Jump to content

thetuningspoon

Members
  • Posts

    691
  • Joined

  • Last visited

  • Days Won

    6

Posts posted by thetuningspoon

  1. A client has requested that the pager always show the next page number when the total number of pages exceeds the numPageLinks. This is what it's showing right now when on Page 7:

    Prev   1   ...   6   7   ...   24   Next

    This is what they would like:

    Prev   1   ...   6   7   8   ...   24   Next

    I think it is a little more intuitive for some people to be able to click on the next page number rather than on the Next button.

    I looked through the options and didn't see anything for this. I started looking through the code as well to see if I could hack it but I'm having trouble pinpointing where this determination is made.

  2. On 1/19/2019 at 7:25 PM, Robin S said:

    I might have this wrong, but isn't the important thing that output formatting be off when you get and set values? It's not enough that it simply be off at the moment the page is saved.

    Doing the following doesn't solve the "problem" of output formatting...

    
    $page->foo = $page->bar . ' baz';
    $page->of(false);
    $page->save();

    ...and so likewise some feature that automatically turned off output formatting within the save() method wouldn't be a solution either. Output formatting has to be turned off earlier in the process of getting and setting values and PW can't automatically know when to do that, so it has be done manually by the developer.

    Edit: to clarify regarding getting values - output formatting only needs to be off if you are going use that value in a field you are setting.

    I find that the need to set the value of one field to another field is so rare in practice that it cannot possibly outweigh the time, frustration and possible security flaws that dealing with turning off and on output formatting has caused me over the years. It is not hard to use $page->getUnformatted('field') in the rare instance when that is what I actually want to do. At least if I am unintentionally saving formatted values to other fields, it is just a mistake and not a potential security concern like outputting fields to the page that are not entity encoded is. And it certainly doesn't seem like it should be an application-terminating error.

    One example I recently ran into was calling a function to send an email while in the midst of editing a page via the api. In such a case you need to remember to turn off output formatting and back on again before calling the email function (and then back off again before doing your save), or your email will be outputting unformatted fields. it took me ages to figure out why the dates in my emails were sometimes coming out as timestamps.  I use setAndSave() now wherever I can, but I do wish I could use the regular object setting syntax sometimes.

    I think this is why people say that "global mutable state" is dangerous.

    • Like 2
  3. 4 minutes ago, Steve_Stifler said:

    ....Oh, one other thing, to confirm I've got it right in my head as to how to attack this, I'm gathering I create a page/form per database "Table" or "Group of columns" in the back end and they then are completed on the front end when the data needs to be entered. Am I in essence creating a Page Template and each record is a different Page, or am I using the 1 Page and the data is being stored in a database somewhere?

    JA

    That is correct. A page can be thought of as a particular record and the template as the table/schema.

    But if you are creating a front-end form for populating data to pages, you may want to create a single page with its own template which has the form on it for creating the records, and then have a hidden area in the page tree where the data-only pages are kept (this is in essence how PW's admin page editor works). The possibilities are endless.  But either way, you're just working with pages--no outside database. 

  4. For the initial live deployment, we just make a copy of the database on the server, sync the files up, and update the config file to point to the correct database. There's no need to run through the actual installer again.

    For ongoing database updates, we usually use PW's import/export functionality for syncing templates and fields, and then have to remember to add any new pages that are needed. But that is not always ideal, so a solution where all DB changes are done by script (as with the migrations module) is probably a better long term approach if you can pull it off. 

  5. I'm trying out the front end features in earnest for the first time. I'm trying to edit my title field inline. It works, but for some reason it's allowing me to enter new lines for a single line text input. If I enter new lines and hit save, they are converted to <br /> tags (via the html entities output formatter). But if I double click to edit again, the tags are processed as live html in edit mode (potential security flaw?). And if I try to remove all of the new lines, I still end up with at least one stray <br /> tag at the end of the line.

    inline-editing.gif.f44c1638e5c3c5030714c0e2de74e240.gif

    • Like 1
  6. FWIW, I really like the font and shade of blue. I also really dig the home page and how you've presented PW, interspersed with the user quotes. I found it very convincing ?

    Agreed on getting "headless" on the page somewhere.

    The documentation is really shaping up. I like the addition of the WordPress vs. PW section.

     

    One critique... The tertiary menus on the right side on the doc pages are confusing to me. For some reason I just don't understand what I'm looking at.

    Also, the hamburger menu looks a little generic. It would be nice to spice it up a little.

    I also noticed the issue with horizontal scrolling. If you scroll to the right there is a small bar of white that appears. I've found the way to deal with this is to add an additional element inside of the body element that contains everything else on the site and put overflow-x: hidden on it (adding that directly to the body tag doesn't work in all browsers). But I haven't actually looked at your code so I don't know what is causing it.

  7. On 10/27/2018 at 7:43 AM, bernhard said:

    Agree on all the above ?

    This video is really great - and even if it was rebuilt I think it should be linked nearby. 8 years later and the foundation and principles are still the same and more trendy than ever before. Just awesome!! Headless as a totally new concept?! Bore me more ? 

    Maybe we should be marketing ProcessWire as "The original headless CMS"

    • Like 3
  8. A bit of an update after some more experimentation tonight. 

    I added pages.templates_id as a field always selected in the RockFinder results, and then attempted to use the $pages->newPage() method to take the results of the SQL query and convert them into a PageArray of Page objects. This worked, thus eliminating that second trip to the database I mentioned (and also bypassing the page cache and probably some other nice features of the normal page creation process).

    Unfortunately, this slowed the whole thing way down again. So I'm thinking now that it is something else about constructing Pages that is slow. Maybe generating the page path or some other properties are the problem. Perhaps I need to load up some additional fields up front. Will have to test more.

    WireData/WireArray works great, though.

    • Like 3
    • Thanks 1
  9. 4 hours ago, bernhard said:

    PS: 2 seconds still sounds very slow for 200 rows. May I see your finder setup?

    You're right... I was not counting some of the pages involved. There are at least 2 to 3 times that many.

    Also, I am counting the entire time from request to first response (not using a debug timer)

    • Like 1
  10. @bernhard I've finally had a chance to try out your module tonight for a project where we're loading pages into a large data table (200+ rows) and were hitting a wall.

    Using RockFinder I now have the initial page load down to ~2 seconds, down from ~7+ seconds! This is a fantastic module, Bernhard. It looks like it's really well thought out and has some really powerful features for building queries. I love how it extends PW's native selectors and allows you to return standard objects, making it easier to substitute this in for a regular $pages->find. Thank you for making this!

    I think I can answer my own question now... The main issue with creating Page objects is that page instantiation requires a trip back to the database. The initial $pages->find converts a selector into a SQL query which returns an array of matching page IDs. Then those IDs are used to go back to the database and get the pages (or pull them from cache if they're already loaded). Then for any page field requested that isn't auto-join, an additional database query is required. If you're looping through a lot of pages, that's a lot of DB queries!

    It seems like there might be a way to provide the functionality of RockFinder in the native PW core, as an option when loading pages. You would still end up with Page objects in the end (which in my case would be a huge boon since I like to extend the Page class with a custom class and methods for each template), but we could skip that second trip to the database (getting pages by IDs) if we could just tell PW which fields we wanted it to get up front. After that, any additional fields we didn't specify could be loaded with another trip to the DB, as they are now.

    That being said, I'm sure @ryan has a good reason for that second trip to the DB. But it seems like there must be a way that we could improve the speed of native pages, even if it is a hidden/advanced option with some caveats.

     

    One minor complaint: I noticed is that the module seems to fail silently and return nothing when it can't find one of the fields. It would be good to throw an exception to make this easier to debug.

    Edit: Another thought... Is there a reason not to use WireData and WireArray for the objects returned from RockFinder, in place of an StdObject? This would allow you to use WireArray's built in sorting and selecting features on the result set:

    $results = $this->database->query($this->getSql());
    if($array) {
    	$objects = $results->fetchAll(\PDO::FETCH_ASSOC);
    }
    else {
    	$objects = $results->fetchAll(\PDO::FETCH_CLASS, '\ProcessWire\WireData');
    	$objects = (new WireArray())->import($objects);
    }

     

    • Like 4
    • Thanks 1
  11. @LostKobrakai This is on a production server.

    @dragan There are no files or images.

    @teppo We are creating a box subscription service where the user can have multiple boxes and configure what is included in each of them. The page structure is:

     

    - Subscription (Not an actual user page, just linked to the user by a page field)
    - - Boxset v1  <-- This is the page I'm calling clone on, and the resulting clone is stored as a sibling
    - - - Box 1
    - - - Box 2
    - - - Box 3
    - - - Box 4
    - - Boxset v2
    - - - Box 1
    - - - Box 2
    - - - Box 3
    ... etc

     

    There are not many fields on each page (~3-4). But each Box has a repeater field on it which usually contains 6 entries. The repeater is using AJAX to generate new items. The 30 pages number I came up with included the repeater pages.

    I've switched to using a ProFields Table field instead of a repeater since I think it will actually serve us just as well in this case. The clone time is now greatly reduced, although still feels a bit slow to me.

    It does sound like I should set up a test case and some debugging to see if something weird is going on. Thanks everyone for your feedback and let me know if you have any further thoughts!

    • Like 1
  12. In an app I'm building I frequently need to clone a set of pages that represent a user's 'order'. Every time they make changes to the order, I want to make a new version by cloning this page structure. The pages in the structure also contain repeaters, so while there is in theory no limit to the number of pages in the structure, there would realistically be no more than about 30 pages (repeaters included).

    From a coding perspective the clone() method makes this all a breeze. Unfortunately the clone operation just takes too long (~20 seconds). Does any one have any insight into why this is so slow? Is there any way around this aside from reducing the number of pages? Is there hope that we might be able to make this more efficient in a future version of PW?

  13. For some reason I never really thought about the fact that, with client-side rendering, you are basically sending your entire application over the wire. It's like having to install an app each time you visit a website. Makes me feel more confident in the componentized server-side approach I've been pursuing, which seems to achieve 90% of what the fully client-side approach aims to achieve, without the complexity and overhead. And it still leaves room for plugging in a more progressive framework like Vue for the cases when you need that extra 10% of interactivity.

    • Like 1
  14. This can now be done with owner selectors (http://processwire.com/blog/posts/processwire-3.0.95-core-updates/)

     

    $tagsThatHaveBeenUsedOnPosts = $pages->find('template=tag, tags.owner.template=post');

     

    Where tags is the name of the field on the post template that holds the tag pages.

     

    Also, for any given tag page you can check how many post pages reference it with $page->references('template=post')->count();

    https://processwire.com/blog/posts/processwire-3.0.107-core-updates/#what-pages-point-to-this-one

    • Like 6
  15. An update on this for others:

    Using Redis for sessions solved the problem for me. I had to make some changes to the module to get it to work right (if anyone wants my code, let me know).

    Today I happened to try migrating my project from using ProcessWire's default MyISAM database engine to using InnoDB (had to convert all tables and set $config->dbEngine in site/config.php). I was playing around with the demo version of my system (which is not using Redis). And interestingly enough, I THINK this (in combination with SessionHandlerDB) actually resolved the issue with simultaneous ajax calls! Perhaps this is because InnoDB has row-level locking instead of table-level locking?

    • Like 3
×
×
  • Create New...