Jump to content

joe_g

Members
  • Posts

    395
  • Joined

  • Last visited

Everything posted by joe_g

  1. Thanks for the insights, Ryan. I figured there were good reasons, since virtually every CMS works this way. Ultimately what I'm mafter is to decouple editing and visiting, like, for example serving the site as static files from Amazon S3, but I understand this would be a whole different approach to website-making (with a number of issues).
  2. I have an existing site that I'm trying to convert to processwire, so I'm transferring a lot of data and deleting it again as part of the process to write the conversion of the data. I was wondering if this might mess up some indexes and slow things down gradually (although I'm guessing not). (by the way, converting an existing site to processwire is awesome) About performance, I don't have a lot of pages per repeater, but what I have is a lot of repeater fields. It seems like every added repeater (where the value needs to be displayed) adds a linear amount of processing power (twice the number of repeater fields, twice the time required to get the data). I was hoping that 'autojoin' would join the data (including the repeater data) into some kind of big outer join (to bundle it all in one SQL call), but it doesn't seem to be the case. So I figure autojoin makes more sense for regular fields, but not for repeaters. In my case, I guess I should just be conservative with the amount of things i display, and try to simplify/denormalize the structure a bit. J
  3. thanks, switching context would work. I'll try that. I thought the context would be the calling page, not the current page. But I guess it makes sense how things are. So, if I do $somepage = $page->get('/somepage'); echo $somepage->body; // body has MyTextFormatter filter in MyTextFormatter.php, $this->page is equal to $page, not $somepage. This is what confused me. J
  4. Hi, the save button was showing, and i clicked it, got to a refreshed page but no saved values in the database, so this was a different error - but I've only ran into it twice in total so far
  5. Thanks but, from inside the text formatter module code there is no "$page", only "$this->page", my problem is that $this->page refers to the wrong page. Not the page where the text field and textformatter resides. So, I can't do "$page->urlSegment", I don't think I can get the current urlSegment from within a text formatter, in any case it doesn't feel like a good idea. i found an easy workaround to simply inject the fields I need before I use the text field (that calls the text formatters), no big problems - but it's a little bit ugly like $page->footnotes = $pages->get($page->urlSegment(1))->footnotes //hack in read.php then in my TextFormatterFootnotes.php i can do $this->page->footnotes j
  6. Heyhey, run into this issue: if I access an URL read/x where x is a url segment. Then, in read.php (template for read) i find the page x and display a text field it has together with some text formatter I wrote that manipulates the text field. The problem is that inside the text formatter code $this->page points to read, not x – so my filter isn't working, because I'm trying to fetch data from other fields in x (I'm writing a filter for footnotes). Am I doing it wrong or is it a bug? (I'm probably doing it wrong). thanks!
  7. Hi, It was another install with version 2.2.9 using latest chrome. I'm trying to recreate the problem but not succeeding, if I manage I'll post the rest of the information. I'm not sure if it said 'saved page' but I remember the green fields indicating a change of the field. j
  8. Hi, This is a general CMS question, but since this is my favourite CMS, I'll ask it here. Instead of generating a page once the visitor ask for it, wouldn't it be better to generate it directly when the editor press 'save'? Then, all server side performance issues are gone completely. It totally wouldn't matter how long a page takes to generate, and you could focus on making the most maintainable structure instead of having to worry about performance. All request would be procache-level speed. I understand this would only work for content-driven sites, and not more complicated things like services, but still it would cover a lot of user cases. I also understand that you would need to keep track of all pages that depends on certain data, but that doesn't strike me as very complicated - even if it has to be done manually (compared to the big advantage of virtually perfect response time, all the time) I'm writing this because I'm working on a site where the structure is on the limit of what is possible to do with a CMS (processwire or any other) without things becoming too slow, and it struck me that this construction would eliminate all my worries about performance. Would this be possible to make by doing a hook to save page or similar? j
  9. I got really excited there for a moment, and in some cases this could work. In my case I'll need to display the times anyway, so they need to be fetched one way or another. So the expense of getting those relational pages is still there. I take it that every relationship / "join" that involves other pages adds a linear expense, and there is no way around that except custom sql? ...I think I have to rethink the structure.
  10. I had this happen again, the page doesn't save in chrome, but it works if I use safari. the green messages indicate all is well, but nothing happens
  11. This is absolutely brilliant. Automatic denormalisation, sort of. The 1 second it took was indeed only that very line, but without the auto-joined repeaters things improved a great deal. I'm repeatedly importing thousands of pages, then deleting them, lots of times. Could there be a performance degradation due to that? (I'm guessing thats not the case) The second snippet triggers the hook saveReady for existing pages, I guess? Many thanks for the help – I'll get back with the results later J
  12. Edit: I did something wrong before. But this line $pages->get('/events')->children('template=event,onfrontpage=1,times.enddate>'.$today); it returns 60 events, and it takes 1 full second. Is that normal? I would have imagined this would be one single sql with lots of outer joins in it? "times" is a repeater. each event containts perhaps 5 page-fields (where 2 is repeaters), and 10-15 regular fields. Everything that can be auto-joined is auto-joined, although it makes little difference seemingly. All fields a tri-lingual, perhaps that makes a difference? Ryan, you suggested I should do $times = $pages->get('/events')->find('template=time,parent.onfrontpage=1,enddate>'.$today); but since times is a repeater, i can't really do that (and it would be weird logically as well) J I started trying this out - and surprisingly the speed issue isn't so much searching and finding, but displaying the data. (as in looping through each field and echo'ing them) I thought that (if I was using auto-join) - once the stuff was loaded it would be super fast to display. Could it be that auto-join doesn't work for repeaters and references to other pages? It takes several seconds to loop through 200 events and display them including all ther doimain data (a couple of repeaters, and other connected page-references - say about 5 domain data fields, and 10 regular data fields). I tried before doing a equivalent SQL query that joins together all data including all the domain data (i was using the database of Symphony, but the abstraction is similar) - I joined together everything and the query would execute in milliseconds, and all the data would be loaded and ready to be displayed. Is there anything to do about this or is custom SQL the only option here? thanks!
  13. I was using chrome v30. But now (some time later, presumably some caching has expired) everything works fine again.. J
  14. no, if there are more than 50 pages, they get paginated, how do i move something from page 2 to page 1? edit: this is a bit hard to talk about since everything is a 'page' – a bit like smurf language. But what I mean is that i want to move page number 51 (on second page) to the, for example, position 1. But I don't know how to cross the pagination boundaries (currently 50) J
  15. Hello, I've got a silly problem, I don't know how to move page across page boundaries in the backend. any ideas? J
  16. I tried dragans suggestion (different browser) - worked great. I suppose that means there are more complicated stuff stored in the session, rather than just 'logged in yes/no' thanks! J
  17. Hello, ran into a weird problem I'm not sure how to debug: I can't save anything in the backend, including of templates/fields/names/structure. If i try to change the title I get Session: Change: title, but no change. But I can delete pages. It started happening on the live site, and after copying it's the same locally – so its something specific with setup. No errors in apache_error.log, mysql_error_log.err, php_error.log or in PW's errors.txt (!) Version is ProcessWire 2.2.9, PHP 5.4.4 mystery! J
  18. Hi there, If I do some conditional output, for example if($config->ajax) { echo 'something'; } else { echo 'somtething else'; } And then switch on cache for this particular page. I guess the first hit will determine what gets stored in the cache? The second hit will get the same results regardless. So that means I cant really have conditional output and caching at the same time, right? If I use ProCache, it's cached by URL instead, right? So then there can be multiple versions of the same page, depending on the URL? For example if I use #!, then the ?_escaped_fragment= version of the page would be cached separately, I suppose? Just trying to see If I understand how it works correctly thanks, J *edit: procache question
  19. I disabled the automatic creation of entries in the repeater, that seem to solve the problem with not being to insert the repeater entries without clearing out the database first. I still would like to - be able to delete everything and insert in the same script (now i need to do it in two GET's) - Not have cruft left behind when i delete pages (repeater items under /processwire/repeaters/for-field-112/for-page-0/) thanks in advance for any tips or hints. I can get by like this, also in the worst case I can clean up the database manually – but maybe the process could be cleaner J
  20. Hi, delete($page, true) doesn't seem to delete my repeater fields. It's version 2.3.0 so this fix is included: https://github.com/ryancramerdesign/ProcessWire/commit/b2780236a2643d703c586a23991d80e2e6b171bf I'm deleting a large amount of pages and inserting them again (working on a CMS conversion). The weird thing is that unless I delete all pages, and then all repeater pages, I can't insert any repeaters again. I have to clear out the DB from both pages and repeaters before I can insert anything again - otherwise no repeaters are stored More weird: Unless I do the delete in a separate GET, it doesn't work either. I have to first delete all stuff $events = $pages->get('/events'); $events_children = $events->children(); foreach($events_children as $c) { $pages->delete($c, true); } $times = $pages->get('/processwire/repeaters/for-field-112/for-page-0/')->children(); foreach($times as $t) { $pages->delete($t, true); } and then fill in the new values: for($n=1; $n<10; $n++) { $event = new Page(); $event->parent= $pages->get('/events'); $event->template = $templates->get('event'); $event->title = 'Event hello #' . $n; //insert two times (repeater fields) per event for($t=0; $t<2; $t++) { $time = $event->times->getNew(); echo $time->id . '</br>'; $time->title = 'Time #' . $t; echo $time; $time->startdate = rand_date('2011-01-01', '2013-12-01'); $time->save(); } $event->save(); } If I don't do it in two steps the times values are not inserted. thanks! rgds, J
  21. joe_g

    parentmost

    I was trying to get the top-event (Event3) from Sub-sub-event2 (or sub-event3, or sub-sub-sub-sub-sub-eventx). I guess the solution is simply ->parents('template=event')->first(); J
  22. joe_g

    parentmost

    ehm, ....this was maybe a bit too easy. Maybe that's why it was confusing ;P But the result seems to be in order of hiearchy, meaning the first result of parents() = top-most it doesn't mention it in the cheat sheet, but i guess i can count on the order to be a good way to know the top-most page?
  23. joe_g

    parentmost

    thanks kongondo, but I don't know how many parents are the current page/event. I don't know how many ->parents I would have to put. I realise I can 1. get ->parents() 2. loop through each result and count(parents) on each one but this seems slow, I was hoping there was a fast way, similar to rootParent, that is only one call, somehow...
  24. joe_g

    parentmost

    Sorry for the confusion, What I'm looking for is how to get the top event (Event3) from Sub-sub-event2. All levels underneath Events are the same template, 'event'. I cant use rootParent, because that points to Events. If I use parents, how do i know what of the event is the top-most in the result, if they're all the same template? (Side-note: How fast is parents(), does it recurse upwards for every step, or is it done in one call somehow?) thanks! Home Events Event1 Event2 Event3 Sub-event1 Sub-event2 Sub-event3 Sub-sub-event1 Sub-sub-event2
×
×
  • Create New...