Jump to content

formulate

Members
  • Content Count

    119
  • Joined

  • Last visited

Community Reputation

44 Excellent

About formulate

  • Rank
    Sr. Member

Profile Information

  • Gender
    Not Telling
  • Location
    Canada

Recent Profile Visitors

1,732 profile views
  1. Adrain, thanks for your time and input on this. Definitely easier to have Ryan relax the limit. Robin, that is a clever solution with URL segments. While I don't normally like work-arounds like this, it may be the only solution at this point.
  2. Unfortunately this didn't work. Name field is now varchar(256) and the name is still being truncated at some point after $sanitizer->pageName happens. I also changed InputfieldName to 256 and ensured InputfieldPageName was also 256. Still doesn't work.
  3. Teppo, you are of course right. Thanks for your diplomatic reply and placing some perspective on this. This is a very edge case and the first time I've encountered this need. It's specific to this one project and unfortunately, necessary. All major web browsers and the http spec itself have supported 2048 character URLs for nearly two decades now. Hence I find it interesting that PW's character limit is 128. However, I don't know the reasons behind it, be it database or other efficiency related issues. Unfortunately segments won't solve my issue. I need the URL to be a direct match with the page title. I'll change the table varchar to 256 and see what happens - seems like it could be the culprit.
  4. Ok, as a workaround I created a script that uses $sanitizer->pageName where I can set the maxlength option to 256. While this works, it doesn't solve the problem of my client creating pages and having correct URL's. I guess I can always hook in to the page save and rewrite the name, but that seems ridiculous. There has to be some easy way to tell processwire to just allow 256 character names when saving new pages. Ok I was wrong. I thought for a moment my script worked, but it didn't. Even using $sanitizer->pageName with the maxlength set to 256 doesn't work. It gets overridden somewhere and truncated to 128.
  5. Yikes! This is a major issue for me. Any ideas about how to work around this? A character limit of 128 on page names seems kind of messed up. I wonder what Ryan's logic is behind this decision. At least relaxing it to 256 would surely be acceptable?
  6. Ouch, ok NOT solved. The PW version I'm running is 3.0.131, much newer than 3.0.86 that addressed this issue. What am I missing here? Do I need to configure something somewhere to allow more than the default 128 characters?
  7. The initial install was a couple months ago, so it seems like an update is in order! Thanks for the heads up dragan!
  8. I have a new project that uses extremely long page urls. I need to bump the 128 character limit to 256. Does anyone know how to do this? As it is now, when creating a new page with a really long title (ie: more than 128 characters), the URL is getting truncated to 128 characters. Thanks.
  9. Wow, wireHTTP is perfect! How have I built a zillion PW sites without ever knowing about wireHTTP. Sheesh. Thanks again Horst.
  10. Very interesting. While I don't think I need the .htaccess part, the second half of your response is probably doable in some fashion. Something like this: Site B tries to pull "optimized" image from publicly accessible url. If it can't get it, it triggers script on Site A to create optimized image and then tries pulling it again. I'll probably have to use curl in order to properly get the 404 call back. I'll report back once I've rigged something up and tried it. Thanks Horst.
  11. Looking for some advice. On site A, I have Ryan's Service Pages module installed and am serving data as JSON. On site B, I'm retrieving the data by passing a page ID. This works great for all the data, but for images it is pulling the full image that's stored in the field for a given page. On site A I store full size images as I use them in a variety of ways and use the API to resize them. However, when using the service pages and pulling the data to site B, I don't have opportunity to resize first and thus it's pulling 6mb, 8mb files over every time site B displays the data. I of course resize the images once they get to site B, but it still has to pull that initial full file over. I need the resize to happen on site A first. So a few thoughts... 1) I could build a caching system on site B that uses cron to execute a script to pull the images, resize them and store them. I'd rather not do this if I can help it and would prefer to rely on a solution that is self-contained on site A. 2) I could use the API on site A to create a compressed version of the file that is publicly accessible via http. I would write up a script that would go through the thousands of pages I have and create these http accessible versions and then create a cron to fire off a script to create these images on a regular basis to account for newly added pages. Not an ideal solution. 3) I could hack the service pages module and somehow resize images before serving the JSON filename. 4) Is the service pages module hookable? That would be preferable to #4 above. Any other thoughts? Some angle or idea I'm not seeing? Thanks for your time looking at this.
  12. Brilliant! In the end my code was more or less along the lines of what d'Hinnisdael posted above. Thanks everyone for chipping in and resolving this issue.
  13. Yeah, first thing I did was strip the conditional back. It appears that if I'm logged in as a superuser, your ready.php code works (as expected). If I'm logged in as any other role, a generic error is thrown and pages in the admin don't work. Below is the code I'm using. Note, I also tried removing the appending of check_access=0 line and it still throws an error (some issue with re-assigning the event arguments, which makes no sense to me). $this->addHookBefore('Pages::find', function(HookEvent $event) { $selector = $event->arguments(0); $selector = ', check_access=0'; $event->arguments(0, $selector); }); Will look again tomorrow. Thanks Adrian.
  14. Couldn't get this to work on my first attempt. Out of time tonight, but will try another crack at it in the morning. Thanks for the github link as well.
  15. Ok, I can confirm the following: 1. Adrian's advice above about check_access=0 works, provided you're doing #2 below. 2. Custom Selectors and therefore the solution from #1 above, will NOT work using page list or auto complete. You need to use good old fashioned select drop down. Adrian, in your example above, how do you think you were able to circumvent issue #2 that myself and d'Hinnisdael seem to be experiencing? Was your screen shot from a superuser or non-superuser role?
×
×
  • Create New...