Jump to content

formulate

Members
  • Content Count

    122
  • Joined

  • Last visited

Community Reputation

45 Excellent

About formulate

  • Rank
    Sr. Member

Profile Information

  • Gender
    Not Telling
  • Location
    Canada

Recent Profile Visitors

1,894 profile views
  1. No restrictions on the address (I've experimented a lot with the module restrictions) and I have it whitelisted in the SPAM filter. It's weird that from that same address, some emails work fine and some don't. I would expect that it would be all or nothing. At any rate, I may work on switching the module over from the Flourish library to the newer/better/maintained Fetch library. Another approach I'm toying with is to just have the email address pipe to a script on the server that bootstraps PW's index.php and simply creates the page. No need to mess around with checking an email account, etc.
  2. Rather than take over, I'd probably just do something from scratch. However, I'm extremely short on time at the moment. Heck, I don't even have time to be trying to debug this issue my client is having. Embarrassingly, at the moment I'm just processing his emails by hand as it's faster for me to just manually input his 2 or 3 emails a week than it is to try and fix this thing. Obviously not a long-term solution, but something I can afford to do with my limited time.
  3. Not sure if this is dead or if anyone is still working on it. I have a problem where emails sent from Outlook only work about 20% of the time. The other 80% the module just leaves sitting on the mail server untouched. I can't seem to find any reason why. If I send the exact same email from a non-outlook client, the module picks it up no problem. Any ideas why this may be the case?
  4. Adrain, thanks for your time and input on this. Definitely easier to have Ryan relax the limit. Robin, that is a clever solution with URL segments. While I don't normally like work-arounds like this, it may be the only solution at this point.
  5. Unfortunately this didn't work. Name field is now varchar(256) and the name is still being truncated at some point after $sanitizer->pageName happens. I also changed InputfieldName to 256 and ensured InputfieldPageName was also 256. Still doesn't work.
  6. Teppo, you are of course right. Thanks for your diplomatic reply and placing some perspective on this. This is a very edge case and the first time I've encountered this need. It's specific to this one project and unfortunately, necessary. All major web browsers and the http spec itself have supported 2048 character URLs for nearly two decades now. Hence I find it interesting that PW's character limit is 128. However, I don't know the reasons behind it, be it database or other efficiency related issues. Unfortunately segments won't solve my issue. I need the URL to be a direct match with the page title. I'll change the table varchar to 256 and see what happens - seems like it could be the culprit.
  7. Ok, as a workaround I created a script that uses $sanitizer->pageName where I can set the maxlength option to 256. While this works, it doesn't solve the problem of my client creating pages and having correct URL's. I guess I can always hook in to the page save and rewrite the name, but that seems ridiculous. There has to be some easy way to tell processwire to just allow 256 character names when saving new pages. Ok I was wrong. I thought for a moment my script worked, but it didn't. Even using $sanitizer->pageName with the maxlength set to 256 doesn't work. It gets overridden somewhere and truncated to 128.
  8. Yikes! This is a major issue for me. Any ideas about how to work around this? A character limit of 128 on page names seems kind of messed up. I wonder what Ryan's logic is behind this decision. At least relaxing it to 256 would surely be acceptable?
  9. Ouch, ok NOT solved. The PW version I'm running is 3.0.131, much newer than 3.0.86 that addressed this issue. What am I missing here? Do I need to configure something somewhere to allow more than the default 128 characters?
  10. The initial install was a couple months ago, so it seems like an update is in order! Thanks for the heads up dragan!
  11. I have a new project that uses extremely long page urls. I need to bump the 128 character limit to 256. Does anyone know how to do this? As it is now, when creating a new page with a really long title (ie: more than 128 characters), the URL is getting truncated to 128 characters. Thanks.
  12. Wow, wireHTTP is perfect! How have I built a zillion PW sites without ever knowing about wireHTTP. Sheesh. Thanks again Horst.
  13. Very interesting. While I don't think I need the .htaccess part, the second half of your response is probably doable in some fashion. Something like this: Site B tries to pull "optimized" image from publicly accessible url. If it can't get it, it triggers script on Site A to create optimized image and then tries pulling it again. I'll probably have to use curl in order to properly get the 404 call back. I'll report back once I've rigged something up and tried it. Thanks Horst.
  14. Looking for some advice. On site A, I have Ryan's Service Pages module installed and am serving data as JSON. On site B, I'm retrieving the data by passing a page ID. This works great for all the data, but for images it is pulling the full image that's stored in the field for a given page. On site A I store full size images as I use them in a variety of ways and use the API to resize them. However, when using the service pages and pulling the data to site B, I don't have opportunity to resize first and thus it's pulling 6mb, 8mb files over every time site B displays the data. I of course resize the images once they get to site B, but it still has to pull that initial full file over. I need the resize to happen on site A first. So a few thoughts... 1) I could build a caching system on site B that uses cron to execute a script to pull the images, resize them and store them. I'd rather not do this if I can help it and would prefer to rely on a solution that is self-contained on site A. 2) I could use the API on site A to create a compressed version of the file that is publicly accessible via http. I would write up a script that would go through the thousands of pages I have and create these http accessible versions and then create a cron to fire off a script to create these images on a regular basis to account for newly added pages. Not an ideal solution. 3) I could hack the service pages module and somehow resize images before serving the JSON filename. 4) Is the service pages module hookable? That would be preferable to #4 above. Any other thoughts? Some angle or idea I'm not seeing? Thanks for your time looking at this.
  15. Brilliant! In the end my code was more or less along the lines of what d'Hinnisdael posted above. Thanks everyone for chipping in and resolving this issue.
×
×
  • Create New...