Jump to content

formulate

Members
  • Posts

    132
  • Joined

  • Last visited

Everything posted by formulate

  1. Totally agree to this. I imagine most common case-use scenarios would be for using Decimal rather than Float. Given the additional confusion with Float and how the numbers get manipulated (which isn't readily clear anywhere that I could see), it definitely seems like Float should not be the default option that's installed.
  2. Oh, just figured out there's a "Decimal" fieldtype in the core that isn't installed by default. This looks like it will do the trick. Thanks everyone.
  3. Ok, so this is getting over my head a bit. My numbers are not scientific notation, I just want a way in PW to be able to specify a 7 or 8 digit number, possibly with 2 decimal places and have it not manipulated in any way. I guess I could just use a text field, but that defeats the purpose.
  4. Created a new float field and it is converting any numbers past the 6th to zero. Example: I enter 123456789 in to the field and after saving the number is 123456000. I have checked and ensured the character limit is set to it's default "10". I also tried "0" and "999". None of that worked. PW version 3.0.184 SOLUTION: use the "Decimal" fieldtype that is built in to the core. It's not installed by default, you need to enable it in Modules > Core.
  5. I did get it working! It took A LOT of time and money, but I eventually got it working. Their documentation is A) terrible and even B) misleading. I'll shoot you a message and we can pick this up in email (it'll be easier that way).
  6. Yes I've tried it with and without $raw_output. 'website' really is the API username.
  7. I'm at the end of my rope! I've been trying to use a 3rd party API that requires authorization. I've tried a zillion different ways in both cURL and wireHTTP. cURL always tells me the signature is wrong. wireHTTP returns 401 unauthorized. Here's the brief documentation for authorization with the API: https://integrate.spektrix.com/docs/authentication Here's the latest version of my wireHTTP code: $date = date(DATE_RFC1123); $string = "GET\n/sidwilliams/api/v3/customers/I-KS22-363P/\n$date"; $string = utf8_encode($string); $key = base64_decode('XXXX'); $signature = base64_encode(hash_hmac('sha1', $string, $key)); $http = new WireHttp(); $http->setHeader('Host', 'system.spektrix.com'); $http->setHeader('Date', $date); $http->setHeader('Authorization', 'SpektrixAPI3 website:' . $signature); $response = $http->get('https://system.spektrix.com/sidwilliams/api/v3/customers/I-KS22-363P/'); if($response === false) { echo $http->getError(); }else{ echo $response; } If anyone is interested in seeing my cURL, I can post that too. Thanks!
  8. Thanks to both of you for the feedback. Maybe I'm approaching this wrong. All I'm storing is time stamps. There's a hierarchical organization of pages and within these, a need to store time stamps. I considered JSON at a lower sub-level of pages, but even then, the JSON would get too large for the MySQL character limit for the text field. Also, the JSON would become time consuming to process and work with. Is there a better way of storing millions of time stamps that I'm not thinking of?
  9. I'm developing a web app in PW and have a lot of PW experience. However, this particular project will be very large scale and I have some questions. The "app" will be creating approximately 3000 pages per day at launch and continue to grow with an expected 300,000 pages per day after a few months. As you can see, even after half a year I will be over 100+ million pages. Frankly, this seems ridiculous, but it's the case. 1. Can ProcessWire even handle this? 2. Does it just come down to server capabilities? 3. Should I consider trying to break this down to separate multiple databases? 4. Alternately, instead of Pages, should I look at using fields instead? Maybe storing JSON in a text field? This would reduce the amount of pages to less than 100 per day, even after half a year. I presume the database itself would still get very large. Thoughts? Thanks.
  10. No restrictions on the address (I've experimented a lot with the module restrictions) and I have it whitelisted in the SPAM filter. It's weird that from that same address, some emails work fine and some don't. I would expect that it would be all or nothing. At any rate, I may work on switching the module over from the Flourish library to the newer/better/maintained Fetch library. Another approach I'm toying with is to just have the email address pipe to a script on the server that bootstraps PW's index.php and simply creates the page. No need to mess around with checking an email account, etc.
  11. Rather than take over, I'd probably just do something from scratch. However, I'm extremely short on time at the moment. Heck, I don't even have time to be trying to debug this issue my client is having. Embarrassingly, at the moment I'm just processing his emails by hand as it's faster for me to just manually input his 2 or 3 emails a week than it is to try and fix this thing. Obviously not a long-term solution, but something I can afford to do with my limited time.
  12. Not sure if this is dead or if anyone is still working on it. I have a problem where emails sent from Outlook only work about 20% of the time. The other 80% the module just leaves sitting on the mail server untouched. I can't seem to find any reason why. If I send the exact same email from a non-outlook client, the module picks it up no problem. Any ideas why this may be the case?
  13. Adrain, thanks for your time and input on this. Definitely easier to have Ryan relax the limit. Robin, that is a clever solution with URL segments. While I don't normally like work-arounds like this, it may be the only solution at this point.
  14. Unfortunately this didn't work. Name field is now varchar(256) and the name is still being truncated at some point after $sanitizer->pageName happens. I also changed InputfieldName to 256 and ensured InputfieldPageName was also 256. Still doesn't work.
  15. Teppo, you are of course right. Thanks for your diplomatic reply and placing some perspective on this. This is a very edge case and the first time I've encountered this need. It's specific to this one project and unfortunately, necessary. All major web browsers and the http spec itself have supported 2048 character URLs for nearly two decades now. Hence I find it interesting that PW's character limit is 128. However, I don't know the reasons behind it, be it database or other efficiency related issues. Unfortunately segments won't solve my issue. I need the URL to be a direct match with the page title. I'll change the table varchar to 256 and see what happens - seems like it could be the culprit.
  16. Ok, as a workaround I created a script that uses $sanitizer->pageName where I can set the maxlength option to 256. While this works, it doesn't solve the problem of my client creating pages and having correct URL's. I guess I can always hook in to the page save and rewrite the name, but that seems ridiculous. There has to be some easy way to tell processwire to just allow 256 character names when saving new pages. Ok I was wrong. I thought for a moment my script worked, but it didn't. Even using $sanitizer->pageName with the maxlength set to 256 doesn't work. It gets overridden somewhere and truncated to 128.
  17. Yikes! This is a major issue for me. Any ideas about how to work around this? A character limit of 128 on page names seems kind of messed up. I wonder what Ryan's logic is behind this decision. At least relaxing it to 256 would surely be acceptable?
  18. Ouch, ok NOT solved. The PW version I'm running is 3.0.131, much newer than 3.0.86 that addressed this issue. What am I missing here? Do I need to configure something somewhere to allow more than the default 128 characters?
  19. The initial install was a couple months ago, so it seems like an update is in order! Thanks for the heads up dragan!
  20. I have a new project that uses extremely long page urls. I need to bump the 128 character limit to 256. Does anyone know how to do this? As it is now, when creating a new page with a really long title (ie: more than 128 characters), the URL is getting truncated to 128 characters. Thanks.
  21. Wow, wireHTTP is perfect! How have I built a zillion PW sites without ever knowing about wireHTTP. Sheesh. Thanks again Horst.
  22. Very interesting. While I don't think I need the .htaccess part, the second half of your response is probably doable in some fashion. Something like this: Site B tries to pull "optimized" image from publicly accessible url. If it can't get it, it triggers script on Site A to create optimized image and then tries pulling it again. I'll probably have to use curl in order to properly get the 404 call back. I'll report back once I've rigged something up and tried it. Thanks Horst.
  23. Looking for some advice. On site A, I have Ryan's Service Pages module installed and am serving data as JSON. On site B, I'm retrieving the data by passing a page ID. This works great for all the data, but for images it is pulling the full image that's stored in the field for a given page. On site A I store full size images as I use them in a variety of ways and use the API to resize them. However, when using the service pages and pulling the data to site B, I don't have opportunity to resize first and thus it's pulling 6mb, 8mb files over every time site B displays the data. I of course resize the images once they get to site B, but it still has to pull that initial full file over. I need the resize to happen on site A first. So a few thoughts... 1) I could build a caching system on site B that uses cron to execute a script to pull the images, resize them and store them. I'd rather not do this if I can help it and would prefer to rely on a solution that is self-contained on site A. 2) I could use the API on site A to create a compressed version of the file that is publicly accessible via http. I would write up a script that would go through the thousands of pages I have and create these http accessible versions and then create a cron to fire off a script to create these images on a regular basis to account for newly added pages. Not an ideal solution. 3) I could hack the service pages module and somehow resize images before serving the JSON filename. 4) Is the service pages module hookable? That would be preferable to #4 above. Any other thoughts? Some angle or idea I'm not seeing? Thanks for your time looking at this.
  24. Brilliant! In the end my code was more or less along the lines of what d'Hinnisdael posted above. Thanks everyone for chipping in and resolving this issue.
  25. Yeah, first thing I did was strip the conditional back. It appears that if I'm logged in as a superuser, your ready.php code works (as expected). If I'm logged in as any other role, a generic error is thrown and pages in the admin don't work. Below is the code I'm using. Note, I also tried removing the appending of check_access=0 line and it still throws an error (some issue with re-assigning the event arguments, which makes no sense to me). $this->addHookBefore('Pages::find', function(HookEvent $event) { $selector = $event->arguments(0); $selector = ', check_access=0'; $event->arguments(0, $selector); }); Will look again tomorrow. Thanks Adrian.
×
×
  • Create New...