Jump to content

netcarver

PW-Moderators
  • Posts

    2,236
  • Joined

  • Last visited

  • Days Won

    47

Everything posted by netcarver

  1. Ok, it just required a small tweak to the regex that finds the H1 element in the support files (it adds them to that page) and demotes the H1s to styled H2s. In ProField's case, the H1 is not the first line of the readme file - something I was never expecting - so we ended up with multiple H1s in the DOM, with associated styling confusion. Version 0.11.3 should fix it for you.
  2. Are you running the SessionDB Handler module?
  3. Thanks for this, Chris!
  4. Not fact-checked, but chatGPT suggests this as a history of short tags in PHP... Does this fit with what you are seeing (given the version of PHP you might be running?)
  5. Also, what version of PW are you running?
  6. Is www/site/modules readable and traversable? (Either world readable (r) and traversable (x) or by the user/group you run php as on that server?) Are there actually module files in the site/module subdirectories? (ie, does /home/cinemed/www/site/modules/ProcessHannaCode/ have a readable file called ProcessHannaCode.module?) Any errors at the end of www/site/assets/logs/errors.txt or www/site/assets/exceptions.txt ?
  7. In this case, I'd probably just stick to bootstrapping PW from the cronjob script and then using parallel curl. I was just throwing out some options in case you wanted to explore using something specifically written for async IO like this.
  8. If you are running a CLI-based cronjob for this, then you could take a different approach and try out Framework-X as it's built around react php specifically for this kind of task. Or take a look at react php's http client class if you want to do something more low-level.
  9. BTW, if you do go for this approach, I'd also consider setting a timeout on each connection - so the whole thing should definitely finish within your 60 second limit. curl_setopt($handle, CURLOPT_CONNECTTIMEOUT, 5);
  10. Sure, but just to be clear, I meant with the parallel curl approach :)
  11. I don't see why you couldn't monitor 100 urls every 60s as long as the machine you are using turns over the sockets fast enough (and has high enough limits on how many open handles it can have etc.) Your server might already have high enough resource limits to allow it, but if it doesn't then ask chatGPT about things like decreasing tcp connection recycling times (wait timeout) and about linux max open files to see how to up the limits.
  12. Thanks @FireWire Coming back to PHP, Brent and Roman do an interesting monthly review of PHP things...
  13. I know it's not native PW - but you can use parallel curl for this kind of thing. $urls = [ 'https://www.example.com/', 'https://www.google.com/', 'https://www.github.com/' ]; $handles = []; $multi_handle = curl_multi_init(); foreach ($urls as $url) { $handle = curl_init(); curl_setopt($handle, CURLOPT_URL, $url); curl_setopt($handle, CURLOPT_RETURNTRANSFER, true); curl_multi_add_handle($multi_handle, $handle); $handles[] = $handle; } $running = null; do { curl_multi_exec($multi_handle, $running); } while ($running); foreach ($handles as $handle) { $result = curl_multi_getcontent($handle); echo $result; curl_multi_remove_handle($multi_handle, $handle); curl_close($handle); } curl_multi_close($multi_handle);
  14. Ok, are you running mod_security on the server?
  15. Some things to check first: 1. Check the value of the upload_max_filesize directive in your PHP configuration file (php.ini). If the value is set to 2.6MB or lower, it could be limiting the upload size. Increase the value to allow larger file uploads. After making changes to `php.ini`, restart your web server for the changes to take effect. 2. Check the post_max_size directive, it determines the maximum amount of data that can be sent in a POST request. If the uploaded file exceeds this limit, it can result in incomplete uploads or truncation. Make sure this is set to a size greater than the maximum file size you want to allow. After that, if it still doesn't work, you could try looking at the web server configuration. In Apache, you may need to adjust the `LimitRequestBody` directive in your server configuration to allow larger file uploads.
  16. Runtime-only fields and/or the dashboard module may also be helpful.
  17. I think @bernhard might be on the right track here. If you've changed the charset for the password table from the default, that would likely break things.
  18. Maybe 1) could have a configuration option?
  19. Looks like you might be saving some unvalidated data in the $currentUser there. I'm not familiar enough with Padloper to know if it handles pre-validating posted data like $input->post->email - but if it doesn't you might be leaving yourself open to stored XSS or an email header injection depending on how that field is used later in the code.
  20. And if you switch back to the files session handler can you log in again?
  21. @zx80 There's actually several module's for this in the PW Asset Catalog, but you could always take the purple pill option (am I the only one who wanted a purple pill in the matrix?) and create a textformatter module that people could hook up to their textfields.
  22. If you don't use phpmyadmin, simply relying on Adminer as part of Tracy Debugger on your sites, you can use the `--omit-containers=dba` flag as well to remove that container from your setup... alias ddpw='ddev config --php-version=8.1 --database=mysql:8.0 --webserver-type=apache-fpm --omit-containers=dba --timezone=UTC' You can also install adminer in it's own container if you prefer to run that beside your PW container... alias ddmore='ddev get ddev/ddev-adminer && ddev restart && ddev describe'
  23. Log in to your server via SSH and navigate to the site/assets/logs/ directory and look at the end of the errors.txt and exceptions.txt files. You should see information there about what could be happening on the server.
  24. Simplest way to test is to edit your root .htaccess in the PW install directory and comment out the X-Frame-Options line (just start the line with a hash character '#') and save the .htaccess file, then clear cache and reload the page in your browser. If it works, then this is the issue and you'll either need to add an exception to the .htaccess to allow frame loading from www.domain.com or re-add the line and fix this a different way to ensure the source and iframe both load from domain.com (or both load from www.domain.com)
  25. I know it's a little counter intuitive, but domain.com !== www.domain.com => not the same origin. Yet it sounds like your server headers are explicitly telling the browser not to load an iframe if it doesn't come from the same origin.
×
×
  • Create New...