Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 03/05/2022 in all areas

  1. As we continue to work towards the next master version, this week I've been working on fixing reported issues. A new $sanitizer->words() method was also added which reduces a string to contain just words without punctuation and such. It was added in part to work on an issue reported with the tags feature in the field editor, but should be handy for other cases as well. As part of that update, the existing $sanitizer->word() (singular) method was re-written to support the features needed for the new words() plural method. This week I've also been working on a pull request from Bernhard that enables greater customization of AdminThemeUikit by way of custom render files and hooks. I'm bringing in that PR and it has a lot of good ideas that have inspired some related updates to it. I've got a bit more work and testing to do before committing, but that part should be ready early next week, along with more core updates. Thanks for reading and have a good weekend!
    11 points
  2. ProFields Table doesn't provide this as a feature. I requested it in 2015 and it has been on the "todo" list since then, with occasional re-requests from other users. Topic in the Pro support forum: Would be great if anyone who wants the feature could remind Ryan about it. Until the feature is part of the official ProFields Table, here is a module that does the job... Table Column Required Adds a "required" option for columns within ProFields Table. Saving a page with an empty required column will alert the user via error messages and highlight the relevant table cells. But it doesn't implement any non-default action set at Template > Advanced > "Required field action". Paginated tables are supported. Screenshots Page Edit Field config https://github.com/Toutouwai/TableColumnRequired
    4 points
  3. The module can generate basic ICS calendar strings and files. Usage Example: $icsgen = wire()->modules->IcsGenerator; // set properties $icsgen->setArray(array( 'date' => new \DateTime('2033-12-24 12:00'), 'dateEnd' => new \DateTime('2033-12-24 13:00'), 'summary' => 'Event title', 'description' => 'Event description', )); // get path to a temporary .ics file // (using wire()->files->tempDir) $icspath = $icsgen->getFile(); // send email with ics file $mail = wireMail(); $mail->attachment($icspath, 'calendar.ics'); $mail->to($user->email); $mail->subject('ICS Demo'); $mail->body('This is a ICS demo.'); $numSent = $mail->send(); For more infos see GitHub Readme or Modules Page. If you experience reproducable issues please open a GitHub issue.
    2 points
  4. Padloper 2 has received a number of updates, pushing it closer to a production release. Stripe Finished the Stripe payment gateway. It is based on the latest Stripe Payment Intents + Stripe Elements. The Stripe Elements widget is fully configurable (UI). I will be updating the docs about this. You can test this now in the demo site. Make sure to read 'about' first here. If upgrading, there are a number of simple steps (actually just one simple step). Just create a page titled Stripe in the admin under the payment gateways parent page. Currently, it is not possible to create a payment gateway using the GUI. Shipping Rate Selection If more than 1 shipping rates are matched, the checkout form will now present the customer with a form to select their preferred rate (e.g. express - €5, standard - €2, etc.). [I have just noticed a bug if using the inbuilt form with this; I'll fix asap]. You can test this by adding this product to the basket and selecting Kenya as the shipping country. Variants The demo site and the starter site have been updated to show how to handle products with variants - adding to basket, checkout, etc. You can test by adding this product or this Padloper [fake] product to the basket. Pay using Stripe and you'll even get to download Padloper! OK, file's fake, obviously. Reports View and functionality is now ready. Powered by htmx. Hoping to create a demo video of this and other backend views soon. Downloads This was not ready in the last release. it is now. Test with 'Padloper' product linked to above. If upgrading, you will need to install the related Fieldtype + add it to the download template. I'll write up about this separately. Bug Fixes Fixed a number of bugs. Docs I've updated the docs in some places. New Requirement I have added a new requirement for PHP BC Math Extension. Currently, Padloper will not work without this. Otherwise we have never-ending pennies/cents rounding errors. Whole libraries have been created just on this one issue; just Google it. BC Math solves it for us. Pending Manual Order Creation: this has been a difficult but getting close to finishing it up. Documentation: Especially the most pertinent. Some minor bug fixes. Some PayPal indempotency + rounding issues! If anyone knows how to pass amounts as pennies/cents to PayPal instead of whole currencies, please let me know. It used to work with the old API. I haven't been able to find how to do this in the latest API and checkout SDK. Production Release My plan is to release a stable version in March 2022. Beta testing has finished. Thank you all who've participated. Edit: making the demo site pretty and perform better on smaller screens is still on my todo. It's not urgent though. Thanks.
    2 points
  5. Should have some free time this weekend to work on these, hopefully that'll get us somewhere.
    2 points
  6. Subscribe Podcast RSS feed and save as something you want. The additional example module ProcessPodcastSubscriptionsEpisodes create new pages per episode. Download/Install Github: https://github.com/neuerituale/ProcessPodcastSubscriptions Module directory: https://processwire.com/modules/process-podcast-subscriptions/ Composer: composer require nr/processpodcastsubscriptions
    1 point
  7. Do you output the SVG in an img-tag? Probably you need to output them as inline SVG, directly in the HTML with something like: file_get_contents($page->image->filename); Edit: I'd recommend an extra image field called sth. like "svg" that only allows svg format. You don't want JPGs end up inline like that. Also you probably don't need the XML declaration in the SVG.
    1 point
  8. I’ll put the question to the developer of the generating app. Thanks again Adrian!
    1 point
  9. Is this for the front-end? If yes, You can write your own login routine that will check against the user data (additional field denoting selected client). Once verified, redirect to the desired page. Since you don't want users to register themselves, you can add a link to your login page (similar to contact us) where potential clients can request an account. The request is emailed to the admin for processing. Just a thought.
    1 point
  10. According to https://github.com/ryancramerdesign/FileValidatorSvgSanitizer there should be a log at Setup > Logs > File-validator, does it say anything?
    1 point
  11. @rjgamer I check it. Thanks you @adrian , the update also contain your change - feel free to ping me if something is broken. --- @rjgamer I checked it on PHP-7.4, 8 and 8.1 and I don't get any error, the package is created after invoking cron.php When you can please update to v1.4.18 and fill an issue if the problem persist ?
    1 point
  12. I am getting errors when using cron.php Can you check/test it? I am on my phone and cannot access the error message properly. Thanks.
    1 point
  13. Thanks @flydev ?? - I've been using 1.4.16 for quite some time (with some sqldump command changes) and it's been working great. I just updated to 1.4.17 and all looks great running on pHP 8.0.
    1 point
  14. 1 point
  15. March 2022 UPDATE The module got updated, the current stable version is the ^1.4.17 The module can be updated troughs ProcessUpgrade or from Github, all the updates and fixes made by @Autofahrn are now in the master, some minors fixes was also pushed. The main feature of this update beside all the fixes is the native backup mode which use mysqldump if available and set in Duplicator options. ? I have not tested it personally on PHP-8 but @rjgamer made some update to fixes notices and warning. - a test of a backup of 2GB was successfully created on 169 seconds duplicator.mp4
    1 point
  16. As @Robin S suggest, you can try mysqldump in a script, it might be available from the admin panel, or you can also try the module Duplicator, I just updated the master version - you can try the web-mode or the native-mode (which use mysqldump) to see if it works. In the past we got quite good results on limited shared hosts.
    1 point
  17. First, it's probably best not to try and host such a site on shared hosting. If the site is that popular the client should be able to afford better hosting. You could try using mysqldump in a cron job and see if that works. Example: mysqldump -u'YOUR_DATABASE_USERNAME' -p'YOUR_DATABASE_PASSWORD' YOUR_DATABASE_NAME > /home/YOUR_ACCOUNT/public_html/site/assets/backups/database/YOUR_DATABASE_NAME_`date +\%Y\%m\%d_\%H\%M`.sql
    1 point
  18. @flydev Merci beaucoup (also for your wonderful modul!) Eventually I found the problem: it was a missing trailing slash in the url segment, although the setting was to either / or. Thanks to all of you for the support!
    1 point
  19. I could be wrong, but what I think @wbmnfktr is looking for is a **standardized** Processwire APIs across all Processwire installations that is on by default? This could be beneficial by allowing: third party services to integrate easily with Processwire. Something like zapier.com could build a Processwire connector that consumes the API to allow for no-code workflows that connect different systems and services together? a site aggregator website that could consume the other Processwire website's API and report back the details. For example, which sites need module or Processwire updates. Something like https://sitedash.app/ for Modx Static Site Generators to consume and build a fast static website that can be hosted on a global CDN. a Single Page Application built with Vue.js/React.js/React Native, etc.. that could be replace the Processwire Admin. I think https://www.sanity.io/ can do this? Everything is fully decoupled. Why would you want a different admin? What if you wanted to build a native Mobile app to administer your Processwire site? admin components that consume that consume the API for different admin experiences? Wordpress uses the API for their new Block Editor https://developer.wordpress.org/block-editor/ Sure stuff like sitedash.app can be built right now with Processwire, but services like zapier.com and others aren't going to spend time building a API connector if it isn't included in Processwire core and isn't standardized. I agree with flydev - there are other things to consider as well like issuing API tokens, content throttling, API versioning, providing data in different formats other than json and REST like GraphQL, webhooks, autogenerated API documentation like https://swagger.io/. https://api-platform.com/ covers a lot of these topics. https://strapi.io/ does a good job with some of these things like issuing tokens for integrating third party clients. Thanks everyone for posting solutions that could work. I enjoying reading and watching the many different ways you can do things with and without modules. Thanks @flydev ?? for the AppAPI demo. Thanks @bernhard for showing/creating the RockHeadless module and demo - dang your fast. I like how you demonstrate how you can also expose the children of certain pages to the API as well. That's is another aspect that has to be considered since Processwire is different than most bucket based CMSs. Processwire is tree based around hierarchy.
    1 point
  20. @cpx3 give a read to @MoritzLost's article: https://processwire.dev/performance-optimizations-for-processwire/#server-side-caching-template-cache-and-procache and there https://github.com/MoritzLost/CachePlaceholders - #motivation-and-breakdown
    1 point
  21. That map is very nice. I’ll keep Leaflet in mind for my next improvement to my app. Thank you, as I see how some of that code you posted would be beneficial. Turns out MapMarker can’t be used in FormBuilder, which I didn’t realize. I wanted to use it for capturing coordinates as entries were saved. So instead, I ended up using a PHP script that I tailored to my needs. In PW, I queried and saved all the addresses (right now about 10–12, eventually up to 30) to an array. I then passed that array to Google’s GeoCoder API and let it do the hard work (parsing the address) instead of me having to figure out latitude and longitude. It works well and loads fast. I’m not a developer but thanks to helpful answers by everyone in this community, I’m able to learn as I go.
    1 point
  22. Thanks @wbmnfktr. Trick is saving the multipage reference field after any "add". I was confused by the "add" command, I thought it added the page.
    1 point
  23. wow. That's a powerful script. I am going to use it. I had been running poor man's backup with these lines as cron. 0 0 * * * mysqldump -uusername -ppassword databasename | gzip > /var/www/backups/sql/sql_$(date +\%m-\%d-\%Y).sql.gz 0 0 * * * cd /var/www/backups/files && zip -r files_$(date +\%m-\%d-\%Y).zip /var/www/docroot/site/assets/files 0 0 * * * find /var/www/backups/sql/ -mtime +7 -delete 0 0 * * * find /var/www/backups/files/ -mtime +7 -delete The first two back up the database and files folder at midnight. The last two remove backups older than a week.
    1 point
  24. Let me explain it quickly, Your original bash script need to be modify whenever you want to backup other database or same database but different server or the damn path. I really do hate it, it will take me over than 30 seconds. That is why I wrote this one to automate that. The purpose of this script is let you enter all needed information to the shell (I also added history to this script, then next time if you still using credential, just press up or down to navigate among them) instead of open the script with vim editor, change them one by one. Of course it is slower if you have only one server, only one database need to backup. But if you have more than one server or database, I should suggest you use my script Good news is, I kept your workflow then you have no worries at all. Cheers!
    1 point
  25. I added interactive mode into your script, hope it helped to someone is too lazy like me #!/bin/bash #---------------------------------------------- # INTERACTIVE REMOTE DATABASE DUMP SCRIPT #---------------------------------------------- # This work is licensed under a Creative Commons # Attribution-ShareAlike 3.0 Unported License; # see http://creativecommons.org/licenses/by-sa/3.0/ # for more information. #---------------------------------------------- SCRIPT=${0##*/} IFS=$'\n' HISTFILE="$HOME/.remotedump.history" # Use colors, but only if connected to a terminal, and that terminal supports them. if which tput >/dev/null 2>&1; then ncolors=$(tput colors) fi if [ -t 1 ] && [ -n "$ncolors" ] && [ "$ncolors" -ge 8 ]; then RED="$(tput setaf 1)" GREEN="$(tput setaf 2)" YELLOW="$(tput setaf 3)" BLUE="$(tput setaf 4)" BOLD="$(tput bold)" NORMAL="$(tput sgr0)" else RED="" GREEN="" YELLOW="" BLUE="" BOLD="" NORMAL="" fi # Case-insensitive for regex matching shopt -s nocasematch # Prepare history mode set -i history -c history -r # Input method text get_input() { read -e -p "${BLUE}$1${NORMAL}" "$2" history -s "${!2}" } # Input method password get_input_pw() { read -s -p "${BLUE}$1${NORMAL}" "$2" history -s "${!2}" } # Echo in bold echo_b() { if [ "$1" = "-e" ]; then echo -e "${BOLD}$2${NORMAL}" else echo "${BOLD}$1${NORMAL}" fi } # Echo in colour echo_c() { case "$1" in red | r | -red | -r | --red | --r ) echo "${RED}$2${NORMAL}" ;; green | g | -green | -g | --green | --g ) echo "${GREEN}$2${NORMAL}" ;; blue | b | -blue | -b | --blue | --b ) echo "${BLUE}$2${NORMAL}" ;; yellow | y | -yellow | -y | --yellow | --y ) echo "${YELLOW}$2${NORMAL}" ;; * ) echo "$(BOLD)$2$(RESET)" ;; esac } # Get input data and save to history save_input() { if [[ ! -n "$local_dir" ]]; then while get_input "Local DB Directory > " local_dir; do case ${local_dir%% *} in * ) if [ -n "$local_dir" ]; then break else continue fi ;; esac done fi if [[ ! -n "$remote_user" ]]; then while get_input "SSH Username > " remote_user; do case ${remote_user%% *} in * ) if [ -n "$remote_user" ]; then break else continue fi ;; esac done fi if [[ ! -n "$remote_ip" ]]; then while get_input "SSH Aliases/IP-address > " remote_ip; do case ${remote_ip%% *} in * ) if [ -n "$remote_ip" ]; then break else continue fi ;; esac done fi if [[ ! -n "$remote_dir" ]]; then while get_input "Remote Backup Directory > " local_dir; do case ${remote_dir%% *} in * ) if [ -n "$remote_dir" ]; then break else continue fi ;; esac done fi if [[ ! -n "$db_user" ]]; then while get_input "DB Username > " local_dir; do case ${db_user%% *} in * ) if [ -n "$db_user" ]; then break else continue fi ;; esac done fi if [[ ! -n "$db_password" ]]; then while get_input_pw "DB Password > " local_dir; do case ${db_password%% *} in * ) if [ -n "$db_password" ]; then break else continue fi ;; esac done fi if [[ ! -n "$db_name" ]]; then while get_input "DB Name > " local_dir; do case ${db_name%% *} in * ) if [ -n "$db_name" ]; then break else continue fi ;; esac done fi } change_pwd_rsync() { ## CD INTO LOCAL WORKING DIRECTORY ## this is where I keep my local dump SQL files. ## the most recent one is always named dump.sql cd "$local_dir" ## RSYNC LATEST DUMP.SQL FILE TO REMOTE SERVER rsync -avzP dump.sql $remote_user@$remote_ip:$remote_dir wait } remote_dump() { ## SSH INTO SERVER ssh $remote_user@$remote_ip /bin/bash << EOF echo "**************************"; echo "** Connected to remote. **" echo "**************************"; echo ""; ## CD INTO REMOTE WORKING NON-PUBLIC DIRECTORY ## where the dump.sql file was rsynced to cd "$remote_dir" wait sleep 1 ## RUN MYSQLDUMP COMMAND ## save the SQL with date stamp mysqldump --host=localhost --user=$db_user --password=$db_password $db_name > `date +%Y-%m-%d`.sql; echo "***************************************"; echo "** `date +%Y-%m-%d`.SQL has been imported. **" echo "***************************************"; echo ""; wait sleep 1 ## IMPORT DUMP.SQL COMMAND mysql --host=localhost --user=$db_user --password=$db_password $db_name < dump.sql; echo "*********************************"; echo "** DUMP.SQL has been imported. **" echo "*********************************"; echo ""; wait sleep 1 ## REMOVE DUMP.SQL FILE rm dump.sql echo "********************************"; echo "** DUMP.SQL has been removed. **" echo "********************************"; exit EOF } main() { save_input change_pwd_rsync remote_dump } main
    1 point
×
×
  • Create New...