Jump to content

elabx

Members
  • Posts

    1,472
  • Joined

  • Last visited

  • Days Won

    21

Everything posted by elabx

  1. Thanks crew for jumping in! I know this situation is really really odd and I want to dive into what might be happening because I reproduced the same behaviour in live and on my local ddev environment. And went back and forth between 3.0.215 and 3.0.228 to confirm, it also caused issues in other parts where certain results where expected, so I'll dig in those too. Actually the issue is with 3.0228! Is this the moment where I learn not to upodate/deploy on friday? haha Will definitely do some detective work and come back to report.
  2. Is anyone having issues with selectors on the last few updates? I just had to rollback a big site which was showing issues in multiple parts. Unfortunately I don't have time right to make a clean install to reproduce the scenarios but just wanted to communicate in case someone else is having issues. For instance, I have this selector: 'template=ad, parent=1082, ad_status=1, sort=-verified, sort=-created, city_id|cities=11669,created_users_id!=41,title!=test|Test' Which worked just fine, but when updated the same find operation returned 0 pages found. I was able to debug this because the title field in the selector was only used with guest users so that the admin users copuld see these Test pages created, but guests do not. I had to change the selector to the following to make it work again: 'template=ad, parent=1082, ad_status=1, sort=-verified, sort=-created, city_id|cities=11669,created_users_id!=41' Notice that I removed: title!=test|Test So you can imagine my confused face on how would this cause an issue to not find any page at all. The site is on 3.0.215 and and important detail might be that it is using multilanguage.
  3. I think this might be related to: https://github.com/processwire/processwire-issues/issues/1611 And it should be solved at least from 3.0.204 onward.
  4. You know what, I misread your issue. What could probable work is placing a hook in emailFormReady https://processwire.com/store/form-builder/hooks/#where-to-place-hooks wire()->addHookBefore("FormBuilderProcessor::emailFormReady",function($e){ /** @var InputfieldForm **/ $form = $e->arguments(0); /** @var FormBuilderEmail **/ $email = $e->arguments(1); if($form->name !== "the_form_in_question") return; $page_id = $form->getChildByName('form_page_id'); if($page_id){ $contact_email = wire('pages')->get($page_id)->contact_email; $email->to = $contact_email; } }) form_page_id would be a hidden field in the form where you save the page_id, not sure if there is a nicer way to do this! That is, without this hidden field to pass on the id info.
  5. I think you'd have to customize the autoresponder template from FormBuilder. Check the instructions in site/modules/FormBuilder/email-autoresponder.php, in the comments at the top of the file.
  6. I think that as long as the field keeps the Hanna code text formatter, you shouldn't have any issues. About Hannah code insertion in the field, take a look at this: https://github.com/BitPoet/InlineCompleteTinyMCE
  7. I'm also very interested in this answer! I feel I'll never leave compiling SCSS/Less in PHP, even it means using an outdated subset of Less, it's just so convenient! I'm very sold on HTMX + AlpineJS, main reason being it saves me from the build step.
  8. If you want to save the data using the Page object I see no other way than this or the meta object. I do understand it can get troublesome to manage all the custom fields, but I've seen a few module authors handle it like this! I'm not sure if my explanation will be completely correct but I'll give it a shot haha So if you take a look at the Templates class, you'll see that if actually extends WireSaveableItems, which is where persistence to the database is implemented. See how $some_template->save() (Template, no "s") actually calls the $templates->save method (Templates = global $templates). If you take a look at the actual database through phpmyadminer or Adminer and see the templates table, you'll find one row per template and a column named data that holds the template config. This is all thanks to WireSaveableItems! The Page object is different, it doesn't have a data column and it doesn't implement persistence of fields like this. What happens is that each field's Fieldtype implements its own persistence mechanism. So, when you are adding that InputfieldCheckboxes, the Page doesn't know anything about it and later when you assign the Inputfield's value as a property of the Page object it also doesn't know anything about how it should be saved either, this is all delegated to different classes decoupled from the Page class, the Fieldtypes. Check how the PagesEditor class ends up calling savePageField, a method from the Fieldtype abstract class. So when you are adding these properties dynamically these are just set as regular properties, the further process of saving the pages doesn't really know that this property is a field since it's not in the template/fieldgroup assigned to the page and there is no default 'save to the data column on the page table' behaviour.
  9. What if you use the meta() function? Then on adding the field, checking for the state of the data and set the field's checked state accordingly. On the save, instead of saving to the page, use the meta() API.
  10. From what I am understanding there is no "menus" field on the pages your are running this module on, so this won't get saved anywhere when you set it on the $page object. So yes you can add properties to the page, but that doesn't mean that property represents a Field/Fieldtype within the Page, which holds the actual persistence mechanism of the fields. Why not create the "menus" field from the start? Maybe explain a bit further what are you trying to solve with the module?
  11. I very much agree it seems complicated! I am very knew to composer packages myself and this is just the first thing I figured out and it felt ddev made it also really easy to make it "ergonomic" for daily use. It's at least a way that makes sense to me haha and I end up with one extra command ( ddev composer-local {$@} ) that just works when I want to work with my local versions and another command to go back on live/latest.
  12. Yes! This actually has nothing to do with ProcessWire itself So, regularly, you just want to load the composer dependencies and that's it! But what if you want to test a library you are developing, that has it's own dependencies defined through composer. Ideally too, you want this installed on an existing local project. And you wanna do this with composer itself! Composer has of course has this in mind and it has the symlink flag so that composer knows it should load a local copy where you can have a repository checked out on any branch. { "repositories": [ { "type": "path", "url": "../../packages/*", "options": { "symlink": true } } ] } In my long gone and not missed non-ddev days this worked great with just having my repos checked out in a folder above my project's. I just had to include the local path as repositories, and it symlinked the libraries from the upper packages directory. This wasn't perfect, since when working on the local version, the composer.json and composer.locked got edits that I didn't want in the main branch of the project's repo, so on every change I had to rollback updates in this two files, then install again if I wanted to test "live" versions locally, but more on this later. With ddev this is not so straightforward since the container's volume is isolated from the system. So it threw an error regarding not finding the path to that repo. Of course, there is no "../packages" in the container's volume! Hence, we use the first config to enable a path that is reachable by the composer path configuration within the container. The second config for the .env variable of the path is really unnecessary but thought it would be nice to handle this type of thing in a variable, since each developer might place their repositories in whatever other place within their filesystem. .ddev/docker-compose.composer-packages.yaml: services: web: volumes: - "${COMPOSER_PACKAGE_DIRECTORY}:/packages" .ddev/.env COMPOSER_PACKAGE_DIRECTORY="../packages" And the composer.json configuration for the local repo, see how on this case it is defined from the root, since it's where the volume is mounted from the previous instruction. { "repositories": [ { "type": "path", "url": "/packages/*", "options": { "symlink": true } } ] } And this is is like 80% of of the problem solved! The shell script wrapped in the ddev command is a way to not mess with the regular lockfile which is the one I use to deploy to production. So it creates an "alt" composer.json and lockfile to work in a different composer setup on parallel to the one that is your "production" one.
  13. Just fan-tastic! You can really see the how instantaneous all the setup can be. Seriously nice extra bonus the interview with Randy! Coincidentally I've been seeing him participate leading the ddev contributor's training sessions. From the few I've watched, it's been really great and interesting.
  14. A bit off-topic, but I just realized this exact same thing when teaching ProcessWire to someone, it can be a really confusing concept since it's a "PW-only frontend" concept. I still use it but lately been thinking of switching to a templating language.
  15. Sure mate, come back at it anytime!
  16. The "raw" ProcessWire API, is a bit more verbose but not really a show stopper. Also to be clear with API I mean the whole deal of classes involved in the admin or pages/fields like Field, Fieldtype, Inputfield, InpufieldForm, InputfieldWrapper, etc and hooks involved, not just the global $page, $pages, $input variables. Say with RockMigrations you'd do: $rm->migrate([ 'fields' => [ 'some_fifeld' => [ 'label' => 'Some Fields', 'type' => 'title' ] ]; ]) In "plain" PW API: if($fields->get('some_field')){ $field = new Field(); $field->name = "some_field"; $field->type = "FieldtypeTitle"; $field->label = "Some Field"; $field->save(); } And most of this doesn't really change much with any other field. But indeed since ProFields tend to elaborate on complex things to solve, they might involve a few more steps.
  17. RockMigrations, FormBuilder, ProCache, I think all if not most ProFields. DDEV for local development.
  18. A bit offtopic: Specifically in UIkit I've overcome most some of these issues adding the classes along the attributes of the components, for example if you have a uk-dropdown or uk-drop components, add the uk-drop class too. See how the uk-drop class is the one with the display:none property, hence the issue of seeing it when the js hasn't added the class to the component. .uk-drop { display: none; position: absolute; z-index: 1020; --uk-position-offset: 0px; --uk-position-viewport-offset: 0; box-sizing: border-box; width: 300px; } All assuming you load the CSS before the document.
  19. You can upgrade it, DM Ryan to get an invoice generated.
  20. Being a long time I gave support to this site but last time I knew I got a notifications they closed their shopify store.
  21. Ok so here's what worked for me for now to develop composer packages locally while having them working in a ddev project: Add a volume to the web service on ddev adding file .dev/docker-compose.composer-packages.yaml: services: web: volumes: - "${COMPOSER_PACKAGE_DIRECTORY}:/packages" I've also heard it's advised to add this as a whole other service? Let me know if anyone thinks this is better! You can then define whatever path on your file system in .ddev/.env such as: COMPOSER_PACKAGE_DIRECTORY="../../packages" Now you could just git clone you repos into the path you defined in the variable and "ddev composer install" in case you have the following configuration on your composer.json and it will symlink your libraries instead of { "repositories": [ { "type": "path", "url": "/packages/*", "options": { "symlink": true } } ], } Check the url matches the target the container's path in the volume definition made in the yaml file. I added the additional layer to this proposed by the original article that put me to work, to be able to have my composer.json "clean" and without the local package path definition. I setup the script adviced in the original article: #!/usr/bin/env bash command -v jq >/dev/null 2>&1 || { echo >&2 "jq is required to support local development of composer packages, please install using your OS package manager before continuing"; exit 1; } jq -s ".[0] * .[1]" composer.json composer.local.json > composer.dev.json COMPOSER=composer.dev.json php /usr/local/bin/composer "$@" rm composer.dev.json [ -e "composer.dev.lock" ] && rm composer.dev.lock Save this as .ddev/commands/web/composer-local Create a composer.local.json where you can hold a local configuration that will be merged with composer.json into a temporary composer.dev.json file that will install this merged combination, guarding your composer.local and composer.lock file from being edited. Example I am using as composer.local.json: { "repositories": [ { "type": "path", "url": "/packages/*", "options": { "symlink": true } } ] } If you comment the lines that delete the composer.dev.json you'll see that the property "repositories" value is overwritten by our custom definition. Then run: ddev composer-local install You should see your composer dependencies available as local packages linked to the repository. You can just roll back with the regular composer install: ddev composer install I am still unaware if deleting the composer.dev.json and .lock files are really necessary as it did help me figure out what was going on. Please test if you find this useful and let me know how it goes. Thanks everyone for your thoughts and work put into this thread, I'm really enjoying migrating to ddev!
  22. I wanna put this in a frame ❤️ haha
  23. Hi crew! Does anyone know how to make symlinks work for composer?? Maybe there is an alias path within the ddev container? ? To do something like this: https://medium.com/pvtl/local-composer-package-development-47ac5c0e5bb4 EDIT: Ok, found something! https://carnage.github.io/2019/07/local-development EDIT2: A simplified version? https://adithya.dev/using-docker-for-spatie-laravel-package-development/ more inline with what I had being doing. Haven't tried any of these but will get back when I do.
  24. Not super sure if these have been brought up but I'll lay my experience so far. Problems solved for me and what I liked the most about: It solves sharing the exact environment with other devs within the project itself. I just recently started working with someone who I am also training and to put and example, explaining how to do aliases for php versions in MAMP Pro was a pain, running composer had to be done all with absolute paths for the php version, the composer install in itself too, etc. So even if trivial for an experienced dev, it does introduce friction for less experienced ones. It's all there right after the ddev start, clean and isolated from whatever other project. Of course, maybe there was a cleaner way to do this I never never got to know 'em, don't wanna trash MAMP Pro haha. All the command suite, but specifically love the database import/export ones no need to open phpmyadmin anymore for a simple db import. I really understand the value now of each project running separately. One example, it had happened to me that I was running a database of a few GB in MAMP PROs single installation and it was a pain to turn on and off when I had to update any php.ini configuration since it brought up and down the whole thing, even if I was working on a completely different project than the mentioned big database one. I also didn't love the phpmyadmin screen full of ALL the databases from ALL the projects. It's been really easy to move everything to ddev! It got me into development with Docker and the whole Docker topic in general which wasn't something I was really into since I didn't "need" it. Now I am looking into how to leverage this to deploy productions sites with containers. Maybe this could a whole other topic for another talk? I am trying to move away from panels such as Runcloud/Ploi into a "cloud services" approach and I have the feel that Docker is crucial to this. Pain points: Probably the only pain point I've had is that on a laptop where I might have had docker installed before, I ran the installation commands and got into this issue. But I am not really sure on how to give more input here since I went into "delete all and reinstall mode" and it just worked. Also in my main computer I decided to try Colima and realized that the container volumes were not the same! Maybe this is obvious if you are familiar with Docker. But overall these paint points seem to be very specific, "layer 8 cause" problems haha.
×
×
  • Create New...