Jump to content

benbyf

Members
  • Posts

    796
  • Joined

  • Last visited

  • Days Won

    6

Posts posted by benbyf

  1. YESSSS thank you! working, I changed the .env php to 7.4 from 8.1 and now works, its was erroring about something in AIOM module, but I'm usre it'll would have failed on something else too. Love to get some sites on 8.* at somepoint though so might revisit.

    Thanks for you help!

    • Like 1
  2. Just getting set up with a docker and Devilbox after lots of chat about it on the recent share you setups topic.

    I probably foolishly am starting from a duplicate of an existing site having copyied the files into the data/sitename/htdocs/ folder and imported the db using phpmyadmin, but getting 500 errors. I seem to be able to access .txt .html etc files ok but my processwire files return a title then a internal server error which i cant find the logs for... any help would b appreciated. (I also dont know if it's an issue that I created the folder and copied files on my mac and not in the container...?)

     

  3. On 1/28/2022 at 11:04 PM, FireWire said:
    • URL Hooks - Use case- We rolled out a Web API so external platforms can make RESTful JSON requests to the site at dedicated endpoints. The syntax resembles application frameworks which made development really enjoyable and productive. The code is organized separately from the templates directory and allowed for clean separation of responsibilities without dummy pages or having to enable URL segments on the root page. Also allowed for easily building pure endpoints to receive form submissions.
    • Page Classes - My usage -This was a gamechanger. Removing business logic from templates (only loops, variables, and if statements allowed) and using an OOP approach has been fantastic. Not sure if many people are using this but it's made the code much more DRY, predictable, and well organized. Implementing custom rendering methods in DefaultPage allowed for easily "componentizing" of common elements (video galleries, page previews, forms, etc) so that they are rendered from one source. Helped achieve no HTML in PHP and no PHP in HMTL (with the exceptions above). Also allows for using things like PHP Traits to share behavior between specific Page Classes. I completely fell in love all over again with PW over this and now I couldn't live without it. This literally restructured the entire site for the better.

    Holy smokes how did i miss these two!!! so useful!!!! Thanks for sharing you setup, I'm defos going to get on the PageSpeed train too on DigitalOcean and the 7G Firewall  looks super usefuly too.

    • Like 3
  4. 2 hours ago, bernhard said:

    thx! I'm thinking about it. But I think not to the current version. RockMigrations has evolved over the last years and it's time to refactor and clean up and that would be a good opportunity.

    What I'd like to know of you guys is why everybody seems to be so excited about YAML? Is it about the YAML thing or is it about the recorder? I'm asking because YAML would really be a drawback IMHO. What I think would be much better is to have a regular PHP file that returns a simple array. That's almost the same as a YAML file but you can do additional things if you want or need. See this example:

    <?php namespace ProcessWire;
    /** @var RockMigrations $rm */
    return [
      'fields' => [
        'foo' => [
          'type' => $wire->languages ? 'textlanguage' : 'text',
        ],
      ],
      'templates' => [
        'demo' => [
          'tags' => RockMails::tags,
          'fields-' => [
            'title',
            'foo',
          ],
        ],
      ],
    ];

    That migration file would add a "foo" field and depending on the system make this field single or multi-language. That would be just one example which would not be possible using YAML syntax. Another thing is using class constants. Possible in PHP (see RockMails::tags in the example above), not possible in YAML.

    I'm really curious what it is really about, thx.

    I'm guessing that is any file format that shows changes to the system that can be commited to another system. PW uses json for its template and field export import... so I'm guessing it really doesnt matter... but (kinda of my original point about this module) that migrations, in my mind, are about reflecting changes, where as RockMigration goes a step further and basically sticks all config in code. I would prefer using PW for config (opening the door to less technical admins for example), and have those changes migratable like templates and fields but in a way that is less cumbersome. The fact you can code up your cofig isnt really migration to me, the migration is because of changes, where ever the changes happen.

    • Like 1
  5. 5 minutes ago, elabx said:

    Well in this scenario I was talking about a particular project that is more "app like". But the rest of sites I manage, which might have specific features regarding template/fields, different templates that are for one specific website/design, I handle them the same way, repo and deployment through pipelines. 

    I'd love to hear about what other things you've tried if you'd be keen on sharing. I've seen some folks move to Statamic/Kirby for this purpose.

    Thinking about this I've thought a "migration recorder" built on field save hooks, would be neat. Like, you're doing your work on PW thinking about the awesome features you wanna build, building repeaters/matrix, changing file widths, show conditions, all those things we love. Then you go into Setup > Migrations and you find all the work you just did on your repeater field recorded. Just a thought for one part of the whole problem. EDIT: I just read about the YAML think Craft does ?

     

    They had a similar thing I've used in Drupal 7 (back in the day), you would edit the Drupal config in the admin and it would track changes which can then be exported and imported or git'ed up and then executed on your live server either pressing a button or by adding a hook

    • Like 2
  6. Happy new years too!

    Thank you so much for sharing your setups, this has been an excellent thread, and its nice to see some people in a similar boat to me (and those perfering as little setup as possible)... Love that quote about "Real Programmers, Don't Use Frameworks.." I feel like I subscribe to this but without the expertise to actually pull it off ?‍♂️

    I might look again at docker / vagrant / Virtualbox again and see if I can find a good local workflow on my M1 Mac (for referrence). 

    I've now got so many links to check out, will report back if I hit any revelations. Please continue to share your setups as maybe useful for other peeps coming fresh too PW too.

    • Like 1
  7. I usually use:

    • IDE: visual studio code
    • Server: hosting with DigitalOcean droplets administered with Serverpilot
    • Staging: I edit over FTPS (I know!) on the dev server, then either copy or git to a live server.
    • PW Modules: always use AIOM, ProCache, RockMigrate (looking into), ManageFiles, ProcessRedirects
    • Like 5
  8. Hi Everyone!

    I feel I'm falling behind and becoming a "dinosaur" with how I work on the web, so in the hope to make myself feel better and hopefully learn from others Please sahre how you usually work on projects and tools you might use.

    Think: IDEs, servers and hosting, command line tools, PW moduels that help with development etc...

    • Like 2
  9. ok, thanks @bernhard...but... three things I guess, and two of them may sound stupid, so here goes..

    1.  I work on a remote server so this simply won't work for me (as far as i can tell)... on small sites i tend to have a dev site on my dev server and role out to live when its done or close too, my IDE just sees the files I'm editing.
    2. I seem to be drawn to the array/json esque way of using the migrate function instead of each individual function, so again IDe wont help here i don't think?
    3. Less stupid I hope but, my IDE doesn't give me YOUR preferred usage, things left TODO / roadmap so we can pitch in, or other ways to interact with the module... which in this case seem like there are lots of usecases and places to use it.

     

  10. I believe the heat levels are an average or max... not sure you're have to read the info. I was provided the data but Centric Lab will know the ins-and-outs of how its was derived.

    • Like 1
  11. 13 minutes ago, dab said:

    Fantastic site, great job ?

    Ditto some random postcodes threw an error e.g. IV42 8YD, IV41 8WZ (selected off the web to compare against my home postcode).

    fixed, so shouldnt 500 now, but just let you know that the postcode coulndt be found.

    • Like 1
  12. https://right-to-know.org/

    Website to check your local area's stressors e.g. air, noise and light pollution etc. Available for postcodes in the UK. I did quiet a bit of database stuff to host postcode lookup on the site and caches any API results for teh stressors on site for fast second query. We also choice against any tracking on the site (google analytics etc) and only track the number of total queries and number of requests to each specific postcode.

    right-to-know.thumb.jpg.e4110d806187c3af417712e3af699ccd.jpg

    • Like 8
  13. New module: OneTimeOnlyCode  https://github.com/benbyford/OneTimeOnlyCode

    Use this module to create codes with urls (or any of string data) and later check a code and its data e.g. to get one time access to content.

    I built this to send out specific "exclusive" users codes at the end of URLs to new articles where noramlly they would have to log in ->  now they can see the content only once and then have to login if they navigate away. I made it only to show the content not any user profile infomation etc when using the code, be warned not to use it to show user information unless you know what you're doing!

    Codes can have an expiry and uses lazy-cron to cull codes out of day at a set interval (both in checking and elapsed interval for the codes).

    Check it out!

    • Like 11
  14. 16 minutes ago, Jan Romero said:

    So are you looking for hidden pages or unpublished pages? Because for pages that have never been published, this may be faster:

    $pages->find('published=, include=all');

    The selector status=unpublished will turn into SQL as where pages.status & 2048. Now I don’t know what optimizations MySQL can do there, but it I suppose it will still have to look at a bunch of values and see if they match, whereas published=, include=all turns into pages.published IS NULL, so it should be a matter of returning a continuous range of rows from that index. Even better if you only want the ID, then it should never even touch the actual table.

    I'm looking for published pages that are hidden, but I would love to hear if there is a faster query to return those pages than $pages->find("{myField}=1, status=hidden") for sure!

  15. (if it is indeed it is the thing slowing things down), think my issue was i was using

    $pages->find("{myField}=1, include=hidden")

    which i've now changed to

    $pages->find("{myField}=1, status=hidden")

    I'm hoping this helps. I've also put the frequency down on the cron job... As although there maybe thousands of pages they should only really be a couple of hidden ones so need to find those as fast as possible ?

×
×
  • Create New...