Jump to content

Beluga

Members
  • Posts

    528
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by Beluga

  1. See my previous comments regarding Leaflet 1.2.0. It needs some scripts included separately now, but there is still some mystery to be solved in the case of "single marker in map".
  2. I found this to be a great approach for using Grid today: https://www.smashingmagazine.com/2017/06/building-production-ready-css-grid-layout/ One caveat is that you naturally can't use "grid in grid" for navigation layout etc..
  3. Another missing script (when I checked the JS console): <script src="https://unpkg.com/drmonty-leaflet-awesome-markers@2.0.2/js/leaflet.awesome-markers.js"></script> Now the markers appear in a map with multiple markers, but not in maps with a single marker. The console shows no error. Edit: I added a watcher to line 57 in MarkupLeafletMap.js (this.addMarker = function(lat, lng, url, title, extra) {) and stepped forward. The marker variable stayed blank.
  4. Turns out I had to add this script: <script src="https://unpkg.com/leaflet-providers@1.1.17/leaflet-providers.js"></script> Now the maps appear. Yet, the markers do not.
  5. With the newest module version, PW 3, Leaflet 1.2, I get this: jQuery.Deferred exception: L.tileLayer.provider is not a function jsMarkupLeafletMap Did they change something again?
  6. Ok, apparently slowest response to an email ever (3 years), but Microsoft has confirmed they are working on getting their Grid implementation updated in Edge.
  7. Now all major browsers except Edge ship with Grid Layout support. You can let Microsoft know you want it by voting in Uservoice. The votes do matter:
  8. If the details are not covered by an NDA, maybe you could share them publicly in this very topic
  9. Ping @benbyf - see the above discussion about your Subscribers module.
  10. Still with InnoDB, I tried with BCE. It only created 3660 pages out of 35k. I wonder why that is and if it's related to the "Connection to server lost" behavior.
  11. I realized tuning php-fpm is useless as it only helps when dealing with multiple users. I tuned MariaDB instead. I had created the PW install with InnoDB. Sadly, it took an equivalent amount of time to import with Ryan's module. Even emptying the trash with 35k pages took 30 minutes and I always get "Connection to server lost" while doing either import or trashing. It then does not allow me in the admin until it has finished the operation. Next I will try with BCE. Here is my override for my.cnf:
  12. Ok, I set up a Docker dev environment with Caddy, php-fpm and MariaDB and now I could import the whole 35k rows with Ryan's module in about 2 hours. Still, that is much worse than Zeka's result. Maybe I have to tweak some php-fpm and MariaDB settings.. I just don't know how exactly.
  13. Edit: solved using this gem of knowledge! I had to use "mariadb" as the DB host, because that is how I named (and linked) it in my compose file! I am running the PW installer. PW files are located on my host (Arch Linux). Db files as well. If I use localhost as the DB host I get this: SQLSTATE[HY000] [2002] Can't connect to local MySQL server through socket '/run/mysqld/mysqld.sock' (2 "No such file or directory") The solution given to this everywhere is to use 127.0.0.1 instead, because localhost makes it look at my host machine instead of the MariaDB container. But with 127.0.0.1 I get: SQLSTATE[HY000] [2003] Can't connect to MySQL server on '127.0.0.1' (111 "Connection refused") I have confirmed inside the database container (using sh) that the user and password are correct. The web server and database containers are linked. I am using https://github.com/abiosoft/caddy-docker/ and https://github.com/bianjp/docker-mariadb-alpine/ I found some advice to create firewall rules, but they did not help: iptables -t filter -A INPUT -p tcp -i docker0 --dport 3306 -j ACCEPT iptables -t filter -A OUTPUT -p tcp -o docker0 --dport 3306 -j ACCEPT I found advice to add this to my.cnf: bind-address = 0.0.0.0 so it listens to all interfaces. No help. What to try next? Here is my docker-compose.yml:
  14. I have MariaDB on my server, it's fine. I think I will try this dockerized Caddy + php-fpm and skip the VM https://github.com/abiosoft/caddy-docker
  15. Fantastic work, guys. Zeka's result proves that my local setup is somehow silly. Maybe it is too primitive: I don't even use php-fpm. However, you can imagine my surprise, when I returned back home just now after leaving the BCE import running: it had imported all the 35k pages successfully! I should set my local environment up like I have on the server, with php-fpm and Caddy. Thank you for restoring my sanity and trust in PW.
  16. I have a problem with the performance of importing CSV data with ~35k rows and 17 columns as pages. This is all on my local dev environment, PW 3.0.42, Ubuntu 16.10 virtual machine with SSD, 16GB memory and 4 CPU cores allocated (6th gen i7). PHP is 7.1 and MariaDB is 10.1.21. So the setup if beefier compared to the server environment this project would be living in, at least in the CPU department. So far I have tried with Ryan's CSV import module. I started the import at 14:30, left it running as I went to sleep. When I woke up and checked it, Apache server had somehow borked. 14k pages existed. I tried to restart apache2, but it coredumped. After a restart of the VM, Apache worked again. If we assume the import ran for 10 hours (might have run longer), that's about 23 pages per minute. Seems it could go faster. Next I will test with @adrian's Batch Child Editor. If someone wants to conduct their own testing, attached is a zip file with - real world CSV data, from a proverb research database, Tab separated and enclosed with ^ characters (because Ryan's module doesn't work with pure Tab separated). I have added dummy "title" field data, so it can be used with any of the importers without hacks. - the needed fields exported from PW - my php.ini - my my.cnf csv-testcase.zip
  17. It is in Ryan's module. I guess I could later share the file (+ exported PW fields) for testing as the data will be completely public anyways (and even is now in an older site).
  18. Thanks I am using BCE anyways, even though not for this particular corner case. One thing that would be interesting to hear is performance on import of large amounts of CSV data compared with Ryan's module. I am getting massive execution times on a local environment, hours, for a set with 16 columns and 35k rows (I have not yet done a successful import). I looked at these types of answers: http://stackoverflow.com/a/3908669 and tuned my MySQL settings, but I guess PW's page creation strategy should match..
  19. I'm working on a setup with no title fields. It was quite easy to edit "Import Pages From CSV" module to accept import data without titles. However, the module messes up the data->field pairings when using Tab separated data (I will report it later). I would like to try Batch Child Editor to import my data, but it is not as obvious how to tweak it so titles are not required. I mean how to deal with the page naming after commenting out the if(!in_array('title' block starting from line 249. Page names as IDs would be fine, but what is the most straightforward way to achieve it? Thanks. Edit: urgh.. apparently I was wrong with "Import Pages From CSV" messing up the field pairings. I don't understand why I cannot reproduce the problem anymore. The only thing I can think of is I was in somehow dyslexic mode when staring at the data.
  20. For pages not using the title field (which you can achieve with: Setup > fields > title > advanced tab > turn off global flag), you can do a couple of modifications: Do this commenting out in importPage function: /*if(!$page->name) { $this->error("Unable to import page because it has no required 'title' field or it is blank."); return false; }*/ Then in importPageValue you can construct a naming scheme or just use the id (after the else statement): $page->set($name, $value); $page->name = $page->id;
  21. Doug G's concern was that documentation moves to IRC, not if people want to discuss their cats and dogs on #processwire, or did I misunderstand you? I don't think anyone sees the forums or chat as a solution for lack of documentation. Please treat this lack as a problem of its own and do not muddle it with communication/user support channels.
  22. There have been a couple of these sorts of articles lately: https://80x24.net/post/the-problem-with-amp/ https://shkspr.mobi/blog/2016/11/removing-your-site-from-amp/
  23. Do you really believe you will get less answers to your questions on the forum due to the existence of IRC and Slack channels? Information will not be moved anywhere. Chat channels are just a different mode of communication, one where you have the advantage of rapid-fire interviewing regarding some problem. My typical use of IRC is coordinating work on open source software. Thus, IRC (or Slack, or Telegram etc. - these can all be bridged to unify communication) could be used as a way for ProcessWire's documentation team (does it exist yet?) to efficiently collaborate in producing documentation. See, how it's really apples to oranges, right tool for the job etc.?
  24. http://caniuse.com/#feat=css-grid The CSS standard Grid Layout will soon be usable in the stable versions of the leading browsers. It should hit Chrome and Firefox in March. It is in Safari tech preview, but no clear date. MS Edge is working on updating support. I guess support in mobile browsers will follow. 2017 is the year to use it on sites, where it is OK to experiment with bleeding edge stuff. A complete guide to the system on CSS Tricks Learn by examples (includes video tutorials) Rachel Andrews summarizing use cases for Grid, Flexbox and Box Alignment in a single article on Smashing Mag (note: heavy with Codepens) News on all things CSS Layout curated by Rachel Polyfill support is unfortunately dragging behind with no contributors stepping up to help Fremy
×
×
  • Create New...