-
Posts
6,671 -
Joined
-
Last visited
-
Days Won
366
Everything posted by bernhard
-
Preview/Discussion: RockDataTables
bernhard replied to bernhard's topic in Module/Plugin Development
hi adrian, totally useful thoughts as always! i agree that for simple tables it would be easier to use the pw api. and i think that this will make sense in many situations, so i will build the module to support both options. it's definitely not necessary to go the sql route when all you want to do is listing 20 or so pages i did some tests on findMany and findIDs but as i need to access fields of the page it's not enough to just load the ids. and then it gets slow again... -
Preview/Discussion: RockDataTables
bernhard replied to bernhard's topic in Module/Plugin Development
ok... i get closer it's definitely the best to build the tables by sql queries. so the module will get an easy sql editor with live preview like shown here: you see that the query loads 10.000 rows in around 500ms another huge benefit of writing sql queries is that you can group and sum your data easily, like in this example, where i build sums of "field3" grouped by month: thanks to @adrian 's idea i zipped the json response of the sql query making it shrink from 2.7mb in the first example to 121kb I'll add some helpers to make multilang queries easy and to query views (for kind of a modular setup) -
hm... not sure why you have <region...></div> in your example? you could also use a variable + if in your _main.php: // _main.php if($config->sidebar) { // your sidebar markup // grid // region body // region sidebar } else { // no sidebar // region body } and in your template file $config->sidebar = false; <region id="body">your content</region> // or $config->sidebar = true; <region id="body">your content</region> <region id="sidebar">your sidebar</region> but i think there's also nothing wrong about setting regions to null
-
thanks kongondo, all valid points. the reason why i want to have all date loaded at once is because it makes the development of my module a lot easier and has some huge benefits over loading data paginated via ajax. when i have all the data available at the client i can use datatables' api to filter, sort, query, draw charts etc.; that would be very hard to achieve using server side techniques. i was also trying to avoid direct sql queries, that's why i took other approaches for my first two versions of the module. but it turnes out that all of that approaches have some really big drawbacks. actually building the queries via SQL manually is not as difficult as i thought (the genious api and the easy page->find() operations was one of the main reasons i fell in love with processwire). and i have some nice ideas how to make it even more comfortable and easy. I'm quite sure it will be really easy to use for everybody
-
nice do you think you could extend this module to also store images pasted from the clipboard? PS: i think it would be great if the images were added via AJAX right after pasting (either urls or clipboard). what do you think @Robin S ?
-
because it's a LOT more performant. constructing my datastring via pages->find() and a foreach takes 16 seconds for 10.000 rows and 5 columns (needing some extra seconds for every column added) whereas querying the database directly needs only some milliseconds. my other idea was to cache the table rows on the dedicated templates but that leads to problems when you have a "parent" or "category" column because then it would take several seconds to recreate the cache of the table when the name of the category changes (updating up to thousands of rows' cache). It also leads to a lot of redundant data. All of that problems are solved when the DB is queried directly
-
glad it helped. actually i think that the usability of this site/module could be improved. it was also not self-explaining for me when i did my first steps with multilanguage setups...
-
there's a thread for phpstorm and vscode - feel free to open one for atom...
-
thanks for your suggestion dragan! i modified your example a little bit. it's also clear and easy to extend with other fields... but still i find this one easier to read and maintain (having one line for each field vs. in your example (and my previous tries) it's 3 lines for each field): select id, (select data from field_title where pages_id = p.id) as title, (select data from field_field1 where pages_id = p.id) as field1, (select data11041 from field_field1 where pages_id = p.id) as field1_de, parent_id, (select data from field_title where pages_id = p.parent_id) as parent_title, (select group_concat(data separator ',') from field_test_page where pages_id = p.id) as test_page from pages as p where p.templates_id = 44 i did some tests and it seems that both have the same performance... 65ms for a query over all 10.000 rows See "test_page" in the example above - also not that hard to get. Only problem could be special characters and quotes - not sure about that... but i think in that situations it would be the best to add a field that gets populated via a save hook then you could populate whatever value you want easily via PHP.
-
hi cesco, of course this is possible, maybe you have to refresh your file list (that caught me some time too):
-
yep, i know i talked to him during my writeup of the blogpost. i just wanted to help as i think 100mb for everybody is not the best solution. i'm fine now, but i think it makes sense to restrict new users more than active ones. thanks for taking care of it now!
-
would be a perfect fit as every developer should use tracy anyhow no idea how different the IDEs work or how difficult that would be (i guess it should not be too hard?), but it would be great to support different IDEs of course. at least vscode
-
https://themeforest.net/ https://templated.co/ https://html5up.net/
- 23 replies
-
- 4
-
-
- template
- website template
-
(and 2 more)
Tagged with:
-
Just had an idea: what if we created a module that creates some kind of static file that the IDE can read listing all fields of a given template. Maybe we could also extend this module to parse all hooks even for properties that where added via a hook does anyone need a challenge for christmas? to your question, pwuser1 (welcome btw): some use phpstorm, some vscode, some sublime, some atom...
-
I'm still looking for the best solution for my datatables module and I think there is no way around using direct sql queries... That makes it more complicated to setup but saves a lot of other troubles like proper caching, cache maintenance etc... Unfortunately such sql queries can get quite complex unless I'm missing any pw magic that makes it easier to do? This is what i have so far to query all pages of type "basic-page": select id, title.data as title, headline.data as headline, body.data as body, (select group_concat(data separator ',') from field_test_page where pages_id = p.id) as test_page, test_repeater.data as test_repeater from pages as p left join field_title as title on title.pages_id = p.id left join field_headline as headline on headline.pages_id = p.id left join field_body as body on body.pages_id = p.id left join field_test_repeater as test_repeater on test_repeater.pages_id = p.id where p.templates_id = 29 the repeater is a little tricky for example. it gets even worse when you have to handle multilanguage fields or you want references to other pages' field values... edit: maybe better like this? select id, (select data from field_title where pages_id = p.id) as title, (select data from field_field1 where pages_id = p.id) as field1, (select data11041 from field_field1 where pages_id = p.id) as field1_de, parent_id, (select data from field_title where pages_id = p.parent_id) as parent_title from pages as p where p.templates_id = 44 thanks for any hints
-
I think that's two totally different things... Migrator is for applying changes to sites in a programmatical way. So you can do... migrations Page Import/Export may be helpful when you transfer sites or parts of sites. But I don't think it will be helpful when you have a live site and want to add some new features while you cannot take the site offline and there might be changes in data/content while you are working on the updates. But I haven't used both of them so far. But I'm watching them for a long time... For me until now it was sufficient to work directly on the live server (having proper backups of course). I did some tests writing my own modules and applying update scripts... that was also quite easy. But there where situations where I wished I had a helper that takes care of some things like checking if the field that i want to create already exists, deleting all pages before deleting a template etc.; not sure how or if migrator handles that or what you plan exactly by your rewrite but I'm quite sure I would pay for a module that makes this process easier. And I'm quite sure that there is some demand on this staging/production migration topic... But I also think that tracydebugger has the bigger audience - so this would be more interesting as a business case
-
I'm using this one: https://marketplace.visualstudio.com/items?itemName=lukasz-wronski.ftp-sync I'm not 100% happy but it works most of the time. Searching through all files only works when you have all files on your computer. This ftp sync sucks for that part. I always zip everything and download it... much faster most of the time. Maybe the new duplicator module could help here... Another thing that I do is browsing the server over ssh via winscp and then just doubleclick the file. This opens the file on VSCode and on save uploads the new version.
- 246 replies
-
- 1
-
-
- visual studio code
- vsc
-
(and 2 more)
Tagged with:
-
not related to processwire but may be of interest anyhow:
- 246 replies
-
- 3
-
-
- visual studio code
- vsc
-
(and 2 more)
Tagged with:
-
a limit per time or by number of posts would be great. it's totally understandable that not every user needs 100mb from the beginning. it would even make sense to have a low limit at first imho
-
you need the entities only for frontend OUTPUT when you don't want the string to be interpreted as HTML
-
hey @adrian what about creating a wiki page for tracy debugger? I played around with that on gitlab today because I'm looking for an easy and good way to write docs for my module too. I didn't get readthedocs to work properly... Gitlab/hub wikis seem to have all i need: Write Code easily in the Browser, copy&paste screenshots (very handy!) and automatic table of content creation and dividing content into several pages and sub-pages. see here: https://gitlab.com/baumrock/test-readthedocs/wikis/neue-testseite also gifs work. and collaboration would easily be possible too. not sure how docs would work for different versions of a module though. but all the pages have a history so that should not be a problem...
-
Preview/Discussion: RockDataTables
bernhard replied to bernhard's topic in Module/Plugin Development
Can you explain your exact usecase? I'm still not sure how useful the module could be on the frontend... It's really easy to implemenet a Datatable on your own on the frontend and then you have all the flexebility that custom code provides. Not sure if it makes sense to trade flexibility against ease of use... This is an awesome example by @Macrura http://ohmspeaker.com/speaker-filter/?length=13&width=17 That would not be possible with my module. At least it would not be easier than without using it... So I'm not sure if my module could provide what you expect -
just a quick sidenote to google pagespeed insights and pingdom... i analysed a page today by curiosity because the images where loading slowly... devtools showed that the frontpage loaded 19,8MB (wordpress, what else...); pagespeed insights sais 80/100 "needs work". so i checked another site from some days before which i knew got 22/100... this one has 9,1MB for the frontpage... pingdom at least has the amount of seconds for load time and shows a percentage of slower websites... but still the 20MB site (pingdom said it are even 28MB) is faster than 38% of the tested sites... seriously?! ok, i get it... but jquery with its 84kB is too huge for the modern web
-
very nice, congratulation! and thanks for sharing
-
Preview/Discussion: RockDataTables
bernhard replied to bernhard's topic in Module/Plugin Development
Ok guys, I got some REALLY nice results today @dragan sorry, I was unclear in my previous post... I had to leave to a christmas party So here are some explanations and the new results: Table with 10.000 rows without cache: 400ms (see the screenshot) Table with 10.000 rows with cache: 200ms The key was to get the data directly from the database. Of course I knew before that this possibility exists, but I didn't know how to solve the problem of multilanguage and returning complex data (like $page->parent->title for one column) and still keep the setup of a table simple and straightforward. Now I found a great way and I'm really eager to start working on this What I meant by "stay on the client side" was, that I want all the data get transferred to the client and then rendered by datatables. The other option would be to use ajax pagination and load only junks of the data to the client. But having all the data on the client is a huge benefit for manipulating, filtering, sorting, charting etc.; and you where right about my "cache" wording: I'm talking about a cached string holding all the data for the datatable. So if the cache exists it just loads the string - if not it creates the string from the database and then loads the data into the datatables. Any wishes/ideas for features that I should think of when developing the next version of the module? ...see the feature list in the first post