-
Posts
204 -
Joined
-
Last visited
Everything posted by Guy Verville
-
It is so mysterious. When I am logged, I notice that the session is not preserved. For example, if I click on a module settings, I'm just returned to the module page with this message : No module specified...
-
Hello, I am not sure what you are talking about. I have done nothing special except trying to create a new page, and suddenly, this problem. I think the DB is corrupted some way.
-
Hi BiyPoet, this is the same problem as I can't see the Page tree. This happens when I log into the site... The official call is then http://www.guyverville.com/pw/page/?login=1&id=1 http://www.guyverville.com/pw/page/list/?id=1&render=JSON&start=0&lang=0&open=undefined&mode=actions for which I receive: Failed to load resource: the server responded with a status of 404 (Not Found) Here what I receive in the console No Ajax call seems to be allowed, on every page. For example, I cannot see the logs... or refresh the module page. Nothing works while, the same installation works in local...
-
I have the same problem here, only on prod site. On my local machine, I have no problem. I deleted the cache folder entirely, made PW up to date, even with the DEV version, removed ProCache, but to no avail. I get this network error : Unknow problem, please try later. The console tells me this: ?id=1&render=JSON&start=0&lang=0&open=undefined&mode=actions 404 xhr JqueryCore.js?v=1507602825:2 9.1 KB 284 ms And I repeat, the local installation which is exactly the same has not this problem.
-
Saving via API multilingual field
Guy Verville replied to Guy Verville's topic in Module/Plugin Development
Ok, got it. $french = $this->languages->get("francais"); $page->of(false); // page->setLanguageValue($french, 'title', $columns['Titre']); $page->setName($columns['Chemin'], $french); $page->set("status$french", 1); -
I have create a module base on Cramer's Import Pages From CSV in order to be able to import bilingual data. I based my code with the help of this page. While the title, the body (page_content) field are saved, I cannot save the french path (this is an import of previous blog entries from another CMS. I must keep the URL. The english URL is saved). $p->of(false); if($columns['Titre'] != "") { $french = $this->languages->get("francais"); $p->of(false); // $p->title->setLanguageValue($french, $columns['Titre']); $p->page_content->setLanguageValue($french, $columns['Corps']); $p->name->setLanguageValue($french, $columns['Chemin']); $p->status->setLanguageValue($french, 1); } I get this error
-
repeater matrix Repater Matrix and Page field
Guy Verville replied to Guy Verville's topic in API & Templates
My mistake... a simple typo... -
I am trying to implement a Repeater Matrix which contains a Page field (there are four fields, an image field, one text field and two page fields). The example given in the readme.txt is self-explanatory. However, this seems not to work with a page field. Say my page field is called "attached_ceramic_skus": <?php foreach($page->test_matrix as $item) { echo " <h3>$item->headline</h3> <=works <img... $item->image->url </> <= works foreach ($item->attached_ceramic_skus as $skus)... <= throws an error "; } } } When I debug this line, I see my image and text fields in the $item array, but not the page fields. But these fields are there and working when I enter data. I have understood how they are kept in the database. What is the method/hook to read this type of field?
-
I resolved my problem by adding a new crop setting in the my field and changing in my template to the new one... It does not explain why the previous crop setting still works on my local machine... Anyways...
-
I have a serious and mysterious problem with CroppableImage3 module since I upgrade to PW 3.0.62. Since my local copy of the site works perfectly well, I uploaded it to the production site, along with the database. But, I get this error when accessing a page using CroppableImage3. Exception: There is no crop setting for the template 'promenade' called 'miniature' (in /home/guyvervi/public_html/site/modules/CroppableImage3/FieldtypeCroppableImage3/FieldtypeCroppableImage3.module line 209) The miniature crop setting IS set. I really does not understand what is going on.
-
The previous CSV file could indeed the base of the comparison. The site will not be a store and the stock inventory will be accessible on demand. We will have six months to see the extent of daily changes. There will be site administrators in charge of completing the missing information of each product (the external database contains only prices, and basic information, SKU, numbers of items in a palette, etc.) Thank you for your input!
-
I just made a test with Phamton.js. It works like a charm. The script below is incomplete, because you have to stop the server once the script is done, but you can get an idea. My local script resides at http://example.local/script-ajax, the same script I can fire from a browser. To make it run from a server, you create another little script below: "use strict"; var page = require('webpage').create(), system = require('system'); if (system.args.length < 2) { console.log('Usage: loadurltest.js URL'); phantom.exit(); } var address = system.args[1]; page.open(address, function(status) { if (status === 'success') { console.log('The script goes here'); } else { console.log('Unable to load the address!'); phantom.exit(); } }); Where you see console.log('The script goes here'); is where you should wait for the script to finish (because you must exit phantom). So, the crontab just call this: phantomjs yourscript.js http://example.com/script-ajax
-
Hi, The data changes daily (mainly prices, but also additions or subtractions of products). A cron route would mean to run a lengthy PHP script. To preprocess the data, I would have to read from PW to get the information. I don’t have a created/modified date in the CSV file, which is a build of many sources, so I have to read everything, hence compare with a bunch of fields in the PW pages. Making a quick dump of the 50000 products and preprocessing would be an option, but PW is not that fast (i tried to get a $pages->find of every product… ). Since we test Elastic Search, it would perhaps be a way to get it done. The actual script is based on the Import CSV module from Ryan. It does essentially the same thing with some extras: read a line, compare with a page, if that page exists, check if such and such data is the same, if so, continue to the next line. If a new sku is present (it’s not at the end of the CSV file…), create the page, etc. Going the cron route would require a stepping mechanism… I know already that 6-7 entries are read per second… The phantomjs way is elegant because I can test on a browser anytime without porting anything.
-
Hi, I am building a site containing around 50000 products which will be updated on a regular basis. We are far from putting this site into production. I am actually devising the future functionalities. The actual update script uses an AJAX process so to avoid timeout. However, the update process will have to be fired automatically, something with either curl or phantom.js. What I have coded at the moment isn't probably not the best solution, albeit it works pretty well. There is a PW template called from an address : http://example.com/start-ajax which starts on load a simple jQuery Ajax process. The 50000 products are listed in a CSV file that is previoulsy broken down into small pieces (200 lines per file). The process is smooth and takes 40 minutes to read the data and update/or create Processwire pages accordingly. I would like to know your advices, your experience in that matter. What would you do to create an automated batch file?
-
Float field rounds to 3 while set to 10
Guy Verville replied to Guy Verville's topic in General Support
I've seen this thread and it's answering the problem. -
Float field rounds to 3 while set to 10
Guy Verville replied to Guy Verville's topic in General Support
I have found that setting the input format to HTML5 actually resolves sometimes the issue. My site is bilingual, French and English. While the field is not correctly saved, the input form show a comma instead of a dot when I sent my account into English. Perhaps it's due to the browser set to French. Anyways, the float field seems to behave erratically. I made more tests and the whole problem has reappeared... -
I have a big problem with some float fields. They are set to round to 10. They are currency fields with 4 or 5 digits after the decimal. It's very important to keep that information as is. However, when I enter a number such as 1.4567, it 's rounded to 1.457 and it is saved as such. What am I missing here? I am using PW 3.0.60
-
I completely agree with you and that's why we are choosing PW for its UI. Structuring the data is also crucial as we will have to deal with a synchronization of data coming from another external DB.
-
Thank you all for your input. We will certainly continue our work with Elastic Search which is quite impressive (and RESTFUL). My programmer colleague found already that querying PW has its limit. I have asked him to look at your comments and maybe he will tell us about his appreciation of the topic.
-
Hi, We have a great project of a national website which will present over 50000 products and we are studying the possibility to use PW, following the success we had with a very much smaller project. While we know that PW has great search capabilities, we are looking at the same time at Elastic Search since there will be many angle to search from and we want the most natural, swift rendering. The products will be distributed over six main categories, and certainly three subcategories for each. Those categories will have their templates and the calls for the associated skus (50000 products) will be made à la Processwire. The website will be bilingual, and, on far future phase 2, there might be some ecommerce stuff (not all products). I would like to know if any of you had an experience dealing with that amount of information for a website with PW, and if you could share some tips or describe some pitfalls to avoid...
-
I will ask my IT team. I am surprised that it is not set by default.
-
Better results now: http://www.webpagetest.org/result/170101_41_KJA/1/performance_optimization/#cache_static_content I am puzzled by the expiration problems. I have ProCache installed and the necessary tweaks have been put in the .htaccess.
-
Don't worry, I do respect what is said here and will act accordingly. In fact, some improvements have already been pushed on GIT (lowering, for example, the image quality to 70) but our git server is down today ;-)
-
Hi Francis, I know those tools. Perhaps I was too eager to show this site before optimizing it. We didn't stressed that much this aspect because the client doesn't need to be known. That seems weird, but there is no Google Analytics set on this site. Why? Because this is almost an intranet of a sort. Anyway, that doesn't excuse anything. As the Webpagetest tells, there is another more irritating aspect : the first byte latency. Those statistics are to be taken with a grain of salt also. The visitor gets visuals after 1.5 sec (http://www.webpagetest.org/video/view.php?id=161230_DH_MDH.1.0). What takes so long are those images in the photo album which aren't that optimized even though there is a routine to implement srcset/sizes alternatives, and also the buttons (I don't understand that there aren't svg. I'll ask the team to redo that). So, too soon to present this. :-(
-
You are absolutely right. The site has to be put rapidly online and this optimization has not been well done. It will be corrected when we will be back to work next week.