adrian Posted February 27, 2017 Author Share Posted February 27, 2017 17 minutes ago, k07n said: @adrian one more thing: nothing happens on this: $modules->get('ProcessTableCsvExport'); // load module $pages->get('/clients/')->client_emails->exportCsv(',', '"', 'csv', '|', true); I'm working on it now - looks like that was broken when I added the option to declare which columns to export. Will have a fix shortly. Link to comment Share on other sites More sharing options...
adrian Posted February 27, 2017 Author Share Posted February 27, 2017 @k07n and everyone else who uses this module. I just committed a pretty major revision to the API export method. It now works like this: $modules->get('ProcessTableCsvExport'); // load module // field name, delimiter, enclosure, file extension, multiple values separator, names in first row, columns to export, selector(filter) string // columns to export can be index starting at 1, or column names $page->exportTableCsv('table_field_name', ',', '"', 'csv', '|', true, array('col1', 'col2'), 'year=2017'); Unfortunately this is a breaking change, but I realized that the old approach didn't allow calling the method on an alternate page. This new version also supports limiting the exported columns (an array of column names, or indices starting at 1), as well as defining of a selector filter. Please let me know if you have any problems. 3 Link to comment Share on other sites More sharing options...
grimezy Posted March 22, 2017 Share Posted March 22, 2017 Thanks @adrian for the great module! I'm just having an issue at the moment when exporting to CSV via the admin area. I have a table field, and within that, I find that when a page (select) field has a default setting set (for example default=page_id)... when exporting the CSV, only entries of that default setting are exported. I know this is the case as I have changed the default value of the page(select) field and the export changes to only these entries. I have tested this using PW 3.0.56 and 1.0.3 of this module. Thanks. Link to comment Share on other sites More sharing options...
adrian Posted March 22, 2017 Author Share Posted March 22, 2017 Hi @grimezy - I hadn't ever used the default=pageid option before, but I just tested here and it still exports all rows. I am also running 3.0.56. If the site is online, is there any chance I could take a look? If not, could you post your table field settings, including the specific settings for the page select field, as well as a screenshot of all rows and also the CSV that gets exported. One more thought - what version of Table are you running? I am on the latest v15 beta. Link to comment Share on other sites More sharing options...
grimezy Posted March 24, 2017 Share Posted March 24, 2017 Hey @adrian Thanks for the quick reply. I didn't realise there was an updated version, so I have upgraded to v15 and still only exports what is set to the default option. I was currently on v14. I have made a vid of replicating the issue, which also shows all the info you might need. Just let me know if you need any other information. I'll PM you shortly with the video. Thanks. Link to comment Share on other sites More sharing options...
Macrura Posted September 14, 2017 Share Posted September 14, 2017 Hi Adrian, Having a problem with the CSV export, getting this error on the process module for exporting this table (on table version 13) Method Page::downloads_table does not exist or is not callable in this context and also getting this: PHP Notice: Array to string conversion in .../TableCsvImportExport/ProcessTableCsvExport.module:112 not sure what's going on, downloads_table is the name of the field... Link to comment Share on other sites More sharing options...
adrian Posted September 14, 2017 Author Share Posted September 14, 2017 @Macrura Sorry about that - I should have a check in there regarding the required version of Table. When Ryan introduced pagination I had to make a lot of changes to support it and decided to no longer support older versions. This commit (https://github.com/adrianbj/TableCsvImportExport/tree/b83df4d13fd18fcb6c4bc8ac467e6ae35cba9711) was the last version to support the older version of Table, so you should hopefully be ok with that one, although please note that I have added some new features (like the ability to select the columns to export) and also some bug fixes to newer versions. I would suggest that you instead upgrade to v15 of Table. Link to comment Share on other sites More sharing options...
adrian Posted September 14, 2017 Author Share Posted September 14, 2017 Actually, I just noticed that Ryan is now up to v17 / v18beta - hopefully both of those should also work though. EDIT: just did a quick check with v18 and it looks to be working fine! Link to comment Share on other sites More sharing options...
Macrura Posted September 14, 2017 Share Posted September 14, 2017 this site is still 2.7 i'm trying to convince client to upg but complex site so need contingency budget... i'll try the old version and post back tomorrow Link to comment Share on other sites More sharing options...
adrian Posted September 14, 2017 Author Share Posted September 14, 2017 Just now, Macrura said: maybe if you have an old version we could get by with for now Yep, the old version is linked to in this post: https://processwire.com/talk/topic/7905-profields-table-csv-importer-exporter/?do=findComment&comment=150953 1 Link to comment Share on other sites More sharing options...
Macrura Posted September 14, 2017 Share Posted September 14, 2017 awesome, thanks again, got the export working now for them... 1 Link to comment Share on other sites More sharing options...
adrian Posted September 19, 2017 Author Share Posted September 19, 2017 Hi all, I have just committed a major new version (2.0.0) on the dev branch (https://github.com/adrianbj/TableCsvImportExport/tree/dev). This version has breaking changes to the frontend export/import API methods!! Changes include: changed API methods - please see ReadMe for details: https://github.com/adrianbj/TableCsvImportExport/blob/dev/README.md module config settings to: set defaults for import and export parameters (delimiter, enclosure, etc) can users override the default parameters can users can select columns/order in exported CSV can users use an InputfieldSelector interface to filter table rows (this is an enhanced version of the built-in Table "Find" interface available when you have pagination enabled) I would like to make this the stable (master) version as soon as possible because the old API methods were not well constructed so I would really appreciate testing of this new version if you have any time. Thanks! PS - has anyone out there actually been using the API methods? 3 Link to comment Share on other sites More sharing options...
adrian Posted November 7, 2017 Author Share Posted November 7, 2017 Pretty quiet on that dev version, so I have decided to merge those changes into Master - please review the ReadMe for changes to the API and also visit the module settings to see everything new. And of course let me know if you find any problems. PS I also have a new version in the wings that adds JSON import. I feel like it should be expanded to also allow JSON export, but maybe these additions should retire this module and have a new one named TableImportExport (without the CSV part). Link to comment Share on other sites More sharing options...
Lutz Posted December 15, 2017 Share Posted December 15, 2017 On 7. November 2017 at 9:42 PM, adrian said: Hi Adrian, some minor problems. In TableCsvImportExport.module, function buildImportForm, we should be able to translate "Append" and "Overwrite", so $f->addOption("append", __('Append')); $f->addOption("overwrite", __('Overwrite')); In the same function, there's a problem with the $fieldset->description. Your code: $fieldset->description = __("The structure of the CSV must match the table fields. Import will happen on page save." . ($this->wire('fields')->get($actualFieldName)->allow_overwrite != 1 || !$this->wire('user')->hasPermission("table-csv-import-overwrite") ? "\nImported data will be appended to existing rows." : "")); Try to translate and you see the prob. My simple fix was to separate the two strings: if($this->wire('fields')->get($actualFieldName)->allow_overwrite == 1 && $this->wire('user')->hasPermission("table-csv-import-overwrite")){ ... $f->description = __("Determines whether to append new rows, or overwrite all existing rows."); ... } else { $fieldset->description .= __("Imported data will be appended to existing rows"); } I had problems with wrong data in subfields when importing tab-separated data with empty strings in one of the subfields. Whenever the third of four subfields (a textfield, optional) was left empty, after the import the subfield wasn't empty. Instead, the name of the column was inserted. I wasn't able to retrace what exactly happened. I fixed it this way, in function importCsv: foreach($rows->data as $data){ ... foreach($data as $subfieldKey => $fieldValue){ ... if($fieldValue == '') $tableEntry[$subfieldNames[$subfieldKey]] = ""; ... } ... } 1 Link to comment Share on other sites More sharing options...
adrian Posted December 15, 2017 Author Share Posted December 15, 2017 Hi @Lutz Heckelmann - welcome to the forums! Thanks for a useful first post I have taken care of the first two thing locally (slightly different solution for #2), but I am having trouble reproducing #3. Any chance you could do a field export on the table field and also send me an example file to import so I can test. I understand what you are getting at, but I thought that the if($fieldValue != '') check was taking care of it. I can kinda see why it wouldn't but I'd still really like to reproduce the problem so I can be certain of the solution. Thanks. Link to comment Share on other sites More sharing options...
Lutz Posted December 16, 2017 Share Posted December 16, 2017 Hi Adrian, I have a field with four columns: von (DateTime) | bis (DateTime) | session (Text, optional) | notiz (Text, optional) Import works with: 2017-10-11 8:00 2017-10-11 9:00 Session 1 Notiz zu Session 1 Error in Table like described when importing: 2017-11-10 8:00 2017-11-10 10:00 (CSV, replace comma with tab: date,date,,) Result is: 2017-11-10 8:00 |2017-11-10 10:00 | Session | Thank you. test_csv-import_171216_0119.csv.zip 1 Link to comment Share on other sites More sharing options...
adrian Posted December 16, 2017 Author Share Posted December 16, 2017 Thank you for that - I wasn't quite getting the scenario correct, so that helped a lot. I have fixed it although a little differently - turns out I can't see a need to keep rows when the value is blank, so removed that check and everything seems fine. I have committed a new version. Would you mind checking at your end also please? 1 Link to comment Share on other sites More sharing options...
Lutz Posted December 16, 2017 Share Posted December 16, 2017 Hi Adrian, works well when I import records like the one I sent. However, when I try to copy the two required columns only, I get the error I mentioned above. The expected behavior would be that just the two required columns were filled out, but again I got the word "Session" placed in the first empty column. Field: four columns, two of them required. CSV: two columns (for events that just needs a start and an end date set). 2018-2-25 12:00 2018-2-25 14:00 1 Link to comment Share on other sites More sharing options...
adrian Posted December 16, 2017 Author Share Posted December 16, 2017 @Lutz Heckelmann - I never planned on supporting importing of CSV or TSV where the number of columns for each row is different to the table it's being imported into. I should probably have added a warning that they didn't match and not import anything. That said, maybe I should just support what you are looking for - I don't suppose there is any harm in it. I'll sort it out shortly. Link to comment Share on other sites More sharing options...
adrian Posted December 16, 2017 Author Share Posted December 16, 2017 @Lutz Heckelmann - please try the latest version - it now supports any number of missing columns on the end of the CSV/TSV Link to comment Share on other sites More sharing options...
Lutz Posted December 16, 2017 Share Posted December 16, 2017 Adrian, just finished some tests, everything seems to work as expected now! I thought for a moment we could loop with a while instead of the foreach($data as $subfieldKey => $fieldValue), using $i (number of subfieldNames). But the array_push does it. Great response time, thanks a lot! 1 Link to comment Share on other sites More sharing options...
adrian Posted December 16, 2017 Author Share Posted December 16, 2017 Just now, Lutz Heckelmann said: I thought for a moment we could loop with a while instead of the foreach($data as $subfieldKey => $fieldValue), using $i (number of subfieldNames). But the array_push does it. Yeah, I could possibly refactor a little, but this approach seemed the simplest without messing with anything else. If you feel like optimizing things though, I always like PRs 1 Link to comment Share on other sites More sharing options...
Lutz Posted December 16, 2017 Share Posted December 16, 2017 Good to know, and yep: "this approach seemed the simplest without messing with anything else". So, I'm happy with that solution. This last problem wasn't a deal-breaker, of course. However, it's great to have this new flexibility. You know, it's good to keep things simple for the ones who use those featuers to import just by copy and paste from Excel--even though they lack any experience in using CSV/TSV... Thanks again. 1 Link to comment Share on other sites More sharing options...
adrian Posted December 16, 2017 Author Share Posted December 16, 2017 1 minute ago, Lutz Heckelmann said: However, it's great to have this new flexibility. You know, it's good to keep things simple for the ones who use those featuers to import just by copy and paste from Excel--even though they lack any experience in using CSV/TSV I completely agree - I do think it's going to be a very nice improvement for many users - thanks for the suggestions! 1 Link to comment Share on other sites More sharing options...
adrian Posted December 17, 2017 Author Share Posted December 17, 2017 @Lutz Heckelmann - just realized that I should also exclude trying to insert any additional columns in case the users has copied more columns than the table has. Would you mind doing another check at your end to make sure everything still look ok? Thanks! Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now