Jump to content

Bulk saving pages


clsource
 Share

Recommended Posts

Hello, thanks for helping me.

Currently I'm using a CSV with all the data.

read the 2000 entrys and for each one

create a new page

// Csv Parsing

$system = $pages->get('/system/people');

// 2000 people
foreach($people as $friend) {

   $newFriend = new Page();
   $newFriend->template = 'person';
   $newFriend->parent = $system;
   $newFriend->name = md5($friend['name']);
   
   $newFriend->save();

   $newFriend->title = $friend['name'];

   $newFriend->save();
}
Link to comment
Share on other sites

Probably you are either hitting the time limit of your php script or then your server runs out of the memory.

To avoid time limit issue, put this on top of your script:

set_time_limit(0);

That might be all you need. But probably you will hit memory issues too, since PW keeps pages in cache. At some point memory issues come - if not with 2000 pages, then probably with 20000 pages or so. That is easy to avoid, just call this at the end of your foreach loop:

wire("pages")->uncacheAll();

Also make sure you don't have debug mode on when running this kind of scripts (there will be memory leak, since debug mode logs all queries in memory).

I myself like to create this kind of scripts as a command line script. More information here: http://processwire.com/api/include/

  • Like 4
Link to comment
Share on other sites

Thanks, 

but if you have a greater number of pages that need to be created,

like 10K for example. The script would have O(n) execution times.

I tried to use the command line, but for unknown reasons the same script only saved 1 person.

There is a way to use javascript to programatically call

a page with a chunk of people to save?

Edit:

I found something with jQuery Queue

http://learn.jquery.com/effects/uses-of-queue-and-dequeue/
https://www.inkling.com/read/jquery-cookbook-cody-lindley-1st/chapter-17/recipe-17-5

I will look into it and post the results

Edited by clsource
Link to comment
Share on other sites

I think apeisa has all the usual suspects covered.

I imported 900+ plant species from a CSV yesterday. Each has 3 repeater fields that have around 8 entries each on average.

So roughly 21,600 pages. It took a little bit, but it completed in one batch.



This is on a pretty solid University of Florida server, so not sure how it would do on a typical shared host.

  • Like 3
Link to comment
Share on other sites

Yes, that is a good example. Did you got it working?

Yes, works like a charm.

But also I discovered 2 issues.

1.- the csv have not \n line endings.

2.- the sql export have all the sessions. that was the main reason it was a very huge sql file (near 12 megas)

after the truncate only was 140 kb. wow.

Thanks folks!

  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...