alxndre Posted August 7, 2017 Share Posted August 7, 2017 function copyParent($begin, $times) { for ($i=$begin; $i<=$times; $i++) { $limit = 50; $set = $i; $start = ($set-1) * $limit ; $tx = wire('pages')->findMany("template=transaction-earn, start=$start, limit=$limit, sort=transactiondate"); if ($tx->count < 1) { wire('log')->save('migration', "no mas transaciones"); break; } wire('log')->save('migration', "updating $start to" . $set*$limit); foreach($tx as $t) { $p = $t->parent; $t->setAndSave([ 'tagreference' => $p->tagreference, 'idreference' => $p->idreference, 'transactiondate' => $p->transactiondate, 'merchantreference' => $p->merchantreference, 'userreference' => $p->userreference, 'terminalreference' => $p->terminalreference, 'offline' => $p->offline, 'amount' => $p->amount, 'product' => $p->product ]); } } } I have this code that i'm trying to run in the background to copy some data from a parent node to a child node. The problem is each iteration of this code runs for about 15 seconds for every 50 pages saved / copied (or about 200/minute). I have about half a million of these pages to update and it would take 2 straight days running. Maybe someone could point me out to what's causing the slow down? Or if anyone could suggest a better way to do what i'm trying to do. Thank a lot, as usual. Edit: By the way, the reason I'm doing this is because when I'm trying to query the child nodes with criteria from the parent node, it takes an incredibly long time, but takes close to nothing when I tried copying some to the child node and do the query there. I hope that helps a bit. $tx = wire('pages')->findMany("template=transaction-earn, start=$start, limit=$limit, sort=transactiondate, [tagreference=$tag, merchantreference=$merchant]"); Link to comment Share on other sites More sharing options...
bernhard Posted August 7, 2017 Share Posted August 7, 2017 just a wild guess and maybe complete nonsense but did you try your script without the logging? the log saves to files, so maybe that's getting slow (and huge logfiles)? but copying half a million pages data from parents to children sounds not too good anyhow and i guess (second guess in 3 sentences ^^ ) there is a better solution... 1 Link to comment Share on other sites More sharing options...
Robin S Posted August 7, 2017 Share Posted August 7, 2017 Not sure if it could cause the issue, but in your copyParent() function you could try find() instead of findMany() - no need for findMany() when you have a limit of 50. Also, the selector you posted looks a bit off: $tx = wire('pages')->findMany("template=transaction-earn, start=$start, limit=$limit, sort=transactiondate, [tagreference=$tag, merchantreference=$merchant]"); What is going on with the section in the square brackets at the end? Looks a bit like a subselector but you are not matching it against a field. 2 Link to comment Share on other sites More sharing options...
alxndre Posted August 9, 2017 Author Share Posted August 9, 2017 Thanks for the response, I tried without logging but it's the same. And also, sorry bout the wrong selector, you're right it should be parent=[selector] as a subselector. I'll try later using find only. Thanks! Link to comment Share on other sites More sharing options...
Jonathan Lahijani Posted August 18, 2017 Share Posted August 18, 2017 I have a script that deletes a bunch of pages and it's very slow too, but I'm not sure if it's normal or if something's wrong. In addition, I'm running it from Bash. Did you have any luck with improving the speed? Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now