Jump to content

Adding pages through the API


jrtderonde
 Share

Recommended Posts

Hey,

I'm working around a module that adds pages in my ProcessWire installation. The pages are added based on a JSON array that will be imported through a page save. Everything works fine except for when the pages are added. I get the following error

Integrity constraint violation: 1062 Duplicate entry '3e215ecd6774fd99c2b0eb5cadf36a07-1269' for key 'name_parent_id'

I'm using the following code/loop to generate the pages.

// Loop through the files
foreach ($p->importFile as $file)
{
    // Set the file location
    $name = $file->data["basename"];
    $path = $p->importFile->path;
    $location = $path . $name;

    // Get the file
    $json = file_get_contents($location);
    $json = json_decode($json);

    // Loop through the rows of the import
    foreach ($json->ttEntityDelAddrLink as $client)
    {
        // Create new page
        $new = new Page();

        // Create unique hash
        $unique = md5(date("Y-m-d H:i:s") . "-" . $client->CustomerCode);

        // Set some variables for the new page
        $new->setOutputFormatting(false);
        $new->template = "_client";
        $new->parent = $p;

        // Create hash
        $new->title = $client->DelAddressName;
        $new->name = $unique;

        // Page specific fields
        $new->company = $client->DelAddressName;
        $new->companyId = $client->CustomerCode;
        $new->city = $client->DelAddressCity;
        $new->address = $client->DelAddressStreet;
        $new->postcode = $client->DelAddressZipCode;
        $new->country = $countries[$client->DelAddressCountryCode];

        // Save the page
        $new->save();
    }

    // Exit for debugging
    exit;
}


Does anybody know what's wrong?

Link to comment
Share on other sites

Ok, but that's why I'm trying to prevent with the hash as $page->name. I'm using the ID of the customer combined with a timestamp to generate a md5 hash that's "unique". (See the code, I'm pretty sure it should work). 

Also, the next entry in the array has a complete different customer id. So it should be "impossible" to be duplicated.

I'm using this code within a pageSaveHook (module). Could this be a problem? Furthermore, the array counts 11000+ entries, could it be that this is a problem? The platform is hosted on a stable and fast VPS and I cranked up the PHP execution time to 10 minutes, so it shouldn't cause any serious trouble.

Link to comment
Share on other sites

1 hour ago, jrtderonde said:

Ok, but that's why I'm trying to prevent with the hash as $page->name. I'm using the ID of the customer combined with a timestamp to generate a md5 hash that's "unique". (See the code, I'm pretty sure it should work). 

What @BitPoet said. In other words, PHP is fast, MySQL is fast. So, there is no guarantee that you will have only 1 page created per second (your date), or even per microsecond! That leaves you with uniqueness dependent only on the customer ID ($client->CustomerCode). If you can guarantee that customer IDs are unique and that they only appear once per data set, (your JSON array) then you would be fine. The fact that you are getting a SQL error tells us that either your customer IDs are not unique and/or one customer ID is appearing more than once in your data set (i.e., it may be uniqe to the customer, but not the data set). If the database says so, then it must be so ;). Let me illustrate (non-unique customer ID and/or replicated customer ID):

 

$clientCustomerCode = 12345;// replicating repeated use of customer ID
$p = $pages->get(1722);

for ($i=0; $i < 30; $i++) { 
	// Create new page
	$new = new Page();
	// Create unique hash {no guarantee this will be unique if same customer ID used}
	$unique = md5(date("Y-m-d H:i:s") . "-" . $clientCustomerCode);
        // even these (microseconds) do not guarantee uniqueness throughout
	#$unique = round(microtime(true) * 1000) . "-". $clientCustomerCode;
	#$unique = microtime(true) . "-". $clientCustomerCode;
	#$unique = microtime() . "-". $clientCustomerCode;

	echo $i . ': ' . $unique . '<br>';// display 'uniqueness'; you should see duplicates here

	// Set some variables for the new page
	//$new->setOutputFormatting(false);// not needed for a new page (I think)
	$new->template = "basic-page";
	$new->parent = $p;   
	// Create hash
	$new->title = "Title " . rand(5,15);// just for testing
	$new->name = $unique;// not really unique!
	// Save the page
	#$new->save();// you'll get PDO error here within a second

}

Using the above code, you'll be hit by the duplicate entry violation error within a second...

Edited by kongondo
  • Like 7
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...