Jump to content

How to export all user emails to a CSV file?


Vineet Sawant
 Share

Recommended Posts

Hello,

I've a website with over 40k registered users who sign up using their email IDs. We've a ticket booking service and we wish to alert users about certain functionality changes in the site.

Whenever I'm trying to print all the email using foreach, page is going blank due to memory issue I suppose (it takes ages to respond).

Is there any way or any module which will export all the emails to a single CSV file?

Thanks.

Link to comment
Share on other sites

I don't think there's a module for this, but it's kinda easy to circumvent timeouts. You'd build a php file that calls a reasonable number of users (1000 or so) using the pagination feature of processwire. The results are then written / appended to a file somewhere in the system. And the php then returns the next pagenumber, at best as simple json string. Now you'd just build some js, which calls the same page via ajax and iterates on as soon as the response is returned. After 40 calls you've your full csv. 

Another way would be using some SQL, to get all user id's and select the emails directly from the email fields table. 

  • Like 1
Link to comment
Share on other sites

Another option is to call $pages->uncacheAll() at a regular interval. This helps you avoid memory issues in most cases, and allows you to run the whole query at once. For a rough example, see how it's done in VersionControl.module.

If you do take this route, make sure that your maximum execution time isn't an issue here either. If your script takes a long time to run, you'll usually want to increase the max execution time using set_time_limit() first. The default value is typically 30 seconds, and a script running longer than that gets killed.

First things first, though: you mentioned that the page goes blank supposedly because of a memory issue. If you're not 100% sure of the reason for the issue, the very first thing you should do is checking why it actually happens. Do the same thing in a test environment with $config->debug = true, or see your error logs on the live site. Assuming is never a good strategy :)

  • Like 2
Link to comment
Share on other sites

This worked really great for me in a site with thousands of pages

$selector = "template=pages_template"; // as an example

while (1) {
    $p = wire('pages')->get("{$selector}, id>$id"); // get page with id bigger than previous
    if(!$id = $p->id) break; // assign current page's id to $id or break the loop if it doesn't exist

    // do stuff using $p as the current page

    wire('pages')->uncacheAll();
};

Like this you never have an array in memory, only one page at each time, and you can do all the operation in one go. Of course you would probably have to constantly write to the csv file (like, open,write,close,open,write,close,open,write,close  edit: this doesn't make any sense you can open, write, write, write,... close), but I don't think that would be a problem.

Edit: Ideally you would do this from the terminal by bootstrapping PW, to prevent overloading apache.

  • Like 3
Link to comment
Share on other sites

The code on the post above is not full proof, if the process is interrupted for any reason, you would have to start over. The ideal would be to, on each iteration, read the last line of the file, identify the user, find the next user, and write the next line.

  • Like 1
Link to comment
Share on other sites

I ended up using limit with foreach & pagination.

I also found out that start=n in the selectors breaks pagination module. It stops working. Pagination renders page numbers but after clicking the page, e.g., 3 still shows data from page 1. I guess it happens cause I'm using start=n. When I removed, it started working just fine.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...