Skip to content

CSV import time out

edited April 2013 in Troubleshooting

Hi there,

We have a Nginx + Apache configuration for sendy, like a normal way, nginx does like a proxy and apache like a backend. Everything work perfectly but i got some problems uploading big files, i have grown up the timeout out on both and i tried any combination... and sometime i get this error:

PHP Fatal error: Maximum execution time of 300 seconds exceeded in /usr/local/src/sendy/includes/subscribers/import-update.php on line 124

I dont know how to go on, anybody has a solution or advice..

Thanks, cheers

Comments

  • Hi @rentalia,

    Did you setup a cron job for CSV import?

    Ben

  • Yes, i did and it works but it doesnt insert all.

  • What's your cron job interval?

  • Each 5 min.

  • Each 5 min.

    Wait around 5 minutes, importing will resume itself when cron job kicks in 5 minutes after time out.

  • Yes, i know it, but it doesnt work... while we were talking about it, the list doesnt grown up.... so... something went wrong.

  • Can you check your error log for any 'fatal errors'? Is the 'Maximum execution time of 300 seconds exceeded' error the only error?

    How many records are you trying to import and what is the CPU and memory of your server?

  • Yes, thats the only error what i got on backend... i try to import 300000 records, the server has 12GB of RAM with the last Mysql version.

  • You have 12GB of RAM, but the memory_limit allocated for PHP's usage is only 512M. Try increasing it to a larger number in php.ini and utilize more of your RAM.

  • Hi @Ben,

    Sorry, i thought you were asking for server database, the host only has 1Gb... i could try increase the memory_limit...

    Thanks.

  • Beside the scheluded.php does too many thing doesnt it?¿ i realized that it takes too much memory and make swaping to vm.

  • I'm sorry, do you mean the scheduled.php script does too many things? There's a lot of features users require, I can make it do less things but that wouldn't make people happy. Also sending multi-threaded newsletters in bulk requires more memory than a typical application like Wordpress for example.

    Having said all that, I do try to optimize things where I can, the last test I did on a server with only 613MB of memory sent out 13,000 newsletters in 11mins.

    Thanks.

    Ben

  • Hi @Ben,

    Dont apologize... its ok, i will try to increase the php memory limit and i let you know how it works. Becuse its so ugly to receive a 504 from nginx... :P

    Thanks for all, cheers.

  • Thanks rentalia. You can PM me if you'd like to let me know if it works better.

    Cheers.

This discussion has been closed.