@solamathew Saw your post about crons at 1 minute increments. I also have shared hosting, but I'm not allowed to do every 1 min either. What I can do though is set up duplicate crons for autoresponders.php at the following intervals 0/30, 1/31, 2/32, 3/33,... all the way up to 29/59. So 30 cron jobs setup and this worked for me. Hope it helps.
I'd been using webfaction (shared hosting based on centos), but decided to try something new. I got it up and running via Laravel Forge on a Digital Ocean droplet (a small one, about 10 USD / month) running mysql 8. Forge is nice because you just commit to git and it deploys.
Forge assumes nginx so you need to tweak the nginx config of the forge "site" to make it work (someone shared a variant of this in a gist):
Don't just copy paste that; instead you need to go line by line and "weave" it into whatever forge sets up for you. Also, let forge install the letsencrypt cert by itself. It will write the "forge ssl" section by itself.
Mysql 8 needed sql_mode set, but the setting cited in this forum did not work for v 8. Adding this line to /etc/mysql/mysql.conf.d worked, and I restarted mysql via the forge dash:
I tried Digital Ocean's managed databases and, eventually got it working. It also defaults to mysql v8, but since it's a cluster, they require primary keys. That means, if you restore your sendy db into a DO managed db, you need to do a SET SESSION to turn that requirement off, in your sql dump file, before restoring it. See the SET SESSION line:
-- phpMyAdmin SQL Dump
-- version 4.9.4
-- https://www.phpmyadmin.net/
--
-- Host: localhost
-- Generation Time: Jun 14, 2020 at 10:37 AM
-- Server version: 5.6.48-88.0
-- PHP Version: 7.2.31
SET SESSION sql_require_primary_key = 0;
SET SQL_MODE = "NO_AUTO_VALUE_ON_ZERO";
SET AUTOCOMMIT = 0;
START TRANSACTION;
SET time_zone = "+00:00";
...etc...
I gave up on using a DO managed database and just decided to setup a cron job backup. I'm using gobackup, an easy-to-use golang program to backup to local and aws s3. It needs a yml config, which looks like:
That sets up two targets - one local and one on s3. It includes some config files you might want to backup, besides the mysql dump.
You should make an IAM credential (programmatic is fine) and assign a policy that only allows the needed access to the bucket. Put its credentials in the yaml. Then you can run the backup via gobackup perform or make a crontab entry:
At Sendybay, where we provide hosting for Sendy, we recently launched Drag & Drop Email Builder Integration. It's tightly integrated with Sendy + you get 240 free templates. Do check us out. We believe we will save you a lot of time & effort + our added value is cherry on the top!
I've used sendy on the most expensive goddady shared hosting before with pretty large lists (100K +) and it was not a great experience. I would get nowhere near my max send rate which is 100s per second, at best I could send about 5 per second.
To piggy-back onto my last post, I'm about to upgrade my servers. Can someone look at my existing specs and give me an idea of what specs I should upgrade to? These are not shared servers, but I do host about 50 websites on them for client. Thanks in advance!
+1 for Digital Ocean. Smooth setup and reliable servers. Have another instance running on AWS Lightsail with Bitnami. Can't say I'm in love with it, but might be my lack of sysop skills. If you offload your images to S3 or use Cloudfront, you'd save your server a lot of peak traffic when you send out your mails.
SiteGround. Tech support is the best I've seen, and I've had probably a half dozen different hosts over the years. I run a bunch of wordpress, host a dozen sites, have some other apps installed too, and when I blow something up in cpanel, they're so good at fixing it for me.
@ondrique I use 1 GB RAM, 1 vCPU, 40 GB SSD (and an external Amazon RDS database). This is a bespoke server for Sendy. My 15,000 mailing list takes an hour to send. CPU load only goes up to about 10%. I don't know whether there's benefit in running 2GB rather than 1GB RAM - my suspicion is that the bottleneck is the RDS server.
@jamescridland 10,000 emails should take less than 10 minutes to send. Yes, the bottleneck is RDS.
I highly recommend not using RDS with Sendy.
Users who use RDS experience slower sending speeds and unforeseen issues. RDS is good for 'storing' data but not optimized for quick real time data processing like what Sendy needs.
Here's one feedback from a user using RDS:
Screenshot:
I suggest using a database hosted on the same instance that is hosting Sendy.
DigitalOcean (CentOS7) and using Virtualmin for Administration.
Its an interesting combination. I created a subdomain from an existing domain name and did not have an installation problem.
Used both a 1GB CentOS Linode with pure CLI and a 2GB Ubuntu 20.04 lightsail instance on AWS with VirtualMin as a control panel... professional solutions if you can work with the CLI and know how to secure a server. Love it.
I'm using VPS which is ULTRA CHEAP, BUT you need to know that it's parameters are great for Sendy, but won't be great for let's say minecraft server or TeamSpeak. It's cost about $16,25/YEAR (yeah!) and support is on facebook group. It's more like educational for linux server but for sendy it works nice.
Been on namecheap.com for many many years with version 2.x and 3.x just updated to 5.2 and since the functions.php is encrypted they telling me I cannot use or, or I need to upload an unencrypted version of the file.
@jan Sending speed depends on the type of server you use, VPS would have more performance compared to shared hosting. I suggest reading this guide to sending speed → https://sendy.co/troubleshooting#sending-speed. Thanks.
LiquidWeb Cloud Sites. Costs $150 a month, but I’d you want the flexibility of true cloud hosting without the management aspect you can’t beat the price.
@ben Have you thought about approaching the major VPS providers that offer preinstalled app packages (Vultr, Digital Ocean, Linode, Upcloud) and create a Sendy app that is kind of preconfigured (minus the license and Amazon SES keys of course)? It might be a huge help for those that are unfamiliar with scripts, MySQL, .htaccess, servers, etc.
Comments
@solamathew Saw your post about crons at 1 minute increments. I also have shared hosting, but I'm not allowed to do every 1 min either. What I can do though is set up duplicate crons for autoresponders.php at the following intervals 0/30, 1/31, 2/32, 3/33,... all the way up to 29/59. So 30 cron jobs setup and this worked for me. Hope it helps.
@chrisatisb Curious to know how you've set this up? I'm a fan of Synology products and would like to hear any details on your config?
I'd been using webfaction (shared hosting based on centos), but decided to try something new. I got it up and running via Laravel Forge on a Digital Ocean droplet (a small one, about 10 USD / month) running mysql 8. Forge is nice because you just commit to git and it deploys.
Forge assumes nginx so you need to tweak the nginx config of the forge "site" to make it work (someone shared a variant of this in a gist):
Don't just copy paste that; instead you need to go line by line and "weave" it into whatever forge sets up for you. Also, let forge install the letsencrypt cert by itself. It will write the "forge ssl" section by itself.
Mysql 8 needed sql_mode set, but the setting cited in this forum did not work for v 8. Adding this line to
/etc/mysql/mysql.conf.d
worked, and I restarted mysql via the forge dash:I tried Digital Ocean's managed databases and, eventually got it working. It also defaults to mysql v8, but since it's a cluster, they require primary keys. That means, if you restore your sendy db into a DO managed db, you need to do a
SET SESSION
to turn that requirement off, in your sql dump file, before restoring it. See theSET SESSION
line:I gave up on using a DO managed database and just decided to setup a cron job backup. I'm using gobackup, an easy-to-use golang program to backup to local and aws s3. It needs a yml config, which looks like:
That sets up two targets - one local and one on s3. It includes some config files you might want to backup, besides the mysql dump.
You should make an IAM credential (programmatic is fine) and assign a policy that only allows the needed access to the bucket. Put its credentials in the yaml. Then you can run the backup via
gobackup perform
or make a crontab entry:To decrypt the backup files:
The
openssl
line will prompt for the encrypt password you set in the yml.Hope this helps someone.
At Sendybay, where we provide hosting for Sendy, we recently launched Drag & Drop Email Builder Integration. It's tightly integrated with Sendy + you get 240 free templates. Do check us out. We believe we will save you a lot of time & effort + our added value is cherry on the top!
Is anyone here running Sendy on namecheap shared hosting?
I've used sendy on the most expensive goddady shared hosting before with pretty large lists (100K +) and it was not a great experience. I would get nowhere near my max send rate which is 100s per second, at best I could send about 5 per second.
@motionsquared With such large volume and large send rate, it's highly recommended to use a VPS server with adequate server CPU and memory resources.
Ben, can you define "adequate CPU and memory resources? What is optimum for running large lists on Sendy?
To piggy-back onto my last post, I'm about to upgrade my servers. Can someone look at my existing specs and give me an idea of what specs I should upgrade to? These are not shared servers, but I do host about 50 websites on them for client. Thanks in advance!
The web server:
16 vCPU cores
30GB RAM
50GB SSD storage
200GB SATA storage
The database server:
8 vCPU cores
15GB RAM
50GB SSD storage
75GB SATA storage
+1 for Digital Ocean. Smooth setup and reliable servers. Have another instance running on AWS Lightsail with Bitnami. Can't say I'm in love with it, but might be my lack of sysop skills. If you offload your images to S3 or use Cloudfront, you'd save your server a lot of peak traffic when you send out your mails.
AWS EC2 micro running Ubunut 18.04 LTS
@CissyH: To give you an idea:
I hope this helps.
SiteGround. Tech support is the best I've seen, and I've had probably a half dozen different hosts over the years. I run a bunch of wordpress, host a dozen sites, have some other apps installed too, and when I blow something up in cpanel, they're so good at fixing it for me.
@ondrique I use 1 GB RAM, 1 vCPU, 40 GB SSD (and an external Amazon RDS database). This is a bespoke server for Sendy. My 15,000 mailing list takes an hour to send. CPU load only goes up to about 10%. I don't know whether there's benefit in running 2GB rather than 1GB RAM - my suspicion is that the bottleneck is the RDS server.
@jamescridland 10,000 emails should take less than 10 minutes to send. Yes, the bottleneck is RDS.
I highly recommend not using RDS with Sendy.
Users who use RDS experience slower sending speeds and unforeseen issues. RDS is good for 'storing' data but not optimized for quick real time data processing like what Sendy needs.
Here's one feedback from a user using RDS:
Screenshot:
I suggest using a database hosted on the same instance that is hosting Sendy.
I would like to recommend AWS light sail (10$ monthly ) with OPENLitePanel for quick deployment. Its best solution.
Ref -
OPENLitePanel - https://openlitepanel.com
AWS - https://lightsail.aws.amazon.com/
DigitalOcean (CentOS7) and using Virtualmin for Administration.
Its an interesting combination. I created a subdomain from an existing domain name and did not have an installation problem.
DigitalOcean with Ubuntu 18.04.4 LTS.
I use UpCloud for Sendy mail
Just moved my Sendy to Amazon EC2.T2 medium server and very happy. I am able to send 70K emails in 50 minutes.
Used both a 1GB CentOS Linode with pure CLI and a 2GB Ubuntu 20.04 lightsail instance on AWS with VirtualMin as a control panel... professional solutions if you can work with the CLI and know how to secure a server. Love it.
Hello,
I'm using VPS which is ULTRA CHEAP, BUT you need to know that it's parameters are great for Sendy, but won't be great for let's say minecraft server or TeamSpeak. It's cost about $16,25/YEAR (yeah!) and support is on facebook group. It's more like educational for linux server but for sendy it works nice.
Link: http://bit.ly/mikrusvelkan
You're welcome.
Ah yeas, parameters are:
OS:
Ubuntu 20
Centos 8
Debian 10
Arch 2020
IP: only IPv6
Using all-inkl.com shared webspace to send to a list of 52k subscribers daily. Sending usually takes 2-3 hours.
hello sendy, amazon aws is giving only 200 mails per day what should i do to get 50000 mails per day
AWS EC2 t3.micro with centos 7. I share the instance with a HTML5 plain web and a CRM Vtiger. works fine
Been on namecheap.com for many many years with version 2.x and 3.x just updated to 5.2 and since the functions.php is encrypted they telling me I cannot use or, or I need to upload an unencrypted version of the file.
@artrain You'd need to request Amazon to raise your daily sending limits to a number you need by filling up this form → http://aws.amazon.com/ses/extendedaccessrequest. This is covered in Step 5 of the installation guide → https://sendy.co/get-started
@jan Sending speed depends on the type of server you use, VPS would have more performance compared to shared hosting. I suggest reading this guide to sending speed → https://sendy.co/troubleshooting#sending-speed. Thanks.
LiquidWeb Cloud Sites. Costs $150 a month, but I’d you want the flexibility of true cloud hosting without the management aspect you can’t beat the price.
@ben Have you thought about approaching the major VPS providers that offer preinstalled app packages (Vultr, Digital Ocean, Linode, Upcloud) and create a Sendy app that is kind of preconfigured (minus the license and Amazon SES keys of course)? It might be a huge help for those that are unfamiliar with scripts, MySQL, .htaccess, servers, etc.
For example, I use the Plesk app, and then use that to create domains/DBs to host Sendy and other sites:
https://www.vultr.com/features/one-click-apps/