Question

JasonDWilson77 on Fri, 17 Feb 2017 19:48:24


I am supporting a new hosted/SAAS model web application where each customer has their own DB.  I have a very large SQL AlwaysOn cluster that spans multiple data centers that hosts all the DB's for all the different  customers (several hundred databases).  

Our sales folks now want the customers to have the abliity to connect to their DB (read-only) for custom reporting, integration with other systems, etc.  

I want to be able to support this, but no way I am opening up the AlwaysOn servers for anyone in the world to connect to!  I currently have the AlwaysOn servers setup behind multiple firewalls, non-default ports, secure , only white listed IP's can connect, etc.  

What is the best way for me to be able to support this new request without risking security or performance since all customers' DBs live on this AlwaysOn setup?  Keep in mind our customers are on different network/domains, etc.  So I have no idea where they are coming from.  

At this time I am not worried about real-time synching, and I am fine if it is just a nightly push to somewhere for each DB (the entire process must be fully automated though).  I want to support the request, but not risk security or performance to everyone else as a cost.  

Any help/guidance/ideas is greatly appreciated!


Replies

Hilary Cotter on Fri, 17 Feb 2017 19:57:45


Your post is rather hard to read.

If you are comfortable with your customers connecting to the primary replica, all you need to do is ensure that all logins are on all replicas which belong to the customer AG. This means that they will be able to connect to their listener for all read and write activities and the same security will be applied to them as what they currently are restricted to on the current principal/primary replica. Ensure that their connection strings are configured with the correct application intent.

If you do this as long as you are comfortable with the security on the principal, you should be equally as comfortable on the secondary(ies).

JasonDWilson77 on Fri, 17 Feb 2017 20:16:12


I apologize.  I have tried to reformat and clarify things a bit better hopefully...  Since my end customers come from different external networks that we have no knowledge or control over.  I don't want to give them direct access to the replicas.  I am almost thinking of just FTP-ing them a database backup of their db that we replace nightly, but hoping there is a better answer out there that doesn't risk security.

Hilary Cotter on Fri, 17 Feb 2017 20:47:00


I had to do this before. What we did was to backup the database and then put it on an ftp site for the customers to pick up.

Another option might be to use Azure and host the databases there. So they would go to the Azure db and use Azure to compartmentalize what they would access.

JasonDWilson77 on Mon, 20 Feb 2017 14:52:49


Thanks.  I have never used Azure beyond playing around with it on my MSDN account.  Is there a trial version or a getting started kind of tutorial that you know of that I could use to see if my customers want to try that route?  As always it always comes down to price and I have no idea where to even start to give them an idea of costs for that to see if they even want to explore it.

Ekrem Önsoy on Mon, 20 Feb 2017 15:12:52


Hello Jason,

How about Log Shipping? If you can locate a machine somewhere, such as Azure and feed (log ship) the databases with Transaction Logs that you've already been taking from your Always On AG databases then your customers may execute queries against this server. Log Shipping is fairly flexible, easy to manage and does not jeopardize the production environment.

I saw your post as a reply to Hilary about Azure. Azure has a trial period, you can give it a try. I'm not sure it the trial period / credit would be enough for you to try the setup. Fundamentaly, you'll need to set a site-to-site VPN connection between Azure and your on-prem and ship the logs to an Azure VM this way.

Adding another replica as a reporting server probably won't be a good idea. Because you never know how hard your customers will beat that server and if you locate it on Azure or some another cloud, it could somehow be disconnected because of the maintenances and problems and that could threaten your production environment.

Another totally different option could be disk cloning. If you have the adequate disk infrastructure, you could talk to your SAN guys and they could clone the database disks. So you could write some scripts coordinating with the SAN guys to bring your databases up daily.

JasonDWilson77 on Mon, 20 Feb 2017 16:05:42


Thanks Ekrem!  The log shipping to Azure is what I was thinking of exploring when I asked about it, but just don't know where to start with being able to give the sales folks a cost number associated with it for them to price it for our customers is my hangup to see if it is even worth exploring.

Ekrem Önsoy on Mon, 20 Feb 2017 16:13:11


I set it up for a client of mine as an off-site backup option. You can configure it and apply the transaction log files (restore) only onces a day, so the end users will not be disconnected while executing their reports during the day.

Also, it's really very easy to size up your VM when the configured one becomes short on resources for performance.