Unable to connect Azure SQL DW using Databricks having JDBC connection.

Category: azure sql data warehouse

Question

PriyaJha on Thu, 18 Apr 2019 12:36:05


Hi folks,

I am trying to connect Azure data warehouse using databricks having JDBC connection, this log-in is using SQL Server Authentication. 

I am receiving the following error:

Error in SQL statement: SqlDWSideException: SQL DW failed to execute the JDBC query produced by the connector. Underlying SQLException(s): - com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'XXX'. ClientConnectionId:XXXX [ErrorCode = 18456] [SQLState = S0001]

Kindly post suggestions for the above issue.

Regards,

Priya Jha

Replies

Mike Ubezzi (Azure) on Fri, 19 Apr 2019 00:36:47


Hi Priya,

Please post your entire connection string minus any sensitive information such as password. 

In the Azure Portal for your database, there is a Connection Strings blade that details the correctly formatted connection string for the SQL Admin. YOu just need to add the password. You also have the link for the correct JDBC driver.

Please try to connect to your database as the SQL Admin with the information available in the portal. This is to establish a baseline. Please ensure you have also added the client/app public IP address to the Allow List in the firewall for the Azure SQL Server instance where you DWH is deployed.

PriyaJha on Fri, 19 Apr 2019 06:51:57


Hi Mike,

Thanks for your response.

I provided the JDBC connection string from the azure portal and please find the query that i am using to connect to ADW through Azure Databricks:

CREATE TABLE ReadDataTypeofWork
USING com.databricks.spark.sqldw
OPTIONS (
  url 'jdbc:sqlserver://XYZ.database.windows.net:1433;database=ABC;user=XXX@XYZ;password=XXXXX;encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30',
  forwardSparkAzureStorageCredentials 'true',
  dbTable 'dbo.sample',
  tempDir 'wasbs://XXX@XXXXX.blob.core.windows.net/Test'
);

I also tried using other languages provided by Azure Databricks like Python and Scala, but getting the same error.

I am able to connect to ADW through SSMS using the same details provided above.

And Allow access to Azure services is also enabled in the azure portal.

Regards,

Priya jha

Mike Ubezzi (Azure) on Fri, 19 Apr 2019 21:48:09


Can you please connect to your Azure SQL Database with the connection string detailed in the Azure Portal. Can you please test this? This is to verify that you can connect to the SQL Database with the credentials configured in Databricks. This would be a test you would need to perform outside of Databricks by setting up a basic java client and passing your connection string found in the Azure Portal. 

If you are unable to perform this test, then you should be able to login to your Azure SQL DWH instance with SSMS and the credentials being used in Databricks. Again, just a test to isolate this issue. Can the user login?

If the user can login, can the user browse to the `dbo.sample` table?

Also, you need to ensure that `Allow access to Azure services` is enabled on the Azure Storage account. 

I suspect that your connection string is breaking at this point:

forwardSparkAzureStorageCredentials 'true',
  dbTable 'dbo.sample',
  tempDir 'wasbs://XXX@XXXXX.blob.core.windows.net/Test'

SQL DW to Azure Blob Storage

I don't see any reference to these specific parameters but, I suspect that you might just need to add `=` to your connection string for thise values. Also, what is the default database (ABC?).

CREATE TABLE ReadDataTypeofWork
USING com.databricks.spark.sqldw
OPTIONS (
  url 'jdbc:sqlserver://XYZ.database.windows.net:1433;
	database=ABC;
	user=XXX@XYZ;
	password=XXXXX;
	encrypt=true;
	trustServerCertificate=false;
	hostNameInCertificate=*.database.windows.net;
	loginTimeout=30;
  	forwardSparkAzureStorageCredentials=true;
  	dbTable=dbo.sample;
  	tempDir=wasbs://XXX@XXXXX.blob.core.windows.net/Test'
);

Databricks Azure SQL Documentation: Connecting to SQL Databases using JDBC

I hope I have provided some steps to isolate and identify the root cause. Thanks, ~Mike