AAD authentication with password throws java.lang.NoClassDefFoundError: com/microsoft/aad/adal4j/AuthenticationException #28
Comments
Same error in my case |
Syed, |
@beloblotskiy Thanks. How did your |
We're using Databricks "Copy Data" now, it supports it out of the box. |
we get the same exception |
@allenwux @jiayuehu @CarlRabeler @kpanwar @zeqicui A colleague of mine and I are also getting the same issue. Cluster Information
Code Snippet import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._
val config = Config(Map(
"url" -> "<sqlsvrname>.database.windows.net",
"databaseName" -> "<sqldbname>",
"dbTable" -> "dbo.Bitcoin",
"user" -> "<username>",
"password" -> "<password>",
"authentication" -> "ActiveDirectoryPassword",
"encrypt" -> "true",
"ServerCertificate" -> "false",
"hostNameInCertificate" -> "*.database.windows.net"
))
val collection = sqlContext.read.sqlDB(config)
collection.show() Error Message |
Any updates on this? We are running into similar issue - pretty much the same config code like in previous post #28 (comment) @beloblotskiy - can you share config how it worked with service principal? I have tried with both user name/password and service principal - I get the same error in either case. I have verified SPN by using SSMS to ensure connection works and same with azure data factory as well(both of them work and allow me to preview data). Thanks in advance, |
We are getting this same error using This issue seems to not be specific to azure-sqldb-spark Is this an incompatibility between adal4j and mssql-jdbc? Is there a specific authentication exception that is causing this issue or is it a catch all problem for AAD auth exceptions? |
@Mike-Ubezzi-MSFT please assign to @VanMSFT |
Following up with @GithubMirek and @allenwux Referenced article: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-spark-connector |
We have a workaround. It is lengthy and a bit complex, but it is verified. Hopefully it would unblock some of you. Root causeFor AAD auth to work correctly, some extra JARs are needed. In addition, since the Azure Databricks runtime already ships an older version the MSSQL JDBC driver, we need to replace that with a newer version first. High-level steps to fixConceptually, the steps are:
Updated repro scriptBefore I proceed, I’d also like to share my minimal repro script, as this has nothing to do with the Spark connector for SQL. It is JDBC/ ADAL specific. So use the repro below to test: import java.sql.Connection; val ds = new SQLServerDataSource(); val connection = ds.getConnection(); val rs = stmt.executeQuery("SELECT SUSER_SNAME()"); Detailed stepsStep 1 is to use an init script to delete the default MSSQL JDBC driver that ships with the Databricks runtime. I put the below into a shell script (call it deletemssqljdbc.sh): rm /databricks/jars/*mssql* deletemssqljdbc.sh is then copied to dbfs using Databricks CLI: dbfs cp ./deletemssqljdbc.sh dbfs:/ Step 2 is to copy the appropriate JARs from a local machine to the Databricks workspace. I create a new folder under DBFS to copy these to: dbfs mkdirs dbfs:/FileStore/adaljars then copy: dbfs cp dbfs:/FileStore/adaljars/ Step 3 is to install these as libraries and attach to cluster. Important: these versions are specific to the 6.4.0 release of the MSSQL JDBC driver. For higher versions, the versions of these dependency JARs will most likely be different. As mentioned previously, follow the steps at the JDBC driver Wiki if you intend to use a different JDBC driver version other than 6.4.0, in order to ensure that the dependencies are correct. databricks libraries install --cluster-id --jar dbfs:/FileStore/adaljars/mssql-jdbc-6.4.0.jre8.jar Restart the cluster, and your AAD authentication code will now work. |
Click here to download a working notebook. Create a Databricks ClusterKnown working configuration - Databricks Runtime 5.2 (includes Apache Spark 2.4.0, Scala 2.11) Install the Spark Connector for Microsoft Azure SQL Database and SQL Server
Known working version - com.microsoft.azure:azure-sqldb-spark:1.0.2 Update VariablesUpdate variable values (custerName, server, database, table, username, password) Run the Initialisation command (ONCE ONLY)This will do the following:
Bash Script Commands:
Dependencies:
Restart the Databricks ClusterThis is needed to execute the init script. Run the last cell in this NotebookThis will test the ability to connect to an Azure SQL Database via Active Directory authentication. Init Command // Initialisation
// This code block only needs to be run once to create the init script for the cluster (file remains on restart)
// Get the cluster name
var clusterName = dbutils.widgets.get("cluster")
// Create dbfs:/databricks/init/ if it doesn’t exist.
dbutils.fs.mkdirs("dbfs:/databricks/init/")
// Create a directory named (clusterName) using Databricks File System - DBFS.
dbutils.fs.mkdirs(s"dbfs:/databricks/init/$clusterName/")
// Create the adal4j script.
dbutils.fs.put(s"/databricks/init/$clusterName/adal4j-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/adal4j-1.6.0.jar http://central.maven.org/maven2/com/microsoft/azure/adal4j/1.6.0/adal4j-1.6.0.jar
wget --quiet -O /mnt/jars/driver-daemon/adal4j-1.6.0.jar http://central.maven.org/maven2/com/microsoft/azure/adal4j/1.6.0/adal4j-1.6.0.jar""", true)
// Create the oauth2 script.
dbutils.fs.put(s"/databricks/init/$clusterName/oauth2-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/oauth2-oidc-sdk-5.24.1.jar http://central.maven.org/maven2/com/nimbusds/oauth2-oidc-sdk/5.24.1/oauth2-oidc-sdk-5.24.1.jar
wget --quiet -O /mnt/jars/driver-daemon/oauth2-oidc-sdk-5.24.1.jar http://central.maven.org/maven2/com/nimbusds/oauth2-oidc-sdk/5.24.1/oauth2-oidc-sdk-5.24.1.jar""", true)
// Create the json script.
dbutils.fs.put(s"/databricks/init/$clusterName/json-smart-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/json-smart-1.1.1.jar http://central.maven.org/maven2/net/minidev/json-smart/1.1.1/json-smart-1.1.1.jar
wget --quiet -O /mnt/jars/driver-daemon/json-smart-1.1.1.jar http://central.maven.org/maven2/net/minidev/json-smart/1.1.1/json-smart-1.1.1.jar""", true)
// Create the jwt script.
dbutils.fs.put(s"/databricks/init/$clusterName/jwt-install.sh","""
#!/bin/bash
wget --quiet -O /mnt/driver-daemon/jars/nimbus-jose-jwt-7.0.1.jar http://central.maven.org/maven2/com/nimbusds/nimbus-jose-jwt/7.0.1/nimbus-jose-jwt-7.0.1.jar
wget --quiet -O /mnt/jars/driver-daemon/nimbus-jose-jwt-7.0.1.jar http://central.maven.org/maven2/com/nimbusds/nimbus-jose-jwt/7.0.1/nimbus-jose-jwt-7.0.1.jar""", true)
// Check that the cluster-specific init script exists.
display(dbutils.fs.ls(s"dbfs:/databricks/init/$clusterName/")) Test Command // Connect to Azure SQL Database via Active Directory Password Authentication
import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._
// Get Widget Values
var server = dbutils.widgets.get("server")
var database = dbutils.widgets.get("database")
var table = dbutils.widgets.get("table")
var username = dbutils.widgets.get("user")
var password = dbutils.widgets.get("password")
val config = Config(Map(
"url" -> s"$server.database.windows.net",
"databaseName" -> s"$database",
"dbTable" -> s"$table",
"user" -> s"$username",
"password" -> s"$password",
"authentication" -> "ActiveDirectoryPassword",
"encrypt" -> "true",
"ServerCertificate" -> "false",
"hostNameInCertificate" -> "*.database.windows.net"
))
val collection = sqlContext.read.sqlDB(config)
collection.show() |
Connection reset ClientConnectionId:7e49a51f-fb35-48b9-88f3-ac6031f81e02 Getting this issue on the above code. |
This guy pulled it off using a service principal. Just tried it and it works. |
How do we fix this? Also is AAD Password Auth supported for SQL on IaaS domain joined VM? |
Due to recent maven repo changes- https://central.sonatype.org/articles/2019/Apr/30/http-access-to-repo1mavenorg-and-repomavenapacheorg-is-being-deprecated/ , the init script would need to be changed to use "https://repo1.maven.org/" instead of "http://central.maven.org". |
So I just was able to get all this working except that my org uses MFA and i get this error... I can use Service Principles for automation, but what about ad-hoc user queries? |
I'm having the same issue, with a Python notebook. Is there an easy workaround that doesn't involve installing by hand lots of jars? |
Actually using the same INIT script works with Python using JDBC. Unfortunately MFA isn't supported yet. Please provide feedback here
|
We were able to get this working in our development environment. Unfortunately, deploying the same code in a UAT environment, it is failing with the error below. We triple checked the credentials and confirmed all access levels are working as expected. Cannot figure out why we are getting this error. Any ideas? An error occurred while calling o1606.save.
|
@arvindshmicrosoft Not sure what am I missing! Thanks in advance my databricks run time version is 6.5 (includes Apache Spark 2.4.5, Scala 2.11) %scala val config = Config(Map( "user" -> "", "password" -> "", "authentication" -> "ActiveDirectoryPassword", "encrypt" -> "true", "hostNameInCertificate" -> "*.database.windows.net" )) val collection = spark.read.sqlDB(config) |
I missed to point this script in the init script of the cluster. Good catch! I think this works, however I am having another error now which should be related to my permission By any chance, @aravish do you have an idea on the below exception which I get.. com.microsoft.sqlserver.jdbc.SQLServerException: Failed to authenticate the user emailid@domain.com in Active Directory (Authentication=ActiveDirectoryPassword). Caused by: java.util.concurrent.ExecutionException: com.microsoft.aad.adal4j.AuthenticationException: Server returned error in RSTR - ErrorCode: FailedAuthentication : FaultMessage: MSIS7068: Access denied. This is the same error which I get when I try to log in using SSMS (ActiveDirectoryPassword), despite my email id being the active directory administrator for the data warehouse I suspect me being outside my company VPN is the issue which is not able to authenticate via ActiveDirectory Password method.. |
I am having the exact same issue. Any solution to this would be greatly appreciated. |
Hello,
Do you know if MFA is enabled in the AD Tenant? Could you please check with your Azure admin?
From: ramziharb <notifications@github.com>
Sent: Monday, June 8, 2020 3:27 PM
To: Azure/azure-sqldb-spark <azure-sqldb-spark@noreply.github.com>
Cc: Arvind Ravish <Arvind.Ravish@microsoft.com>; Mention <mention@noreply.github.com>
Subject: Re: [Azure/azure-sqldb-spark] AAD authentication with password throws java.lang.NoClassDefFoundError: com/microsoft/aad/adal4j/AuthenticationException (#28)
By any chance, @aravish<https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Faravish&data=02%7C01%7CArvind.Ravish%40microsoft.com%7Cf3f09d748de7482e433308d80bea4488%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637272448088514787&sdata=Wqkp6mwbTqlP7F3CYsK4sqH0UB9SJGzx74hbkx15kUo%3D&reserved=0> do you have an idea on the below exception which I get..
com.microsoft.sqlserver.jdbc.SQLServerException: Failed to authenticate the user emailid@domain.com<mailto:emailid@domain.com> in Active Directory (Authentication=ActiveDirectoryPassword).
Caused by: java.util.concurrent.ExecutionException: com.microsoft.aad.adal4j.AuthenticationException: Server returned error in RSTR - ErrorCode: FailedAuthentication : FaultMessage: MSIS7068: Access denied.
I am having the exact same issue. Any solution to this would be greatly appreciated.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub<https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2FAzure%2Fazure-sqldb-spark%2Fissues%2F28%23issuecomment-640867341&data=02%7C01%7CArvind.Ravish%40microsoft.com%7Cf3f09d748de7482e433308d80bea4488%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637272448088524744&sdata=wGDijI4YHkcDfoIxC9ymnHPLnirIzIbq3jiPNC%2BzD3Y%3D&reserved=0>, or unsubscribe<https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FAKPNRK5BLTDSH76MBV7B4GDRVVCQNANCNFSM4GQIOFMQ&data=02%7C01%7CArvind.Ravish%40microsoft.com%7Cf3f09d748de7482e433308d80bea4488%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637272448088524744&sdata=mNu0hCmLHVxz1wcm659a8RzoLOkULxTF2CUePuDG2c8%3D&reserved=0>.
|
Hello,
Do you know if MFA is enabled in the AD Tenant? Could you please check with your Azure admin?
|
Kamalesh,
Unfortunately MFA isn’t supported yet as mentioned on the github bug previously.
Please provide feedback below
https://feedback.azure.com/
|
Hey @aravish - no, MFA is not enabled. We are able to get this working in the US East region under the same tenant/active directory. When we are trying on the UNC region, it is failing. |
@ramziharb, |
@aravish - I meant US North Central region. |
@aravish Thanks, but shouldn't I still be able to login using active directory password instead of mfa? Is it either this or that? Because in SSMS if I am inside the company vpn I can login using just password. Only when I am outside company network I get the same error like what I get using databricks. |
For anyone happening upon this thread, let me save you some trouble. The solution above of using the cluster init scripts to remove the jars is a correct approach. But then do yourself a favor, load the latest mssql driver and adal4j packages and specify the driver class in the connection properties. Works as expected :) |
@Kamalesh54 is your Azure SQL connected to a vnet and you're getting prompts you don't expect? If so, even if it's connected to a vnet whose egress is exempted in your conditional access policy(ies) or is white listed at the MFA service: the SQL instance' requests will never appear to be coming from the egress IP(s) you expect. Connections then fail because the user either needs to enroll in MFA or needs to use MFA. If you exempt any accounts you need to connect without MFA from your MFA policy(ies), it will work as expected. |
Hey @thereverand Thanks for your response. My issue as you stated is because I have the MFA enabled in the account and couldn't exempt it due to organizational policies. So once I can get MFA exempted, this approach should work. |
We finally solved the issue we were having which turned out to be a firewall issue. Now, we are facing this issue on a sporadic basis. It fails on some tries and works on others. Rerunning the same notebooks sometimes works and sometimes not. Checking to see if anyone else is facing similar issues? |
I am not running on DataBricks but experienced the same issue: In my case it was due to hadoop including a very old version of the mssql driver. To fix it:
This is probably the same jar DataBricks includes by default which is why rm /databricks/jars/mssql fixes it. |
This issue has collected enough anecdotal evidence and guidance over time. Since it is not specific to the connector, but instead dependent on the associated compute runtime environment, I am closing this issue as there is no action expected or planned around this. |
The ActiveDirectoryPassword auth method throws
java.lang.NoClassDefFoundError: com/microsoft/aad/adal4j/AuthenticationException
Trace:
The text was updated successfully, but these errors were encountered: