I created a new azure databricks workspace with these settings:
- vnet injection
- secure cluster connectivity
- front and backend private link
This is to setup a connection from databricks to our on-premise sql server.
The vnet that was injected with the databricks workspace was peered into our hub vnet that has connection to our on-premise network via azure vpn gateway. I also setup a DNS forwarding ruleset on the vnet. I confirmed connectivity to on-prem resources by creating a compute cluster and running the commands while connected to the compute cluster.
%sh
ping sql-serverpany.local
PING sql-serverpany.local (10.0.80.201) 56(84) bytes of data.
64 bytes from 10.10.80.201 (10.0.80.201): icmp_seq=1 ttl=126 time=28.9 ms
64 bytes from 10.10.80.201 (10.0.80.201): icmp_seq=2 ttl=126 time=27.4 ms
64 bytes from 10.10.80.201 (10.0.80.201): icmp_seq=3 ttl=126 time=27.4 ms
64 bytes from 10.10.80.201 (10.0.80.201): icmp_seq=4 ttl=126 time=27.1 ms
64 bytes from 10.10.80.201 (10.0.80.201): icmp_seq=5 ttl=126 time=27.9 ms
%sh
nc -zv sql-serverpany.local 1433
Connection to sql-serverpany.local (10.0.80.201) 1433 port [tcp/ms-sql-s] succeeded!
After confirming connectivity from the compute cluster to our on-prem sql server, I wanted to setup a connection in databricks to use the sql server via Catalog and add an connection. But I'm still getting cannot establish connection to our SQL server DB.