- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
accessing NetSuite using Suite Analytics Connect JDBC driver
We're using Suite Analytics Connect successfully - mostly with ODBC DSN installed on Gateway server. But I'm now trying to access NetSuite tables using pyspark in a Notebook. I learned from ChatGPT that I should probably use JDBC for this. I do know how to install and use the Suite Analytics Connect JDBC driver, since I'm using it for DBeaver access to NetSuite database. Am I on the right track, to follow directions (from ChatGPT) to use SuiteAnalytics Connect JDBC driver in my Fabric environment, so I can run queries against NetSuite tables from my Notebook? This will involved putting the jar driver file in our Lakehouse and accessing it from there in the Notebook code. Thanks.
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We submitted a ticket to Microsoft and this issue is solved. The learning that I got from ChatGPT was ultimately correct. However, my Notebook still didn't work due to a known issue with the spark session create method. The jar file for the JDBC driver just wasn't accessible by the notebook. The solution was to use %%configure, a Magic configuration feature, to add the driver jar file into the configuration/path.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi gibbyj ,
No documentation was searched for the moment, and the authenticity of Chatgpt's answer should be considered.
Best Regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
We submitted a ticket to Microsoft and this issue is solved. The learning that I got from ChatGPT was ultimately correct. However, my Notebook still didn't work due to a known issue with the spark session create method. The jar file for the JDBC driver just wasn't accessible by the notebook. The solution was to use %%configure, a Magic configuration feature, to add the driver jar file into the configuration/path.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying this JDBC approach; getting this error:

Helpful resources
Subject | Author | Posted | |
---|---|---|---|
01-03-2023 05:55 PM | |||
07-29-2024 08:07 PM | |||
08-06-2024 02:42 AM | |||
10-30-2024 10:41 AM | |||
12-12-2023 11:53 PM |
User | Count |
---|---|
55 | |
52 | |
49 | |
13 | |
11 |