Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
prabhatnath
Advocate III
Advocate III

How to Ingest Data from SSAS Cubes in a PySpark Notebook

Hello Friends,

 

Currently we are ingesting data from SSAS Cubes into our Fabric Lakehouse using Dataflow gen2 by writing the DAX query and performing some transformations inside th Dataflow gen2. For Authentication we have our Service Account which is authorized on Cube side. Is that possible to achive the same from a Notebook as it is more managable and git supported as well.

 

Please suggest.

Thanks,

Prabhat

2 ACCEPTED SOLUTIONS
Anonymous
Not applicable

Hi @prabhatnath,

Thank you for reaching out in Microsoft Community Forum.

Yes, it is possible to ingest data from SSAS cubes into Fabric Lakehouse using Notebooks. This approach offers better manageability, version control with Git, and automation capabilities.

Please follow below steps To ingest data from SSAS Cubes into Fabric Lakehouse using Notebooks;

1.Install pyodbc for connecting to SSAS:
%pip install pyodbc

2.Use the service account credentials:
conn_str = f'DRIVER={{ODBC Driver 17 for SQL Server}};SERVER=<SSAS_SERVER>;DATABASE=<CUBE_NAME>;UID=<SERVICE_ACCOUNT>;PWD=<PASSWORD>'
conn = pyodbc.connect(conn_str)

3.Run the DAX query and load the results into a DataFrame:
dax_query = "EVALUATE 'Sales'"
cursor = conn.cursor()
cursor.execute(dax_query)
rows = cursor.fetchall()

4.Convert the DataFrame to Spark DataFrame and save it to Lakehouse:
df_spark = spark.createDataFrame(df)
df_spark.write.format("delta").mode("overwrite").save("Tables/SSAS_Data")

Please continue using Microsoft community forum.

If you found this post helpful, please consider marking it as "Accept as Solution" and give it a 'Kudos'. if it was helpful. help other members find it more easily.

Regards,
Pavan.

View solution in original post

Anonymous
Not applicable

Hi @prabhatnath,

Thank you for reaching out in Microsoft Community Forum.

Please follow below to Secure Authentication Without Storing Passwords;

1.Use Azure Key Vault for Secure Storage;
Store the Service Account credentials (username and password) in Azure Key Vault and Retrieve the credentials securely at runtime in the notebook.

from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient

# Connect to Key Vault
key_vault_url = "https://<YOUR-KEYVAULT-NAME>.vault.azure.net"
credential = DefaultAzureCredential()
client = SecretClient(vault_url=key_vault_url, credential=credential)

# Retrieve credentials
username = client.get_secret("SSAS-Username").value
password = client.get_secret("SSAS-Password").value

2.Use Service Principal with Managed Identity
Enable Managed Identity for your notebook in Fabric Admin Portal and Authenticate without storing credentials by using the Managed Identity token.

from azure.identity import ManagedIdentityCredential
import pyodbc

# Authenticate using Managed Identity
credential = ManagedIdentityCredential()
token = credential.get_token("https://database.windows.net/.default").token

# Connect to SSAS
conn_str = f'DRIVER={{ODBC Driver 17 for SQL Server}};SERVER=<SSAS_SERVER>;DATABASE=<CUBE_NAME>;Authentication=ActiveDirectoryMsi;Token={token}'
conn = pyodbc.connect(conn_str)

Please continue using Microsoft community forum.

If you found this post helpful, please consider marking it as "Accept as Solution" and give it a 'Kudos'. if it was helpful. help other members find it more easily.

Regards,
Pavan.

View solution in original post

6 REPLIES 6
Anonymous
Not applicable

Hi @prabhatnath,

I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, kindly "Accept  as  Solution" and give it a 'Kudos' so others can find it easily.

Thank you,
Pavan.

Anonymous
Not applicable

Hi @prabhatnath,

I wanted to follow up since we haven't heard back from you regarding our last response. We hope your issue has been resolved.
If the community member's answer your query, please mark it as "Accept as Solution" and select "Yes" if it was helpful.
If you need any further assistance, feel free to reach out.

Please continue using Microsoft community forum.

Thank you,
Pavan.

Anonymous
Not applicable

Hi @prabhatnath,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please "Accept  as  Solution" and give a 'Kudos' so other members can easily find it.

Thank you,
Pavan.

Anonymous
Not applicable

Hi @prabhatnath,

Thank you for reaching out in Microsoft Community Forum.

Yes, it is possible to ingest data from SSAS cubes into Fabric Lakehouse using Notebooks. This approach offers better manageability, version control with Git, and automation capabilities.

Please follow below steps To ingest data from SSAS Cubes into Fabric Lakehouse using Notebooks;

1.Install pyodbc for connecting to SSAS:
%pip install pyodbc

2.Use the service account credentials:
conn_str = f'DRIVER={{ODBC Driver 17 for SQL Server}};SERVER=<SSAS_SERVER>;DATABASE=<CUBE_NAME>;UID=<SERVICE_ACCOUNT>;PWD=<PASSWORD>'
conn = pyodbc.connect(conn_str)

3.Run the DAX query and load the results into a DataFrame:
dax_query = "EVALUATE 'Sales'"
cursor = conn.cursor()
cursor.execute(dax_query)
rows = cursor.fetchall()

4.Convert the DataFrame to Spark DataFrame and save it to Lakehouse:
df_spark = spark.createDataFrame(df)
df_spark.write.format("delta").mode("overwrite").save("Tables/SSAS_Data")

Please continue using Microsoft community forum.

If you found this post helpful, please consider marking it as "Accept as Solution" and give it a 'Kudos'. if it was helpful. help other members find it more easily.

Regards,
Pavan.

Thank you for the reply on this.

Having the password on Notebook file is risk. So is there a way it can be based on the owner credential of the notebook file where the passwor need not be stored in the Notebook itself? Please advise.

 

Thanks,

Prabhat

Anonymous
Not applicable

Hi @prabhatnath,

Thank you for reaching out in Microsoft Community Forum.

Please follow below to Secure Authentication Without Storing Passwords;

1.Use Azure Key Vault for Secure Storage;
Store the Service Account credentials (username and password) in Azure Key Vault and Retrieve the credentials securely at runtime in the notebook.

from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient

# Connect to Key Vault
key_vault_url = "https://<YOUR-KEYVAULT-NAME>.vault.azure.net"
credential = DefaultAzureCredential()
client = SecretClient(vault_url=key_vault_url, credential=credential)

# Retrieve credentials
username = client.get_secret("SSAS-Username").value
password = client.get_secret("SSAS-Password").value

2.Use Service Principal with Managed Identity
Enable Managed Identity for your notebook in Fabric Admin Portal and Authenticate without storing credentials by using the Managed Identity token.

from azure.identity import ManagedIdentityCredential
import pyodbc

# Authenticate using Managed Identity
credential = ManagedIdentityCredential()
token = credential.get_token("https://database.windows.net/.default").token

# Connect to SSAS
conn_str = f'DRIVER={{ODBC Driver 17 for SQL Server}};SERVER=<SSAS_SERVER>;DATABASE=<CUBE_NAME>;Authentication=ActiveDirectoryMsi;Token={token}'
conn = pyodbc.connect(conn_str)

Please continue using Microsoft community forum.

If you found this post helpful, please consider marking it as "Accept as Solution" and give it a 'Kudos'. if it was helpful. help other members find it more easily.

Regards,
Pavan.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.