Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Picci
Helper I
Helper I

Cannot attach an external CDM folder to a new dataflow

Hi everybody,

as Power BI Admin I configured Power BI Dataflow to save data in my ADLS2 store.

When I create Power BI Dataflow starting from Power Query in Power BI Service, the corresponding folders and files are generated in ADLS2.

But when I try to attach an external CDM folder to a new dataflow (as explained here: https://docs.microsoft.com/en-us/power-bi/service-dataflows-add-cdm-folder) I receive the following error

 

Something went wrong
There was a problem creating a new dataflow.
Please try again later or contact support. If you contact support, please provide these details.
Activity ID: d6173bad-b724-4604-a586-18f7506586f6
Request ID: f9bce339-1e57-5796-7b8e-d4600a035fcb
Correlation ID: 9da79dac-7eea-24c1-d7af-73d4c963a4f6
Status code: 500
Time: Tue Feb 19 2019 19:22:00 GMT+0100 (Central European Standard Time)
Version: 13.0.8383.201
Cluster URI: https://wabi-north-europe-redirect.analysis.windows.net

 

 

Thanks for helping,

Elisa

1 ACCEPTED SOLUTION
Picci
Helper I
Helper I

After some more trials, I managed to solve the issue following this steps

  1. Create the ADLS2 folder in Azure Storage Explorer
  2. BEFORE populating the folder, assign Read and Execute permission to that folder to the user who is going to attach the CDM folder to the dataflow in Power BI Service
  3. Execute the Databrick notebook so that it writes inside that folder (model.json and other subfolder inherit user permission)
  4. In Power BI Service Create à Dataflow à Attach an external CDM folder
  5. Paste CDM folder path, ending with <…..>/model.json
  6. It works!

 

Step 2 and 5 are not so clear from the documentation (https://docs.microsoft.com/en-us/power-bi/service-dataflows-add-cdm-folder) IMHO.

 

Thanks,

Elisa

View solution in original post

2 REPLIES 2
Picci
Helper I
Helper I

After some more trials, I managed to solve the issue following this steps

  1. Create the ADLS2 folder in Azure Storage Explorer
  2. BEFORE populating the folder, assign Read and Execute permission to that folder to the user who is going to attach the CDM folder to the dataflow in Power BI Service
  3. Execute the Databrick notebook so that it writes inside that folder (model.json and other subfolder inherit user permission)
  4. In Power BI Service Create à Dataflow à Attach an external CDM folder
  5. Paste CDM folder path, ending with <…..>/model.json
  6. It works!

 

Step 2 and 5 are not so clear from the documentation (https://docs.microsoft.com/en-us/power-bi/service-dataflows-add-cdm-folder) IMHO.

 

Thanks,

Elisa

Anonymous
Not applicable

I'm exploring using the attach CDM capability to transition our ETL to ADFv2 and then create in the CDM format so that it can be consumed in PBI. I'm trying to create my own .JSON and fold structure to test this out before going to databricks route. 

 

Has databricks and then attaching the CDM for PBI consumption worked well for you?

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.