Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
andrewmvu41
New Member

Question on data limitation

I am in the beginning process of developling a data governance process at work. Does it make sense to require everyone who publishes a report to go through the PowerBI pipeline deployment process? I am skeptical because we currently already have an ecosystem of 140 workspaces. About 20% currently use the pipeline deployment process. If we create a governance that requires everyone to go through the pipeline deployment process, this would result in 300+ workspaces. My concerns are does this become to monstrous to manage? What is the best practice? and will I run into data storage issues? I know premium capacity allows us to have up to 100TB worth of data. The 1000 dataset limitation also scares me. What does this mean for pipeline deployment?Power BI.JPG

 

1 REPLY 1
v-luwang-msft
Community Support
Community Support

Hi @andrewmvu41 ,

The following article tells deployment pipelines best practices,you could refer it:

https://learn.microsoft.com/en-us/power-bi/create-reports/deployment-pipelines-best-practices 

 

 

Best Regards

Lucien

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.