Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
icassiem
Post Patron
Post Patron

Large Volume of data broken up in chunks with multiple API Calls

Good day, 

 

I have a large dataset (1m records), we are trying to provide the data to a client via an API that has a param for orgID and areaID

The API only returns the dataset for the combination of orgID + areaID

"Me thinking out loud"

1. For powerbi to load all data, should i do a loop and meaning get a list of orgID + areaID and execute the API per combination and append the dataset > How ?

2. Not sure if Direct Mode can do this, as the API was developed on the bases of Lazy load similar to an app based on user selection fetches data for the selection - but this is not what i understand of pbi

 

Please what other options do i have, as loading the data in a storage layer is not an option as its built for external clients power users?

 

Please Any Help, Thanks

9 REPLIES 9
icassiem
Post Patron
Post Patron

Good day,

Any possible ideas, Loop, Paging etc?

Please Help

icassiem
Post Patron
Post Patron

Is lazy loading "like app" and paginated load "pages" possible?

What does direct query do, something i should explore?

Would exploring looping work?

I might overthink your case. it sounds you just need users access your large dataset(semantic model?) based on orgID and areaID when they run report/browse semantic model. If so, using Row Level Security should be the answer.

 

 

 

Hi @Mrquestionmark , There's no semantic layer, the users are external clients that needs to use the API to build their reports. The API was designed to only return data from operational/prod DB for a ordid + areaid but we need to build reports for an org with all its areas. I was hoping i could custom connect and loop the api call based on the list of areas, any ideas plz?

I do not know how api works in this case. Can you use dataflows to share your data? it is a neat solution 

 

you can ask Copilot for the details

 

dataflows in Power BI Service can be shared with external users, but there are specific steps and conditions that need to be met for this to be possible. Here’s a high-level overview of the process:

Azure Active Directory B2B: External sharing in Power BI is built upon Azure Active Directory (Azure AD) B2B (Business-to-Business). You need to invite external users as guest users in your Azure AD tenant1.

App Workspace: The external users must be added to an App Workspace in Power BI Service. You need to grant them access to the workspace where the dataflow is published.

Permissions: Ensure that the external users have the necessary permissions to view or edit the dataflow. Typically, they would need to be assigned the role of a Member or Contributor in the workspace2.

Power BI Licensing: Both the content sharer and the external user need to have appropriate Power BI licensing. This usually means both parties should have Power BI Pro licenses, or the content should be in a workspace that is in a Power BI Premium capacity3.

For a detailed guide and the latest updates on sharing Power BI content with external users, including dataflows, you should refer to the official Microsoft documentation1.

 

Sorry @icassiem , not much I can provide.

 

If you are using power bi premium, dataflows or datamart might help. 

😞 thank you @Mrquestionmark 

I am new to this lazy loading "like app" and paginated load "pages"

what does direct query do, something i should explore?

would exploring looping work?

Are you using Fabric? If so, as a data mesh solution, there is a feature to share the data

 

To provide datasets to external users, you can utilize the External Data Sharing feature in Microsoft Fabric. This feature allows Fabric users to share data from within their Fabric tenant with users in another Fabric tenant1. The data is shared “in-place” from OneLake storage locations, meaning that no data is actually copied to the other tenant2.

Here’s a high-level overview of how to enable and use this feature:

Enable External Data Sharing: Go to the tenant settings in the consuming tenant and find the “Users can accept external data shares (preview)” setting under export and sharing settings. Enable the toggle and specify who in the tenant can create external data shares3.
Create and Configure Domains: Organize your data into domains and subdomains to manage governance and control access based on business units or departments4.
Assign Workspaces to Domains: Assign workspaces to the relevant domains or subdomains in the admin portal5.
Share Data with External Users: Use the external data sharing feature to share the necessary datasets with users in another Fabric tenant1.
This approach allows for decentralized data ownership and governance, enabling each business unit to define its own rules and restrictions according to its specific business needs45. It’s a powerful way to collaborate across different Fabric tenants while maintaining control over your data. For a detailed guide, you can refer to the resources provided by Microsoft Fabric45213.

 

Hi, Thank You for assisting @Mrquestionmark 

Unfortunately No fabric, due to our landscape being AWS my reccomendation was Redshift but the thinking was to provide api only due to clients having different visualizers

 

Any other ideas please?

Helpful resources

Announcements
OCT PBI Update Carousel

Power BI Monthly Update - October 2024

Check out the October 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors