Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hello, I'm creating a custom visual that reads data in from a table (csv file). My table has three columns, and each column has 59,000 rows. I am reading the rows in via TypeScript, but the latest version of Power BI desktop says "Too many values. Not showing all data. Click to see details". Publishing the report to a webpage shows that only 1,000 rows were read in because my console debug output says rows.length = 1000. I thought the rows limit was 30,000? Any thoughts on this? Thanks.
Solved! Go to Solution.
Based on my research, you need to specify dataReductionAlgorithm in capabilities.json. It describes how to reduce the amount of data exposed to the visual.
https://github.com/Microsoft/PowerBI-visuals/issues/44
Hi @RichardL,
I understand you're using TypeScript, but how is Power BI consuming it ? Via 'web' http ? In other words, what do you do when you select 'Get Data' ?
We have encoded data in JSON using python then into Power BI successfully Yes, we have had issues with maximum rows, but that has also been a server side limitation rather than Power BI. However, I have not see your error message before.
Cheers,
D
Hi @djnww
I'm consumimg the data via 'web' http. The same row limit exists on both the web and in Power BI desktop. I see the limit of 1000 rows when I press F12 in Chrome or Edge to view console output.
Based on my research, you need to specify dataReductionAlgorithm in capabilities.json. It describes how to reduce the amount of data exposed to the visual.
https://github.com/Microsoft/PowerBI-visuals/issues/44
For a newbie.
How/where do you edit capabilities.json? where is capabilities located?
capabilities.json is located in the root folder of your project. Did you look?
Hello,
Thanks for your reply.
I did look. Im not sure where the root folder of the project is located or where is should look for that.
It is not in the folder where the power bi project file is stored.
Am I maybe misunderstanding something.
I am just importing data into Power BI using query ...
let
Source = Json.Document(Web.Contents("https://issues.companyname.com/rest/api/2/search?jql=project=PR&maxResults=3000")),
issues = Source[issues],
So perhaps this is not a project.. Like I said in the previous message I am a newbie 🙂
Is there a way to include more search results for a just a query?
Specifying dataReductionAlgorithm in capabilities.json did the trick. Thanks a bunch. I can now read in 30K rows. My dataViewMappings now looks like this:
"dataViewMappings": [
{
"table": {
"rows": {
"select": [
{ "for": { "in": "values" } }
],
"dataReductionAlgorithm": { "sample": { "count": 64000 } }
},
"rowCount": { "preferred": { "min": 2, "max": 100000 }, "supported": { "min": 1, "max": 100000 } }
}
}
],
This is good progress. However, almost half of my table still gets truncated. Didn't the Power BI team say they were raising the 30K rows limit?
Hi @RichardL
In terms of row limits for a CSV, Power BI Desktop is only limited by the amount of memory that your machine has.
We currently load millions of rows of data into Power BI on a weekly basis without any issues.
Are you actually reading a physical CSV file , or is Power BI trying to read a file in TypeScript ?
Cheers,
Dan
Hi @djnww
My 64-bit machine has 32 GB of memory and only uses around 8 GB when I'm running Power BI Desktop.
I'm reading a physical CSV file from disk in TypeScript. In my update(options: VisualUpdateOptions) method, I have
var dataView = options.dataViews[0];
var rows: DataViewTableRow = dataView.table.rows;
console.log('rows length: ', rows.length); // This would print out 1000
dataViewMapppings is defined as followed in capabilities.json:
"dataRoles": [
{
"displayName": "Values",
"name": "values",
"kind": 2
}
],
"dataViewMappings": [
{
"table": {
"rows": {
"for": {
"in": "values"
}
}
}
}
],
Do you see anything out of the ordinary here? Thanks.