Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

Linking R Script to Power BI, is there a limit to the number of rows to import?

is there a limit to the number of rows of data it can bring inside Power BI via R Script? When I limit the number of countries to 3, it is able to bring in the data into Power BI. However, if I do not limit, it gives the screen below:

tan_thiam_huat_0-1654239291244.png

And it keeps waiting.

When I run the whole R Script outside, its TotalData is only 252 rows with 17 variables. Is that too much for Power BI to handle? Though I can run the whole R Script outside, generate a CSV file, then bring that CSV file inside Power BI. I am wondering is there a limit?

R Script below:

 

# install.packages(c("httr", "jsonlite"))
# install.packages("rvest")
# https://power.larc.nasa.gov/data-access-viewer/
# https://power.larc.nasa.gov/api/temporal/monthly/point?parameters=T2M,T2MDEW,T2MWET,TS,T2M_MAX,T2M_MIN&community=RE&longitude=103.9690&latitude=1.3665&format=CSV&start=2019&end=2021

rm(list=ls()) # remove all variables
cat("\014")  # clear Console
if (dev.cur()!=1) {dev.off()} # clear R plots if exists

library(Hmisc)
library(httr)
library(jsonlite)
library(data.table)
library(rvest)
library(tibble)
library(tidyverse)
library(readr)

webpage <- read_html("https://developers.google.com/public-data/docs/canonical/countries_csv")
Countries <- html_nodes(webpage, "table") %>% html_table() %>% as.data.frame() %>% dplyr::rename(Country=name)
Countries$country <- NULL

selected <- c("Singapore","Malaysia","Japan","Australia","China","Switzerland","Spain","United Kingdom","Italy",
              "South Korea","North Korea","Nepal","Russia","Ukraine")
Countries_selected <- Countries %>% dplyr::filter(Country %in% selected)

res_LongLat <- function(Long,Lat)
{
  data = readr::read_delim(
    paste0(
          "https://power.larc.nasa.gov/api/temporal/monthly/point",
          "?parameters=T2M,T2MDEW,T2MWET,TS,T2M_MAX,T2M_MIN",
          "&community=RE",
          "&longitude=",Long,
          "&latitude=",Lat,
          "&format=CSV",
          "&start=2019",
          "&end=2021"
    ),
    skip = 14,
    delim = ","
  )
  
  data$ANN <- NULL
  return(data)
}

total_data <- list()
for (i in 1:nrow(Countries_selected)){
  data <- res_LongLat(Countries_selected$longitude[i],Countries_selected$latitude[i])
  
  Country <- rep(Countries_selected$Country[i],nrow(data)) %>% as.data.frame()
  Longitude <- rep(Countries_selected$longitude[i],nrow(data)) %>% as.data.frame()
  Latitude <- rep(Countries_selected$latitude[i],nrow(data)) %>% as.data.frame()
  colnames(Country) <- "Country"
  colnames(Longitude) <- "Longitude"
  colnames(Latitude) <- "Latitude"
  CountryLongLat <- cbind(cbind(Country,Longitude),Latitude)
  CountryLongitudeLatitude_data <- cbind(CountryLongLat,data)
  total_data[[i]] <- CountryLongitudeLatitude_data 
}

totaldata_DF <-lapply(total_data,data.frame)
TotalData  <- do.call("rbind", totaldata_DF)

 

 

 

7 REPLIES 7
Syndicate_Admin
Administrator
Administrator

Were you able to fix your problem? Something similar happens to me. I have a script that creates a helper table for each month that passes and in the end it pastes all those helper tables into a single table that is the one that I import into Power BI. It worked perfectly in R script but after a certain number of months it is "hanging" in the Power BI. I'm going to make it more of a memory problem.

Best regards

v-henryk-mstf
Community Support
Community Support

Hi @Anonymous ,

 

Does it work to try to import complete data by using an external R IDE?

vhenrykmstf_0-1654568129999.png

More details, you can read related document


If the problem is still not resolved, please provide detailed error information and let me know immediately. Looking forward to your reply.


Best Regards,
Henry


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Anonymous
Not applicable

Hi Henry, thanks for following up. Yes, exactly I follow the settings which you share in the link. My R and IDE are set in the PBI. I still face the same issue. I am not sure if that dataset is too huge for PBI to handle, but it should not be.

lbendlin
Super User
Super User

That's the least of your problems.   You can ingest millions of rows and hundreds of columns.

 

But the R support in Power BI is limited to a small subset of packages, especially if you plan to do a dataset refresh in the service.

Anonymous
Not applicable

But the R support in Power BI is limited to a small subset of packages --> so the issue lies in some of the R packages which is not able to bring inside R in Power BI? why would it has that limitation? Because it is just linked to our installed R in our PC, isn't it?

 

I doubt that is the reason which stops the R script from running, because if I limit its dataset, the whole R script is able to run successfully.

Anonymous
Not applicable

thanks for the link above... so all my R packages are supported. So what is the issue which causes "252 rows with 17 variables" not able to run inside Power BI?

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors