Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Vote for your favorite vizzies from the Power BI Dataviz World Championship submissions. Vote now!
Hi All - Trying to transform Lat/Long coordinates into Congressional Districts using R. I can get the code to work in R studio, but when using it in power query it returns an empty table. All packages I'm using are supported: https://docs.microsoft.com/en-us/power-bi/service-r-packages-support#r-packages-that-are-supported-i...
I suspect it has something to do with tigris package. Does anybody know what I might be doing wrong?
R Script:
# 'dataset' holds the input data for this script
geos<-dataset
#Assumes working from a CSV with two columns for Lat/Long and one with an ID
# Load the required packages.
require(rgdal)
require(sp)
require(maps)
require(tigris)
#Uses the tigris package to pull in the congressional districts
districts <- congressional_districts(cb = TRUE, resolution = '20m')
#converts the Latitude and Lontitude columns into a Geospatial Data Frame
coordinates(geos) <- c("Longitude", "Latitude")
#Sets Proj4Strings of geos to that of districts
proj4string(geos)<-proj4string(districts)
#determines which districts contain geos
inside.district <- !is.na(over(geos, as(districts, "SpatialPolygons")))
#Checks the fraction of geos inside a district
mean(inside.district)
#Takes the values for District and adds them to your geos data
geos$District <- over(geos,districts)$CD114FP
#Takes the values for State and adds them to your geos data
geos$State <- over(geos,districts)$STATEFP
#Exports the geos data
output <- geos
R Script ran R studio (successful):
#install packages
install.packages("rgdal")
install.packages("maps")
install.packages("tigris")
#Assumes working from a CSV with two columns for Lat/Long and one with an ID
# Load the required packages.
require(rgdal)
require(sp)
require(maps)
require(tigris)
setwd("C:/Users/dacha/Documents/R/Coordinates2politics")
#Uses the tigris package to pull in the congressional districts
districts <- congressional_districts(cb = TRUE, resolution = '20m')
#Upload data to a dataframe
geos<-read.csv("Codes.csv")
#converts the Latitude and Lontitude columns into a Geospatial Data Frame
coordinates(geos) <- c("Longitude", "Latitude")
#Sets Proj4Strings of geos to that of districts
proj4string(geos)<-proj4string(districts)
#determines which districts contain geos
inside.district <- !is.na(over(geos, as(districts, "SpatialPolygons")))
#Checks the fraction of geos inside a district
mean(inside.district)
#Takes the values for District and adds them to your geos data
geos$District <- over(geos,districts)$CD114FP
#Takes the values for State and adds them to your geos data
geos$State <- over(geos,districts)$STATEFP
#Exports the geos data to a CSV
write.csv(geos, "Districts.csv", row.names=FALSE)
Result:
Hi @Daniel122
The first R scriptis used in Power BI ,the second is used in R studio, right?
Based on my experience, to export data in Power BI to a csv/excel file, the R script should in this kind of format.
But your first R script seems to get data (data stored in a csv file) to the Power BI.
I need make a test,
Is the data showing on the first visual the original data which stored in a csv file or the one which needs to show in Power BI?
dd
Thanks for your response @v-juanli-msft
Yes, the first script is the one used in Power BI and the second is used in R studio. The first visual is the data that is stored in a CSV file. However, since it is already loaded into Power BI, isn't it already stored in a dataframe called 'dataset' ?
Vote for your favorite vizzies from the Power BI World Championship submissions!
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 7 | |
| 6 | |
| 4 | |
| 3 | |
| 3 |
| User | Count |
|---|---|
| 14 | |
| 14 | |
| 12 | |
| 7 | |
| 6 |