March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I need to import some csv files from aws s3 into power bi. Below is the python script I try to use. I have received API url and token from another colleague, how can I complete the bucket and key variable?
********************************************************************************************
import boto3
import pandas as pd
import io
bucket="<input bucket name>"
key="<file name>"
s3 = boto3.client('s3')
f = s3.get_object(Bucket=bucket, Key=key)
shape = pd.read_csv(io.BytesIO(f['Body'].read()), header=0, index_col=0)
shape = shape.apply(lambdax: x.fillna(0))
print(shape)
Solved! Go to Solution.
Thanks, @amitchandak . will go through the tutorials when I get some time.
I talked to our AWS admin and made some change with the python script, it works now. (see script below).
limitations of python script connector:
1. this import mode is slow, hence I can only import a small csv file. If there is a small error with the csv file, the import query will fail. Is there a way to ignore csv reading errors?
2. to import multiple files from the s3 bucket. I need to write a loop function in python script
*****************************************
import boto3
import pandas as pd
import io
import os
my_bucket_name="xx"
my_file_path="xx.csv"
my_key="xx"
my_secret="xx"
session=boto3.Session(aws_access_key_id=my_key,aws_secret_access_key=my_secret)
s3Client=session.client("s3")
f = s3Client.get_object(Bucket=my_bucket_name, Key=my_file_path)
aws_data = pd.read_csv(io.BytesIO(f['Body'].read()), header=0, index_col=0)
print(aws_data)
@Jeanxyz , Hope you are using Python script as source
https://towardsai.net/p/cloud-computing/how-we-connected-amazon-s3-to-microsoft-powerbi-in-5-minutes
How to make Python work with Power BI- https://youtu.be/5D0BkNsu5CM
Thanks, @amitchandak . will go through the tutorials when I get some time.
I talked to our AWS admin and made some change with the python script, it works now. (see script below).
limitations of python script connector:
1. this import mode is slow, hence I can only import a small csv file. If there is a small error with the csv file, the import query will fail. Is there a way to ignore csv reading errors?
2. to import multiple files from the s3 bucket. I need to write a loop function in python script
*****************************************
import boto3
import pandas as pd
import io
import os
my_bucket_name="xx"
my_file_path="xx.csv"
my_key="xx"
my_secret="xx"
session=boto3.Session(aws_access_key_id=my_key,aws_secret_access_key=my_secret)
s3Client=session.client("s3")
f = s3Client.get_object(Bucket=my_bucket_name, Key=my_file_path)
aws_data = pd.read_csv(io.BytesIO(f['Body'].read()), header=0, index_col=0)
print(aws_data)
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
114 | |
76 | |
57 | |
52 | |
44 |
User | Count |
---|---|
165 | |
116 | |
63 | |
57 | |
50 |