Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!
Dear Community,
I'm running a report that integrates with Azure EA and with power bi, is well used by our customers.
Now I want to integrate AWS accounts details, as the integration with Athena + Glue is giving us permission error I wanted to know if any of you has found an optimal way to report AWS billing using Powerbi.
AWS generates in a S3 bucket a folder structure with monthly files named "AWS_Billing_Report-00001.snappy.parquet"
I'm able to download that file and read it in Powerbi , but the integration with S3 seems complex to me as it isn't only one file but a folder structure with new files every month.
Did any of you address this kind of report?
Solved! Go to Solution.
Ok , now I have it so :
Power Automate to connect to the S3 and copy to SPO
Sharepoint Folder as source
Filter list by .parquet
Click on Combine button on the Binary Column
Chagne the formula on the "Transform file" from
Source = Parquet.Document(Parameter1, [Compression=null, LegacyColumnNameEncoding=false, MaxDepth=null])
To
Source = Parquet.Document(Binary.Buffer(Parameter1))
Hi ,
After some days searching similar posts I've found that Power Automate can map the S3 folder, once that's updated sync the files to SharePoint.
I'm able to map the sharepoint online and creating a filtered view Combine the parquet files.
But there are issue with the transformation .
If I import a single snappy.parquet I can use the
Parquet.Document(Binary.Buffer([Content])))
But it doesn't work when consolidating files Gives the error:
Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [Parameter.Error] Parquet.Document cannot be used with streamed binary values..
'.
I'm able to see the query content and seems to be working but on the apply fails ...
I will keep trying as I wasn't able to apply that at the combination level.
Ok , now I have it so :
Power Automate to connect to the S3 and copy to SPO
Sharepoint Folder as source
Filter list by .parquet
Click on Combine button on the Binary Column
Chagne the formula on the "Transform file" from
Source = Parquet.Document(Parameter1, [Compression=null, LegacyColumnNameEncoding=false, MaxDepth=null])
To
Source = Parquet.Document(Binary.Buffer(Parameter1))
@sakBI , I doubt power bi has a folder option read for AWS. But if you move that to local or share point folder, you can use folder /sharepoint folder in power bi
User | Count |
---|---|
140 | |
113 | |
104 | |
77 | |
65 |
User | Count |
---|---|
135 | |
117 | |
101 | |
71 | |
61 |