Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
sakBI
Frequent Visitor

Azure and AWS cost consolidation

Dear Community,

I'm running a report that integrates with Azure EA and with power bi, is well used by our customers.

Now I want to integrate AWS accounts details, as the integration with Athena + Glue is giving us permission error I wanted to know if any of you has found an optimal way to report AWS billing using Powerbi.

AWS generates in a S3 bucket a folder structure with monthly files named "AWS_Billing_Report-00001.snappy.parquet"

I'm able to download that file and read it in Powerbi , but the integration with S3 seems complex to me as it isn't only one file but a folder structure with new files every month.

Did any of you address this kind of report?

 

1 ACCEPTED SOLUTION
sakBI
Frequent Visitor

Ok , now I have it so :

Power Automate to connect to the S3 and copy to SPO

Sharepoint Folder as source

Filter list by .parquet

Click on Combine button on the Binary Column

Chagne the formula on the "Transform file"  from
Source = Parquet.Document(Parameter1, [Compression=null, LegacyColumnNameEncoding=false, MaxDepth=null])

To

Source = Parquet.Document(Binary.Buffer(Parameter1))

View solution in original post

3 REPLIES 3
sakBI
Frequent Visitor

Hi ,

After some days searching similar posts I've found that Power Automate can map the S3 folder, once that's updated sync the files to SharePoint.

I'm able to map the sharepoint online and creating a filtered view Combine the parquet files.

But there are issue with the transformation .

If I import a single snappy.parquet I can use the

Parquet.Document(Binary.Buffer([Content])))

But it doesn't work when consolidating files Gives the error:

Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: [Parameter.Error] Parquet.Document cannot be used with streamed binary values..
'.

I'm able to see the query content and seems to be working but on the apply fails ...

I will keep trying as I wasn't able to apply that at the combination level.

sakBI
Frequent Visitor

Ok , now I have it so :

Power Automate to connect to the S3 and copy to SPO

Sharepoint Folder as source

Filter list by .parquet

Click on Combine button on the Binary Column

Chagne the formula on the "Transform file"  from
Source = Parquet.Document(Parameter1, [Compression=null, LegacyColumnNameEncoding=false, MaxDepth=null])

To

Source = Parquet.Document(Binary.Buffer(Parameter1))

amitchandak
Super User
Super User

@sakBI , I doubt power bi has a folder option read for AWS. But if you move that to local or share point folder, you can use folder /sharepoint folder in power bi

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.