Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi All,
I'm working on a small prject with a few data sources and the last one I need to pull in is using AWS Signature V4. I've got an access token, secret, region and API key but the handshake is proving very hard to complete in order to pull the data over.
Does anyone have any experience in this?
I've spent a considerable amount of time on this problem as a bit of a side project of mine, and since this post is the first (and only relevant) result when searching for Fabric/PowerBI + AWS SigV4, I thought I would put my findings here in case others are dealing with this same scenario.
First, using the Web connector won't work as there is no AWS SigV4 authentication option. The reasons for this have to do with SigV4 requiring that an authentication signature be created dynamically based on timestamp, region, keys and the body of the API call itself, all encrypted multiple times in a specific sequence. Performing this signature generation is, I'm guessing, well beyond what Microsoft's dev teams are willing to do to enable access to their main rival's data stores. There's also the technical limitation of an authentication option that needs to be stored as a Connection's configuration when the option requires parameters that will only be defined downstream of the connection itself.
In short, it's not going to happen.
Second, the whole point of using SigV4 is to force outside parties through a robust security pathway in order to access data inside a particular AWS tenant. Whitepapers describing how to place a Microsoft Gateway inside the fence is just bypassing SigV4 entirely, and requires a degree of trust that just isn't going to be there for most 3rd party APIs that are leveraging AWS to surface data to clients.
So what can we do in lieu of a specific connector? Well, if you're just needing to fetch data for ETL purposes either as a one-off or on a schedule then you can use a Notebook in conjuncton with some Python libraries that do all the heavy lifting for you. Namely, botocore.
Here's the Python if you're starting without a session token and need to generate one along with temporary keys:
pip install botocore
temp_cred_url = 'insertvalue'
accessKey = 'insertvalue'
secretKey = 'insertvalue'
apiKey = 'insertvalue'
region = 'insertvalue'
service_name = 'insertvalue'
api_url = 'insertvalue'
import requests
import json
from botocore.auth import SigV4Auth
from botocore.awsrequest import AWSRequest
from botocore.credentials import Credentials
payload = json.dumps({
"accessKey": accessKey,
"secretKey": secretKey
})
headers = {
'x-api-key': apiKey,
'Content-Type': 'application/json'
}
response = requests.request("POST", temp_cred_url, headers=headers, data=payload)
tmp_accessKey = json.loads(response.text).get('accessKey')
tmp_secretKey = json.loads(response.text).get('secretKey')
tmp_sessionToken = json.loads(response.text).get('sessionToken')
payload = json.dumps({
---insert API body here---
})
credentials = Credentials(tmp_accessKey, tmp_secretKey, tmp_sessionToken)
request = AWSRequest(method='POST', url=api_url, headers=headers, data=payload)
SigV4Auth(credentials, service_name, region).add_auth(request)
response = requests.request('POST', api_url, headers=request.headers, data=payload)
df = json.loads(response.text)
display(df)
If your API source doesn't use temp keys then just skip the first "request" entirely. Modify the headers variable initialization based on your endpoint's requirements, and you should be good to go. Similarly, if the endpoint is expecting a GET with no body, adjust accordingly. Once you get past testing and start to implement this you'll likely throw it into a Pipeline, in which case you'll want to replace display(df) with:
notebookutils.notebook.exit(response.text)
...which will make the API response value accessible to further activities in your Pipeline. You'll also want to store your keys outside of your Python, for best practice.
Now, I mentioned earlier that this satisfies ETL purposes, in that you would be storing the results somewhere and perhaps then accessing it with Power BI for visualization. If, however, you're looking to use Power BI more like an app and have it dynamically fetch the data straight from the API via DirectQuery... this is possible, but requires an extra step.
You still need the Python code to perform the actual fetch, because again the Web connector can't do it. But what you can do is store the Python as an API-callable function, say as an Azure Function (or, and this is still in Preview as I write this, use a Fabric User-Defined Data Function). This way you're essentially setting up an internal (to Fabric) API endpoint that doesn't require SigV4 authentication. You can then use Power BI's Web connector to call this function using authentication that it can do, the function uses your Python to call the SigV4 endpoint and pass the results as its output, and Power BI will trigger this as a DirectQuery. You should even be able to parameterize this if required, as the Function can accept parameters from the Web connector and inject them into your Python call as headers or body, as needed.
It adds some overhead, but this is the most straightforward solution I've come up with for accessing AWS SigV4 sources from Fabric. I hope it helps someone!
Hi @Jim_Bob ,
Maybe you can use Web connector.
And I hope these links will help you:
https://docs.aws.amazon.com/whitepapers/latest/using-power-bi-with-aws-cloud/connecting-the-microsof...
https://docs.aws.amazon.com/whitepapers/latest/using-power-bi-with-aws-cloud/appendix-microsoft-powe...
ARCHIVED: Using Microsoft Power BI with the AWS Cloud
Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Check out the July 2025 Power BI update to learn about new features.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
10 | |
7 | |
7 | |
6 | |
6 |