Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

Azure Analysis services and on prem data gateway

I am trying out a Microsoft business intelligence solution which consists of a on-prem database/datawarehouse running on a SQL Server 2014 Windows server. On the same sql server is SSIS running a set of SSIS packages stored in the SSIS Catalog. A SQL SSAS Tabular cube is running on the azure portal. Power BI reports are using SSAS live connection.

 

Well, here is the problem. The customer has not yet an azure portal so I have been testing on free users, but this will be fixed soon. My problem is to get the on-premises datagateway to work with the azure analysis services. I have to mention that the customer already has an on-prem datagateway running on an other machine using a windows service user that they have created.

Well here is what I did. I created access to the datawarehouse for that on-prem windows service user. To create a gateway to the azure portal, I had to ask the db admin person to assist me to create an on-prem gateway on azure from the on-premises data gateway dialog using the registrated account (onmicrosoft.com mail) for that service (not mine). So far so good. He helped me to get into the portal using the onprem service account on azure and there I created a free account and set up analysis services and connected it to the datagateway. I also gave my @onmicrosoft.com account on azure admin rights. Then we logged out of that azure account. From Visual studio 2017 I could now deploy the SSAS tabular using Do not Process. Just before deploying, I correct the data source setting for the SSAS tabular to the IP address like 10.xxx.x.xxx,1433 and set impersonation to Service Account, clicked save and unchecked the encryption and set privacy level to public (will change that later). Then I deployed and that went ok.

 

BUT I need to process the cube either from SSMS or from a SSIS package. I hade hoped that my @onmicrosoft.com account that has also admin rights on azure could be used. Maybe I have to use the other @onmicrosoft.com account that was created on azure, but I don't have the password to that. I believe this is a credential problem, but I am asking if there is a way around this. On SSMS, I tried running the code below using my Account @onmicrosoft.com account on user name with MTA setting and the asazure://northeurope.asazure.windows.net/xxx on server name

 

{ "refresh": { "type": "automatic", "objects": [ { "database": "My Cube" } ] } }

It runs some seconds but the it fails with this message

 

The JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'An error occurred during On-Premise Gateway related activity. Additional error details: DM_GWPipeline_Gateway_DataSourceAccessError Received error payload from gateway service with ID 371137: An exception encountered while accessing the target data source. An exception encountered while accessing the target data source The specified length exceeds maximum capacity of SecureString. Parameter name: length

Technical Details: RootActivityId: f4989df9-60c3-445f-8ef7-85fa9f7c48ac Date (UTC): 3/16/2019 8:00:33 AM 0: PFError::SetLastError() line 2160 + 0x0 (sql\picasso\engine\src\pf\eh\pferror.cpp) 1: PFSetLastError() line 2918 + 0x0 (sql\picasso\engine\src\pf\eh\pferror.cpp) 2: PFSetLastErrorExTag() line 3474 + 0x27 (sql\picasso\engine\src\pf\eh\pferror.cpp) 3: 0x00007FF913041541 (symbolic name unavailable)

I hope somebody can tell me what I am doing wrong or lead me in a correct direction here

 

Regards

 

Geir

3 REPLIES 3
v-piga-msft
Resident Rockstar
Resident Rockstar

Hi @Anonymous ,

Have you solved your problem?

If you have solved, please always accept the replies making sense as solution to your question so that people who may have the same question can get the solution directly.

If you still need help, please feel free to ask.

Best Regards,

Cherry

 

 

 

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
v-piga-msft
Resident Rockstar
Resident Rockstar

Hi @Anonymous ,

 

I have a little confused about your scenario.

 

The on-premises gateway you mentioned is the on-premises  data gateway in Power BI?

 

If it is, I'm afraid that if your data source is Azure Analysis services, you don't need to configure the on-premises gateway in Power BI Service.

 

If you really want to manage the cloud source within gateway, you could select this option under Gateway Cluster Settings when set the gateway,  "Allow user's cloud data sources to refresh through this gateway cluster "> Apply. 

 

In addition, if you want to achieve this "I hade hoped that my @onmicrosoft.com account that has also admin rights on azure could be used. ", you'd better have a contact with your global admin in your tenant. You also could get a help from Azure support.

 

If you still have the problems about Power BI, please feel free to ask.

 

Best  Regards,

Cherry

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Anonymous
Not applicable

Hi again. Thanks for spending time on me. I have had some progress. I have succeeded to create an on-prem datagateway on the same server where my database is and connected it to an azure account (still free version).

I created an user on the database which is the service user NT Service\PBIEgwService like

 

use master;

GO

CREATE LOGIN  [NT Service\PBIEgwService] FROM Windows;

GO 

use mydb;

CREATE USER  [NT Service\PBIEgwService] for login [NT Service\PBIEgwService]

GO

EXEC sp_addrolemember db_datareader, [NT Service\PBIEgwService]

GO

 

This is fine, I deployed my Visual Studio 2017 Analysis project up to the azure account, where impersonation is set to impersonate service account. I was able to process the AAS in SSMS. 

So far so good. But in my customer's environment they already has a on-prem datagateway which is connected to active directory service user domain\serviceuser. This on-prem datagateway is installed on a different machine but inside the same domain of course. I tried to use the same approach, but this time I logged in on azure on the azure account for the customers datagateway account if you understand. Everything seems identical and I have green on connecting azure to onprem datagateway. I now deployed to the new azure analysis services (do not process), that was ok. Then I tried to process and I got the following error: 

 

Failed to save modifications to the server. Error returned: 'An error occurred during On-Premise Gateway related activity. Additional error details: DM_GWPipeline_Gateway_DataSourceAccessError
Received error payload from gateway service with ID 371137: An exception encountered while accessing the target data source.
An exception encountered while accessing the target data source
The specified length exceeds maximum capacity of SecureString.
Parameter name: length

 

I can't see what is different here and what is causing the error in the second approach. It is two different azure accounts, but that is only for separating the cases here. There are both connecting to the same database, but the first is using the standard local onprem service user and the second is connected through the domain service user (set on the on prem datagateway on the second machine).

 

Hope you and others understand what I wa trying to tell here. If not let me know and I will try to explain better

 

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors