Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Error when loading data in Datamart


Hi,

 

I am trying to add dataflows to a datamart, however when loading up the data to the datamart, I am getting an error that my date column is not a valid date/time field. Please see error below.

 

Wrapping things up

InternalError
ErrorMessageString was not recognized as a valid DateTime.
HttpStatusCode 500

 

I am suspecting it is because the locale seems to be resetting itself (to United States) after you load data in the dataflow.

 

Could you please have a look?

 

thanks,

 

Stephan

Status: Accepted

Thanks for @Anonymous '' sharing, users who have faced with the same issue could try his workaround:

 

It assumes a valid table and a string column list as parameters and assumes the columns are of type text.

 

Feel free to use and adapt it (e.g.making it more generic for different TZs).

 

Function TransformSQLDTZ () :

 

let
Quelle = (Tab as table, cols as list) => let
Quelle1 = Table.ReplaceValue(Table.ReplaceValue(Table.ReplaceValue(Tab,"+01:00","",Replacer.ReplaceText,cols),"T"," ",Replacer.ReplaceText,cols),"+02:00","",Replacer.ReplaceText,cols)
in
Quelle1
in
Quelle


Implementation  (query code snippet, copy to query code, adapt "StepBeforeDateConversion" to your previous step name) :


ChangeDatestoText = Table.TransformColumnTypes(StepBeforeDateConversion,{{"AnalyticsUpdatedDate", type text}, {"CompletedDate", type text}, {"StartedDate", type text}}),
TransformSQLDTZ = TransformSQLDTZ(ChangeDatestoText, {"AnalyticsUpdatedDate","CompletedDate", "StartedDate"}),


After those amendments you should be able to import datetimes/datetimezones successfully into datamart (as text values).

 

 

Best Regards,
Community Support Team _ Yingjie Li

Comments
jackstrudleywsh
Frequent Visitor

I've been seeing nothing but issues when it comes to pulling Dataflows into the Datamart.

Initially if I selected too many entities it would crash so I had to start doing one at a time.

I've attempted to ingest two working Premium Dataflows and for one it seems to be working. For the other I'm unable to see the data within the Datamart from those Entities and when you look at the data from the generated Dataset some of the entities are showing data whereas others are completely blank.

The Datamart had been refreshing successfully but now that's stopped working. Initially I could view the refresh history where I'd just see a failure without any sort of message as to the issue but now the refresh history is not even accessible.

JP8991
Advocate II

I think Datamarts are going to be a game changer, but like @jackstrudleywsh I am having loads of problems too.

 

It feels like this should be in Alpha development stage not at public preview to be honest.

I am confident Microsoft will get this fixed though, very excited to have this up and running.

v-yingjl
Community Support
Status changed to: Investigating

Currently as far as my test, the datamart could load columns correctly when I tested it in my side.

Although I have create a datetime column with M function, it could also be loaded correctly:

The Local column is created by DateTime.FixedLocalNow() function.

vyingjl_0-1654050023554.png

 

So it looks like a regional issue, not a global issue since someone has worked already, someone has not.

 

Perhaps you can try to create a new test datamart with the same dataflow in different workspaces to check this issue again. If it still has the same issue and it is urgent for you, you can create a support ticket here at the end of the page for further fast investigating.

vyingjl_1-1654050704100.png

 

 

Best Regards,
Community Support Team _ Yingjie Li

TototheK
Regular Visitor

Thanks for the reply.

The error message is missleading. I had two dateTime columns (based on power M functions) and one date column. The issue was NOT one of the dateTime columns, but the date column. 

 

How I fixed it:

I transformed the column type and changed it to regional type date instead of just date. Afer that import to datamart works fine.

v-yingjl
Community Support
Status changed to: Investigating

Thanks for the sharing, someone who has the same issue may try it to check it again.

 

Best Regards,
Community Support Team _ Yingjie Li

Anonymous
Not applicable

Hey

 

Same issue, but switching to DateTimeZone worked to resolve the issue.

 

Regards

iwadmin
Regular Visitor

Getting the same issue when trying to ingest data from an existing Data Flow and also an existing Power BI dataset already loaded to the service:

Wrapping things up

InternalError
ErrorMessageString was not recognized as a valid DateTime.HttpStatusCode500
 
Attempting to change the format of a data type 'Date' to 'Date Locale' returns this error at transformation:

Building the datamart ETL

DataflowContainsLinkedComputedEntities
ErrorMessageLinked entities can't be modified. Please undo any changes made to the following linked entities and try again: MasteraDateTableParam0MasteraDateTable
iwadmin_0-1654532582921.png

 

RadouaneSbaa
New Member

I also have a strange behavior with DataFlow

I can set them up.

In PowerQuery from the DataMart, I can load them successfully

But when I Accept my change, there is no row to display after Loading Data steps...

Really confusing...

 

[EDIT : It may be because I did not refresh at the DataMart level...

It does work finally

I left this comment for people strugling in the same "situation" than me...]

BaBallero
Frequent Visitor

@iwadmin 

Exactly the same issue here...

Anonymous
Not applicable

Hello all.

The problem arises whenever you load datetime or datetimezone types from a dataflow into a Power BI datamart.

For datetimes you can just 

- convert them to text before reusing them in a datamart

 

However, for datetimezones to work correctly it is more complicated because the SQL server behind the datamart does not interprete them correctly even if they are text type.
Reason is the "+/-[HH:MMDelaytoGMT]" and possibly "T" letter to indicate the timezone format.

For datetimezones, you need to change the format into datetime format while keeping text type before a successful import into the PBI datamart. 

Just converting the type from DTZ to DT will not do (in many cases) because it will add/substract the +/- timezonedelay and thus adapting the timestamp wrongly.

I wrote a function covering that conversion sufficiently for german timezone format.

It assumes a valid table and a string column list as parameters and assumes the columns are of type text.

 

Feel free to use and adapt it (e.g.making it more generic for different TZs).

 

Function TransformSQLDTZ () :

 

let
Quelle = (Tab as table, cols as list) => let
Quelle1 = Table.ReplaceValue(Table.ReplaceValue(Table.ReplaceValue(Tab,"+01:00","",Replacer.ReplaceText,cols),"T"," ",Replacer.ReplaceText,cols),"+02:00","",Replacer.ReplaceText,cols)
in
Quelle1
in
Quelle


Implementation  (query code snippet, copy to query code, adapt "StepBeforeDateConversion" to your previous step name) :


ChangeDatestoText = Table.TransformColumnTypes(StepBeforeDateConversion,{{"AnalyticsUpdatedDate", type text}, {"CompletedDate", type text}, {"StartedDate", type text}}),
TransformSQLDTZ = TransformSQLDTZ(ChangeDatestoText, {"AnalyticsUpdatedDate","CompletedDate", "StartedDate"}),


After those amendments you should be able to import datetimes/datetimezones successfully into datamart (as text values).

 

 

Please mark this as a solution if applicable.