Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Shape the future of the Fabric Community! Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions. Take survey.

Reply
QIAO
Helper II
Helper II

OLE DB or ODBC error: [Expression.Error] We couldn't fold the expression to the data source.

When I try to use Direct Query, an error occurs like "OLE DB or ODBC error: [Expression.Error] We couldn't fold the expression to the data source. ".

WeChat Image_20180126160517.png

It's a third ODBC Driver which can run when using Import mode.

26 REPLIES 26
sameeksha3006
Frequent Visitor

@QIAO @Anonymous @yan @Arebgb612 

i was able to resolve this issue by moving my report level dax fields to backend source. Looks like PBI is not able to fold the data because I was using report level dax fields inside another field(shown below).

Original Formula

DAX_Column=COMBINEVALUES(" ", column1_fromsource, column2_fromsource, column3_fromsource, column4_fromreportdax)

 

New Formula

DAX_Column=COMBINEVALUES(" ", column1_fromsource, column2_fromsource, column3_fromsource, column4_fromsource)

Note- i moved column4_fromreportdax to my backend source and later used that new field in my source query.

This all was working when my source was SAP HANA. I had to make above changes when i changed my source to Snowflake.

 

Anonymous
Not applicable

Hi, 

I'm not sure if my problem is related to this one, but I would appreciate any help here, please

I have the following Error Message: 

OLE DB or ODBC error: [Expression.Error] We couldn't fold the expression to the data source. Please try a simpler expression.

Marina_MA_0-1642001421084.png

I am connected to a Postgre Database in Azure in Direct Query mode and I think something is wrong with the Data Type (which I cannot edit in Transform data since it is not supported by Direct Query mode), because when I try to transform data the data type is correct 

Marina_MA_1-1642001555945.png

But when I try to create my model Power Bi doesn't recognize the data type, therefore Power BI is unable to creace my graph.

Marina_MA_3-1642001798473.png

Could you please help me?

Many thanks in advance!

Marina

 

check what data type the 'fetcha_resultado' is - it most probably isn't a supported type in PBI. Since you can't transform on the fly (direct mode), cast or convert that field to a supported type (string, numeric,...) at the DBMS-level, here PostGres.

How ? Create a SQL view based on the source table and use that view as source in PBI instead.

Anonymous
Not applicable

Many thanks for your help Yan!

 

I have created a SQL view and it happends the same.... 

Marina_MA_0-1642408610842.png

Marina_MA_1-1642408670996.png

 

 

I have also tried with a Native query and same result

= Value.NativeQuery(PostgreSQL.Database

............#(lf)p.fecha_pedido::timestamp,#(lf)r.fecha::timestamp as fecha_resultado,................, null, [EnableFolding=true])

 

Any other suggestion I can try, please?

 

Many thanks for your help again!

Marina

I see. Since these are meant to be timestamp, either the formatting is off (weird as postgres to pbi data types is standard), or you have an invalid value on postgres side like 00:00:00).

To confirm, cast the fetcha fields in your view to string, do an import in pbi (temporarily) and transform them there to dates. PBI should tell you what value it couldn't convert. Then adapt your view to clean these fields before trying direct query again.

PBI is pretty bad in telling you what value and record it could not coerce (convert implicitly). The generic message doesn't help troubleshoot at all. 

Anonymous
Not applicable

Hi Yan,

 

I switched DirectQuery to import mode, and PBI directly recognized the column as Date :'( Therefore, I assume the date format in Postgre is properly created, and the problem is in the direct query mode... could it be a Bug in PBI? Do you know how can we report it to Microsoft for them to solve it?

 

Just to share our conclussiones, what we are going to do is to create 3 columns in the Postgre DDBB with the year, month and date, in order to create some graphs with these data and try if it is enough to represent what we need... 

 

Many many thanks for your help, it has been very helpful!!

Anonymous
Not applicable

This is something that I'm still digging into but in my understanding the folding error has to do with query run-time data type mapping and the subsequently associated flags. 

I'm trying to roll my own ODBC Direct Query solution because we have some proprietary data types (Robot Frustrated) which have to be handled. A lot of this has to do with what it included in the custom connector so far (https://github.com/Microsoft/DataConnectors/blob/master/docs/odbc.md) So what I've learned so far:


Super common folding errors:

 
  • "Data Type of column order_status with searchable property UNSEARCHABLE should be SEARCHABLE or ALL_EXCEPT_LIKE." = Text column not being recognized as searchable (One of my proprietary text columns which didn't get correctly cast to WLONGVARCHAR) (ex. sort functions)
  • "This ODBC driver doesn't support SQL_FN_TD_TIMESTAMPDIFF. You can override this by using SqlGetInfo for SQL_TIMEDATE_FUNCTIONS." = Date column that is being cast to Timestamp for an odd reason and so Date math cannot be applied. (ex. date filters)

  • "This ODBC Driver does not support SQL_GB_NO_RELATION. You can override this by using SqlCapabilities" = Driver does not support the highest level of Group By Requirements, has to be set in custom driver as another one of these values:
    SQL_GB_NOT_SUPPORTED = 0,
    SQL_GB_GROUP_BY_EQUALS_SELECT = 1,
    SQL_GB_GROUP_BY_CONTAINS_SELECT = 2,
    SQL_GB_NO_RELATION = 3,
    SQL_GB_COLLATE = 4
    (ex. group numeric by name)

  • "This ODBC driver doesn't support SQL_AF_SUM. You can override this by using SqlGetInfo for SQL_AGGREGATE_FUNCTIONS" = Flag not set to allow standard aggregate functions / not picked up in Sql92Conformity parameter. 
    (ex. any numeric "default summarizations")

At least... that's how far I've gotten so far. If you're interested in contributing, here's my repo. 

https://github.com/sgoley/DirectQuery-for-ODBC-in-PowerBI

SamTrexler
Helper IV
Helper IV

I am having the same issue with the Amazon Redshift connector in Direct Query mode. Including any text column in the query produces this error when I try to create a slicer, table or matrix visual. Number columns work fine.

 

I have reduced the number of columns retrieved, and it doesn't matter if I restrict the number of rows retrieved by setting filters on date columns in Query Editor. Even a small number of rows and columns fails when I try to create a visual.

@SamTrexlerI have faced the exact issue even after more than 2 years ☹️! Any workaround for this?

https://github.com/microsoft/DataConnectors

Please read above url and there should be some solution to solve the problem.

I have solved the problem using idea from the link. In reality, I don't remember too much as it's the first and last connector I wrote. I borrowed some idea from Kylin ODBC driver and wrote an ODBC driver to connect to AWS Athena using direct query 2 years ago. 

Sam - enabling diagnostics tracing in PBI could give you more details on why folding isn't working. I understand MS may push back on the driver to fix and make it compliant e.g. specific attributes values, but we should get guidance on what to request from the driver's author.

For BigQuery, the driver is embedded in PBI desktop so it's all in MS hands...

v-huizhn-msft
Microsoft Employee
Microsoft Employee

Hi @QIAO,

You can connect the database in Direct Query, when you try to create a visual, it display the error message, right? What version of Power BI desktop do you use? Please update it to the latest one and check if it works fine.

Best Regards,
Angelia

 I am getting same error on custom data connector on ODBC. I have built for Apache Kylin ODBC. Preview and schema working fine but data loading fails with same error

Hi,

Have you solved your problems? I just met the same problem when I want to connect Clickhouse with Power BI by using Direct query way.

Hi, I have the same problem with big query connector

@pstanek what's the datatype of the aggregated field in BigQuery ? If NUMERIC, I just reported a bug about it.

 

Unfortunately MS / PBI doesn't make the SQL datatype mapping (ODBC driver <--> PBI) configurable. There are many sources and ODBC drivers out there and PBI determines if folding is relevant only if the driver returns all criteria for a given data type.

It would really be great if such mapping and the attributes were editable, in a config file for instance. This would allow to "force" for query folding (=send native sql to source) where actually supported, and as such increase acceptance of PBI. 

 I am getting same error on custom data connector on ODBC. I have built for Apache Kylin ODBC. Preview and schema working fine but data loading fails with same error

 

Hello, I get the same error when connecting to Snowflake using Direct Query. I am using the latest version of Power BI (April 2018). When I try to use any field from the Date dimension or a Date field in the Fact table in a visual, I get the error posted above. 

 

 

Anonymous
Not applicable

Hi, I see it's an old post. But I am getting the same error when I change Snowflake from import mode to direct query. Were you able to solve the issue? Any specific pointers for Snowflake?

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.