Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Norm
Advocate I
Advocate I

Enhanced metadata with Direct Query SQL Server = Many sys.* queries


Hi,

 

In the Power BI Service, when I open a Direct Query SQL Server report that was generated using the Enhanced Metadata setting, I see that when the dataset is loaded, A LOT of queries are sent to the SQL Server in order to obtain different metadata. Is that normal?

 

For example, I have a report that have only 1 table (MySchema.Table1) and the only thing that is in my report is card displaying the number of rows. When I'm not using Enhanced Metadata, the only query that is sent to the SQL Server is a "select count(*) from MySchema.Table1". But if the same report was generated using Enhanced Metadata, then I see queries like those sent to the SQL Server

 

select t.[TABLE_CATALOG], t.[TABLE_SCHEMA], t.[TABLE_NAME], t.[TABLE_TYPE], tv.create_date [CREATED_DATE], tv.modify_date [MODIFIED_DATE], cast(e.value as nvarchar(max)) [DESCRIPTION]
from [INFORMATION_SCHEMA].[TABLES] t
join sys.schemas s on s.name = t.[TABLE_SCHEMA]
join sys.objects tv on tv.name = t.[TABLE_NAME] and tv.schema_id = s.schema_id and tv.parent_object_id = 0
left outer join sys.extended_properties e on tv.object_id = e.major_id and e.minor_id = 0 and e.class = 1 and e.name = 'MS_Description'

select
s.name [TABLE_SCHEMA],
o.name [TABLE_NAME],
c.name [COLUMN_NAME],
cast(c.column_id as bigint) [ORDINAL_POSITION],
c.is_nullable [IS_NULLABLE],
case when (t.is_user_defined = 0 and t.name is not null) then t.name when (c.system_type_id = 240 or t.name is null) then 'udt' else t_system.name end [DATA_TYPE],
case when (c.system_type_id in (59, 62)) then 2 when (c.system_type_id in (48, 52, 56, 60, 104, 106, 108, 122, 127)) then 10 else null end [NUMERIC_PRECISION_RADIX],
c.precision [NUMERIC_PRECISION],
case when (c.system_type_id in (59, 62)) then null else c.scale end [NUMERIC_SCALE],
case when (c.system_type_id in (40, 41, 42, 43, 58, 61)) then c.scale else null end [DATETIME_PRECISION],
case when (c.system_type_id in (231, 239)) then floor(c.max_length / 2) when (c.system_type_id in (165, 167, 173, 175)) then c.max_length else null end [CHARACTER_MAXIMUM_LENGTH],
cast(e.value as nvarchar(max)) [DESCRIPTION],
d.definition [COLUMN_DEFAULT],
cc.definition [COLUMN_EXPRESSION],
case when c.is_identity = 1 or c.is_computed = 1 or t.system_type_id = 189 or c.generated_always_type > 0 then 0 else 1 end [IS_WRITABLE]
from sys.objects o
join sys.schemas s on s.schema_id = o.schema_id
join sys.columns c on o.object_id = c.object_id
left join sys.types t on c.user_type_id = t.user_type_id
left join sys.types t_system on t.system_type_id = t_system.user_type_id
left join sys.default_constraints d on d.object_id = c.default_object_id
left join sys.computed_columns cc on c.object_id = cc.object_id and c.column_id = cc.column_id
left join sys.extended_properties e on o.object_id = e.major_id and c.column_id = e.minor_id and e.class = 1 and e.name = 'MS_Description'
where (s.name = N'MySchema' and o.name = N'Table1')


Is this normal that Power BI execute all those queries? My problem is that if I have a big dataset (for example 30 tables), then those queries could take almost 30 seconds to execute and that means that a report that took 2 seconds to display data now takes 32 seconds.

I'm using the latest version of PowerBI Desktop dans the July version of the gateway.

 

Thanks!

2 REPLIES 2
lbendlin
Super User
Super User

yes, the meta data queries seems to have multiplied over the last few releases, with many data sources now spending lots of time in the "Evaluating" stage.  

 

Can you see these meta queries cosistently or are they only run occasionally/on query structure changes?  I am not seeing consistent impact but haven't figured out the true behavior yet.  Running Fiddler traces at all times is not sustainable I fear.

I haven't done extensive testing, but so far what seems to trigger the queries is the first DAX query execution.

For example I upload my report to the workspace I connect to the dataset using DAX Studio and do the following:
- Run a query "EVALUATE ROW("dummy", 1)" -> No metadata queries are run
- Run a query "EVALUATE ROW("my measure", [My measure])" -> All the metadata queries are run.
- Run again the query "EVALUATE ROW("my measure", [My measure])" -> No metadata queries are run.
- Disconnect from DAX Studio, reconnect and run again the query "EVALUATE ROW("my measure", [My measure])" -> No metadata queries are run.

My workspace is in a Azure Embedded Capacity. If I restart the capacity and run again the query "EVALUATE ROW("my measure", [My measure])", again all the metadata queries are run the first time.

We want to use the new Enhanced Metadata feature in order to use the calculation groups, perspectives, etc., but after seeing that huge performance hit the first time the report is loaded, I'm very worried.

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors