Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I have been using this code for years to combine the same table from multiple SQL servers and databases. I need to make it dynamic so that I don't have to edit all my dataflows when the list changes.
let
TableName = "SQLTable",
Source = Table.Combine({
Table.SelectRows(Sql.Database("SqlServer1", "SQLDB1"), each [Item] = TableName){[Schema="dbo",Item=TableName]}[Data],
Table.SelectRows(Sql.Database("SqlServer2", "SQLDB2"), each [Item] = TableName){[Schema="dbo",Item=TableName]}[Data],
Table.SelectRows(Sql.Database("SqlServer3", "SQLDB3"), each [Item] = TableName){[Schema="dbo",Item=TableName]}[Data],
Table.SelectRows(Sql.Database("SqlServer4", "SQLDB4"), each [Item] = TableName){[Schema="dbo",Item=TableName]}[Data],
Table.SelectRows(Sql.Database("SqlServer5", "SQLDB5"), each [Item] = TableName){[Schema="dbo",Item=TableName]}[Data]})
in
Source
I plan to query the list of servers/databases from a single SQL table, something like "Select ServerName, DatabaseName from ListOfDatabases Where DatabaseType=1"
In case it helps, I found this other code from an earlier post, and it looks like a good start but I haven't been able to tweak it to support different database names, or pulling the list from the master table (ListOfDatabases).
let
serverList = {"<SqlServer1>","<SqlServer2>",...,"<SqlServerN>"},
fxReadData = (x as text) =>
let
Source = Sql.Database(x, "<MyDatabase>"),
#"Navigation 1" = Source{[Schema = "dbo", Item = "MyTable"]}[Data]
in
#"Navigation 1",
Custom=Table.Combine(List.Transform(serverList,fxReadData))
in
Custom
Solved! Go to Solution.
Putting this here for future reference...
I got some really cool code from GPT 4.1 which read my list of servers and databases from a SQL table, looped through them to combine all the tables. Also got a version which would let me use custom SQL to pull from each one instead of just a table. It all worked great in dataflow design view, but the dataflow would not save. It gave me the following error: "One or more tables references a dynamic data source." On researching this I realized there's probably no way around it, or at least no easy way, because... every time you add a new connection to a dataflow you also have to add the connection credentials (in my case a gateway connection). Even if the code would dynamically connect to a new server/database, the refresh would fail for lack of having the credentials.
For posterity's sake I am going to post the two versions of the code from GPT. (I don't have the actual working versions any more because I saved them in a comment block before the "let" statement of my old code and when I went back to retrieve it I discovered the DF dropped all the code. Argh!)
Here's the GPT code for dynamically aggregating a table across servers. I scrubbed the code of real server, database, table column names so hopefully I didn't jack it up too bad 😂
let
DBConnectionQuery = "
select [ServerName], [Database Name]
from [ServersAndDatbases]
",
DBConnections = Sql.Database(
"<SQL server name>",
"<SQL database name>",
[Query = DBConnectionQuery]
),
CustomSQL = "
Select * from tblYourTable
Where ...
);
",
ConnectionRows = Table.ToRecords(DBConnections),
GetTable = (r as record) =>
let
server = r[ServerName],
database = r[DatabaseName],
result = Sql.Database(server, database, [Query = CustomSQL])
in
result,
YourTables = List.Transform(ConnectionRows, GetTable),
Combined = Table.Combine(YourTables)
in
CombinedHere's the version for aggregating a custom SQL statement across servers:
let
// 1. Get the list of server/database pairs dynamically
DBConnectionQuery = "
select [ServerName], [Database Name]
from [ServersAndDatbases]
",
Connections = Sql.Database(
"<SQL server name>",
"<SQL database name>",
[Query = DBConnectionQuery]
),
// 2. Set the table name and schema
TableName = "tblYourTable",
SchemaName = "dbo",
// 3. Convert to records for easier iteration
ConnectionRows = Table.ToRecords(Connections),
// 4. Function to get the required table for one connection
GetTable = (r as record) =>
let
server = r[Server Name],
database = r[Database Name],
allTables = Sql.Database(server, database),
tableRow = Table.SelectRows(
allTables,
each [Schema] = SchemaName and [Item] = TableName
),
table = if Table.RowCount(tableRow) > 0
then tableRow{0}[Data]
else null // handle missing tables gracefully
in
table,
// 5. Get all tables (some could be null if not present)
AllTables = List.Transform(ConnectionRows, GetTable),
// 6. Remove any nulls in case some tables do not exist
PresentTables = List.RemoveNulls(AllTables),
// 7. Combine into one table
Combined = Table.Combine(PresentTables)
in
Combined
Putting this here for future reference...
I got some really cool code from GPT 4.1 which read my list of servers and databases from a SQL table, looped through them to combine all the tables. Also got a version which would let me use custom SQL to pull from each one instead of just a table. It all worked great in dataflow design view, but the dataflow would not save. It gave me the following error: "One or more tables references a dynamic data source." On researching this I realized there's probably no way around it, or at least no easy way, because... every time you add a new connection to a dataflow you also have to add the connection credentials (in my case a gateway connection). Even if the code would dynamically connect to a new server/database, the refresh would fail for lack of having the credentials.
For posterity's sake I am going to post the two versions of the code from GPT. (I don't have the actual working versions any more because I saved them in a comment block before the "let" statement of my old code and when I went back to retrieve it I discovered the DF dropped all the code. Argh!)
Here's the GPT code for dynamically aggregating a table across servers. I scrubbed the code of real server, database, table column names so hopefully I didn't jack it up too bad 😂
let
DBConnectionQuery = "
select [ServerName], [Database Name]
from [ServersAndDatbases]
",
DBConnections = Sql.Database(
"<SQL server name>",
"<SQL database name>",
[Query = DBConnectionQuery]
),
CustomSQL = "
Select * from tblYourTable
Where ...
);
",
ConnectionRows = Table.ToRecords(DBConnections),
GetTable = (r as record) =>
let
server = r[ServerName],
database = r[DatabaseName],
result = Sql.Database(server, database, [Query = CustomSQL])
in
result,
YourTables = List.Transform(ConnectionRows, GetTable),
Combined = Table.Combine(YourTables)
in
CombinedHere's the version for aggregating a custom SQL statement across servers:
let
// 1. Get the list of server/database pairs dynamically
DBConnectionQuery = "
select [ServerName], [Database Name]
from [ServersAndDatbases]
",
Connections = Sql.Database(
"<SQL server name>",
"<SQL database name>",
[Query = DBConnectionQuery]
),
// 2. Set the table name and schema
TableName = "tblYourTable",
SchemaName = "dbo",
// 3. Convert to records for easier iteration
ConnectionRows = Table.ToRecords(Connections),
// 4. Function to get the required table for one connection
GetTable = (r as record) =>
let
server = r[Server Name],
database = r[Database Name],
allTables = Sql.Database(server, database),
tableRow = Table.SelectRows(
allTables,
each [Schema] = SchemaName and [Item] = TableName
),
table = if Table.RowCount(tableRow) > 0
then tableRow{0}[Data]
else null // handle missing tables gracefully
in
table,
// 5. Get all tables (some could be null if not present)
AllTables = List.Transform(ConnectionRows, GetTable),
// 6. Remove any nulls in case some tables do not exist
PresentTables = List.RemoveNulls(AllTables),
// 7. Combine into one table
Combined = Table.Combine(PresentTables)
in
Combined
Hi @Believer ,
If you get a chance, please review the responses shared by @lbendlin and @AnkitKukreja , both community members have explained the possible solutions well.
If you need any additional information, please let us know.
@lbendlin , Thanks for your continuous valuable inputs.
I left this question open in the hopes that someone else would have a more complete answer. Basically I heard "this is hard, don't do it if you don't have to" and "yes, this is hard but here's what you can do in theory..." 🙂 At this point I suppose I need to ask Google instead... <wink>
Hi! @
As Ibendlin mentioned it's difficult to get it right. You have to create a table in m code and then a invoke function to use that. Also, you can use parameters to choose server and DB and create tables in Dataflow and then combine that.
You can also try copilot for code tweak,
You can try but dynamic sources are notoriously difficult to get right. The Formula Firewall will work against you on that. If your sources only change occasionally then stay with your static sources.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!