Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
talam
New Member

How to Import large BigQuery data (8GB or more) into Fabric Data Warehouse

Hi,

I am trying to import data from Bigquery using Dataflow Gen2. My table size is 8GB physical bytes in BigQuery.  I am getting the following error. It seems the error is the reason for the size of the file Can anyone explain what the alternatives are to import data in the Fabric Environment?   Thank you. 
-----------------------------------------------
Pages_Views_Sessions_Users: Error Code: Mashup Exception Data Source Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Failed to insert a table., InnerException: #{0}: #{1}, Underlying error: ODBC: ERROR [HY000] [Microsoft][BigQuery] (100) HTTP Error 403: responseTooLarge (Response too large to return. Consider specifying a destination table in your job configuration. For more details, see https://cloud.google.com/bigquery/troubleshooting-errors). Details: Reason = DataSource.Error;Message = ODBC: ERROR [HY000] [Microsoft][BigQuery] (100) HTTP Error 403: responseTooLarge (Response too large to return. Consider specifying a destination table in your job configuration. For more details, see https://cloud.google.com/bigquery/troubleshooting-errors).;Detail = [DataSourceKind = "GoogleBigQuery", DataSourcePath = "GoogleBigQuery", OdbcErrors = error "Microsoft.Mashup.Engine1.Runtime.ValueException: [Expression.Error] Value was not specified.#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.CreateValueForThrow(IThrowExpression throwExpr)#(cr)#(lf) at Microsoft.Mashup.Engine1.Language.ValueCreator.<>c__DisplayClass23_0.<CreateValueForRecord>b__0(Int32 index)#(cr)#(lf) at Microsoft.Mashup.Engine1.Runtime.RecordValue.DemandRecordValue.get_Item(Int32 index)#(cr)#(lf) at Microsoft.Data.Mashup.ProviderCommon.MashupResource.TryGetValue(Func`1 getValue, IValue& value, String& errorMessage)#(cr)#(lf)Record"];Message.Format = #{0}: #{1};Message.Parameters = {"ODBC", "ERROR [HY000] [Microsoft][BigQuery] (100) HTTP Error 403: responseTooLarge (Response too large to return. Consider specifying a destination table in your job configuration. For more details, see https://cloud.google.com/bigquery/troubleshooting-errors)."};Microsoft.Data.Mashup.Error.Context = User (Request ID: 2daea118-d580-4466-b8b7-4935b06bafb0).
------------------------------------------

5 REPLIES 5
pqian_MSFT
Microsoft Employee
Microsoft Employee

Can you raise a ticket with Google? BQ operates in two modes: if the data size is small, it'll return it inline. If the data size is big enough, it'll first buffer it to an external table, and point us to that table to download. The default limit for the external table case is 10GB and when that is exceeded, you get this error. However, the BQ doc says:

https://cloud.google.com/knowledge/kb/bigquery-response-too-large-to-return-consider-setting-allowla...

 

The query needs to set a "allowLargeResults option", which is actually implicit set. Evidently BQ did not respect that setting. This should be examined by Google and the BQ server side.

Thanks for your reply. I am also having issue sending data from a BQ table that has 400 million rows. I am trying to split the original table into small small tables and then transfer data to Fabric and combined data in Fabric Environment. After that I will do daily incremental refersh for new data only. Thanks

pqian_MSFT
Microsoft Employee
Microsoft Employee

We'll soon ship a feature that helps you partitioning your data into smaller chunks that can be incrementally refreshed.

 

Although I still think you should raise a ticket with Google to check why they are returning result set too large when "allowLargeResult" is true

Hi @talam 

 

We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.


Thanks.

I do not have any solution yet. I am exploring alternative ways to trasfer data.  See my response above. Thanks

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.