Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Find articles, guides, information and community news

Most Recent
Rufyda
Kudo Kingpin
Kudo Kingpin

When organizations work with Microsoft Fabric, one of the most attractive features is the ability to create shortcuts to external storage systems such as AWS S3.
A shortcut gives you the convenience of accessing external data as if it were already part of OneLake, without the need to copy or duplicate files.

But here’s the catch: while shortcuts simplify connectivity, they don’t eliminate one of the biggest hidden costs in cloud analytics — data transfer fees.


How Shortcuts Work

A Fabric shortcut is essentially a pointer to the data. When you query parquet files in S3 through Fabric, the compute engine (running in Azure) must fetch the bytes from AWS. This means the data is leaving AWS, and every gigabyte transferred counts as egress traffic.


So even though the files aren’t duplicated inside Fabric storage, AWS still charges you for every read that crosses into Azure.

The Cost of Reading 200 GB Daily

 

Let’s consider a realistic example:

Your S3 bucket contains about 200 GB of parquet files.

These files are refreshed daily, and your Fabric semantic model needs a daily refresh.

That means 200 GB per day × 30 days = ~6 TB per month.

Based on typical AWS S3 data transfer rates (around $0.09 per GB for the first 10 TB), you’re looking at:

 6,000 GB × $0.09 ≈ $540 per month in AWS egress charges.

That’s before considering Fabric compute costs.

 

Why Shortcuts Don’t Reduce Egress Fees

It’s important to understand that shortcuts don’t magically reduce data transfer charges. They prevent duplication of storage, but the actual bytes must still move from AWS to Azure every time you run a query or refresh your model.

So, if you’re reading the full 200 GB daily, you’ll pay egress fees as if you were downloading the data each day.

Strategies to Optimize Costs

The good news is that you don’t have to accept those fees at face value. There are practical ways to bring them down:

Initial Full Copy + Incremental Loads
Do one large migration of your dataset into OneLake (or Azure Data Lake). After that, only copy the new or updated files each day. This reduces daily transfers to just the delta, which is usually far smaller than the entire dataset.

Partitioning and Predicate Pushdown
Structure your parquet files by date or partition keys. Ensure your queries are selective so that Fabric only reads what’s necessary instead of scanning all 200 GB.

Push Changes from AWS
Instead of letting Fabric pull data every day, configure S3 event triggers (with Lambda or DataSync) to push only the new files into Azure as they arrive.

Compression and Column Pruning
Since parquet is columnar, make sure your reports only pull the columns that are actually needed. This reduces the amount of data read — and the egress bill.

Evaluate Long-Term Data Residency

 

If your workload is permanent and heavy, it may be more cost-effective to migrate the dataset fully into Azure and avoid continuous cross-cloud transfers.

 

Fabric shortcuts offer a great way to connect to S3 without moving data right away, but they don’t avoid AWS data transfer charges. If you access large volumes of S3 data every day, costs can add up quickly.

The most effective approach is usually to copy once, then refresh incrementally, while designing your data to minimize unnecessary reads. That way, you get the best of both worlds: the convenience of Fabric integration and a controlled cloud bill.

Ilgar_Zarbali
Super User
Super User

Microsoft Fabric revolutionizes data architecture by offering a unified platform that integrates Power BI, data science, real-time analytics, and more. At the heart of this ecosystem is the Lakehouse, a powerful, flexible, and scalable storage layer tailored for modern data engineering workflows.

In this article, we explore how Lakehouses work in Microsoft Fabric, how to set one up, and how they serve as the foundation for managing both files and structured data—all without the traditional complexity of data platforms.

Read more...

Rufyda
Kudo Kingpin
Kudo Kingpin

Microsoft Fabric is a powerful data platform that brings together data movement, transformation, and analytics in one unified environment. One of the core workflows in Fabric involves ingesting, exploring, transforming, and preparing data for analysis. This article provides an overview of how to work with data in Microsoft Fabric—starting from ingestion and ending with clean, ready-to-use datasets.

 

 

Microsoft-Fabric.jpg

Read more...

Ilgar_Zarbali
Super User
Super User

OneLake is a unified storage system in Microsoft Fabric that eliminates data silos by storing all data in a single location. Now, we’re going to discuss Direct Lake, a new way Power BI interacts with this storage for faster performance and efficiency.

Direct Lake.png

Source: https://learn.microsoft.com/en-us/fabric/fundamentals/direct-lake-overview

Read more...

technolog
Super User
Super User

The dynamic landscape of modern business demands more than just data-driven insights; it requires a seamless and adaptive approach to business intelligence (BI) adoption. While advanced tools like Microsoft Fabric and Power BI have revolutionized data analytics, many organizations struggle to translate their potential into impactful decision-making. This article outlines actionable strategies to simplify BI adoption, emphasizing the importance of aligning key performance indicators (KPIs), fostering agile practices, enhancing data quality, and leveraging advanced visualizations. By integrating these principles, supported by Microsoft’s ecosystem, businesses can transform their BI capabilities into a competitive advantage.

To achieve a seamless BI adoption, organizations must focus on aligning KPIs with evolving business objectives while ensuring they remain measurable and actionable. Agile BI practices, including iterative development and user feedback, are crucial to maintaining relevance in a fast-changing environment. Robust data governance, effective integration of diverse sources, and consistent validation are essential for ensuring data quality and trust. Employing advanced visualization techniques tailored to user needs enhances clarity and engagement, while automating workflows minimizes errors and saves time.

User training and support play a pivotal role in fostering a culture of data literacy, ensuring employees can confidently navigate tools like Microsoft Fabric and Power BI. Monitoring user engagement and incorporating feedback ensures reports evolve in alignment with user needs. Measuring the impact of BI on business outcomes validates its strategic value, while collaboration between IT and business teams ensures a holistic approach. Finally, full utilization of Microsoft’s tools, including Power BI and Azure OpenAI, maximizes the transformative potential of BI systems. Together, these elements drive continuous improvement and enable organizations to achieve excellence in data-driven decision-making.

Read more...

technolog
Super User
Super User

The dynamic landscape of modern business demands more than just data-driven insights; it requires a seamless and adaptive approach to business intelligence (BI) adoption. While advanced tools like Microsoft Fabric and Power BI have revolutionized data analytics, many organizations struggle to translate their potential into impactful decision-making. This article outlines actionable strategies to simplify BI adoption, emphasizing the importance of aligning key performance indicators (KPIs), fostering agile practices, enhancing data quality, and leveraging advanced visualizations. By integrating these principles, supported by Microsoft’s ecosystem, businesses can transform their BI capabilities into a competitive advantage.

To achieve a seamless BI adoption, organizations must focus on aligning KPIs with evolving business objectives while ensuring they remain measurable and actionable. Agile BI practices, including iterative development and user feedback, are crucial to maintaining relevance in a fast-changing environment. Robust data governance, effective integration of diverse sources, and consistent validation are essential for ensuring data quality and trust. Employing advanced visualization techniques tailored to user needs enhances clarity and engagement, while automating workflows minimizes errors and saves time.

User training and support play a pivotal role in fostering a culture of data literacy, ensuring employees can confidently navigate tools like Microsoft Fabric and Power BI. Monitoring user engagement and incorporating feedback ensures reports evolve in alignment with user needs. Measuring the impact of BI on business outcomes validates its strategic value, while collaboration between IT and business teams ensures a holistic approach. Finally, full utilization of Microsoft’s tools, including Power BI and Azure OpenAI, maximizes the transformative potential of BI systems. Together, these elements drive continuous improvement and enable organizations to achieve excellence in data-driven decision-making.

Read more...

technolog
Super User
Super User

The dynamic landscape of modern business demands more than just data-driven insights; it requires a seamless and adaptive approach to business intelligence (BI) adoption. While advanced tools like Microsoft Fabric and Power BI have revolutionized data analytics, many organizations struggle to translate their potential into impactful decision-making. This article outlines actionable strategies to simplify BI adoption, emphasizing the importance of aligning key performance indicators (KPIs), fostering agile practices, enhancing data quality, and leveraging advanced visualizations. By integrating these principles, supported by Microsoft’s ecosystem, businesses can transform their BI capabilities into a competitive advantage.

To achieve a seamless BI adoption, organizations must focus on aligning KPIs with evolving business objectives while ensuring they remain measurable and actionable. Agile BI practices, including iterative development and user feedback, are crucial to maintaining relevance in a fast-changing environment. Robust data governance, effective integration of diverse sources, and consistent validation are essential for ensuring data quality and trust. Employing advanced visualization techniques tailored to user needs enhances clarity and engagement, while automating workflows minimizes errors and saves time.

User training and support play a pivotal role in fostering a culture of data literacy, ensuring employees can confidently navigate tools like Microsoft Fabric and Power BI. Monitoring user engagement and incorporating feedback ensures reports evolve in alignment with user needs. Measuring the impact of BI on business outcomes validates its strategic value, while collaboration between IT and business teams ensures a holistic approach. Finally, full utilization of Microsoft’s tools, including Power BI and Azure OpenAI, maximizes the transformative potential of BI systems. Together, these elements drive continuous improvement and enable organizations to achieve excellence in data-driven decision-making.

Read more...

Helpful resources

Join Blog
Interested in blogging for the community? Let us know.