Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Did you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now

Ilgar_Zarbali

Creating Azure Data Lake Storage (ADLS) in Azure: A Step-by-Step Guide

In modern data platforms, building efficient and reliable data pipelines is at the heart of every data engineering workflow. This is where Microsoft Fabric Data Factory comes into play. It provides a powerful, intuitive interface to design, orchestrate, and automate data movement across different systems. With its visual pipeline designer, rich set of activities, and built-in scheduling and monitoring capabilities, Data Factory enables data engineers to create scalable workflows—from simple data ingestion to complex transformations—without heavy coding.
As we move from understanding the interface to building real solutions, the next essential step is preparing a robust data source. In this article, we will start by creating an Azure Data Lake Storage (ADLS) account in Azure, upload sample data, and lay the foundation for our upcoming data pipelines.
We’ll begin by setting up an Azure Storage account, where we’ll upload some sample data that will later be used in our data pipeline. I’ll assume you already have access to a Microsoft Azure account. Start by opening a new browser tab and navigating to the Azure portal (portal.azure.com). From there, we’ll proceed to create the storage account within your Azure environment.

1.png

 

 

To set up the storage account, simply search for “Storage” in the Azure portal, where you’ll find the Create option—go ahead and click it. You’ll then be prompted to select your subscription and define a new resource group. Next, provide a unique name for your storage account. You can leave the region and other settings as default if they suit your needs and proceed by clicking Next. To configure it as an Azure Data Lake Storage account, make sure to enable the Hierarchical Namespace option.

 

2.png

After that, continue clicking Next through the remaining steps. In the final stage, you’ll see a summary page displaying all your selected configurations. Take a moment to review everything, and once you’re satisfied, click Create to proceed.
The deployment may take a few moments to complete.

3.png

Once the Azure Storage account is successfully created, navigate to the resource to continue.

 

4.png

 

Once the storage account is ready, the next step is to create a container. Navigate to Data Lake Storage, add a new container, give it a name like Building Container, and then click Create.

5.png

 

Within this container, we can now upload our sample data. Simply select the required files—such as orders, products, and customers—and click Upload to add them.

6.png

 

 

7.png

 

That’s it—your data has been successfully uploaded to the Azure Data Lake Storage account. In the next article, we’ll build a pipeline to read this data from ADLS and load it into a Fabric Lakehouse.