Orchestrating Data Movement with Azure Data Factory
1 h 15 m
Lab Overview
In this module, you will learn how Azure Data factory can be used to orchestrate the data movement from a wide range of data platform technologies. You will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests data from SQL Database and load the data into SQL Data Warehouse. You will also demonstrate how to call a compute resource.

Related Learning Path(s):
DP-200: Implementing an Azure Data Solution
Azure Solutions Architect Expert (AZ 303 and AZ 304)
In this exercise, you will deploy the services you will leverage throughout this lab.
In this exercise, you will use the Data Factory Copy Activity to copy a dataset to Azure Data Lake Store Gen2.
In this exercise, you will use a Mapping Data Flow to read data from the source, transform the data and write data to your data sink.
In this exercise, you will use a Databricks access token to allow Data Factory to access Databricks as a Linked Service. This will allow you to leverage Databricks as a compute target in your pipeline.
Real-Time Lab
Not Registered?
Create Account
Already Registered?
What are Labs?

Labs provide a live environment to get hands-on experience using the same tools and services in the real world.

Learn More