Course DP-203T00 Data Engineering on Microsoft Azure

Course Description

In this course, the professionals will learn about data engineering regarding working with batch & real-time analytical solutions using Azure data platform technologies. Professionals will begin with knowledge of the core computer & storage technologies that are used to build an analytical solution. The experts will gain knowledge on interactive file-based data exploration. The professionals will also learn how they can transform the data using the same technologies used to ingest data. They will be aware of the significance of using security to guarantee that the data is secure while it is at rest or in transit. The student will then demonstrate how to create an analytical system that can produce real-time results.

Prerequisites

Successful professionals start this course with knowledge of cloud computing & core data concepts & professional experience with data solutions. 

Specifically completing:

 

 AZ-900 - Azure Fundamentals

DP-900 - Microsoft Azure Data Fundamentals

Target Audience

The primary audience for this training is data professionals, data architects, & business intelligence professionals who want to learn about data engineering & building analytical solutions using Microsoft Azure data platform technologies. The secondary audience for this course is data analysts & data scientists who work with analytical solutions built on Microsoft Azure.

Content Outline

  • List the business issues that Azure Synapse Analytics tries to solve.
  • Explain the core capabilities of Azure Synapse Analytics.
  • Determine when to use Azure Synapse Analytics.
  • Provision an Azure Databricks workspace.
  • Identify core workloads & personas for Azure Databricks.
  • Explain key concepts of an Azure Databricks solution.
  • Decide when you should use Azure Data Lake Storage Gen2
  • Make an Azure storage account by using the Azure portal
  • Compare Azure Data Lake Storage Gen2 & Azure Blob storage
  • Explore the stages for processing big data by using Azure Data Lake Store
  • List the supported open-source platforms
  • Knowledge of data streams.
  • Knowledge of event processing.
  • Knowledge of window functions.
  • Get started with Azure Stream Analytics.
  • Identify capabilities & use cases for serverless SQL pools in Azure Synapse Analytics
  • Query CSV, JSON, & Parquet files using a serverless SQL pool
  • Make external database objects in a serverless SQL pool
  • Use a MAKE EXTERNAL TABLE AS SELECT (CETAS) statement to transform data.
  • Encapsulate CETAS information in a stored procedure.
  • Include a data transformation stored procedure in a pipeline.
  • Knowledge of lake database concepts & components
  • Explain database templates in Azure Synapse Analytics
  • Make a lake database
  • Choose an authentication method in Azure Synapse serverless SQL pools
  • Control users in Azure Synapse serverless SQL pools
  • Control user permissions in Azure Synapse serverless SQL pools
  • Explain key elements of the Apache Spark architecture.
  • Make & configure a Spark cluster.
  • Explain use cases for Spark.
  • Use Spark to process & analyze data stored in files.
  • Use Spark to visualize data.
  • Explain the core features & capabilities of Delta Lake.
  • Make & use Delta Lake tables in Azure Databricks.
  • Make Spark catalog tables for Delta Lake data.
  • Use Delta Lake tables for streaming data.
  • Identify core features & capabilities of Apache Spark.
  • Configure a Spark pool in Azure Synapse Analytics.
  • Run code to load, analyze, & visualize data in a Spark notebook.
  • Explain the integration methods between SQL & Spark Pools in Azure Synapse Analytics
  • Knowledge of the use cases for SQL & Spark Pools integration
  • Authenticate in Azure Synapse Analytics
  • Transfer data between SQL & Spark Pool in Azure Synapse Analytics
  • Authenticate between Spark & SQL Pool in Azure Synapse Analytics
  • Integrate SQL & Spark Pools in Azure Synapse Analytics
  • Externalize the use of Spark Pools within the Azure Synapse workspace
  • Transfer data outside the Synapse workspace using SQL Authentication
  • Transfer data outside the Synapse workspace using the PySpark Connector
  • Transform data in Apache Spark & write back to SQL Pool in Azure Synapse Analytics
  • Knowledge of data loading design goals
  • Explain loading methods into Azure Synapse Analytics
  • Control source data files
  • Control singleton updates
  • Set up dedicated data-loading accounts
  • Control concurrent access to Azure Synapse Analytics
  • Apply Workload Controlment
  • Simplify ingestion with the Copy Activity
  • Introduction
  • List the data factory ingestion methods
  • Explain data factory connectors
  • Exercise: Use the data factory copy activity
  • Training: Control the self-hosted integration runtime
  • Exercise: Setup the Azure integration runtime
  • Knowledge of data ingestion security considerations
  • Knowledge Check
  • Summary
  • Knowledge of Azure Data Factory
  • Explain data integration patterns
  • Explain the data factory process
  • Knowledge of Azure Data Factory components
  • Azure Data Factory security
  • Set up Azure Data Factory
  • Make linked services
  • Make datasets
  • Make data factory activities & pipelines
  • Control integration runtime

Learning objectives

  • Introduction
  • Explain Data Factory transformation methods
  • Explain Data Factory transformation types
  • Use Data Factory mapping data flow
  • Debug mapping data flow
  • Use Data Factory to wrangle data
  • Use compute transformations within Data Factory
  • Integrate SQL Server Integration Services packages within Data Factory
  • Knowledge Check
  • Summary
  • Introduction
  • Knowledge of data factory control flow
  • Work with data factory pipelines
  • Debug data factory pipelines
  • Add parameters to data factory components
  • Integrate a Notebook within Azure Synapse Pipelines
  • Execute data factory packages
  • Knowledge Check
  • Summary
  • Explain Hybrid Transactional / Analytical Processing patterns.
  • Identify Azure Synapse Link services for HTAP.
  • Configure an Azure Cosmos DB Account to use Azure Synapse Link.
  • Make an analytical store-enabled container.
  • Make a linked service for Azure Cosmos DB.
  • Analyze linked data using Spark.
  • Analyze linked data using Synapse SQL.
  • Knowledge of network security options for Azure Synapse Analytics
  • Configure Conditional Access
  • Configure Authentication
  • Control authorization through column & row-level security
  • Control sensitive data with Dynamic Data masking
  • Apply encryption in Azure Synapse Analytics
  • Explore proper usage of Azure Key Vault
  • Control access to an Azure Key Vault
  • Explore certificate held with Azure Key Vault
  • Configure a Hardware Security Module Key-generation solution
  • Plan & apply data classification in Azure SQL Database
  • Knowledge of & configure row-level security & dynamic data masking
  • Knowledge of the usage of Microsoft Defender for SQL
  • Explore how Azure SQL Database Ledger works
  • Make an event hub using the Azure CLI
  • Configure applications to send or receive messages through the event hub
  • Evaluate the performance of the event hub using the Azure portal

FAQs

Microsoft Azure is a private & public cloud platform. Azure takes this virtualization technology & rethinks it on a massive scale in Microsoft data centers worldwide. Therefore, the Cloud is a set of physical servers in one or several data centers that run virtualized hardware on behalf of clients.

Microsoft Azure has been explained as a "cloud layer" on top of several Windows Server systems, which use Windows Server 2008 & a customized version of Hyper-V, known as the Microsoft Azure Hypervisor & this is to provide virtualization of services.

Microsoft Azure architecture runs on a massive collection of servers & networking hardware, which, in turn, hosts a complex array of applications that control the operation & configuration of the software & virtualized hardware on these servers. With the help of complex orchestration, Azure is so powerful.

To make an azure diagram follow the given steps:-

Desktop Select File > New > Templates > Network > Azure Diagrams.

Online Select File > New > Microsoft Azure Diagrams.

 

To attend the training session, you should have operational Desktops or Laptops with the required specification and a good internet connection to access the labs. 

We would always recommend you attend the live session to practice & clarify the doubts instantly & get more value from your investment. However, if, due to some contingency, you have to skip the class, Radiant Techlearning will help you with the recorded session of that particular day. However, those recorded sessions are not meant only for personal consumption & NOT for distribution or commercial use.

Radiant Techlearning has a data center containing the Virtual Training environment for participants' h&-on-practice. 

Participants can easily access these labs over Cloud with the help of a remote desktop connection. 

Radiant virtual labs allow you to learn from anywhere in the world & in any time zone. 

 

The learners will be enthralled as we engage them the real-world & industry Oriented projects during the training program. These projects will improve your skills & knowledge & you will gain a better experience. These real-time projects will help you a lot in your future tasks & assignments.

Send a Message.


  • Enroll