Description

In this course, ‘Implementing an Azure Data Solution’ the professionals will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios which includes relational as well as No-SQL data. They will get to learn about data processing using a range of technologies and languages for both batch data and streaming data. 
Implementing an Azure Data Solution’ course enables professionals to explore how to implement data security including authentication, authorization, data policies and standards. We will discuss how to define and deploy data solution monitoring for both the data processing and data storage activities. At the end, they will troubleshoot and manage Azure data solutions including the optimization and disaster recovery of big data, streaming data solutions and batch processing.

 

 

Radiant Techlearning offers ‘Implementing an Azure Data Solution’ training program in Classroom & Virtual Instructor Led / online mode.

 

 

Duration: 3 Days

Pre-requisites

Professional experience is a requirement for this course. Additionally, technical knowledge equivalent to the Azure fundamentals course is recommended. 

 

Audience profile

This course is designed for data architects, data professionals, and business intelligence professionals interested in learning about the data platform technologies existing on Microsoft Azure.

Also, the individuals developing applications delivering content from the data platform technologies existing on Microsoft Azure are among secondary audience.

Course Content

Module 1: Azure for the Data Engineer

This section explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for business to explore their data in different ways. The professional will gain an overview of the various data platform technologies that are available, and how a Data Engineers role and responsibilities has evolved to work in this new world to an organization benefit

Lessons

  • Explain the evolving world of data
  • Survey the services in the Azure Data Platform
  • Identify the tasks that are performed by a Data Engineer
  • Describe the use cases for the cloud in a Case Study

 

Lab: Azure for the Data Engineer

  • Identify the evolving world of data
  • Determine the Azure Data Platform Services
  • Identify tasks to be performed by a Data Engineer
  • Finalize the data engineering deliverables

 

This module enables professionals to:

  • Explain the evolving world of data
  • Survey the services in the Azure Data Platform
  • Identify the tasks that are performed by a Data Engineer
  • Describe the use cases for the cloud in a Case Study

 

Module 2: Working with Data Storage

This section teaches the variety of ways to store data in Azure. The Professional will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data you want to store in the cloud. They will also get to learn how data lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.

Lessons

  • Choose a data storage approach in Azure
  • Create an Azure Storage Account
  • Explain Azure Data Lake storage
  • Upload data into Azure Data Lake

 

Lab: Working with Data Storage

  • Choose a data storage approach in Azure
  • Create a Storage Account
  • Explain Data Lake Storage
  • Upload data into Data Lake Store

 

This module enables professionals to:

  • Choose a data storage approach in Azure
  • Create an Azure Storage Account
  • Explain Azure Data Lake Storage
  • Upload data into Azure Data Lake

 

Module 3: Using Azure Databricks for enabling Team based Data Science

This section introduces professionals to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will get to learn about the basics of Azure Databricks and Apache Spark notebooks; how to provision the workspaces and service, and learn how data preparation takes place that can contribute to the data science project.

Lessons

  • Explain Azure Databricks
  • Work with Azure Databricks
  • Read data with Azure Databricks
  • Perform transformations with Azure Databricks

 

Lab: Using Azure Databricks for enabling Team based Data Science

  • Explain Azure Databricks
  • Work with Azure Databricks
  • Read data with Azure Databricks
  • Perform transformations with Azure Databricks

 

This module enables professionals to:

  • Explain Azure Databricks
  • Work with Azure Databricks
  • Read data with Azure Databricks
  • Perform transformations with Azure Databricks

 

Module 4: Using Cosmos DB to building Globally Distributed Databases 

In this section, professionals will learn how to work with NoSQL data using Azure Cosmos DB. Professionals will get to learn about service provision, and how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also get to learn about configuration of the availability options so that users are able to access the data from anywhere in the world.

Lessons

  • Create an Azure Cosmos DB database built to scale
  • Insert and query data in your Azure Cosmos DB database
  • Build a .NET Core app for Cosmos DB in Visual Studio Code
  • Distribute your data globally with Azure Cosmos DB

 

Lab: Building Globally Distributed Databases with Cosmos DB

  • Create an Azure Cosmos DB
  • Insert and query data in Azure Cosmos DB
  • Build a .Net Core App for Azure Cosmos DB using VS Code
  • Distribute data globally with Azure Cosmos DB

 

This module enables professionals to:

  • Create an Azure Cosmos DB database built to scale
  • Insert and query data in your Azure Cosmos DB database
  • Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
  • Distribute your data globally with Azure Cosmos DB

 

Module 5: Relational Data Stores working in the Cloud

In this section, professionals will explore the Azure relational data platform options including SQL Database and SQL Data Warehouse. The professional will be able explain why they would choose one service over another, and how to provision, connect and manage each of the services.

Lessons

  • Use Azure SQL Database
  • Describe Azure SQL Data Warehouse
  • Azure SQL Data Warehouse Creation and Querying 
  • Load Data into Azure SQL Data Warehouse using PolyBase

 

Lab: Relational Data Stores working in the Cloud

  • Use Azure SQL Database
  • Describe Azure SQL Data Warehouse
  • Azure SQL Data Warehouse Creation and Querying 
  • Load Data into Azure SQL Data Warehouse using PolyBase

 

This module enables professionals to:

  • Use Azure SQL Database
  • Describe Azure SQL Data Warehouse
  • Azure SQL Data Warehouse Creation and Querying 
  • Load Data into Azure SQL Data Warehouse using PolyBase

 

Module 6: Performing Real-Time Analytics with Stream Analytics

In this section, professionals will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The professionals will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. At the end, they will get to learn about management and monitoring of running jobs.

Lessons

  • Explain data streams and event processing
  • Data Ingestion with Event Hubs
  • Processing Data with Stream Analytics Jobs

 

Lab: Performing Real-Time Analytics with Stream Analytics

  • Explain data streams and event processing
  • Data Ingestion with Event Hubs
  • Processing Data with Stream Analytics Jobs

 

This module enables professionals to:

  • Explain data streams and event processing
  • Data Ingestion with Event Hubs
  • Processing Data with Stream Analytics Jobs

 

Module 7: Using Azure Data Factory for Orchestrating Data Movement

In this section, professionals will learn how Azure Data factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. Professionals will be proficient in explaining the capabilities of the technology and to set up an end to end data pipeline that intakes and transforms data.

Lessons

  • Explain how Azure Data Factory works
  • Azure Data Factory Components
  • Azure Data Factory and Databricks

 

Lab: Orchestrating Data Movement with Azure Data Factory

  • Explain how Data Factory Works
  • Azure Data Factory Components
  • Azure Data Factory and Databricks

 

This module enables professionals to:

  • Azure Data Factory and Databricks
  • Azure Data Factory Components 
  • Explain how Azure Data Factory works

 

Module 8: Securing Azure Data Platforms

In this section, professionals will learn how Azure provides a multi-layered security model to protect your data. The professionals will explore how security can range from setting up secure networks and access keys, to defining permission through to monitoring across a range of data stores.

Lessons

  • An introduction to security
  • Key security components
  • Securing Storage Accounts and Data Lake Storage
  • Securing Data Stores
  • Securing Streaming Data

 

Lab: Securing Azure Data Platforms

  • An introduction to security
  • Key security components
  • Securing Storage Accounts and Data Lake Storage
  • Securing Data Stores
  • Securing Streaming Data

 

After completing this module, professionals will be able to:

  • An introduction to security
  • Key security components
  • Securing Storage Accounts and Data Lake Storage
  • Securing Data Stores
  • Securing Streaming Data

 

Module 9: Troubleshooting and Monitoring Processing and Data Storage

In this section, the professional will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. We will lead professionals to explore the common data processing and data storage issues. At the end, to ensure business continuity disaster recovery options are revealed.

Lessons

  • Explain the monitoring capabilities that are available
  • Troubleshoot common data storage issues
  • Troubleshoot common data processing issues
  • Manage disaster recovery

 

Lab: Monitoring and Troubleshooting Data Storage and Processing

  • Explain the monitoring capabilities that are available
  • Troubleshoot common data storage issues
  • Troubleshoot common data processing issues
  • Manage disaster recovery

 

This module enables professionals to:

  • Explain the monitoring capabilities that are available
  • Troubleshoot common data storage issues
  • Troubleshoot common data processing issues
  • Manage disaster recovery

FAQs

Q: What is Azure Data Solution?

 

A: Azure data engineers are responsible for data-related implementation tasks that include provisioning data storage services, ingesting streaming and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data

 

Q: What is Cosmos in Azure?

 

A: Azure Cosmos DB is Microsoft’s globally distributed, multi-model database service. With a click of a button, Cosmos DB enables user to elastically and independently scale throughput and storage across any number of Azure regions worldwide.

 

Q: What is Azure Data Factory?

 

A: Azure Data Factory is the platform that solves such data scenarios. It is the cloud-based ETL and data integration service that allows user to create data-driven workflows for orchestrating data movement and transforming data at scale.

 

Q: How can I prepare for DP 200?

 

A: Preparing for Microsoft Exam DP-200: Implementing an Azure Data Solution

  • Understand Skills Measured in Exam DP-200.
  • Complete Microsoft Learning Paths for Azure Data Engineers.
  • Read and Reference the Azure Docs.
  • Review Azure Solution Architectures.
  • Watch SQLBits Session Recordings.

 

Q: How do you ensure the quality of training program?

 

A: Radiant has highly intensive selection criteria for Technology Trainers & Consultants, who deliver you training programs. Our trainers & consultants undergo rigorous technical and behavioral interview and assessment process before they are on boarded in the company. 

Our Technology experts / trainers & consultant carry deep dive knowledge in the technical subject & are certified from the OEM.

Our training programs are practical oriented with 70% – 80% hands on the training technology tool.  Our training program focuses on one-on-one interaction with each participant, latest content in curriculum, real time projects and case studies during the training program. 

Our faculty will provide you the knowledge of each course from fundamental level in an easy way and you are free to ask your doubts any time from your respective faculty.

Our trainers have patience and ability to explain difficult concepts in simplistic way with depth and width of knowledge.

To ensure quality learning, we provide support session even after the training program. 

 

Q: What if I/we have doubts after attending your training program?

 

A: Radiant team of experts would be available on the email Support@radianttechlearning.com to answer your technical queries, even after the training program.

We also conduct a 3 – 4 hours online session after 2 weeks of the training program, to respond on your queries & project assigned to you.  

 

Q: If I face technical difficulty during the class what should I do?

 

A: Technical issues are unpredictable and might occur with you as well. Participants have to ensure that they have the system with required configuration with good internet speed to access online labs.  

If the problem still persists or you face any challenge during the class then you can report to us or your trainer. In that case Radiant would provide you the recorded session of that particular day. However, those recorded sessions are not meant only for personal consumption and NOT for distribution or any commercial use. 

 

Q: Does this training program include any project?

 

A: Yes, Radiant will provide you the most updated, high valued and relevant real time projects and case studies in each training program. 

We included projects in each training program from fundamental level to advance level so that you don’t have to face any difficulty in future. You will work on highly exciting projects and that will upgrade your skill, knowledge and industry experience.

Unble To Find a Batch..?

Request a Batch