Building Resilient Streaming Analytics Systems on Google Cloud

Course Overview

As streaming makes it possible for organisations to obtain real-time measurements on business activities, processing streaming data is growing in popularity. How to create streaming data pipelines on Google Cloud is covered in this course. For processing incoming streaming data, Pub/Sub is described. Additionally, the course covers using Dataflow to apply aggregations and transformations to streaming data and storing processed data in BigQuery or Cloud Bigtable for analysis. Students will gain practical experience constructing QwikLabs-based streaming data pipeline components on Google Cloud.

The emblem that is shown above can be yours if you've finished this course! Visit your profile page to see all the badges you have earned. Enhance the visibility of your cloud career by showcasing your acquired knowledge.

Prerequisites

  • Experience building cloud-based big data solutions, analysing and displaying big data, and transforming/processing datasets.
  • Fundamentals of Big Data and Machine Learning on Google Cloud (or equivalent experience)
  • Some familiarity with Java

Audience Profile

This course is designed for programmers, data scientists, and analysts who wish to create solutions for unusual situations including high availability, resilience, high throughput, and real-time streaming analytics using Google Cloud.

Learning Objectives

  • Interpret real-time streaming analytics usage cases.
  • Utilize the Pub/Sub asynchronous messaging service to control data events.
  • Create streaming pipelines and carry out adjustments as appropriate.
  • Dataflow, BigQuery, and Pub/Sub should work together for real-time streaming and analysis.

Content Outline

This module introduces the course and agenda.

This module talks about challenges with processing streaming data.

This module discusses using Pub/Sub to ingest incoming streaming data.

This module recaps Dataflow and focuses on its streaming data processing capabilities.

This module covers BigQuery and Bigtable for streaming data.

This module explains more into more advanced features of BigQuery.

This module recaps the topics covered in the course.

FAQs

A: There are three approaches to handle streaming data: batch processing it at intervals ranging from hours to days, real-time processing, or a hybrid process that combines both. The ability to undertake extensive analysis, including machine learning, is a benefit of batch processing, but the disadvantage is a significant latency.

A: Streaming data is the continuous flow of data produced by numerous sources and is what is known as event stream processing. Data streams can be processed, stored, analysed, and acted upon as they are generated in real-time using stream processing technology.

A: Streaming analytics is continuously processing and analyzing data records rather than in batches. Generally, streaming analytics is helpful for the types of data sources that send data in small sizes (often in kilobytes) in a continuous flow as the data is generated.

A: Massive datasets are produced when businesses collect data at rates of hundreds of thousands or even millions of occurrences per second. The delivery of insights from this quantity of data by traditional systems can take days.

Real-time data processing and analysis are required to produce real-time actions. With the appropriate infrastructure and data-streaming platform, this is achievable.. For example, stream analytics built on Google Cloud products and services enable companies to ingest, process, and analyze data streams in real-time.

A: Radiant has highly intensive selection criteria for Technology Trainers & Consultants who deliver training programs. Our trainers & consultants undergo a rigorous technical and behavioral interview and assessment process before they are onboarded in the company.

Our Technology experts/trainers & consultants carry deep-dive knowledge in the technical subject & are certified by the OEM.

Our training programs are practically oriented with 70% – 80% hands-on training technology tools. Our training program focuses on one-on-one interaction with each participant, the latest content in the curriculum, real-time projects, and case studies during the training program.

Our faculty will quickly provide you with the knowledge of each course from the fundamental level, and you are free to ask your doubts at any time from your respective faculty.

Our trainers have the patience and ability to explain complex concepts simplistically with depth and width of knowledge.

To ensure quality learning, we provide support sessions even after the training program.

A: To attend the training session, you should have operational Desktops or Laptops with the required specification and a good internet connection to access the labs. 

A: We recommend you attend the live session to practice & clarify the doubts instantly and get more value from your investment. However, if, due to some contingency, you have to skip the class, Radiant Techlearning will help you with the recorded session of that particular day. However, those recorded sessions are not meant only for personal consumption and NOT for distribution or any commercial use.

A: Radiant Techlearning has a data center containing a Virtual Training environment for participants’ hand-on-practice. 

Participants can easily access these labs over Cloud with the help of a remote desktop connection. 

Radiant virtual labs allow you to learn from anywhere and in any time zone. 

A: The learners will be enthralled as we engage them in the natural world and Oriented industry projects during the training program. These projects will improve your skills and knowledge and give you a better experience. These real-time projects will help you a lot in your future tasks and assignments.

Send a Message.


  • Enroll