Advanced 10 Steps 7h 5m 53 Credits
This advanced-level quest is unique amongst the other Qwiklabs offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataproc, to Tensorflow, this quest is composed of specific labs that will put your GCP data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended.
PrerequisitesThis Quest requires proficiency with GCP Services, particularly those relating to working with large datasets. It is recommended that the student have at least earned a Badge by completing the hands-on labs in the Baseline: Data, ML, and AI and/or the GCP Essentials Quests before beginning. Additional lab experience with the Scientific Data Processing and the Machine Learning APIs Quests will be useful.
In this lab you analyze historical weather observations using BigQuery and use weather data in conjunction with other datasets. This lab is part of a series of labs on processing scientific data.
In this lab you analyze a large (137 million rows) natality dataset using Google BigQuery and Cloud Datalab. This lab is part of a series of labs on processing scientific data.
In this lab you will learn to use a Cloud TPU to accelerate specific TensorFlow machine learning workloads on Compute Engine.
In this lab you'll use Ibis to query the Stack Overflow public dataset in BigQuery.
In this lab you will build an end to end machine learning solution using Tensorflow + Cloud ML Engine and leverage the cloud for distributed training and online prediction.
In this lab you will use Google Cloud Dataflow to create a Maven project with the Cloud Dataflow SDK, and run a distributed word count pipeline using the Google Cloud Platform Console.
In this lab you will build several Data Pipelines that will ingest data from a publicly available dataset into BigQuery.
In this lab you’ll use Google Cloud Composer to automate the transform and load steps of an ETL data pipeline.
This lab shows you how to connect and manage devices using Cloud IoT Core; ingest the steam of information using Cloud Pub/Sub; process the IoT data using Cloud Dataflow; use BigQuery to analyze the IoT data. Watch this short video, Easily Build an IoT Analytics Pipeline.
Cloud Dataprep is Google's self-service data preparation tool. In this lab, you will learn how to use Cloud Dataprep to clean and enrich multiple datasets using a mock use case scenario of customer info and purchase history.
In this lab you build several Data Pipelines that will ingest data from the USA Babynames dataset into BigQuery, simulating a batch transformation
In this lab you will use a newly available ecommerce dataset to run some typical queries that businesses would want to know about their customers’ purchasing habits.
In this lab you will explore millions of New York City yellow taxi cab trips available in a BigQuery Public Dataset, create a ML model inside of BigQuery to predict the fare, and evaluate the performance of your model to make predictions.