La technologie cloud est une grande source de valeur pour les entreprises. En combinant le potentiel de cette technologie avec celui des données, il est possible de créer encore plus de valeur et d'offrir de nouvelles expériences client. "Explorer la transformation des données avec Google Cloud" vous fait découvrir la valeur que les données peuvent apporter à une entreprise et les façons dont Google Cloud peut les rendre utiles et accessibles. Ce cours fait partie du parcours de formation Cloud Digital Leader. Il a pour but d'aider les participants à évoluer dans leur poste et à façonner l'avenir de leur entreprise.
This course aims to upskill Google Cloud partners to perform specific tasks of migrating data from Microsoft SQL Server to CloudSQL using the built-in replication capabilities of SQL Server. Sample data will be used during the migration. Learners will complete several labs that focus on the process of transferring schema, data, and related processes to corresponding Google Cloud products. One or more challenge labs will test the learner's understanding of the topics.
Le traitement de flux de données est une pratique de plus en plus courante, car elle permet aux entreprises d'obtenir des métriques sur leurs activités commerciales en temps réel. Ce cours explique comment créer des pipelines de flux de données sur Google Cloud et présente Pub/Sub, une solution qui permet de gérer des données de flux entrants. Par ailleurs, vous verrez comment appliquer des agrégations et des transformations à des flux de données à l'aide de Dataflow, mais aussi comment stocker des enregistrements traités dans BigQuery ou Bigtable pour qu'ils puissent être analysés. Les participants mettront en pratique les connaissances qu'ils auront acquises en créant des composants de pipelines de flux de données sur Google Cloud à l'aide de Qwiklabs.
Les pipelines de données s'inscrivent généralement dans l'un des paradigmes EL (extraction et chargement), ELT (extraction, chargement et transformation) ou ETL (extraction, transformation et chargement). Ce cours indique quel paradigme utiliser pour le traitement de données par lot en fonction du contexte. Il présente également plusieurs technologies Google Cloud de transformation des données, y compris BigQuery, l'exécution de Spark sur Dataproc, les graphiques de pipelines dans Cloud Data Fusion et le traitement des données sans serveur avec Dataflow. Les participants mettront en pratique les connaissances qu'ils auront acquises en créant des composants de pipelines de données sur Google Cloud à l'aide de Qwiklabs.
Migration from Oracle to Cloud Spanner using HarbourBridge. This course describes an example scenario that uses sample data during the migration. This process includes using HarbourBridge for Assessment, Schema Conversion, Schema Transformation, Data Migration, and supporting tools for data validation.
Migration from MySQL to Cloud Spanner using Dataflow that includes sample mock data and all necessary steps with initial assessment to validation including taking care of migrating users and grants.
This workload aims to upskill Google Cloud partners to perform specific tasks associated with priority workloads. Learners will perform the tasks for migrating data from AWS Redshift to BigQuery using BigQuery Data Transfer Service, which includes sample mock data. Learners will complete a challenge lab that focuses on the process of transferring both schema and data from a Redshift data warehouse to BigQuery.
This workload aims to upskill Google Cloud partners to perform specific tasks associated with priority workloads. Learners will perform the tasks of migrating data from Snowflake to BigQuery. Sample data will be used during the migration. Learners will complete several labs that focus on the process of transferring schema, data and related processes to corresponding Google Cloud products.There will be one or more challenge labs that will test the learners' understanding of the topics. "This learning path aims to upskill Google Cloud partners to perform specific tasks associated with priority workloads. Learners will perform the tasks of migrating data from Snowflake to BigQuery.
Ce cours décrit les problématiques courantes auxquelles se confrontent les analystes de données et explique comment les résoudre à l'aide des outils de big data disponibles sur Google Cloud. Vous découvrirez quelques notions de SQL et apprendrez comment utiliser BigQuery et Dataprep pour analyser et transformer vos ensembles de données. Il s'agit du premier cours de la série "From Data to Insights with Google Cloud". Après l'avoir terminé, inscrivez-vous au cours "Creating New BigQuery Datasets and Visualizing Insights".
This course discusses the key elements of Google's Data Warehouse solution portfolio and strategy.
This course continues to explore the implementation of data load and transformation pipelines for a BigQuery Data Warehouse using Dataflow.
This course explores how to implement a streaming analytics solution using Pub/Sub.
This course explores how to implement a streaming analytics solution using Dataflow and BigQuery.
This course explores the Geographic Information Systems (GIS), GIS Visualization, and machine learning enhancements to BigQuery.
This course explores how to leverage Looker to create data experiences and gain insights with modern business intelligence (BI) and reporting.
Welcome to Intro to Data Lakes, where we discuss how to create a scalable and secure data lake on Google Cloud that allows enterprises to ingest, store, process, and analyze any type or volume of full fidelity data.
Welcome to Migrate Workflows, where we discuss how to migrate Spark and Hadoop tasks and workflows to Google Cloud.
Welcome to Data Governance, where we discuss how to implement data governance on Google Cloud.
This workload aims to upskill Google Cloud partners to perform specific tasks associated with priority workloads. Learners will perform the tasks of Migration from Teradata to BigQuery using the Data Transfer Service and the Teradata TPT Export Utility. Sample Data will be used during both methods. Learners will complete a challenge lab that focuses on the process of transferring both schema, data and SQL from a Teradata data warehouse to BigQuery.
In this course, you explore the four components that make up the BigQuery Migration Service. They are Migration Assessment, SQL Translation, Data Transfer Service, and Data Validation. You will use each of these tools to perform a migration using to BigQuery.
This course covers BigQuery fundamentals for professionals who are familiar with SQL-based cloud data warehouses in Snowflake and want to begin working in BigQuery. Through interactive lecture content and hands-on labs, you learn how to provision resources, create and share data assets, ingest data, and optimize query performance in BigQuery. Drawing upon your knowledge of Snowflake, you also learn about similarities and differences between Snowflake and BigQuery to help you get started with data warehouses in BigQuery. After this course, you can continue your BigQuery journey by completing the skill badge quest titled Build and Optimize Data Warehouses with BigQuery.
In this course, you will receive technical training for Enterprise Data Warehouses solutions using BigQuery based on the best practices developed internally by Google’s technical sales and services organizations. The course will also provide guidance and training on key technical challenges that can arise when migrating existing Enterprise Data Warehouses and ETL pipelines to Google Cloud. You will get hands-on experience with real migration tasks, such as data migration, schema optimization, and SQL Query conversion and optimization. The course will also cover key aspects of ETL pipeline migration to Dataproc as well as using Pub/Sub, Dataflow, and Cloud Data Fusion, giving you hands-on experience using all of these tools for Data Warehouse ETL pipelines.
This course identifies best practices for migrating data warehouses to BigQuery and the key skills required to perform successful migration.
Perform a migration from Oracle to BigQuery using SQL Translation and DataFlow using Sample Data. Learners will complete a quiz that focuses on the process of transferring both schema and data from an Oracle enterprise data warehouse to BigQuery.
This workload aims to upskill Google Cloud partners to perform specific tasks associated with priority workloads. Learners will perform the tasks of migrating data from five products hosted on Cloudera or Hortonworks to corresponding Google Cloud services and hosted products. The migration solutions addressed will be: HDFS data to Google Cloud Dataproc and Cloud Storage Hive data to Cloud Dataproc and the Cloud Dataproc Metastore Hive data to Google Cloud BigQuery Impala data to Google Cloud BigQuery HBase to Google Cloud Bigtable Sample data will be used during all five migrations. Learners will complete several labs that focus on the process of transferring schema, data and related processes to corresponding Google Cloud products.There will be one or more challenge labs that will test the learners understanding of the topics.
This course covers BigQuery fundamentals for professionals who are familiar with SQL-based cloud data warehouses in Redshift and want to begin working in BigQuery. Through interactive lecture content and hands-on labs, you learn how to provision resources, create and share data assets, ingest data, and optimize query performance in BigQuery. Drawing upon your knowledge of Redshift, you also learn about similarities and differences between Redshift and BigQuery to help you get started with data warehouses in BigQuery. After this course, you can continue your BigQuery journey by completing the skill badge quest titled Build and Optimize Data Warehouses with BigQuery.
Ce cours présente les outils et les bonnes pratiques MLOps pour déployer, évaluer, surveiller et exploiter des systèmes de ML en production sur Google Cloud. Le MLOps est une discipline axée sur le déploiement, le test, la surveillance et l'automatisation des systèmes de ML en production. Les ingénieurs en machine learning utilisent des outils pour améliorer et évaluer en permanence les modèles déployés. Ils collaborent avec des data scientists (ou peuvent occuper ce poste) qui développent des modèles permettant de déployer de manière rapide et rigoureuse les solutions de machine learning les plus performantes.
Dans ce cours, vous allez apprendre à créer un modèle de sous-titrage d'images à l'aide du deep learning. Vous découvrirez les différents composants de ce type de modèle, comme l'encodeur et le décodeur, et comment l'entraîner et l'évaluer. À la fin du cours, vous serez en mesure de créer vos propres modèles de sous-titrage d'images et de les utiliser pour générer des sous-titres pour des images.
Welcome to Design in BigQuery, where we map Enterprise Data Warehouse concepts and components to BigQuery and Google data services with a focus on schema design.
This course covers BigQuery fundamentals for professionals who are familiar with SQL-based cloud data warehouses in Teradata and want to begin working in BigQuery. Through interactive lecture content and hands-on labs, you learn how to provision resources, create and share data assets, ingest data, and optimize query performance in BigQuery. Drawing upon your knowledge of Teradata, you also learn about similarities and differences between Teradata and BigQuery to help you get started with data warehouses in BigQuery. After this course, you can continue your BigQuery journey by completing the skill badge quest titled Build and Optimize Data Warehouses with BigQuery.
This course provides partners the skills required to scope, design and deploy Document AI solutions for enterprise customers utilizing use-cases from both the procurement and lending arenas.
Intégrer le machine learning à des pipelines de données renforce la capacité à dégager des insights des données. Ce cours passera en revue plusieurs façons d'intégrer le machine learning à des pipelines de données sur Google Cloud. Vous découvrirez AutoML pour les cas ne nécessitant que peu de personnalisation (voire aucune), ainsi que Notebooks et BigQuery ML pour les situations qui requièrent des capacités de machine learning plus adaptées. Enfin, vous apprendrez à utiliser des solutions de machine learning en production avec Vertex AI.
Avec l'essor de l'utilisation de l'intelligence artificielle et du machine learning en entreprise, il est de plus en plus important de développer ces technologies de manière responsable. Pour beaucoup, le véritable défi réside dans la mise en pratique de l'IA responsable, qui s'avère bien plus complexe que dans la théorie. Si vous souhaitez découvrir comment opérationnaliser l'IA responsable dans votre organisation, ce cours est fait pour vous. Dans ce cours, vous allez apprendre comment Google Cloud procède actuellement, en s'appuyant sur des bonnes pratiques et les enseignements tirés, afin de vous fournir un framework pour élaborer votre propre approche d'IA responsable.
Welcome to "Virtual Agent Development in Dialogflow CX for Citizen Devs", the second course in the "Customer Experiences with Contact Center AI" series. In this course, learn how to develop customer conversational solutions using Contact Center Artificial Intelligence (CCAI). In this course, you'll be introduced to adding voice (telephony) as a communication channel to your virtual agent conversations using Dialogflow CX.
Welcome to "Virtual Agent Development in Dialogflow CX for Software Devs", the third course in the "Customer Experiences with Contact Center AI" series. In this course, learn how to develop more customized customer conversational solutions using Contact Center Artificial Intelligence (CCAI). In this course, you'll be introduced to more advanced and customized handling for virtual agent conversations that need to look up and convey dynamic data, and methods available to you for testing your virtual agent and logs which can be useful for understanding issues that arise. This is an intermediate course, intended for learners with the following type of role: Software developers: Codes computer software in a programming language (e.g., C++, Python, Javascript) and often using an SDK/API.
Welcome to "CCAI Operations and Implementation", the fourth course in the "Customer Experiences with Contact Center AI" series. In this course, learn some best practices for integrating conversational solutions with your existing contact center software, establishing a framework for human agent assistance, and implementing solutions securely and at scale. In this course, you'll be introduced to Agent Assist and the technology it uses so you can delight your customers with the efficiencies and accuracy of services provided when customers require human agents, connectivity protocols, APIs, and platforms which you can use to create an integration between your virtual agent and the services already established for your business, Dialogflow's Environment Management tool for deployment of different versions of your virtual agent for various purposes, compliance measures and regulations you should be aware of when bringing your virtual agent to production, and you'll be given tips from virtua…
Welcome to "Virtual Agent Development in Dialogflow ES for Citizen Devs", the second course in the "Customer Experiences with Contact Center AI" series. In this course, learn how to develop customer conversational solutions using Contact Center Artificial Intelligence (CCAI). You will use Dialogflow ES to create virtual agents and test them using the Dialogflow ES simulator. This course also provides best practices on developing virtual agents. You will also be introduced to adding voice (telephony) as a communication channel to your virtual agent conversations. Through a combination of presentations, demos, and hands-on labs, participants learn how to create virtual agents. This is an intermediate course, intended for learners with the following types of roles: Conversational designers: Designs the user experience of a virtual assistant. Translates the brand's business requirements into natural dialog flows. Citizen developers: Creates new business applications fo…
Welcome to "CCAI Virtual Agent Development in Dialogflow ES for Software Developers", the third course in the "Customer Experiences with Contact Center AI" series. In this course, learn to use additional features of Dialogflow ES for your virtual agent, create a Firestore instance to store customer data, and implement cloud functions that access the data. With the ability to read and write customer data, learner’s virtual agents are conversationally dynamic and able to defer contact center volume from human agents. You'll be introduced to methods for testing your virtual agent and logs which can be useful for understanding issues that arise. Lastly, learn about connectivity protocols, APIs, and platforms for integrating your virtual agent with services already established for your business.
Welcome to "CCAI Conversational Design Fundamentals", the first course in the "Customer Experiences with Contact Center AI" series. In this course, learn how to design customer conversational solutions using Contact Center Artificial Intelligence (CCAI). You will be introduced to CCAI and its three pillars (Dialogflow, Agent Assist, and Insights), and the concepts behind conversational experiences and how the study of them influences the design of your virtual agent. After taking this course you will be prepared to take your virtual agent design to the next level of intelligent conversation.
This workload aims to upskill Google Cloud partners to perform specific tasks for modernization using LookML on BigQuery. A proof-of-concept will take learners through the process of creating LookML visualizations on BigQuery. During this course, learners will be guided specifically on how to write Looker modeling language, also known as LookML and create semantic data models, and learn how LookML constructs SQL queries against BigQuery. At a high level, this course will focus on basic LookML to create and access BigQuery objects, and optimize BigQuery objects with LookML.
This course covers BigQuery fundamentals for professionals who are familiar with SQL-based cloud data warehouses in Oracle and want to begin working in BigQuery. Through interactive lecture content and hands-on labs, you learn how to provision resources, create and share data assets, ingest data, and optimize query performance in BigQuery. Drawing upon your knowledge of Oracle, you also learn about similarities and differences between Oracle and BigQuery to help you get started with data warehouses in BigQuery. After this course, you can continue your BigQuery journey by completing the skill badge quest titled Build and Optimize Data Warehouses with BigQuery.
This learning experience guides you through the process of utilizing various data sources and multiple Google Cloud products (including BigQuery and Google Sheets using Connected Sheets) to analyze, visualize, and interpret data to answer specific questions and share insights with key decision makers.
This course continues to explore the implementation of data load and transformation pipelines for a BigQuery Data Warehouse using Cloud Data Fusion.
This course explores the implementation of data load and transformation pipelines for a BigQuery Data Warehouse using Dataproc.
Welcome to Optimize in BigQuery, where we map Enterprise Data Warehouse concepts and components to BigQuery and Google data services with a focus on optimization.
Les lacs de données et les entrepôts de données sont les deux principaux composants des pipelines de données. Ce cours présente des cas d'utilisation de chaque type de stockage, ainsi que les détails techniques des solutions de lacs et d'entrepôts de données disponibles sur Google Cloud. Il décrit également le rôle des ingénieurs de données et les avantages d'un pipeline de données réussi sur les opérations commerciales, avant d'expliquer pourquoi il est important de procéder à l'ingénierie des données dans un environnement cloud. Il s'agit du premier cours de la série "Ingénierie des données sur Google Cloud". Après l'avoir terminé, inscrivez-vous au cours "Créer des pipelines de données en batch sur Google Cloud".
In this course, you learn how to do the kind of data exploration and analysis in Looker that would formerly be done primarily by SQL developers or analysts. Upon completion of this course, you will be able to leverage Looker's modern analytics platform to find and explore relevant content in your organization’s Looker instance, ask questions of your data, create new metrics as needed, and build and share visualizations and dashboards to facilitate data-driven decision making.
Ce cours présente les produits et services Google Cloud pour le big data et le machine learning compatibles avec le cycle de vie "des données à l'IA". Il explore les processus, défis et avantages liés à la création d'un pipeline de big data et de modèles de machine learning avec Vertex AI sur Google Cloud.