Priyanka Makhija
成为会员时间:2024
钻石联赛
3193 积分
成为会员时间:2024
Demonstrate the ability to create and deploy deterministic virtual agents using Dialgflow CX and augment responses by grounding results on your own data integrating with Vertex AI Agent Builder data stores and leveraging Gemini for summarizations. You will use the following technologies and Google Cloud services: Vertex AI Agent Builder Dialogflow CX Gemini
Demonstrate the ability to create and deploy generative virtual agents with natural language using Vertex AI Agent Builder and augment responses by integrating Gemini responses with third party APIs and your own data stores You will use the following technologies and Google Cloud services: Vertex AI Agent Builder Gemini Cloud Functions
In this Quest, the experienced user of Google Cloud will learn how to describe and launch cloud resources with Terraform, an open source tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned. In these nine hands-on labs, you will work with example templates and understand how to launch a range of configurations, from simple servers, through full load-balanced applications.
This Quest is most suitable for those working in a technology or finance role who are responsible for managing Google Cloud costs. You’ll learn how to set up a billing account, organize resources, and manage billing access permissions. In the hands-on labs, you'll learn how to view your invoice, track your Google Cloud costs with Billing reports, analyze your billing data with BigQuery or Google Sheets, and create custom billing dashboards with Looker Studio. References made to links in the videos can be accessed in this Additional Resources document.
This course on Integrate Vertex AI Search and Conversation into Voice and Chat Apps is composed of a set of labs to give you a hands on experience to interacting with new Generative AI technologies. You will learn how to create end-to-end search and conversational experiences by following examples. These technologies complement predefined intent-based chat experiences created in Dialogflow with LLM-based, generative answers that can be based on your own data. Also, they allow you to porvide enterprise-grade search experiences for internal and external websites to search documents, structure data and public websites.
In this course, you'll use text embeddings for tasks like classification, outlier detection, text clustering and semantic search. You'll combine semantic search with the text generation capabilities of an LLM to build Retrieval Augmented Generation (RAG) solutions, such as for question-answering systems, using Google Cloud's Vertex AI and Google Cloud databases.
This course explores Google Cloud technologies to create and generate embeddings. Embeddings are numerical representations of text, images, video and audio, and play a pivotal role in many tasks that involve the identification of similar items, like Google searches, online shopping recommendations, and personalized music suggestions. Specifically, you’ll use embeddings for tasks like classification, outlier detection, clustering and semantic search. You’ll combine semantic search with the text generation capabilities of an LLM to build Retrieval Augmented Generation (RAG) systems and question-answering solutions, on your own proprietary data using Google Cloud’s Vertex AI.
在本次课程中,探索 AI 赋能的搜索技术、工具和应用。学习利用向量嵌入的语义搜索、融合语义和关键字的混合搜索方法,以及检索增强生成 (RAG) 技术,以打造基于事实的 AI 智能体,尽可能减少 AI 幻觉。获取 Vertex AI Vector Search 实战经验,打造您自己的智能搜索引擎。
本课程教您如何使用深度学习来创建图片标注模型。您将了解图片标注模型的不同组成部分,例如编码器和解码器,以及如何训练和评估模型。学完本课程,您将能够自行创建图片标注模型并用来生成图片说明。
本课程向您介绍 Transformer 架构和 Bidirectional Encoder Representations from Transformers (BERT) 模型。您将了解 Transformer 架构的主要组成部分,例如自注意力机制,以及该架构如何用于构建 BERT 模型。您还将了解可以使用 BERT 的不同任务,例如文本分类、问答和自然语言推理。完成本课程估计需要大约 45 分钟。
本课程简要介绍了编码器-解码器架构,这是一种功能强大且常见的机器学习架构,适用于机器翻译、文本摘要和问答等 sequence-to-sequence 任务。您将了解编码器-解码器架构的主要组成部分,以及如何训练和部署这些模型。在相应的实验演示中,您将在 TensorFlow 中从头编写简单的编码器-解码器架构实现代码,以用于诗歌生成。
本课程能让机器学习从业者掌握评估生成式和预测式 AI 模型的基本工具、方法和最佳实践。要确保机器学习系统在实际运用中提供可靠、准确、高效的结果,做好模型评估至关重要。 学员将深入了解各项评估指标、方法及如何在不同模型类型和任务中适当应用这些指标和方法。课程将着重介绍生成式 AI 模型带来的独特挑战,并提供有效解决这些挑战的策略。通过利用 Google Cloud 的 Vertex AI Platform,学员可学习如何在模型选择、优化和持续监控工作中实施卓有成效的评估流程。
本课程向您介绍扩散模型。这类机器学习模型最近在图像生成领域展现出了巨大潜力。扩散模型的灵感来源于物理学,特别是热力学。过去几年内,扩散模型成为热门研究主题并在整个行业开始流行。Google Cloud 上许多先进的图像生成模型和工具都是以扩散模型为基础构建的。本课程向您介绍扩散模型背后的理论,以及如何在 Vertex AI 上训练和部署此类模型。
本课程将向您介绍注意力机制,这是一种强大的技术,可令神经网络专注于输入序列的特定部分。您将了解注意力的工作原理,以及如何使用它来提高各种机器学习任务的性能,包括机器翻译、文本摘要和问题解答。
这是一节入门级微课程,旨在解释什么是负责任的 AI、它的重要性,以及 Google 如何在自己的产品中实现负责任的 AI。此外,本课程还介绍了 Google 的 7 个 AI 开发原则。
In the last installment of the Dataflow course series, we will introduce the components of the Dataflow operational model. We will examine tools and techniques for troubleshooting and optimizing pipeline performance. We will then review testing, deployment, and reliability best practices for Dataflow pipelines. We will conclude with a review of Templates, which makes it easy to scale Dataflow pipelines to organizations with hundreds of users. These lessons will help ensure that your data platform is stable and resilient to unanticipated circumstances.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
In this course you will get hands-on in order to work through real-world challenges faced when building streaming data pipelines. The primary focus is on managing continuous, unbounded data with Google Cloud products.
In this intermediate course, you will learn to design, build, and optimize robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting. Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence. A basic knowledge of data warehousing, ETL/ELT, SQL, Python, and Google Cloud concepts is recommended.
While the traditional approaches of using data lakes and data warehouses can be effective, they have shortcomings, particularly in large enterprise environments. This course introduces the concept of a data lakehouse and the Google Cloud products used to create one. A lakehouse architecture uses open-standard data sources and combines the best features of data lakes and data warehouses, which addresses many of their shortcomings.
完成中级技能徽章课程使用 BigQuery 构建数据仓库,展示以下技能: 联接数据以创建新表、排查联接故障、使用并集附加数据、创建日期分区表, 以及在 BigQuery 中使用 JSON、数组和结构体。 技能徽章是 Google Cloud 颁发的专属数字徽章, 旨在认可您在 Google Cloud 产品与服务方面的熟练度; 您需要在交互式实操环境中参加考核,证明自己运用所学知识的能力后 才能获得。完成此技能徽章课程和作为最终评估的实验室挑战赛, 获得数字徽章,在您的人际圈中炫出自己的技能。
This course explores the implementation of data load and transformation pipelines for a BigQuery Data Warehouse using Dataproc.
Looking to build or optimize your data warehouse? Learn best practices to Extract, Transform, and Load your data into Google Cloud with BigQuery. In this series of interactive labs you will create and optimize your own data warehouse using a variety of large-scale BigQuery public datasets. BigQuery is Google's fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes and terabytes of data without having any infrastructure to manage or needing a database administrator. BigQuery uses SQL and can take advantage of the pay-as-you-go model. BigQuery allows you to focus on analyzing data to find meaningful insights. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of this quest to receive an exclusive Google Cloud digital badge.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.