Naveen Chandragiri Poornachandra
成为会员时间:2022
黄金联赛
38965 积分
成为会员时间:2022
Learn to use LangChain to call Google Cloud LLMs and Generative AI Services and Datastores to simplify complex applications' code.
(Previously named "Developing apps with Vertex AI Agent Builder: Search". Please note there maybe instances in this course where previous product names and titles are used) Enterprises of all sizes have trouble making their information readily accessible to employees and customers alike. Internal documentation is frequently scattered across wikis, file shares, and databases. Similarly, consumer-facing sites often offer a vast selection of products, services, and information, but customers are frustrated by ineffective site search and navigation capabilities. This course teaches you to use AI Applications to integrate enterprise-grade generative AI search.
This course explores Google Cloud technologies to create and generate embeddings. Embeddings are numerical representations of text, images, video and audio, and play a pivotal role in many tasks that involve the identification of similar items, like Google searches, online shopping recommendations, and personalized music suggestions. Specifically, you’ll use embeddings for tasks like classification, outlier detection, clustering and semantic search. You’ll combine semantic search with the text generation capabilities of an LLM to build Retrieval Augmented Generation (RAG) systems and question-answering solutions, on your own proprietary data using Google Cloud’s Vertex AI.
這堂課程會介紹 AI 搜尋技術、工具和應用程式。主題涵蓋使用向量嵌入執行語意搜尋;結合語意和關鍵字做法的混合型搜尋機制;以及運用檢索增強生成 (RAG) 技術建構有基準的 AI 代理,盡可能減少 AI 幻覺。您可以實際使用 Vertex AI Vector Search,打造智慧型搜尋引擎。
This course equips full-stack mobile and web developers with the skills to integrate generative AI features into their applications using LangChain. You'll learn how to leverage LangChain’s capabilities for backend flows and seamless model execution, all within the familiar environment of Python. The course guides you through the entire process, from prototyping to production, ensuring a smooth journey in building next-generation AI-powered applications.
This self-paced training course gives participants broad study of security controls and techniques on Google Cloud. Through recorded lectures, demonstrations, and hands-on labs, participants explore and deploy the components of a secure Google Cloud solution, including Cloud Storage access control technologies, Security Keys, Customer-Supplied Encryption Keys, API access controls, scoping, shielded VMs, encryption, and signed URLs. It also covers securing Kubernetes environments.
這堂隨選密集課程會向參加人員說明 Google Cloud 提供的全方位彈性基礎架構和平台服務。這堂課結合了視訊講座、示範和實作研究室,可讓參加人員探索及部署解決方案元素,包括安全地建立互連網路、負載平衡、自動調度資源、基礎架構自動化,以及代管服務。
這堂隨選密集課程會向參加人員說明 Google Cloud 提供的全方位彈性基礎架構和平台服務,並將重點放在 Compute Engine。這堂課程結合了視訊講座、示範和實作研究室,可讓參加人員探索及部署解決方案元素,例如網路、系統和應用程式服務等基礎架構元件。另外,這堂課也會介紹如何部署實用的解決方案,包括客戶提供的加密金鑰、安全性和存取權管理機制、配額與帳單,以及資源監控功能。
這個入門微學習課程主要介紹「負責任的 AI 技術」和其重要性,以及 Google 如何在自家產品中導入這項技術。本課程也會說明 Google 的 7 個 AI 開發原則。
Earn a skill badge by passing the final quiz, you'll demonstrate your understanding of foundational concepts in generative AI. A skill badge is a digital badge issued by Google Cloud in recognition of your knowledge of Google Cloud products and services. Share your skill badge by making your profile public and adding it to your social media profile.
這堂課程可讓參加人員瞭解如何使用確實有效的設計模式,在 Google Cloud 中打造相當可靠且效率卓越的解決方案。這堂課程接續了「設定 Google Compute Engine 架構」或「設定 Google Kubernetes Engine 架構」課程的內容,並假設參加人員曾實際運用上述任一課程涵蓋的技術。這堂課程結合了簡報、設計活動和實作研究室,可讓參加人員瞭解如何定義業務和技術需求,並在兩者之間取得平衡,設計出相當可靠、可用性高、安全又符合成本效益的 Google Cloud 部署項目。
完成「在 Google Cloud 使用 Terraform 建構基礎架構」技能徽章中階課程, 即可證明自己具備下列知識與技能:使用 Terraform 的基礎架構即程式碼 (IaC) 原則、運用 Terraform 設定佈建及管理 Google Cloud 資源、有效管理狀態 (本機和遠端),以及將 Terraform 程式碼模組化,以利重複使用和管理。 技能徽章課程透過實作實驗室和挑戰評量,檢驗學員對於特定產品的實作知識。完成課程或直接進行挑戰實驗室,即可取得徽章。 徽章可證明您的專業能力、提升專業形象,開創更多職涯發展機會。 已獲得的徽章會顯示在您的個人資料中。
完成「設定 Google Cloud 網路」課程,即可獲得技能徽章。 您將瞭解如何在 Google Cloud Platform 執行基本的網路工作,包括建立自訂網路、新增子網路防火牆規則,還有建立 VM 並測試 VM 之間的通訊延遲。 「技能徽章」是 Google Cloud 核發的獨家數位徽章, 用於肯定您在 Google Cloud 產品與服務方面的精熟技能, 並代表您已通過測驗, 能在互動式實作環境中應用相關知識。完成這個課程及結業評量挑戰實驗室, 即可取得數位徽章並與他人分享。
This course helps learners create a study plan for the PCA (Professional Cloud Architect) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
完成 建立 Google Cloud 網路 課程即可獲得技能徽章。這個課程將說明 部署及監控應用程式的多種方法,包括查看 IAM 角色及新增/移除 專案存取權、建立虛擬私有雲網路、部署及監控 Compute Engine VM、編寫 SQL 查詢、在 Compute Engine 部署及監控 VM,以及 使用 Kubernetes 透過多種方法部署應用程式。 「技能徽章」是 Google Cloud 核發的獨家數位徽章,用於肯定 您對 Google Cloud 產品和服務的精通程度,代表您已通過測驗, 能在互動式實作環境中應用相關知識。完成這個技能徽章課程和 結業評量挑戰研究室,即可取得技能徽章並 與親友分享。
本課程會介紹 Vertex AI Studio。您可以運用這項工具和生成式 AI 模型互動、根據商業構想設計原型,並投入到正式環境。透過身歷其境的應用實例、有趣的課程及實作實驗室,您將能探索從提示到正式環境的生命週期,同時學習如何將 Vertex AI Studio 運用在多模態版 Gemini 應用程式、提示設計、提示工程和模型調整。這個課程的目標是讓您能運用 Vertex AI Studio,在專案中發揮生成式 AI 的潛能。
This skill badge course is designed to offer hands-on experience through labs, enabling participants to master Document AI for document processing and extraction tasks. By the end of the course, participants will be proficient in creating and testing Document AI processors, customizing document extraction using Document AI Workbench, and building custom processors to tackle real-world document processing challenges.
This workload aims to upskill Google Cloud partners to perform specific tasks associated with building a Custom Doc Extractor using the Google Cloud AI solution. The following will be addressed: Service: Document AI Task: Extract fields Processors: Custom Document Extractor and Document Splitter Prediction: Using Endpoint to programmatically extract fields
完成 使用 Gemini 多模態功能和多模態 RAG 檢查複合型文件 技能徽章中階課程,即可證明您具備下列技能: 透過 Gemini 多模態功能,使用多模態提示從文字和影像資料擷取資訊、生成影片說明,以及擷取影片以外的額外資訊; 透過 Gemini 的多模態檢索增強生成 (RAG) 功能,為含有文字和圖片的文件建構中繼資料、取得所有相關文字分塊,以及顯示引用資料。 「技能徽章」是 Google Cloud 核發的獨家數位徽章,用於肯定您在 Google Cloud 產品和服務方面的精通程度, 代表您已通過測驗,能在互動式實作環境中應用相關知識。完成本課程及結業評量挑戰研究室,即可取得技能徽章,並與親友分享。
This course explores how to leverage Looker to create data experiences and gain insights with modern business intelligence (BI) and reporting.
Demonstrate your ability to implement updated prompt engineering techniques and utilize several of Gemini's key capacilities including multimodal understanding and function calling. Then integrate generative AI into a RAG application deployed to Cloud Run. This course contains labs that are to be used as a test environment. They are deployed to test your understanding as a learner with a limited scope. These technologies can be used with fewer limitations in a real world environment.
Text Prompt Engineering Techniques introduces you to consider different strategic approaches & techniques to deploy when writing prompts for text-based generative AI tasks.
This course on Integrate Vertex AI Search and Conversation into Voice and Chat Apps is composed of a set of labs to give you a hands on experience to interacting with new Generative AI technologies. You will learn how to create end-to-end search and conversational experiences by following examples. These technologies complement predefined intent-based chat experiences created in Dialogflow with LLM-based, generative answers that can be based on your own data. Also, they allow you to porvide enterprise-grade search experiences for internal and external websites to search documents, structure data and public websites.
This course covers how to implement the various flavors of production ML systems— static, dynamic, and continuous training; static and dynamic inference; and batch and online processing. You delve into TensorFlow abstraction levels, the various options for doing distributed training, and how to write distributed training models with custom estimators. This is the second course of the Advanced Machine Learning on Google Cloud series. After completing this course, enroll in the Image Understanding with TensorFlow on Google Cloud course.
This course introduces participants to MLOps tools and best practices for deploying, evaluating, monitoring and operating production ML systems on Google Cloud. MLOps is a discipline focused on the deployment, testing, monitoring, and automation of ML systems in production. Machine Learning Engineering professionals use tools for continuous improvement and evaluation of deployed models. They work with (or can be) Data Scientists, who develop models, to enable velocity and rigor in deploying the best performing models.
This course covers building ML models with TensorFlow and Keras, improving the accuracy of ML models and writing ML models for scaled use.
The course begins with a discussion about data: how to improve data quality and perform exploratory data analysis. We describe Vertex AI AutoML and how to build, train, and deploy an ML model without writing a single line of code. You will understand the benefits of Big Query ML. We then discuss how to optimize a machine learning (ML) model and how generalization and sampling can help assess the quality of ML models for custom training.
This course explores what ML is and what problems it can solve. The course also discusses best practices for implementing machine learning. You’re introduced to Vertex AI, a unified platform to quickly build, train, and deploy AutoML machine learning models. The course discusses the five phases of converting a candidate use case to be driven by machine learning, and why it’s important to not skip them. The course ends with recognizing the biases that ML can amplify and how to recognize them.
This course explores the benefits of using Vertex AI Feature Store, how to improve the accuracy of ML models, and how to find which data columns make the most useful features. This course also includes content and labs on feature engineering using BigQuery ML, Keras, and TensorFlow.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.