developers.redhat.com
Open in
urlscan Pro
2a02:26f0:6c00::210:baa2
Public Scan
Submitted URL: https://app.engage.redhat.com/e/er?s=1795&lid=248053&elqTrackId=de5385f1a4f546ba8c58cea8f5348b99&elq=4da937ef23644a4dadc2abf28...
Effective URL: https://developers.redhat.com/articles/2022/05/05/how-install-open-source-tool-creating-machine-learning-pipelines?sc_cid=7013...
Submission: On May 16 via api from US — Scanned from DE
Effective URL: https://developers.redhat.com/articles/2022/05/05/how-install-open-source-tool-creating-machine-learning-pipelines?sc_cid=7013...
Submission: On May 16 via api from US — Scanned from DE
Form analysis
1 forms found in the DOMGET /search
<form action="/search" method="get">
<label for="pfe-navigation__search-label-universal">Search</label>
<input id="pfe-navigation__search-label-universal" name="t" type="text" placeholder="Search">
<button>Search</button>
</form>
Text Content
Skip to main content * Topics FEATURED TOPICS * Kubernetes Learn how this powerful open-source tool helps you manage components across containers in any environment. * Quarkus Kubernetes-native Java with low memory footprint and fast boot times for microservices and serverless applications. * DevOps DevOps involves the combination of cultural change, process automation, and tools to improve your time-to-market. * Linux Develop applications on the most popular Linux for the enterprise—all while using the latest technologies. OTHER TOPICS * .NET Core * Apache Kafka on Kubernetes * API Management * Camel K * Containers * Data Integration * Data Science * DevOps * DevTools * Edge computing * Event-Driven Architecture * GitOps * Istio service mesh * Java * Javascript * Microservices * Node.js * Open Source * Operators * Python * Serverless * Spring Boot * View all topics * Products FEATURED PRODUCTS * Red Hat Enterprise Linux A stable, proven foundation that's versatile enough for rolling out new applications, virtualizing environments, and creating a secure hybrid cloud. * OpenShift Open, hybrid-cloud Kubernetes platform to build, run, and scale container-based applications -- now with developer tools, CI/CD, and release management. * Red Hat build of OpenJDK The Red Hat build of OpenJDK is a free and supportable open source implementation of the Java Platform, Standard Edition (Java SE). MORE PRODUCTS * 3scale API Management * AMQ * Ansible Automation Platform * CodeReady Containers * CodeReady Studio * CodeReady Workspaces * Container Development Kit * Fuse * JBoss Enterprise Application Platform * Process Automation Manager * Migration Toolkit for Applications * OpenShift API Management * OpenShift Application Runtimes * OpenShift Data Science * OpenShift Streams for Apache Kafka * Red Hat Decision Manager * Red Hat Developer Toolset * Red Hat build of Quarkus * View all products * Develop in the sandbox * Developer Sandbox * Build * Tools * Events * Learn * Partner Search Search All Red Hat Back to menu * You are here RED HAT Learn about our open source products, services, and company. * You are here RED HAT CUSTOMER PORTAL Get product support and knowledge from the open source experts. * You are here RED HAT DEVELOPER Read developer tutorials and download Red Hat software for cloud application development. * You are here RED HAT PARTNER CONNECT Become a Red Hat partner and get support in building customer solutions. -------------------------------------------------------------------------------- * PRODUCTS * ANSIBLE.COM Learn about and try our IT automation product. * TRY, BUY, SELL * RED HAT HYBRID CLOUD Access technical how-tos, tutorials, and learning paths focused on Red Hat’s hybrid cloud managed services. * RED HAT STORE Buy select Red Hat products and services online. * RED HAT MARKETPLACE Try, buy, sell, and manage certified enterprise software for container-based environments. * COMMUNITY & OPEN SOURCE * THE ENTERPRISERS PROJECT Read analysis and advice articles written by CIOs, for CIOs. * OPENSOURCE.COM Read articles on a range of topics about open source. * * RED HAT SUMMIT Register for and learn about our annual open source IT industry event. * RED HAT ECOSYSTEM CATALOG Find hardware, software, and cloud providers―and download container images―certified to perform with Red Hat technologies. Article HOW TO INSTALL AN OPEN SOURCE TOOL FOR CREATING MACHINE LEARNING PIPELINES May 5, 2022 Share on twitter Share on facebook Share on linkedin Share with email Tags: AI/ML CI/CD JooHo Lee Table of contents: * Deploy the Open Data Hub Operator * Create a KfDef to deploy Pachyderm, JuypterHub, and Ceph Nano * Video demo * Troubleshooting Pachyderm is an open source tool for creating and running machine learning (AI/ML) pipelines. It runs on Kubernetes and provides modern developer benefits such as versioning and autoscaling. Pachyderm also integrates with the JupyterHub information sharing platform. On March 8, 2022, Red Hat announced that Pachyderm has been added to the Open Data Hub (ODH), which is a blueprint for building an AI-as-a-service platform on the Red Hat OpenShift Container Platform. In this article, you'll learn how to install Pachyderm using Open Data Hub. To walk through the steps in this article, you'll need a Red Hat OpenShift cluster with a default StorageClass. The procedure in this article has been tested in the following environments: * OpenShift Dedicated 4.9 on AWS with a gp2 StorageClass * An OpenShift cluster using Red Hat OpenShift Local (formerly Red Hat CodeReady Containers) with an nfs StorageClass set up by the NFS Provisioner Operator. The first option is used in this article. OpenShift Dedicated provides a default gp2 StorageClass, but it is not cost-free. As an alternative, with the second option you can set up a cost-free environment as follows: 1. Use OpenShift Local to install an OpenShift All-in-One cluster on your laptop. 2. Add an nfs StorageClass using the NFS Provisioner Operator, available from OperatorHub or Github. Once you've followed those steps, you'll have essentially the same environment as OpenShift Dedicated. This article also contains an embedded video illustrating the steps. If you want to experiment with the Red Hat OpenShift Local test environment, please refer to the following articles: * Configure CodeReady Containers for AI/ML development * Create and manage local persistent volumes with CodeReady Containers Deploy the Open Data Hub Operator DEPLOY THE OPEN DATA HUB OPERATOR Installing an Operator is the easiest step in this procedure. Go to the OperatorHub menu option in the OpenShift console, search for the Open Data Hub Operator, and click its link (Figure 1). Figure 1. From the OperatorHub menu option, search for the Open Data Hub Operator and click its link. Figure 1: Find the Open Data Hub Operator in the Openshift console. You'll be taken to the Operator page for the Open Data Hub Operator (Figure 2). Click Install. Figure 2. From the Open Data Hub Operator page, click Install. Figure 2: Begin the Operator installation process by clicking here. Next, you'll see the Install Operator page (Figure 3). Keep all the defaults and click Install again. Figure 3. From the Install Operator page, click Install. Figure 3: Click Install on the Install Operator page. When installation is complete, you'll see a message saying "saying "Installed operator — ready for use," as in Figure 4. Figure 4. When you finish installation, a page comes up saying "Installed operator — ready for use." Figure 4: The Operator is now installed. After you install Open Data Hub Operator, you need to create a new project that we'll call opendatahub, where all required components—Jupyterhub, Ceph Nano, and Pachyderm—will be deployed $ oc new-project opendatahub Copy snippet Create a KfDef to deploy Pachyderm, JuypterHub, and Ceph Nano CREATE A KFDEF TO DEPLOY PACHYDERM, JUYPTERHUB, AND CEPH NANO Pachyderm supports any storage option compatible with AWS S3 object storage. Open Data Hub provides two of these storage options: * Full automation: Deploy Ceph Nano on Open Data Hub, which creates a secret for Pachyderm. * Partial automation: Manually create a secret for the credentials to access S3 or another S3-compatible object storage, such as MinIO. FULL AUTOMATION USING CEPH NANO Open Data Hub provides a full automation YAML configuration using a Kubernetes Job named pachyderm-deployer. Here's an excerpt of the configuration: # Ceph Nano - kustomizeConfig: repoRef: name: manifests path: ceph/object-storage/scc name: ceph-nano-scc - kustomizeConfig: repoRef: name: manifests path: ceph/object-storage/nano name: ceph-nano # Pachyderm operator - kustomizeConfig: parameters: - name: namespace value: openshift-operators repoRef: name: manifests path: odhpachyderm/operator name: odhpachyderm-operator # Pachyderm deployer - kustomizeConfig: repoRef: name: manifests path: odhpachyderm/deployer name: odhpachyderm-deployer Copy snippet The configuration contains a script that makes sure Ceph Nano is in a ready state, and then creates an S3 bucket in Ceph Nano. After that, the script creates a secret for the S3 bucket credentials, which Pachyderm will use to gain access to the S3 bucket. To use full automation on Kubernetes, you need a KfDef custom resource (CR). A manifest for this KfDef can be found in my GitHub repository. Create the KfDef on OpenShift through the following command: $ oc create -f https://bit.ly/3wHwt59 Copy snippet PARTIAL AUTOMATION USING S3 OR OTHER COMPATIBLE STORAGE (MINIO) If you want to go the partial automation route, the only difference from using Ceph Nano is that you need to create a secret before creating the KfDef, then pass that information to pachyderm-deployer in the KfDef. The relevant line can be found in context in the YAML file. An oc command that creates a secret for AWS S3 looks like this: $ oc create secret generic pachyderm-aws-secret \ --from-literal=access-id=XXX \ --from-literal=access-secret=XXX \ --from-literal=region=us-east-2 \ --from-literal=bucket=pachyderm Copy snippet An oc command that creates a secret for MinIO looks like this: $ oc create secret generic pachyderm-minio-secret \ --from-literal=access-id=XXX \ --from-literal=access-secret=XXX \ --from-literal=custom-endpoint=${minio_ip} --from-literal=region=us-east-2 \ --from-literal=bucket=pachyderm Copy snippet The following excerpt from a KfDef manifest shows how to use the secret with S3. The example uses pachyderm-aws-secret for the secret: # Pachyderm Operator - kustomizeConfig: parameters: - name: namespace value: openshift-operators repoRef: name: manifests path: odhpachyderm/operator name: odhpachyderm-operator # Pachyderm Deployer - kustomizeConfig: parameters: - name: storage_secret #<=== Must set this value: pachyderm-aws-secret #<=== Use your Secret Name repoRef: name: manifests path: odhpachyderm/deployer name: odhpachyderm-deployer Copy snippet Once you've created the secret, you can create the KfDef through the following command: $ oc create -f https://bit.ly/3NkV31I Copy snippet After you create the KfDef, OpenShift creates several pods in the opendatahub project. Four pods are created for Pachyderm: $ oc get pod etcd-0 1/1 Running 0 12m postgres-0 1/1 Running 0 12m pachd-874f5958c-7w98p 1/1 Running 0 11m pg-bouncer-7587d49769-gwn8f 1/1 Running 0 11m Copy snippet Even more pods might be devoted to Pachyderm if you are using Red Hat OpenShift Local. If resources on your cluster are tight, it could take some time to create the pods. Now you can try Pachyderm on your cluster. Video demo VIDEO DEMO The following video illustrates the steps outlined so far. Troubleshooting TROUBLESHOOTING If you are running this example on your laptop, you might see some errors with the JupyterHub pods, jupyterhub and jupyterhub-db, due to a lack of resources (Figure 5). Figure 5. Sometimes, jupyterhub and jupyterhub-db show errors at startup. Figure 5: Sometimes, jupyterhub and jupyterhub-db show errors at startup. traefix-proxy pods might show some errors, but you can ignore them. When jupyterhub and jupyterhub-db are recovered, traefix-proxy will be automatically healed. If you see these errors, start a rollout of the DeploymentConfigs for jupyterhub and jupyterhub-db as shown in Figures 6 and 7. Figure 6. Pull up the DeploymentConfigs page to get access to pages for jupyterhub and jupyterhub-dc. Figure 6: Pull up the DeploymentConfigs page to get access to pages for jupyterhub and jupyterhub-dc. Figure 7. On the jupyterhub page, choose "Start rollout." Figure 7: On the jupyterhub page, choose Start rollout. Start a rollout for jupyterhub-db in the same way. If these steps don't solve the problem, roll out jupyterhub-db first, wait until it is ready, then roll out jupyterhub. Then enjoy experimenting with what Pachyderm has to offer! RECENT ARTICLES * ALL ABOUT LOCAL AND SELF-MANAGED KAFKA DISTRIBUTIONS * HOW TO USE OPERATORS WITH AWS CONTROLLERS FOR KUBERNETES * DEVELOPER TOOLS REBRAND, SAY FAREWELL TO CODEREADY NAME * HOW TO ORGANIZE JFR DATA WITH RECORDING LABELS IN CRYOSTAT 2.1 * RHEL 8.6: WHAT'S NEW AND HOW TO UPGRADE RELATED CONTENT CONFIGURE CODEREADY CONTAINERS FOR AI/ML DEVELOPMENT WHAT’S UP NEXT? Open Source Data Pipelines for Intelligent Applications provides data engineers and scientists insight into how Kubernetes provides a platform for building data platforms that increase an organization’s data agility. Download the free e-book COMMENTS Please enable JavaScript to view the comments powered by Disqus. * FEATURED TOPICS * Istio * Quarkus * CI/CD * Serverless * Enterprise Java * Linux * Microservices * DevOps * BUILD * Getting Started Center * Developer Tools * Interactive Tutorials * Container Catalog * Operators Marketplace * Certify Applications * Red Hat on Github * QUICKLINKS * What's new * DevNation events * Upcoming Events * Books * Cheat Sheets * Videos * Products * COMMUNICATE * Site Status Dashboard * Report a website issue * Report a security problem * Helping during COVID-19 * About us * Contact Sales Red Hat Developer Build here. Go anywhere. We serve the builders. The problem solvers who create careers with code. Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead. * * * * Sign me up * ©2022 Red Hat, Inc. * Cookie-Präferenzen * Privacy Statement * Terms of Use * All policies and guidelines ✓ Thanks for sharing! AddToAny More…