Run Jupyter Notebooks on Red Hat OpenShift Data Science - WIP

Prerequisites

  • Configure RHODS workspace has been completed.

Run Jupyter notebooks

  1. prepare_env.ipynb 20240220131717
  2. Train model.ipynb 20240220133400
  3. Test_Model.ipynb 20240220132122
  4. Store_Model.ipynb
    Change the minio vaules and run the notebook 20240220134738

Create pipelines with Tekton integration

Create a file called demo.pipeline

Click on Pipeline Editor 20240220135438 Rename pipeline to demo pipeline 20240220135555

Configure Pipeline Runtime

20240220135727

  • Data Science Pipelines API Endpoint*
    • https://ds-pipeline-pipelines-definition-mltesting.apps.mltesting.example.com
  • Public Data Science Pipelines API Endpoint
    • https://rhods-dashboard-redhat-ods-applications.apps.mltesting.sandbox2831.opentlc.com/pipelineRuns/mltesting
  • Data Science Pipelines API Endpoint Password Or Token
    • OpenShift API token
  • Cloud Object Storage Endpoint*
    • https://minio-api-mltesting.apps.mltesting.sandbox2831.opentlc.com
  • Cloud Object Storage Bucket Name*
    • mybucket
  • Cloud Object Storage Credentials Secret
    • ` aws-connection-mybucket`
  • Cloud Object Storage Username
    • minio
  • Cloud Object Storage Password
    • minio123

20240220140905 20240220140937

The data volume for each job is below

  • Mount Path* /opt/app-root/src/
  • Persistent Volume Claim Name* pins-workspace

Optional: Add GPU information

  • GPU: 1
  • GPU Vendor: nvidia.com/gpu
  1. prepare_env.ipynb 20240220141325
  2. Train model.ipynb 20240220142450
  3. Test_Model.ipynb 20240220142610
  4. Store_Model.ipynb
    20240220142703

Final View of the pipeline

20240220142735

Start pipeline run

20240221104217 You should see this screen on scuessful run 20240221104236

20240221104730

20240221104743