Usage Examples

hl-smi Example

  1. Create a hl-smi.yaml file.

apiVersion: v1
kind: Pod
metadata:
    name: habanalabs-gaudi-demo
spec:
    containers:
    - name: habana-ai-base-container
    image: vault.habana.ai/gaudi-docker/1.6.1/ubuntu20.04/habanalabs/tensorflow-installer-tf-cpu-2.9.1:latest
    workingDir: /root
    command: ["hl-smi"]
    args: [""]
    resources:
        limits:
        habana.ai/gaudi: 1
  1. Apply the .yaml file.

oc apply -f hl-smi.yaml -n habana-system
  1. Check the results.

oc logs habanalabs-gaudi-demo -n habana-system

mnist Example

  1. Create a Dockerfile_model file.

FROM vault.habana.ai/gaudi-docker/1.7.1/ubuntu20.04/habanalabs/tensorflow-installer-tf-cpu-2.10.1:latest
git clone -b |Version| https://github.com/HabanaAI/Model-References.git
  1. Build and run docker container. Make sure the docker image is built on the same operating system where the dockerfile was created.

docker build -f Dockerfile_model -t mnist-example:rhel8.6 .
  1. Once build is finished, upload the image to your local docker registry. Example:

docker login <your docker registry URL>
docker push  <your docker registry URL/path (where your docker is stored) images/mnist-example:rhel8.6>
  1. Create pod-mnist.yaml file.

apiVersion: v1
kind: Pod
metadata:
    name: habanalabs-gaudi-demo2
spec:
    containers:
    - name: habana-ai-base-container
    image: <PATH to the image in step a>
    workingDir: /
    command: ["python3"]
    args: ["/Model-References/TensorFlow/examples/hello_world/example.py"]
    resources:
        limits:
        habana.ai/gaudi: 1
  1. Apply the pod-mnist.yaml file.

oc apply -f pod-mnist.yaml -n habana-system
  1. Check the results.

oc logs habanalabs-gaudi-demo2 -n habana-system