Usage Examples

Note

For the following examples, make sure to use TensorFlow on Intel Gaudi software version 1.14.0 or below.

hl-smi Example

  1. Create a hl-smi.yaml file.

apiVersion: v1
kind: Pod
metadata:
    name: habanalabs-gaudi-demo
spec:
    containers:
    - name: habana-ai-base-container
    image: vault.habana.ai/gaudi-docker/1.14.0/ubuntu22.04/habanalabs/tensorflow-installer-tf-cpu-2.15.0:latest
    workingDir: /root
    command: ["hl-smi"]
    args: [""]
    resources:
        limits:
        habana.ai/gaudi: 1
  1. Apply the .yaml file.

oc apply -f hl-smi.yaml -n habana-system
  1. Check the results.

oc logs habanalabs-gaudi-demo -n habana-system

mnist Example

  1. Create a Dockerfile_model file.

FROM vault.habana.ai/gaudi-docker/1.14.0/ubuntu22.04/habanalabs/tensorflow-installer-tf-cpu-2.15.0:latest
git clone -b 1.14.0 https://github.com/HabanaAI/Model-References.git
  1. Build and run docker container. Make sure the docker image is built on the same operating system where the dockerfile was created.

docker build -f Dockerfile_model -t mnist-example:rhel8.6 .
  1. Once build is finished, upload the image to your local docker registry. Example:

docker login <your docker registry URL>
docker push  <your docker registry URL/path (where your docker is stored) images/mnist-example:rhel8.6>
  1. Create pod-mnist.yaml file.

apiVersion: v1
kind: Pod
metadata:
    name: habanalabs-gaudi-demo2
spec:
    containers:
    - name: habana-ai-base-container
    image: <PATH to the image in step a>
    workingDir: /
    command: ["python3"]
    args: ["/Model-References/TensorFlow/examples/hello_world/example.py"]
    resources:
        limits:
        habana.ai/gaudi: 1
  1. Apply the pod-mnist.yaml file.

oc apply -f pod-mnist.yaml -n habana-system
  1. Check the results.

oc logs habanalabs-gaudi-demo2 -n habana-system