Usage Examples

Note

Make sure you use TensorFlow on the Intel® Gaudi® software version 1.14.0 or below before proceeding.

hl-smi Example

  1. Create a hl-smi.yaml file:

    apiVersion: v1
    kind: Pod
    metadata:
        name: habanalabs-gaudi-demo
    spec:
        containers:
        - name: habana-ai-base-container
        image: vault.habana.ai/gaudi-docker/1.14.0/ubuntu22.04/habanalabs/tensorflow-installer-tf-cpu-2.15.0:latest
        workingDir: /root
        command: ["hl-smi"]
        args: [""]
        resources:
            limits:
            habana.ai/gaudi: 1
    
  2. Apply the .yaml file:

    oc apply -f hl-smi.yaml -n habana-system
    
  3. Check the results:

    oc logs habanalabs-gaudi-demo -n habana-system
    

mnist Example

  1. Create a Dockerfile_model file:

    FROM vault.habana.ai/gaudi-docker/1.14.0/ubuntu22.04/habanalabs/tensorflow-installer-tf-cpu-2.15.0:latest
    git clone -b 1.14.0 https://github.com/HabanaAI/Model-References.git
    
  2. Build and run the Docker container. Make sure the Docker image is built on the same operating system where the Dockerfile was created.

    docker build -f Dockerfile_model -t mnist-example:rhel8.6 .
    
  3. Once the build is finished, upload the image to your local Docker registry:

    docker login <your docker registry URL>
    docker push  <your docker registry URL/path (where your docker is stored) images/mnist-example:rhel8.6>
    
  4. Create pod-mnist.yaml file:

    apiVersion: v1
    kind: Pod
    metadata:
        name: habanalabs-gaudi-demo2
    spec:
        containers:
        - name: habana-ai-base-container
        image: <PATH to the image in step a>
        workingDir: /
        command: ["python3"]
        args: ["/Model-References/TensorFlow/examples/hello_world/example.py"]
        resources:
            limits:
            habana.ai/gaudi: 1
    
  5. Apply the pod-mnist.yaml file:

    oc apply -f pod-mnist.yaml -n habana-system
    
  6. Check the results:

    oc logs habanalabs-gaudi-demo2 -n habana-system