Usage Examples
On this Page
Usage Examples¶
hl-smi Example¶
Create a
hl-smi.yaml
file.
apiVersion: v1
kind: Pod
metadata:
name: habanalabs-gaudi-demo
spec:
containers:
- name: habana-ai-base-container
image: vault.habana.ai/gaudi-docker/1.8.0/ubuntu20.04/habanalabs/tensorflow-installer-tf-cpu-2.9.1:latest
workingDir: /root
command: ["hl-smi"]
args: [""]
resources:
limits:
habana.ai/gaudi: 1
Apply the .yaml file.
oc apply -f hl-smi.yaml -n habana-system
Check the results.
oc logs habanalabs-gaudi-demo -n habana-system
mnist Example¶
Create a
Dockerfile_model
file.
FROM vault.habana.ai/gaudi-docker/1.8.0/ubuntu22.04/habanalabs/tensorflow-installer-tf-cpu-2.11.0:latest
git clone -b |Version| https://github.com/HabanaAI/Model-References.git
Build and run docker container. Make sure the docker image is built on the same operating system where the dockerfile was created.
docker build -f Dockerfile_model -t mnist-example:rhel8.6 .
Once build is finished, upload the image to your local docker registry. Example:
docker login <your docker registry URL>
docker push <your docker registry URL/path (where your docker is stored) images/mnist-example:rhel8.6>
Create
pod-mnist.yaml
file.
apiVersion: v1
kind: Pod
metadata:
name: habanalabs-gaudi-demo2
spec:
containers:
- name: habana-ai-base-container
image: <PATH to the image in step a>
workingDir: /
command: ["python3"]
args: ["/Model-References/TensorFlow/examples/hello_world/example.py"]
resources:
limits:
habana.ai/gaudi: 1
Apply the
pod-mnist.yaml
file.
oc apply -f pod-mnist.yaml -n habana-system
Check the results.
oc logs habanalabs-gaudi-demo2 -n habana-system