mirror of
https://github.com/openai/whisper.git
synced 2025-11-24 14:35:57 +00:00
Add Dockerfile_hpu + README.md section of hpu usage
This commit is contained in:
parent
c479ff3614
commit
6e66088bd0
35
Dockerfile_hpu
Normal file
35
Dockerfile_hpu
Normal file
@ -0,0 +1,35 @@
|
|||||||
|
# Use the official Gaudi Docker image with PyTorch
|
||||||
|
FROM vault.habana.ai/gaudi-docker/1.17.0/ubuntu22.04/habanalabs/pytorch-installer-2.3.1:latest
|
||||||
|
|
||||||
|
# Set environment variables for Habana
|
||||||
|
ENV HABANA_VISIBLE_DEVICES=all
|
||||||
|
ENV OMPI_MCA_btl_vader_single_copy_mechanism=none
|
||||||
|
ENV PT_HPU_LAZY_ACC_PAR_MODE=0
|
||||||
|
ENV PT_HPU_ENABLE_LAZY_COLLECTIVES=1
|
||||||
|
|
||||||
|
# Install essential Linux packages and ffmpeg
|
||||||
|
ENV DEBIAN_FRONTEND="noninteractive" TZ=Etc/UTC
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
tzdata \
|
||||||
|
bash-completion \
|
||||||
|
python3-pip \
|
||||||
|
openssh-server \
|
||||||
|
vim \
|
||||||
|
git \
|
||||||
|
iputils-ping \
|
||||||
|
net-tools \
|
||||||
|
protobuf-compiler \
|
||||||
|
curl \
|
||||||
|
bc \
|
||||||
|
gawk \
|
||||||
|
tmux \
|
||||||
|
ffmpeg \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy the Whisper repo contents
|
||||||
|
ADD whisper /root/whisper
|
||||||
|
WORKDIR /root/whisper
|
||||||
|
|
||||||
|
# Install Python packages from Whisper requirements
|
||||||
|
RUN pip install --upgrade pip \
|
||||||
|
&& pip install -r requirements.txt \
|
||||||
30
README.md
30
README.md
@ -140,6 +140,36 @@ result = whisper.decode(model, mel, options)
|
|||||||
print(result.text)
|
print(result.text)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Intel® Gaudi® hpu usage
|
||||||
|
|
||||||
|
### Build the Docker Image
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker build -t whisper_hpu:latest -f Dockerfile_hpu .
|
||||||
|
```
|
||||||
|
|
||||||
|
### Run the Container
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker run -it --runtime=habana \
|
||||||
|
-e HABANA_VISIBLE_DEVICES=all \
|
||||||
|
-e OMPI_MCA_btl_vader_single_copy_mechanism=none \
|
||||||
|
--cap-add=sys_nice \
|
||||||
|
--net=host \
|
||||||
|
--ipc=host \
|
||||||
|
-v /path/to/your/whisper:/workspace/whisper \
|
||||||
|
whisper_hpu:latest \
|
||||||
|
/bin/bash
|
||||||
|
```
|
||||||
|
|
||||||
|
Make sure to replace `/path/to/your/whisper` with the path to the Whisper repository on your local machine.
|
||||||
|
|
||||||
|
### Command-line usage with Intel® Gaudi® hpu
|
||||||
|
|
||||||
|
To run the `whisper` command with Intel® Gaudi® hpu, you can use the `--device hpu` option:
|
||||||
|
|
||||||
|
whisper audio.flac audio.mp3 audio.wav --model turbo --device hpu
|
||||||
|
|
||||||
## More examples
|
## More examples
|
||||||
|
|
||||||
Please use the [🙌 Show and tell](https://github.com/openai/whisper/discussions/categories/show-and-tell) category in Discussions for sharing more example usages of Whisper and third-party extensions such as web demos, integrations with other tools, ports for different platforms, etc.
|
Please use the [🙌 Show and tell](https://github.com/openai/whisper/discussions/categories/show-and-tell) category in Discussions for sharing more example usages of Whisper and third-party extensions such as web demos, integrations with other tools, ports for different platforms, etc.
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user