Commit 2c3511fb authored by Thomas Lips's avatar Thomas Lips
Browse files

update readme

parent d41f6c27
Pipeline #62728 passed with stages
in 14 minutes and 6 seconds
......@@ -19,15 +19,21 @@ This emulates the ^ setup but starts from the GPULab base image to be compatible
- run the container using `docker run -it <container-tag> bash`
### VSCode Local Development Containers
- go to the docker registry and copy the desired image url (use the highest tag or the "latest" tag)
- go to the docker registry and copy the desired image url (use latest tag or :latest tag)
- configure the `.devcontainer.json` to pull that docker image
- pip install your local package(s). Creating a script for this is strongly recommended for convenience.
To enable GPU on local devices: install the [nvidia docker toolkit]( and pass the `--gpus all` arg to docker to allow for discovering the GPUs. The nvidia toolkit will use the on-device drivers while installing the cuda toolkit in the container. To validate if the gpus are enabled, run the `nvidia-smi` command in the container.
### Paard
- pull the docker container
- pull the docker container
- run the container (with the appropriate mounts): `docker run --gpus all --name <project> -it -v /fast_storage/:/fast_storage -v /storage/:/storage -v /home/tlips/:/home/user -d --shm-size 16G`
### IDLab Jobs:
- use the website or the CLI to create a job using the base-dl container.
- use the CLI to ssh into the job and run any necessary command
## IDLab JupyterHub:
- use `` as custom docker image.
- use `` as custom docker image.
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment