Commit d41f6c27 authored by Thomas Lips's avatar Thomas Lips
Browse files


parent 82dd9a15
Pipeline #62649 passed with stages
in 13 minutes and 43 seconds
......@@ -9,9 +9,9 @@ stages:
DOCKER_HOST: tcp://docker:2375/
# gpulab
# Docker Resources
Docker resources for GPULab
Docker resources for local development, AIRO local machines and GPULab.
## Available Docker images:
- `jupyterlab-pytorch` : a Pytorch ecosystem image for use with the Jupyterhub.
## Available Docker images
### `base-dl`
This is a cuda-enabled pytorch image for deep learning.
It installs a conda environment and creates a non-root user (called "user"). This container should be used for all use cases except the jupyterlab interface of GPULab.
## Local development
- build using `docker build -t jupyter-lab:test JupyterLab/`
- run the container using `docker run -p 8888:8888 -it -e JUPYTER_ENABLE_LAB=yes jupyter-lab:test`
- follow the URL in the cli to open the jupyter lab instance and test the environment
## VSCode Development Containers
- clone this repo as a submodule in the `.devcontainer` folder
- configure the `.devcontainer.json` to build the desired docker file
### `jupyterlab-pytorch` : a Pytorch ecosystem image for use with the Jupyterhub on GPULab.
This emulates the ^ setup but starts from the GPULab base image to be compatible. The user is now called "jovyan".
- alternatively, if no need for local modifications, use 'image':'' to avoid the need for (and issues with) submodules.
## Usage
### Local builds
- build using `docker build -t jupyter-lab:test docker -f docker/<container>.Dockerfile`
- run the container using `docker run -it <container-tag> bash`
### VSCode Local Development Containers
- go to the docker registry and copy the desired image url (use the highest tag or the "latest" tag)
- configure the `.devcontainer.json` to pull that docker image
- pip install your local package(s). Creating a script for this is strongly recommended for convenience.
To enable GPU on local devices: install the [nvidia docker toolkit]( and pass the `--gpus all` arg to docker to allow for discovering the GPUs. The nvidia toolkit will use the on-device drivers while installing the cuda toolkit in the container. To validate if the gpus are enabled, run the `nvidia-smi` command in the container.
### Paard
- pull the docker container
## IDLab JupyterHub:
- use `` as custom docker image.
- use `` as custom docker image.
### Known Issues:
- sometimes a 403 error appears in jupyter lab when opening the `python38` kernel, this has to do with caching of chrome (cf The solution is to do a hard refresh (CTRL + F5).
## documentation & inspiration
- gitlab build:
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment