Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The Docker container stops when using python3 scripts/launch_triton_server.py --world_size 1 --model_repo=model_repo/ as the starting command in the Docker Compose YAML file. #580

Open
2 of 4 tasks
Aquasar11 opened this issue Aug 21, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@Aquasar11
Copy link

System Info

  • CPU EPYC 7H12 (32 core)
  • GPU NVIDIA A100-SXM4-80GB

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

  1. pull triton official image
  2. clone tensorrtllm_backend
  3. move your model to repo and make config files
  4. start the container with docker-compose.yaml file like bellow
services:
  tritonserver:
    image: triton_trt_llm
    network_mode: "host"
    container_name: triton
    shm_size: '1gb'
    volumes:
      - /data:/workspace
    working_dir: /workspace
    restart: always
    deploy:
      resources:
        reservations:
          devices:
            - capabilities: [gpu]
    command: bash -c "python3 ./tensorrtllm_backend/scripts/launch_triton_server.py --world_size=1 --model_repo=tensorrtllm_backend/all_models/inflight_batcher_llm/"

Expected behavior

After running the docker compose up command, I expect the container to start the Triton server, wait for it, and remain running unless an error occurs.

actual behavior

The container starts, runs the Python script, and exits immediately without waiting for the Triton server and TensorRT-LLM backend.

additional notes

This bug will be fixed with a simple command after the last line in scripts/launch_triton_server.py like this

child = subprocess.Popen(cmd, env=env)
child.communicate()
@Aquasar11
Copy link
Author

I've suggested this PR to fix the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant