Skip to main content
redteams.ai
All tags

# inference-server

2 articlestagged with “inference-server

Attacking AI Deployments

Security assessment of AI deployment infrastructure, including container escapes, GPU side channels, inference server vulnerabilities, and resource exhaustion attacks.

deploymentcontainersgpuinference-serverinfrastructure
Advanced

Lab: Inference Server Exploitation

Attack vLLM, TGI, and Triton inference servers to discover information disclosure vulnerabilities, denial-of-service vectors, and configuration weaknesses in model serving infrastructure.

labinference-serverinfrastructurevllmtriton
Advanced