# tee
3 articlestagged with “tee”
Confidential Computing for LLM Inference
Using trusted execution environments and confidential computing for secure LLM inference and data protection.
defenseconfidential-computingtee
AI Workload Isolation
Isolation techniques for AI workloads using VMs, containers, and trusted execution environments (TEEs).
infrastructureisolationcontainersteeconfidential-computing
Trusted Execution Environments for AI Workloads
Security analysis of Intel SGX, AMD SEV, and ARM TrustZone for protecting AI model inference and training in untrusted environments
infrastructureconfidential-computingteehardware-securityside-channels