# local-models
標記為「local-models」的 8 篇文章
Lab: Setting Up Ollama for Local LLM Testing
Install and configure Ollama for local LLM red teaming, download models, perform basic interactions, and compare safety behavior between local and API-hosted models.
Ollama Security Testing Walkthrough
Complete walkthrough for security testing locally-hosted models with Ollama: comparing safety across models, testing system prompt extraction, API security assessment, and Modelfile configuration hardening.
Local Model Analysis and Testing with Ollama
Walkthrough for using Ollama to run, analyze, and security-test local LLMs, covering model configuration, safety boundary testing, system prompt extraction, fine-tuning vulnerability assessment, and building a local red team lab.
Ollama for Local Red Teaming
Using Ollama as a local red teaming environment: model selection, running uncensored models, API-based testing, comparing safety across model families, and building a cost-free testing lab.
實驗室: Setting Up Ollama for Local LLM Testing
Install and configure Ollama for local LLM red teaming, download models, perform basic interactions, and compare safety behavior between local and API-hosted models.
Ollama 安全 Testing 導覽
Complete walkthrough for security testing locally-hosted models with Ollama: comparing safety across models, testing system prompt extraction, API security assessment, and 模型file configuration hardening.
Local 模型 Analysis and Testing with Ollama
導覽 for using Ollama to run, analyze, and security-test local LLMs, covering model configuration, safety boundary testing, system prompt extraction, fine-tuning vulnerability assessment, and building a local red team lab.
Ollama for Local 紅隊演練
Using Ollama as a local red teaming environment: model selection, running uncensored models, API-based testing, comparing safety across model families, and building a cost-free testing lab.