Running Local Models with Codex CLI

By Joel Dare - Written September 2, 2025

Install Codex CLI

I installed codex using npm:

npm install -g @openai/codex

As an alternative you can use brew:

brew install codex

Install Ollama

I installed it from the Ollama Website. I grabbed the .dmg file, opened it, and dragged it to Applications.

Run Codex Using Ollama

Now that you have both Ollama and Codex you can run codex with the --oss flag and point it at you model of choice.

codex --oss --model qwen3:latest
Leave a Comment
through formrobin.com

JoelDare.com © Dare Companies Dotcom LLC

Terms - Privacy