Home

Running Local Models with Codex CLI

Install Codex CLI

I installed codex using npm:

npm install -g @openai/codex

As an alternative you can use brew:

brew install codex

Install Ollama

I installed it from the Ollama Website. I grabbed the .dmg file, opened it, and dragged it to Applications.

Run Codex Using Ollama

Now that you have both Ollama and Codex you can run codex with the --oss flag and point it at you model of choice.

codex --oss --model qwen3:latest

Build your next site in pure HTML and CSS

Want to build your next site in pure HTML and CSS? Join the free Five-Day Neat Starter Email Course and build a lean, production-ready page before Friday.

Email Me the Crash Course


JoelDare.com © Dare Companies Dotcom LLC

Terms - Privacy