loading brocco
loading broccoIntegration · model
Run agents fully local. brocco auto-detects localhost:11434.
Run Ollama locally. brocco pings localhost:11434/v1/models on /app load and surfaces a "Use local Ollama" toggle if it sees one.
# Default ollama serve ollama pull llama3.3:70b # brocco picks it up automatically
brocco runs your business while you sleep. 100 agent runs free, every month, forever. no card.