git clone https://github.com/diyism/cc-nim
cd cc-nim
cp .env.exmpale .env
#设置 nvidia key
nano .env
./cc-nim.sh &
#测试流式请求, 快速持续返回很多行就说明成功启动了:
curl -N -X POST http://localhost:3001/v1/messages -H "Content-Type: application/json" -H "x-api-key: test" -d '{
"model": "z-ai/glm4.7",
"max_tokens": 256,
"messages": [{"role": "user", "content": "Say hello in one sentence"}],
"stream": true
}'
Use Claude Code CLI for free with NVIDIA NIM's free unlimited 40 reqs/min API. This lightweight proxy converts Claude Code's Anthropic API requests to NVIDIA NIM format. Includes Telegram bot integration for remote control from your phone!
- Get a new API key from build.nvidia.com/settings/api-keys
- Install claude-code
- Install uv
git clone https://github.com/Alishahryar1/cc-nim.git
cd cc-nim
cp .env.example .envEdit .env:
NVIDIA_NIM_API_KEY=nvapi-your-key-here
MODEL=moonshotai/kimi-k2-thinkingTerminal 1 - Start the proxy:
uv run uvicorn server:app --host 0.0.0.0 --port 8082Terminal 2 - Run Claude Code:
ANTHROPIC_BASE_URL=http://localhost:8082 claudeThat's it! Claude Code now uses NVIDIA NIM for free.
Control Claude Code remotely via Telegram! Send tasks from your phone and watch Claude work.
-
Get a Bot Token:
- Open Telegram and message @BotFather
- Send
/newbotand follow the prompts - Copy the HTTP API Token
-
Add to
.env:
TELEGRAM_BOT_TOKEN=123456789:ABCdefGHIjklMNOpqrSTUvwxYZ
ALLOWED_TELEGRAM_USER_ID=your_telegram_user_id💡 To find your Telegram user ID, message @userinfobot on Telegram.
- Configure the workspace (where Claude will operate):
CLAUDE_WORKSPACE=./agent_workspace
ALLOWED_DIR=C:/Users/yourname/projects- Start the server:
uv run uvicorn server:app --host 0.0.0.0 --port 8082- Usage:
- Send
/startto your bot - Send any text prompt to start a task
- Send
- Send a message to yourself on Telegram with a task
- Claude will respond with:
- 💭 Thinking tokens (reasoning steps)
- 🔧 Tool calls as they execute
- ✅ Final result when complete
- Send
/stopto cancel a running task
See nvidia_nim_models.json for the full list of supported models.
Popular choices:
moonshotai/kimi-k2.5z-ai/glm4.7minimaxai/minimax-m2.1mistralai/devstral-2-123b-instruct-2512
Browse all models at build.nvidia.com
To update nvidia_nim_models.json with the latest models from NVIDIA NIM, run the following command:
curl "https://integrate.api.nvidia.com/v1/models" > nvidia_nim_models.jsonSee .env.example for all supported parameters.
To run the test suite, use the following command:
uv run pytestExtend BaseProvider in providers/ to add support for other APIs:
from providers.base import BaseProvider, ProviderConfig
class MyProvider(BaseProvider):
async def complete(self, request):
# Make API call, return raw JSON
pass
async def stream_response(self, request, input_tokens=0):
# Yield Anthropic SSE format events
pass
def convert_response(self, response_json, original_request):
# Convert to Anthropic response format
passExtend MessagingPlatform in messaging/ to add support for other platforms (Discord, Slack, etc.):
from messaging.base import MessagingPlatform
from messaging.models import IncomingMessage
class MyPlatform(MessagingPlatform):
async def start(self):
# Initialize connection
pass
async def stop(self):
# Cleanup
pass
async def queue_send_message(self, chat_id, text, **kwargs):
# Send message to platform
pass
async def queue_edit_message(self, chat_id, message_id, text, **kwargs):
# Edit existing message
pass
def on_message(self, handler):
# Register callback for incoming messages
# Handler expects an IncomingMessage object
pass