Page 1 of 1

General Information

Posted: Sun Aug 10, 2025 12:28 pm
by JHPJHP
Ollama REST API

REFERENCES:
Ollama User Guide
• Ollama Desktop app
• Command Line
• Windows Subsystem for Linux (WSL)
• Ollama Python Client
• Check Your GPU
• Hardware Performance Benchmarks
• Quantization in LLMs
• Mixture-of-Experts (MoE) in LLMs
Ollama WSL Setup Guide
• Step 1: Determine How Ollama is Running in WSL
• Step 2: Stop Ollama if Running as a systemd Service
• Step 3: Find Your WSL IP Address
• Step 4: Start Ollama Bound to All Interfaces (Temporary)
• Step 5: Test Connectivity
• Step 6: (Skip if Using systemd service)
• Step 7: Make Binding Permanent — Ollama Running as a systemd Service
Ollama WSL Log Guide
• Step 1: View Ollama Logs in Real-Time
• Step 2: View Recent Ollama Logs
• Additional Useful Journalctl Commands

Connect_Ollama.pb:
• Sends a POST request with a user-defined model and prompt.
• Attempts to connect to Ollama via the Windows Subsystem for Linux (WSL).
• Alternatively, attempts to connect through a local Ollama service.