About 600 results
Open links in new tab
  1. LM Studio - Local AI on your computer

    Run local AI models like gpt-oss, Llama, Gemma, Qwen, and DeepSeek privately on your computer.

  2. Download LM Studio - Mac, Linux, Windows

    Discover, download, and run local LLMs with LM Studio for Mac, Linux, or Windows

  3. Welcome to LM Studio Docs!

    Learn how to run Llama, DeepSeek, Qwen, Phi, and other LLMs locally with LM Studio.

  4. Get started with LM Studio | LM Studio Docs

    Download and run Large Language Models like Qwen, Mistral, Gemma, or gpt-oss in LM Studio.

  5. Tool Use | LM Studio Docs

    LM Studio parses the text output from the model into an OpenAI-compliant chat.completion response object. If the model was given access to tools, LM Studio will attempt to parse the …

  6. Model Catalog - LM Studio

    FunctionGemma is a lightweight, open model from Google, built as a foundation for creating your own specialized function calling models.

  7. LM Studio Developer Docs

    Build with LM Studio's local APIs and SDKs — TypeScript, Python, REST, and OpenAI‑compatible endpoints.

  8. lmstudio-python (Python SDK) | LM Studio Docs

    Getting started with LM Studio's Python SDKRead more about lms get in LM Studio's CLI here. Interactive Convenience, Deterministic Resource Management, or Structured Concurrency? …

  9. System Requirements | LM Studio Docs

    Windows LM Studio is supported on both x64 and ARM (Snapdragon X Elite) based systems. CPU: AVX2 instruction set support is required (for x64) RAM: LLMs can consume a lot of …

  10. Offline Operation | LM Studio Docs

    LM Studio can operate entirely offline, just make sure to get some model files first.