🤖
Running AI Models Locally and Remotely
Dedicated to explaining the dual capability of Jan to run AI models both locally for privacy and remotely via APIs. This section covers how to choose between local and remote execution, the benefits of each approach, and detailed instructions on setting up both local models like Llama or Mistral and connecting to remote APIs like ChatGPT or Claude.