Učitavanje video playera...
00:37:16
KulenDayz 2024 - How to Run an LLM Locally
KulenDayz 2024
How to Run an LLM Locally

Marko Lohert

There are multiple benefits of running an LLM on a local machine: our sensitive data is not sent to third party, performance is consistent, and there is no fee to be paid. In a demo we’ll see how to run an LLM on a local machine and how to call that LLM from the application we are building. We can also chat with an LLM that is running locally. There is a growing list of LLMs to choose from: Llama by Meta, Google’s Gemma, Mistral etc. We’ll see in a demo how to run LLMs using tools like LM Studio, and Jan. We can also locally run a SLM, Small Language Model, that is targeted to a specific business domain and have a faster response time than LLM. We’ll learn about a popular SLM Phi-3 from Microsoft.
Objavljeno: 07.10.2024
Unutar kategorije: Obrazovanje
VoD paketi: KulenDayz 2024.