Below you will find all the pages tagged with #Wsl
Running LLMs locally with Ollama
I’m sure you already have experience interacting with LLMs through online services like OpenAI’s ChatGPT.You might even have tried multiple services and various models. And while you have been utlizing those services, you might have been wondering: Can I run large language models on my own computer?
Let's give Azure Data Explorer Kusto emulator a spin in WSL
The Kusto emulator was released back in September 2022. Encapsulating the Kusto Query Engine and making it available on local compute in a Docker Container. Until June 2023 the emulator was only available as a Windows Container.