The world of AI is no longer reserved for cloud providers — some AI models can run on your own machines, with surprisingly good results. But how far can you really go? In this session, we'll explore what you can achieve with solutions like Ollama and Stable Diffusion, right on your own hardware — whether that's a high-end laptop, a beefy desktop, or a small local server. We’ll cut through the hype and focus on the real-world capabilities, limitations, and practical considerations. What models can you run? How much performance do you need? What use cases make sense? Where does local AI shine — and where does it fall short? Along the way, we'll dive into concrete examples how we utilized Ollama for run.events, using Semantic Kernel, Microsoft's framework for building AI-driven applications, to show how you can combine local models with powerful orchestration to create real, working solutions. If you're curious about the real possibilities of self-hosted AI — and what’s just wishful thinking — this is the session for you.
Creator, lead architect, and mastermind at run.events. An Information Technology professional with 35 years of professional experience, Event Tech specialist, Event Organizer, and Public Speaker with over 400 public speaking sessions in the past 25 years. A Microsoft Most Valuable Professional (MVP) for Microsoft 365 and Microsoft Azure, and a Microsoft Regional Director. Spiritus movens at CollabSummit, AI & CloudSummit and BizAppsSummit. Firmly believes in leading by example and motivation. Lives beside the oldest vineyard in the Rhine valley. Prefers collecting stories to collecting possessions. Can make his wife and kids laugh anytime.