<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/00791498f1d84e4ba6d7476bd2e1442f&quot; frameborder=&quot;0&quot; width=&quot;1108&quot; height=&quot;831&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>831</height><width>1108</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>831</thumbnail_height><thumbnail_width>1108</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/00791498f1d84e4ba6d7476bd2e1442f-3120c50efa263701.gif</thumbnail_url><duration>235.489</duration><title>Setting Up OpenCode with LightLLM: A Quick Demo 🚀</title><description>In this quick demo, I walk you through setting up OpenCode with LightLLM, highlighting my local LightLLM proxy and the three Azure GPT models I&apos;ve exposed. After installing OpenCode, I show you how to configure it at a global level or project level based on your needs. I emphasize the importance of defining the provider configuration, particularly for LightLLM, and how to set the base URL to point to localhost. I also guide you on listing the modules and using aliases for easy reference in OpenCode. Please follow along and enter your API key to see the specified modules.</description></oembed>