<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/87c5d243cde642ff942783024ff037e3&quot; frameborder=&quot;0&quot; width=&quot;1670&quot; height=&quot;1252&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1252</height><width>1670</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1252</thumbnail_height><thumbnail_width>1670</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/87c5d243cde642ff942783024ff037e3-b802d89d5fd27a34.gif</thumbnail_url><duration>57.758</duration><title>LiteLLM CLI - Self Serve</title><description>In this video, I demonstrate how to use the LightLM proxy CLI tool to provide developers with access to LightLM. This is particularly useful for granting access to a large number of users, as they simply need to log in through the CLI. I walk you through the login process, which redirects to your IDP provider for authentication—I&apos;m using Microsoft in this case. Once authenticated, I show how to list the available models by typing &quot;models list.&quot; I encourage you to try this out and streamline access for your team.</description></oembed>