{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/87c5d243cde642ff942783024ff037e3\" frameborder=\"0\" width=\"1670\" height=\"1252\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1252,"width":1670,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1252,"thumbnail_width":1670,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/87c5d243cde642ff942783024ff037e3-b802d89d5fd27a34.gif","duration":57.758,"title":"LiteLLM CLI - Self Serve","description":"In this video, I demonstrate how to use the LightLM proxy CLI tool to provide developers with access to LightLM. This is particularly useful for granting access to a large number of users, as they simply need to log in through the CLI. I walk you through the login process, which redirects to your IDP provider for authentication—I'm using Microsoft in this case. Once authenticated, I show how to list the available models by typing \"models list.\" I encourage you to try this out and streamline access for your team."}