<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/2c651abeb3394c38a218f2860084da0d&quot; frameborder=&quot;0&quot; width=&quot;2580&quot; height=&quot;1935&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1935</height><width>2580</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1935</thumbnail_height><thumbnail_width>2580</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/2c651abeb3394c38a218f2860084da0d-6c9d57c417e8eab3.gif</thumbnail_url><duration>584.1926</duration><title>Figma and Atlassian MCPs for Cursor</title><description>This third video in the series explores how MCPs (Model Context Providers) can enhance productivity when working with AI tools like Cursor. An MCP allows the AI to automatically fetch relevant context—such as design specs from Figma or user stories from Jira—without the user having to manually describe everything in the prompt.

The video demonstrates a practical workflow: by connecting Cursor to Figma and Jira through MCPs (with authentication via tokens or plugins), the AI gains direct access to the design and acceptance criteria. With this setup, the AI can generate a login page using both sources as references. The process is smoother and faster, as the developer only needs to specify where the code should go.

The results show that while the generated UI may not be perfect initially, it gets significantly closer to the original design with iterative feedback—like comparing screenshots to the Figma layout and asking the AI to improve the UI.

In summary, using MCPs reduces the effort needed to provide detailed prompts by letting the AI pull context itself. This streamlines development, though it still requires managing context size and fine-tuning results. The video highlights the powerful potential of MCPs in AI-assisted coding workflows.</description></oembed>