<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/a8eb6fa3bece4f89835c778011637a6c&quot; frameborder=&quot;0&quot; width=&quot;1512&quot; height=&quot;1134&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1134</height><width>1512</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1134</thumbnail_height><thumbnail_width>1512</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/a8eb6fa3bece4f89835c778011637a6c-00001.gif</thumbnail_url><duration>160.68</duration><title>Estuary Flow: Extracting Data from Supabase for analytics in Snowflake</title><description>In this video, I will guide you through the process of extracting data from a supabase and pushing it into Snowflake using Estuary Flow. We will start by creating a free account on estuary.dev and then proceed to create a new capture using the Supabase specific capture. Estuary will show you the necessary configuration information and provide you with the Supabase specific docs. Additionally, you will need to create a dedicated ipd4 address within the Supabase and retrieve it for further setup. Finally, I will demonstrate how to save the tables as collections in Estuary Flow and push them to your own cloud storage, enabling unique capabilities such as creating transformations using SQL and TypeScript and pushing to multiple destinations from a single source.</description></oembed>