{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/babdb42fa6ef493b8bfefc547f9689f0\" frameborder=\"0\" width=\"1920\" height=\"1440\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1440,"width":1920,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1440,"thumbnail_width":1920,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/babdb42fa6ef493b8bfefc547f9689f0-23b6e9d990ff3e2c.jpg","duration":1104.272,"title":"Building an End-to-End ETL Pipeline with Databricks and Delta Lake 🚀","description":"In this video, I walk you through a hands-on data engineering project where I designed and implemented an end-to-end ETL pipeline for a fictional e-commerce company called Novacard. The project utilizes Databricks and Delta Lake, following the medallion architecture to transform raw transactional data into meaningful business insights. I detail the steps from ingesting the data to cleaning and enriching it, ultimately aggregating it to compute total sales and average order value. I also highlight the use of Delta Lake features for data governance and optimization. I encourage you to consider how similar approaches could enhance our own data workflows."}