<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/9e3bfa9b4a7f4fc681284bc4002a91a8&quot; frameborder=&quot;0&quot; width=&quot;1280&quot; height=&quot;960&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>960</height><width>1280</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>960</thumbnail_height><thumbnail_width>1280</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/9e3bfa9b4a7f4fc681284bc4002a91a8-00001.gif</thumbnail_url><duration>117.33631011599998</duration><title>prodigy dupes</title><description>In this video, I express my gratitude to Vincent and Ryan for their assistance in reviewing my work. Despite implementing their suggestions, I am still encountering the same issue. I made some additional changes, such as adding &quot;overwrite true&quot; to the get stream call and fixing a typo in the JSON structure. However, I am puzzled by the fact that the machine runs twice in the logs. Moving on to Prodigy, there are now only 58 rows, and the task has become quite messy. I showcase the row counter and demonstrate how it fluctuates. Thank you for watching!</description></oembed>