<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/792ce1a6b73c4fc0bd0c5696a8bd6b60&quot; frameborder=&quot;0&quot; width=&quot;1316&quot; height=&quot;987&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>987</height><width>1316</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>987</thumbnail_height><thumbnail_width>1316</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/792ce1a6b73c4fc0bd0c5696a8bd6b60-f214bac154b7701d.gif</thumbnail_url><duration>746.628</duration><title>Exploring Memory Strategies in Conversational AI 🤖</title><description>In this video, I present my memory comparison project, where I explore three different memory strategies for a Langchain conversational agent: baseline, session memory, and long-term memory. The baseline strategy stores no information, while session memory retains context within a single thread, and long-term memory allows for cross-thread recall of key facts. I demonstrate these strategies through three scripted conversations, highlighting their distinct behaviors and storage mechanisms. I encourage you to consider how these memory strategies could be applied in real-world scenarios, such as customer support or personal AI assistants. Please take a look at the code and tests I implemented to see how these strategies function in practice.</description></oembed>