<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/b6e9684d0fb94bfab892e03910179a87&quot; frameborder=&quot;0&quot; width=&quot;1914&quot; height=&quot;1435&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1435</height><width>1914</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1435</thumbnail_height><thumbnail_width>1914</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/b6e9684d0fb94bfab892e03910179a87-3137f469a0beb197.gif</thumbnail_url><duration>364.465</duration><title>Understanding A-B Test Reporting for Better Marketing Outcomes 📊</title><description>In this video, I provide an overview of how A-B test reporting works and how it differs from our conventional reporting methods. I showcase an example where a text-to-buy message generated an impressive $126,000 per month in incremental revenue with a 14% lift. I explain that A-B testing measures incremental results by comparing two groups, which can reveal insights that conventional reporting might miss, such as the potential negative impact of overwhelming users with messages. I encourage you to consider how these insights can inform your own campaigns and to explore the A-B test reports available in our system.</description></oembed>