<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/0f624c26551b4273b33371358c3164d3&quot; frameborder=&quot;0&quot; width=&quot;1668&quot; height=&quot;1251&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1251</height><width>1668</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1251</thumbnail_height><thumbnail_width>1668</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/0f624c26551b4273b33371358c3164d3-6719ec66493fe0f9.gif</thumbnail_url><duration>316.24</duration><title>Bridging Large Language Models and Molecular Graphs for Enhanced Understanding 🌌</title><description>In this video, I discuss my recent work accepted by ICLR 2026, which focuses on bridging large-language models with molecular graphs. We address the challenges posed by different molecular modalities and propose two key innovations: entropy-guided patching to preserve molecular structure and a DynamicQueryFormer that enhances the Q-former framework. Our model demonstrates superior performance across various benchmarks, including multiple-choice questions and property generation tasks. I encourage viewers to review our findings and consider the implications for future research in this area.</description></oembed>