<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/bd98db65474f4e828bd4db65d556159c&quot; frameborder=&quot;0&quot; width=&quot;1920&quot; height=&quot;1440&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1440</height><width>1920</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1440</thumbnail_height><thumbnail_width>1920</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/bd98db65474f4e828bd4db65d556159c-552ede8b46276907.gif</thumbnail_url><duration>799.718</duration><title>Querying Ollama with schema.org ontology and using it with Logseq</title><description>In this video, I demonstrate how to query data in Allama LLM using a LogSeq graph, specifically focusing on books and notable figures like Feynman. I walk through the process of importing data, correcting inaccuracies, and enhancing our knowledge graph with additional properties. I also highlight the importance of the schema.org ontology in structuring our queries. Please take a moment to review the script and let me know if you have any questions or feedback!</description></oembed>