<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/02833f38e3a6461983a2d7c3a3a570b8&quot; frameborder=&quot;0&quot; width=&quot;2560&quot; height=&quot;1920&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>1920</height><width>2560</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>1920</thumbnail_height><thumbnail_width>2560</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/02833f38e3a6461983a2d7c3a3a570b8-00001.gif</thumbnail_url><duration>1454.03</duration><title>Langchain OpenSearch - demos-n-examples - 28 April 2023</title><description>This code utilizes Open AI GPT, Langchain, Redis Cache, OpenSearch and Unstructured to fetch content from URLs, sitemap, PDF, Powerpoint, Notion doc and images to create embeddings/vectors and save them in a local OpenSearch database. The created collections can then be used with GPT to answer questions.
More details on our repo: https://github.com/voiceflow/demos-n-examples/tree/master/langchain-local-knowledgebase</description></oembed>