{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/02833f38e3a6461983a2d7c3a3a570b8\" frameborder=\"0\" width=\"2560\" height=\"1920\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1920,"width":2560,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1920,"thumbnail_width":2560,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/02833f38e3a6461983a2d7c3a3a570b8-00001.gif","duration":1454.03,"title":"Langchain OpenSearch - demos-n-examples - 28 April 2023","description":"This code utilizes Open AI GPT, Langchain, Redis Cache, OpenSearch and Unstructured to fetch content from URLs, sitemap, PDF, Powerpoint, Notion doc and images to create embeddings/vectors and save them in a local OpenSearch database. The created collections can then be used with GPT to answer questions.\nMore details on our repo: https://github.com/voiceflow/demos-n-examples/tree/master/langchain-local-knowledgebase"}