{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/adedbd6566014bb499fd39f118f3c5c6\" frameborder=\"0\" width=\"1920\" height=\"1440\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":1440,"width":1920,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":1440,"thumbnail_width":1920,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/adedbd6566014bb499fd39f118f3c5c6-5976bd658323251d.gif","duration":299.016,"title":"Local ChatGPT for Famous People and Places","description":"Hi, I’m Dedemirc, and this Loom is my demo for Project 3. Our goal was to build a ChatGPT style assistant that answers questions about famous people and famous places, with everything running locally and no external APIs. I ingest 20 people and 20 places from Wikipedia, chunk them into overlapping word based chunks of about 500 words, embed them with a local sentence transformer, and store everything in SQLite and Chroma. At question time, I embed the query, do a basic person or place classification, retrieve relevant chunks, and generate a grounded answer using a local Llama 3.2 model through Ollama. The main trade offs are slower local responses and no chat memory, and I suggested improving classifier quality, adding streaming, and implementing chat history. There was no specific action requested from viewers."}