{"type":"video","version":"1.0","html":"<iframe src=\"https://www.loom.com/embed/169fb8acd08b4397af336bda1d1298a6\" frameborder=\"0\" width=\"1152\" height=\"864\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>","height":864,"width":1152,"provider_name":"Loom","provider_url":"https://www.loom.com","thumbnail_height":864,"thumbnail_width":1152,"thumbnail_url":"https://cdn.loom.com/sessions/thumbnails/169fb8acd08b4397af336bda1d1298a6-df0ac52fdd764443.gif","duration":353.066,"title":"X2X Project Presentation","description":"Hi everyone, I'm Youssef Abid from Team Paranoid Android at Inset. Today, we're presenting our project, X2X, developed for Orange's hackathon with HexaBot. We've created helpers and plugins enabling AnyModalityInput into AnyModalityOutput, incorporating text, images, and mp3 audio translations. We utilized APIs like Grok, Whisper, LLAMA 3.2, and 11 labs for various functionalities. No action requested."}