<?xml version="1.0" encoding="UTF-8"?><oembed><type>video</type><version>1.0</version><html>&lt;iframe src=&quot;https://www.loom.com/embed/039aa0f549ea4b01a203a61cf1bb1fb9&quot; frameborder=&quot;0&quot; width=&quot;1114&quot; height=&quot;835&quot; webkitallowfullscreen mozallowfullscreen allowfullscreen&gt;&lt;/iframe&gt;</html><height>835</height><width>1114</width><provider_name>Loom</provider_name><provider_url>https://www.loom.com</provider_url><thumbnail_height>835</thumbnail_height><thumbnail_width>1114</thumbnail_width><thumbnail_url>https://cdn.loom.com/sessions/thumbnails/039aa0f549ea4b01a203a61cf1bb1fb9-980a971fc0efd4a7.gif</thumbnail_url><duration>300.1711</duration><title>Perplexity ReSearcher</title><description>Purpose &amp; Impact: 
ReSearcher solves a $30 billion inefficiency: 40% of conference-accepted researchers can&apos;t afford $4,700 attendance while companies spend $30,000+ per hire searching for them. 
We built a reasoning graph autonomously matching peer-reviewed papers to job descriptions—85% cost savings while funding conference access. Deployed across major AI conferences, this could fund 5,000+ researchers annually while saving companies $150M in recruitment fees.

Technical Implementation: Our dual-portal architecture uses Perplexity&apos;s sonar-pro model with return_citations: true and search_domain_filter: [&apos;arxiv.org&apos;, &apos;scholar.google.com&apos;, &apos;github.com&apos;] for academic grounding.
Multi-step reasoning pipeline:
fetchResearcherProfile() — Real-time Scholar search with authorship validation
matchPaperToJob() — Semantic analysis returning 0-100 scores with alignment/gaps
askAboutResearcher() — Skill extraction with citations
Evidence mapping — Paper sections → job requirements

Built with Next.js 14 and TypeScript, achieving ~5 second matching for 5 researchers via parallel processing (5-10x speedup) with 30-minute caching.</description></oembed>