Video by The Linux Foundation via YouTube

AI infrastructure is scaling to hundreds of thousands of automatically generated queries per second. But traditional search methods can’t keep up—inaccurate vector search results drive compute waste and hallucinations across your AI stack. This isn’t just a performance problem. It’s an architectural crisis that impacts every enterprise deploying agentic AI at scale.
In this exclusive interview with Swapnil Bhartiya at TFiR, Bianca Lewis, Executive Director of the OpenSearch Software Foundation, explains how OpenSearch has evolved from an AWS fork into critical AI infrastructure powering NVIDIA’s agentic AI platform. Since joining the Linux Foundation 18 months ago, OpenSearch has doubled downloads to 1.4 billion, expanded to over 400 contributing companies, and secured enterprise deployments at Changi Airport and Atlassian.
Key Topics Covered:
• Hybrid search architecture combining lexical and semantic search to prevent AI hallucinations and reduce compute costs
• AI-native telemetry integration with OpenTelemetry, Prometheus, and observability suites for monitoring agentic AI workflows
• Real-time trace and log correlation in unified dashboards for enterprise-grade AI infrastructure visibility
• Vendor-neutral governance model driving OpenSearch growth under Linux Foundation stewardship
• Production case studies from Changi Airport (world’s largest airport retail search) and Atlassian (RPM acquisition)
Read the full story & transcript at www.tfir.io
#OpenSearch #AgentyicAI #AIInfrastructure #VectorSearch #Observability #LinuxFoundation #HybridSearch #RAG #OpenTelemetry #EnterpriseAI



