The Lab is Explora's working research and development environment, where we test emerging tools, build internal solutions, and document what actually works before we recommend it to anyone else.
We believe you can only advise what you've actually built, and only teach what you've genuinely used.
The Lab is how Explora stays current. We use it to stress-test AI tools in real production contexts, build proprietary workflows, and explore how emerging technologies apply to learning design, strategy consulting, and media production.
What starts in the Lab either gets quietly retired, or becomes embedded in how we work with clients and what we bring to the classroom.
Tools and systems we've built, integrated, or actively refined, in real use across consulting, education, and production.
AI-powered course monitoring system that tracks learner engagement patterns, flags at-risk students, and surfaces early indicators of course quality issues across UBC program delivery and corporate training environments.
Personalized learning pathway engine that maps individual learner progress against program outcomes, recommends next steps, and adapts content sequencing based on demonstrated competency and engagement data.
Intelligent digital asset management system that uses AI to auto-tag, categorize, and retrieve media assets across production workflows, surfacing relevant materials and tracking asset usage across active projects.
AI-augmented analytical tools built around Explora's proprietary Compass Framework, supporting rapid gap analysis, readiness assessments, and strategic scenario modelling for client engagements.
Practical cost-planning tool for teams evaluating or scaling AI API usage. Models token consumption across use cases, compares pricing across providers, and surfaces budget projections for production deployments.
Structured tool for designing and generating capstone research frameworks, assessment rubrics, and milestone scaffolding for professional programs. Reduces design time while ensuring alignment with learning outcomes.
Tracks organizational progress against major AI governance frameworks including ISO 42001, NIST AI RMF, and internal policy requirements. Generates gap reports and prioritized remediation plans for client advisory work.
Automated pipelines connecting communication, scheduling, document generation, and project tracking using n8n for orchestration and locally-run open source models for privacy-preserving AI processing.
An experimental bridge between motion capture data and generative AI pipelines, enabling real-time AI-driven responses to human movement for immersive theatre, interactive installations, and adaptive media production.
We don't subscribe to tools we don't use. Everything here has been evaluated under real production conditions, most of it is active right now.
Stack is actively evolving — we evaluate and integrate new tools continuously.
The Lab isn't a showcase, it's a working environment. Every stage has a purpose.
We don't assess tools from the outside. We integrate them into actual workflows, content development, client strategy, course production, and measure what changes and what doesn't.
Lab experiments run against real deadlines, real clients, and real complexity. If a tool can't hold up under production conditions, it doesn't make the cut, regardless of the demo.
Validated tools and methods get embedded into how Explora operates, in consulting engagements, in client deliverables, in our own business workflows. The Lab feeds the work.
What we learn in the Lab shapes what we teach. Course content is grounded in tools and methods we've actually used, not theoretical overviews of what the industry is doing.
Whether you want to understand how we're applying AI in our own work, explore what that means for your organization, or collaborate on something new, we'd like to hear from you.