Home
About

Hiring is broken because the test is wrong.

Live

LeetCode measures recall in a world that has Copilot. Maven measures the only thing that still matters — how a developer thinks, debates, and decides while working with AI.

The thesis

Every developer uses AI now. GitHub says 92% of developers use AI coding tools. Yet the hiring industry still bans them from assessments — testing a world that no longer exists.

The result: companies hire based on memorized algorithms instead of the skill that actually predicts on-the-job performance — the ability to think clearly while working alongside AI.

Maven was built on a simple insight: the gap between how developers actually work and how companies test them has never been wider. We decided to close it.

What we measure

Most platforms grade the final output. Maven watches the entire session — every keystroke, every AI prompt, every debug cycle, every decision to accept or reject a suggestion.

From that raw signal, we produce an AI Collaboration Score — a 0-to-100 measure of how effectively a developer works with AI tools. Not whether they used AI, but how they used it.

Prompt quality
Are they asking the right questions, or pasting errors and hoping?
Critical evaluation
Do they verify AI output, or blindly accept the first suggestion?
Iterative refinement
Can they steer a conversation toward a working solution?
Independent judgment
Do they know when to code themselves vs. when AI adds value?
Debugging under pressure
When the AI is wrong, can they course-correct quickly?
What we believe
AI fluency is the new literacy.
Within two years, every engineering role will require it. The companies that figure out how to measure it first will hire the best people.
Process reveals more than output.
Two developers can produce identical code. The one who got there by reasoning through the problem — questioning the AI, testing edge cases, iterating on the approach — is a fundamentally different hire than the one who copy-pasted the first suggestion.
Assessments should mirror real work.
Developers don't whiteboard in production. They have an IDE, a terminal, documentation, and AI. Maven gives candidates the same tools they use every day, then watches how they use them.
Transparency builds trust.
We publish our pricing. We show candidates exactly what's being measured. We never train models on assessment data. Hiring is high-stakes — everyone involved deserves to know the rules.
How it works

An employer creates a role-specific coding challenge — or lets Maven generate one calibrated to seniority and tech stack. The candidate gets a full IDE in the browser with a built-in AI copilot, terminal, and file system. No downloads, no setup.

During the session, Maven's WorkGraph engine captures a complete behavioral timeline: code edits, AI interactions, test runs, debug cycles, pauses, and revisions. Every action is timestamped and attributed.

After submission, the engine analyzes the session across five behavioral dimensions and produces a structured report: an overall score, a developer archetype classification, risk flags, and an evidence-backed hire/pass recommendation. The employer sees the full picture. The candidate sees their score.

01
Create
Company sets a challenge or generates one for their role
02
Invite
Candidates get a link — sign up and start in under a minute
03
Assess
Full IDE with AI copilot, terminal, real-world environment
04
Analyze
WorkGraph captures every action, scores AI collaboration
05
Decide
Structured report with score, archetype, and recommendation
Where we are

Maven is live and in early access. The platform is fully functional — employers are creating assessments, candidates are completing them, and the scoring pipeline is producing results.

We're focused on the college hiring pipeline first: university career services and employers recruiting from early-career talent pools. This is where the old approach fails hardest — new grads don't have years of LeetCode prep, but they do know how to work with AI. Maven lets them prove it.

If you're hiring developers and tired of assessments that test the wrong thing, we'd like to talk.

Get in touch

Reach out at hello@maven.dev — we reply within one business day. No sales deck required.

Partnerships|University programs|Press|Careers