kwale Score

Predict brain response before you spend a dollar

Upload any video and get per-second neural predictions across 20,484 cortical vertices. Know what works before you publish.

Join Waitlist
The Problem

Billions spent on content nobody watches

Marketers spend over $500 billion annually on content with no way to predict whether it will resonate before publishing. Focus groups take weeks and cost tens of thousands. A/B testing only works after you have already shipped. The result: most content underperforms, and nobody knows why until the budget is gone.

Traditional neuromarketing offers answers, but at $50,000 to $500,000 per study, with 4 to 8 weeks of lead time and samples of just 30 to 100 lab subjects, it is reserved for the largest brands and the highest-stakes campaigns.

What if you could predict brain response computationally -- in seconds, for cents, with no lab, no participants, and no scheduling?

$500B+
spent annually on content marketing with no way to predict neural impact before publishing.
How It Works

From upload to action items in seconds

01

Upload your video

Drop any video file or paste a URL. TikTok, YouTube, feature films, product demos -- any format, any length up to 30 minutes on Pro.

02

ORCLE predicts cortical activation

Six frozen neural encoders process your video in parallel -- visual, audio, text, OCR, long-context, and reasoning streams. The fusion transformer predicts activation across 20,484 cortical vertices.

03

Atlas maps to 20 metrics

The neural atlas aggregates vertex-level predictions into 20 interpretable metrics: hook score, sustained attention, emotional resonance, memory encoding, cognitive load, and more.

04

VLM generates editing prescriptions

A vision-language model analyzes the neural timeseries alongside your video and generates specific, timestamped editing recommendations referencing brain mechanisms.

05

Act on the results

Export a full PDF report, use the API to integrate scores into your workflow, or feed results directly into kwale Studio for automatic optimization.

Key Capabilities

Six dimensions of neural intelligence

Hook Score

Predicts the probability of capturing attention in the first 3 seconds. Trained on NAcc (nucleus accumbens) reward-anticipation signals.

NAcc -- Reward anticipation

Attention Tracking

Per-second sustained attention predictions based on dorsolateral prefrontal cortex (dlPFC) activation patterns across your entire video.

dlPFC -- Executive attention

Emotional Resonance

Measures predicted emotional arousal and valence from amygdala and anterior insula (AIns) activation, identifying moments of peak emotional impact.

Amygdala + AIns -- Emotional processing

Memory Encoding

Predicts which moments will be remembered 24 hours later based on hippocampal formation activation patterns during viewing.

Hippocampus -- Memory consolidation

Modality Ablation

Isolates the contribution of each sensory channel (visual, audio, text) to overall engagement, revealing which elements are carrying the experience.

Visual cortex + Auditory cortex

Demographic Scoring

Predicts how different audience segments respond. Demographic conditioning adjusts the neural model for age, gender, and interest profiles.

TPJ -- Social cognition & theory of mind
The Science

The neuroscience behind kwale Score

kwale Score is built on ORCLE, a brain encoder trained on 451.6 hours of fMRI data from 720+ human subjects watching naturalistic video. It predicts activation across these key brain regions:

NAcc (Nucleus Accumbens) -- Reward & motivation
AIns (Anterior Insula) -- Interoception & salience
dlPFC (Dorsolateral Prefrontal) -- Executive attention
DMN (Default Mode Network) -- Narrative self-reference
Hippocampus -- Memory encoding & retrieval
Amygdala -- Emotional processing
TPJ (Temporo-Parietal Junction) -- Social cognition
Visual Cortex -- Scene & object processing
Selected References
Huth, A.G., et al. (2016). Natural speech reveals the semantic maps that tile human cerebral cortex. Nature, 532, 453-458.
Kell, A.J., et al. (2018). A task-optimized neural network replicates human auditory behavior. Neuron, 98(3), 630-644.
Allen, E.J., et al. (2022). A massive 7T fMRI dataset to bridge cognitive neuroscience and AI. Nature Neuroscience, 25, 116-126.
Gifford, A.T., et al. (2023). The Algonauts Project 2023 Challenge. arXiv:2301.15469.
View full research references →
Pricing

Choose your plan

Free

$0

5 scores per month

  • 5 video scores/month
  • 3 score dimensions
  • 10s timeseries preview
  • Watermarked PDF export
  • Up to 5 min videos
Join Waitlist

Pro

$49/mo

200 scores, full metrics

  • 200 video scores/month
  • All 6 dimensions + 18 metrics
  • Full per-second timeseries
  • Interactive 3D brain map
  • AI action items
  • API access (100 rpm)
  • A/B comparison
  • Up to 30 min videos
Join Waitlist

Enterprise

Custom

Unlimited, custom models

  • Unlimited scores
  • Custom demographic models
  • Raw vertex data download
  • White-label PDF export
  • Dedicated CSM + Slack
  • SLA guarantees
  • Up to 4 hour videos
Join Waitlist

Ready to predict how brains respond?

Join the waitlist for kwale Score and start scoring content against real neuroscience.

Join Waitlist

Built on peer-reviewed neuroscience. Patent pending.