Letterboxd vs IMDb vs Rotten Tomatoes: Which Rating System Should You Actually Trust?

You’ve been there: a movie sits at 95% on Rotten Tomatoes, 6.8 on IMDb, and 3.4/5 on Letterboxd. Three platforms, three wildly different signals. Which one do you go with?

The answer is: it depends on what you’re trying to find out — because these three systems don’t measure the same thing. Rotten Tomatoes’ Tomatometer isn’t a quality score — it’s a binary “did critics like it or not” percentage. IMDb is the broadest audience pulse on the internet, skewed toward action, sci-fi, and English-language mainstream. Letterboxd is what cinephiles actually think, and it runs about 0.3–0.5 stars colder than IMDb equivalents because the audience there takes film seriously.

Knowing which system to trust — and when — is worth more than blindly following any single score.

The most misunderstood thing in film ratings: A 91% on Rotten Tomatoes does NOT mean critics rated it 9.1/10. It means 91% of critics gave it a positive review. A film where every critic gives it a B+ scores 100%. A polarizing masterpiece where half give it A+ and half give it F scores 50%. Always check the Tomatometer alongside the average rating — RT removed the numerical average in April 2025.


Criterion Letterboxd IMDb Rotten Tomatoes
What it actually measures
Cinephile audience average (0.5–5 stars)
General audience weighted average (1–10)
% of critics positive (Tomatometer) + audience % (Popcornmeter)
Who's rating
Self-selected film enthusiasts — higher bar
200M+ registered users — broadest sample
Approved critics (Tomatometer) + verified buyers (audience)
Score inflation / generosity
Lowest — runs 0.3–0.5 stars colder than IMDb
Moderate — tends to favour popular genres
Most misleading — 60% Fresh = "fresh", inflates quality signal
Review bombing resistance
Highest — cinephile audience, less motivated to bomb
Lowest — easiest to manipulate with organised campaigns
Medium — verified buyer requirement helps audience score
Best for: arthouse / indie / foreign film
Best — this is Letterboxd's home turf
Unreliable — fewer votes, demographic skew
Critics catch it, audience score often misses
Best for: mainstream / blockbuster
Underrates franchise films — cinephile bias
Most reliable — massive sample, weighted average
Tomatometer unreliable for franchise; audience score better
Best for: quick "is it worth watching?"
Check score + top reviews — takes 30 sec more
Score alone is a decent filter at scale
Tomatometer above 75% = almost certainly watchable
Social / discovery features
Diary, lists, friends, film discussion — most social
Watchlists, basic tracking — improving
Aggregator only — no social layer
Database depth (catalog size)
Large — growing fast since Tiny Mammoth acquisition 2023
Film + TV — no industry data
Film + TV — no industry data
Film + TV — no industry data
Good for prestige TV — limited mainstream tracking
Often only reflects Season 1 critics, doesn't update well
Critic vs audience split visibility
No critic score — pure audience only
No critic score — pure audience only
No critic separation — single score
Shows both clearly — best for spotting critic/audience divergence
Score above 3.5 = genuinely good. Rare on Letterboxd.
Score above 3.5 = genuinely good. Rare on Letterboxd.
Score above 7.5 = safe bet for most viewers
Tomatometer above 75% + audience above 70% = double confirmation
×

Final Thoughts

Letterboxd

Check it on: Letterboxd

Trust it for indie, arthouse, and “is it actually good?”

Letterboxd runs cold on purpose — cinephiles are harder to please than the average moviegoer, which makes a high score mean more. A 3.8+ on Letterboxd is a genuine stamp of quality. It’s also the only platform that functions as an actual community — film diaries, curated lists, friend activity. For mainstream blockbusters, it underrates. For everything else, it’s the most honest signal of the three.

IMDb

Check it on: IMDb

Best for mainstream and the only one that works for TV

200 million users means IMDb’s score on a big film is as close to “what humanity thinks” as you’ll get. The weighted average filters out obvious manipulation better than most people realize. For TV specifically — episode ratings, season-by-season tracking, show evolution — nothing else comes close. The bias is clear: it skews male, English-speaking, 18–35, and strongly favors action, sci-fi, and superhero content. Foreign and arthouse films need at least 50K ratings before the IMDb score gets reliable.

Rotten Tomatoes

Check it on: Rotten Tomatoes

Best quick filter — but the score means less than you think

The Tomatometer is uniquely useful for one thing: cutting out the worst films fast. Anything below 50% is almost certainly not worth your time. Above 75%, you’re in safe territory. But the number itself is not a quality rating — it’s a headcount of positive reviews. A 100% can mean “universally loved masterpiece” or “a dozen critics all thought it was fine.” After RT removed the numerical average score in April 2025, the Tomatometer alone lost even more nuance. Use it as a filter, not as a verdict.

Voice Your Opinion

Twists or Comfort?

VS
0%
0%

More to Explore

Leave a Reply

Your email address will not be published. Required fields are marked *