What's new

The Girlfriend Experience

I generally look at the score and if it's received a solid enough rating (7.3+) and believe there to be an upside beyond that (based on the director, cast, certain reviews, genre, my expectations), will rent it.
 
I generally look at the score and if it's received a solid enough rating (7.3+) and believe there to be an upside beyond that (based on the director, cast, certain reviews, genre, my expectations), will rent it.

This is how I treat IMDB. Depending on the type of movie, I adjust what I perceive to be a solid rating. The big, fan driven movies I don't pay attention to the ratings. When I first started following IMDB, "Return of the King" had just been released. There was a fanboy campaign to give "The Godfather" and part 2, and "Shawshank" ratings of "1" just to have "Return of the King" get the top spot on the Top 250. From that point on, I have taken the Top 250 with a grain of salt. The list of movies is solid. Even Kicky should admit that. The actual rankings is pretty weak.
 
Kicky as a member of the upper upper class, just hates the commoner and rightfully so. He'd rather not have to deal with the average Joe, because most of the time they don't have a law degree, so their opinion is useless and doesn't count. The average guy likes movies with fart humor and big ole breasts and doesn't have analytical reasoning skills so their ability to follow a plot is greatly retarded. So when these guys get a hold of a movie rating system that represents themselves, and it is used as a significant marker for anyone to go and see a certain movie, it really grinds Kicky's gears. Those people are going to see movies based on the opinions of everyone who has seen that movie who bothered to vote, what f'ing lemmings those people are, trying to do what other people are doing when they could be acquiring a degree of a learned manner and determining for themselves how big of an idiot a normal person actually is, who likes movies such as Stepbrothers and can't understand Citizen Kane.
Those damn industrial ******** even created a system for elitist pricks called rotten tomatoes, which tally's the opinions of other elitist pricks who are even deemed experts in the field. This would seem like it would be a system Kicky would go for, because finally it would shut the idiotic normal person up. But no, for some reason it isn't elitist enough for him
So I propose that before you see a movie, ask kicky whether or not you should see it, and that's the only movie review you'd ever really need.

So with that thought process in mind, I ask kicky... what was your opinion on "Busty Babes 7"? I picked it up at the local movie store, because i was intrigued by a series which has spanned 7 movies, that it must be as good as Star Wars or Star Trek. I'm an idiot normal person so I feel justified using that kind of knowledge

Kicky > Democracy > Science < Beantown
 
The list of movies is solid. Even Kicky should admit that. The actual rankings is pretty weak.

I think the ratings are horrendously structurually flawed.

If your argument is (and I believe it is) that the list itself is pretty good if you completely ignore the order then I will say that's a more reasonable view of how the list should be viewed. However, the list is still fundamentally screwed up because of its huge bias towards more recent films which excludes a lot of older movies that are better than a good chunk of the more recent films on the list, particularly those located near the bottom.

The recency issue messes up the quality of the list significantly because it places films like "Changeling" and "How to Train Your Dragon" on a list of the top 250 movies of all time while a movie from the 1940s has to be of the quality of Arsenic and Old Lace or the Philadelphia Story (numbers 249 and 250) in order to make it. Those are rightly considered some of the top comedies of all time by film historians and they can barely crack this list, ready to be pushed off by the next fanboy film. The recency issue is unlikely to be resolved because the formula for compiling the top 250 heavily privileges movies that receive a lot of votes, and those films tend to be new releases while older films vote totals are somewhat capped.

As is this flaw critically omits some tremendous movies from older decades, starting even in the 1970s. Some examples:

His Girl Friday
The French Connection
Deliverance
Being There
The Man Who Shot Liberty Valance
Sergeant York
A Night at the Opera
The Thin Man
Battleship Potemkin (this film is one of the most influential ever made, because it invented the montage. Seriously)

Etc etc. I could probably make a list of at least 25-30 films that should almost inarguably replace films that were made in the last 20 that are on the imdb top 250 solely because they were made more recently.
 
Here's an example of how the math works as to why the recency bias is structural for the IMDB 250, given its own post because I had to compile the information independently.

The formula is listed as:

weighted rating (WR) = (v ÷ (v+m)) × R + (m ÷ (v+m)) × C where:

R = average for the movie (mean) = (Rating)
v = number of votes for the movie = (votes)
m = minimum votes required to be listed in the Top 250 (currently 3000)
C = the mean vote across the whole report (currently 6.9)

Lets compare two movies that have the exact same average rating. In this instance we'll say it's 8.0. An older movie might have 20k votes (an actual representative number for several). A newer movie, like The Dark Knight, might have 400,000.

Lets plug those into the formula for their weighted averages for IMDB 250 purposes.

Old movie: (20,000 ÷ (20,000+3,000)) × 8.0 + (3,000 ÷ (20,000+3,000)) × 6.9 = roughly 7.85
New movie: (400,000 ÷ (400,000+3,000)) × 8.0 + (3,000 ÷ (400,000+3,000)) × 6.9= almost exactly 8.0

That creates a built in spread of about .15 weighted points for films that get more votes, which are largely newer films. That matters a LOT given that the spread between #141 and #250 is .2 points.
 
Here's an example of how the math works as to why the recency bias is structural for the IMDB 250, given its own post because I had to compile the information independently.

The formula is listed as:

weighted rating (WR) = (v ÷ (v+m)) × R + (m ÷ (v+m)) × C where:

R = average for the movie (mean) = (Rating)
v = number of votes for the movie = (votes)
m = minimum votes required to be listed in the Top 250 (currently 3000)
C = the mean vote across the whole report (currently 6.9)

Lets compare two movies that have the exact same average rating. In this instance we'll say it's 8.0. An older movie might have 20k votes (an actual representative number for several). A newer movie, like The Dark Knight, might have 400,000.

Lets plug those into the formula for their weighted averages for IMDB 250 purposes.

Old movie: (20,000 ÷ (20,000+3,000)) × 8.0 + (3,000 ÷ (20,000+3,000)) × 6.9 = roughly 7.85
New movie: (400,000 ÷ (400,000+3,000)) × 8.0 + (3,000 ÷ (400,000+3,000)) × 6.9= almost exactly 8.0

That creates a built in spread of about .15 weighted points for films that get more votes, which are largely newer films. That matters a LOT given that the spread between #141 and #250 is .2 points.

I think YB was in agreement with me that the rankings is a joke but that the ratings is a pretty good barometer to go off of, flawed or not.
 
Here's an example of how the math works as to why the recency bias is structural for the IMDB 250, given its own post because I had to compile the information independently.

The formula is listed as:

weighted rating (WR) = (v ÷ (v+m)) × R + (m ÷ (v+m)) × C where:

R = average for the movie (mean) = (Rating)
v = number of votes for the movie = (votes)
m = minimum votes required to be listed in the Top 250 (currently 3000)
C = the mean vote across the whole report (currently 6.9)

Lets compare two movies that have the exact same average rating. In this instance we'll say it's 8.0. An older movie might have 20k votes (an actual representative number for several). A newer movie, like The Dark Knight, might have 400,000.

Lets plug those into the formula for their weighted averages for IMDB 250 purposes.

Old movie: (20,000 ÷ (20,000+3,000)) × 8.0 + (3,000 ÷ (20,000+3,000)) × 6.9 = roughly 7.85
New movie: (400,000 ÷ (400,000+3,000)) × 8.0 + (3,000 ÷ (400,000+3,000)) × 6.9= almost exactly 8.0

That creates a built in spread of about .15 weighted points for films that get more votes, which are largely newer films. That matters a LOT given that the spread between #141 and #250 is .2 points.

Makes sense though. The more a movie has been seen, the more reliable the rankings, the higher the rating.
an 8 with 500,000 votes is definitely worth more than an 8 with 50000
 
Makes sense though. The more a movie has been seen, the more reliable the rankings, the higher the rating.
an 8 with 500,000 votes is definitely worth more than an 8 with 50000

Except for the fact that it's a tremendous structural disadvantage for entire class of films. The difference will actually be magnified the farther away you get from 6.9 as well.

The goal is to rate the BEST 250 films period, not the best 250 films of the last 30 years. The IMDB top 250 list comes a lot closer to doing the latter than doing the former, and frankly it sucks at the latter too. It is a measure of popularity over the last ten years only. Nothing more. It is barely more useful for discerning quality than looking at box office receipts.
 
Except for the fact that it's a tremendous structural disadvantage for entire class of films. The difference will actually be magnified the farther away you get from 6.9 as well.

The goal is to rate the BEST 250 films period, not the best 250 films of the last 30 years. The IMDB top 250 list comes a lot closer to doing the latter than doing the former, and frankly it sucks at the latter too. It is a measure of popularity over the last ten years only. Nothing more. It is barely more useful for discerning quality than looking at box office receipts.

I've said this before and I'll ask again. What discerns overall popularity from inferred quality?
 
I've said this before and I'll ask again. What discerns overall popularity from inferred quality?

I'll put it this way. An entertainment center sold at Wal-Mart, made by O'Sullivan, is probably 1000x more popular than all the entertainment centers ever sold by Ethan Allen.

One is popular the other is quality.
 
Making decisions based on the general area of the rating (I.E. 8.0+ would be "fantastic to great", 7.9-6.0 = great to good, or watchable) while knowing and acknowledging the flaws within the IMDB system gives you a good overall consesus opinion of how good a movie is.

Anybody who puts weight into the actual ranking, whether or not it's in the top 250, or any other specific ranking without taking the time to read a respected review or two, consulting friends, or watching the movie themselves will have a problem with the system.

Using it (the imdb ratings) as it is can be a useful metric, especially if you want to know what the masses think about a movie that you might be curious about.
 
Last edited:
I'll put it this way. An entertainment center sold at Wal-Mart, made by O'Sullivan, is probably 1000x more popular than all the entertainment centers ever sold by Ethan Allen.

One is popular the other is quality.

You were nicer than me. I was tempted just to make fun of him for, in a single post:

*deleted*
 
Last edited:
I've said this before and I'll ask again. What discerns overall popularity from inferred quality?

Popular:

transformers2.jpg


justin_bieber.jpg


tracy.jpg


Watch-Jersey-Shore-Season-1-Episodes-Online-for-FREE-Download-Jersey-Shore-Season-1-Episodes-Torrents.jpg


Does the popularity infer quality?
 
Last edited:
I'll put it this way. An entertainment center sold at Wal-Mart, made by O'Sullivan, is probably 1000x more popular than all the entertainment centers ever sold by Ethan Allen.

One is popular the other is quality.

Not a good comparison. You need something that is accessible to everyone regardless of their socio-economic position in life. Not everyone can afford Ethan Allen so as to compare it to O'Sullivan. I'd guess that most of the people that have purchased O'Sullivan have no idea what Ethan Allen even is.

Now if Ethan Allen were just as inexpensive as O'Sullivan and better quality yet still less popular you could use it as an example.

What exactly makes a movie good, bad, entertaining, popular? This is highly subjective and viewers have varying criteria. Was Transformers entertaining? Yes. Was it popular? Yes. Was it visually stimulating? Yes. So how do turn around and say it was of poor quality? It was poorly written and some of the acting was questionable but it wasn't unwatchable. Now compare this to something like The English Patient. The story is well written and the acting good but it is so damn boring you want to slam your head against a wall. Great, it's a quality show but is near unwatchable. Chariots of Fire? Bleh.

I know what Sirkicky considers as quality based on his reviews and quite frankly, I think a lot of times his reviews suck based on reasons that I don't care about. I find this to be the case with a lot of film critics.
 
Last edited:
Saw this movie on Netflix. It's a little interesting for awhile...and then you finish it and it seems kind of pointless. It's certainly a different type of film-making than you'll usually see, for sure.
 
Back
Top