Hollywood Sucks
Posted 9/18/10 , edited 9/18/10
Nowadays, Hollywood sucks because they deliver bad movies and remakes and such and such. When will they ever learn? Who knows? They just want to make money. They only care about the money. They decide what they want to put that we don't like. These days they will never learn. In the near/far future, they still won't learn. There is not even a single good movie that comes out in theaters including horror movies. Geez. Hollywood doesn't even know what the term horror is. Horror movies are supposed to be scary and they deliver these piece of crap like the remakes. Do we need remakes? No. We don't. They should make the original as long it is scary. Take a good look at Japanese horror movies. They're really scary. I watched the original Ring and that was scary and creepy. Now they are remaking Japanese horror movies. Geez. Next thing you know, they will be remaking Indian horror movies.

Is there any better way to teach these Hollywood douchebags a lesson? Even though we do something about it, will they still listen? I guess not. This is exactly why I don't watch new movies that are coming out. Unless they are really good, then yeah I'll watch them. Sequels and remakes are horrible. We can't stand the horror. THE HORROR!!!!

Hollywood, if you don't have any bright ideas, don't make crappy movies just for the sake of money. We're really tired of watching crappy movies. There is a reason that a good movie is being made. Good story, good dialogue and good characters. Once you have all these, you might win oscars for making good movies. Movies are not just about making money. Movies are meant to entertain the audience with a good story and characters.

What do you think? Will Hollywood ever learn?
Posted 9/18/10 , edited 9/19/10
There have been great movies that came from Hollywood. It's today's writers, directors, etc. that has given bad movies.
regret to say this, but this is a duplicate.


You must be logged in to post.