Hollywood has always been the best movies producer worldwide but here I have noted what American Hollywood movies have taught us:
- American movies have taught us that Chinese movies have nothing better to do than teach or practice kung fu or make movies based on karate and judo.
- American movies have shown us that more than 50% of the American populations are FBI/CIA agents, maybe working undercover.
- The movies from America, has witnessed us with various scenes in movies that if a man has survived a mishap with a lady, he is entitled to a kiss, even if he met the lady only once or for the first time.
- American movies have proved that American cities are mostly prone to various monsters, Godzilla and dinosaurs.
- They show that America is the favorite country for aliens to invade.
- Movies and films from America have proved that the scientists there are experts in mutating genes and species that every new species and organisms are formed always by any mishap that occur.
- America movies have made the world aware that America is home to so many super heroes, like Superman, Spiderman, Ironman, Hulk, Batman, etc.
No comments:
Post a Comment
Now its your TURN.
Is there anything you want to say about the post above? Do let me know by leaving your valuable comments in the comment box below. Thank you!