Originally Posted by Vincey!
Hollywood movies DON'T represent american culture at all. Out of EVERY movie industry, hollywood is the number one looking for the profit before anything else. Smaller countries and more independent movies may take more pride coming from a certain country but if there's one industry that only promote their personal interest it is definitely the movie/music and everything linked to art. That's why some are so controversial. Also what would give the "nationality" to a movie? lol his realisator or producer nationality? The country where it has been filmed? The average nationality of the actors that are playing in it? The home of the studios? I mean you remember much more the name of the actors and realisator then the country of the movie.
Hollywood movies do represent American culture, because they're cultural products from America that are popular both in America and elsewhere, ie. people do watch them in their free time. They might not be art, but they're definitely part of the culture.
And yes, I often find myself checking where the movie was made, or sometimes deliberately look for movies from a specific country to find out something about it, something I can't learn from a wikipedia entry.