I believe that future US historians will view the impact of Hollywood on the country as predominantly negative. Since the emergence of Hollywood and popular culture in the 1980s, there has been a noticeable decline in morality.

Interestingly enough, the very individuals who control Hollywood also have control over the banking system in the US. Quite a coincidence, isn't it?

Reply to this note

Please Login to reply.

Discussion

No replies yet.