@RyanM · Posted 06 Dec. 2021
Are movies becoming more political? I feel like a lot of the movies and tv shows we watch get very political. Do you feel the same? I mean, a lot of the movies I see, tend to do something relating to real life politics, because you can tell via the plot sometimes. Sometimes they'll be about making fun of politics, sometimes making fun of the bad side, and trying to promote what they think is the good side.
Anyway, do you think movies have become more political these days, or are there always policial movies out there?
Debbie Katz Free Spirit
@debkatz78 · Posted 07 Dec. 2021
Entertainment has always had political agendas and messages within them. It is just more obvious today than say, 30 years ago. I think this started during WWII when publication and broadcasting would use messages and different narratives, characters, and ideas as a sense of entertainment and empowerment. Nothing has changed.
@Osheen.Sharma · Posted 06 Dec. 2021
I think movies and books have always been political to some extent but people are only not starting to notice it more since they now force us to ask ourselves some difficult questions. I personally like art that gives a message however including a message just for the sake of it irks me. So yes, art is political and has always been to some extent, we're just noticing it a lot more now.