I wondered the same thing. It's just as bad as the Nobel Peace Prize being awarded to someone who claimed they wanted to achieve peace in the Middle East and bring American's together erasing racial tensions and he has managed to make both areas worse since becoming President. As we know awards have no meaning anymore. There are so many award events for actors and actresses. These are people who don't achieve anything other than to pretend to be someone they are not. Most movies have blatant or hidden left wing agendas that brain wash the American people into believing their ideas about history and reality. When making a historical movie Hollywood will claim that the movies are to entertainment & not about accuracy in history.