I am so disgusted by the changes to society brought about by Hollywood it has caused me to reject and start hating America. I look around at my community and I'm disgusted. It is no longer a place I want my children to grow up in. I see the American experiment as a gigantic failure and now activly preach against a country I would have once died for. This attitude is growing in America and I support spreading it with all the powers at my disposal. I just can't support America anymore...I just can't.