I think that American society has quietly been waging war on capitalism for the last 60 years. Shortly after WWII we stopped encouraging people to become capitalists and take control of their own means of productivity. Instead we began to encourage people to get jobs working for someone else who could assume their productivity for their own advantage and in turn provide a wage and benefits.
this ideology has been beaten into the last few generations, "get an education, so you can get a good job," or "be a good worker." that way your employer can provide everything for you. this is a socialist ideology, why are we no longer encouraging people to take control of their lives? when people give up their lives to their employer and in exchange the employer provides for them, why is anyone surprised when these same people expect the same treatment from their government?