Tuesday, April 19, 2016

The Decline and Fall of the American University

Had Enough Therapy?: The Decline and Fall of the American University: "The Decline and Fall of the American University
America’s universities are self-deconstructing. Once lauded as a great civilizational achievement, they have been degraded by radicals. If your chance for a better life involves going to college and learning what you need to know in order to take your place in the business or professional world, that chance is being systematically undone."


'via Blog this'