Do PUBLIC universities have a right to label themselves as “Conservative” or “Liberal?
Should a PUBLIC university, which accepts PUBLIC funding, be allowed to associate with a political party? For a while I was proud to attend a “liberal” university until I realized I was being brainwashed. It was then I could relate to Pink Floyd’s song lyrics, “We don’t need no education. We don’t need no thought control.” Looking back on it I feel a bit cheated because I would have liked to have a more well-rounded education than what I got. What do you think? [EDIT] To eliminate some confusion: I am not a Conservative any more than I am a Liberal. In fact I am sure I would have felt the same way if I attended a Conservative college. That was just an example from my experience. [EDIT #2] I am not talking about a “liberal arts” degree. I am talking about the descriptive word for politics.