I grew up in a relatively conservative community, and I attended a relatively conservative college. So it's no surprise that I generally view the mainstream media as being relatively liberally biased. But today I ran across an interesting post. Apparently, the mainstream media are really in the pocket of the ultraconservatives, who ask them not to investigate the reality of the American health care system.
But isn't the media really a business? Don't they just report on what they think (in their well-researched opinions) their clientele want to hear? What I'm saying is: the mainstream media reflects what the market wants, which is what newswatchers want.
So if the mainstream media isn't reporting on something, doesn't it suggest that the mainstream doesn't care about it?