I've been seeing this more and more in online discussions frequented by American Internet users:
- Original Poster: "The selfish behavior of large companies is harming society."
- First Reply: "A company has no responsibilities or obligations toward society. Its sole responsibility is to make money for its shareholders."
I've been seeing this line so much online that I am assuming:
- Many Americans are indoctrinated in various ways to think like this
- And an army of paid trolls is posting this specific line over and over everywhere that matters
I personally find the concept of Companies not having any reponsibilities whatsoever toward greater society very very strange.
So people in society are expected to act responsibly, bu a company which is nothing more than a group of people working towards a shared goal somehow has no responsibilities other than making profit for "shareholders"?
Does anybody else think that this is a very very strange and destructive concept?