Workism Is Making Americans Miserable⁠↗
Highlights
What is workism? It is the belief that work is not only necessary to economic production, but also the centerpiece of one’s identity and life’s purpose; and the belief that any policy to promote human welfare must always encourage more work.