Dr. Sandra Trappen

  • Home
  • Curriculum Vitae
  • Publications
  • Current Research
  • Talks & Paper Presentations
  • Contact
  • Social Problems
  • Juvenile Justice
  • Current Social Theory
  • Organized Crime
  • Criminology
  • Race & Ethnicity
  • Race, Crime & Justice
  • The Criminal Justice Research Collective

What is Critical Thinking?

12 Comments

 

The term “critical thinking” is probably one of the most cliche terms in higher education today (“rigor” runs a close second).  Given this, there’s a pretty good chance that every time you walk into a new classroom, your sage professor will at some point emphasize how one of the goals of their class is to “help students engage in critical thinking”- commence eye rolls. Unfortunately, the meaning of words becomes eroded when they are used as much as this.

No doubt, this is frustrating for students, who I imagine may have buzzword fatigue. School Administrators and Ed Tech companies in particular love to hype critical thinking. And while there is ample evidence that suggests employers find critical thinking skills to be desirable, no similar evidence indicates that colleges and universities are delivering on their promise as such.

But of course, none of this is surprising when you take into account the history and evolution of public education in the United States. As it turns out, public schools continue to excel at what they were originally designed to do — they train obedient workers (not thinkers). The comedian George Carlin has a particularly famous riff one this (albeit it is very explicit), which you can check out on your own time on YouTube.

While education holds out great promise to be be an engine of social mobility in the United States, it remains deeply embedded in what sociologists refer to as the “social reproduction” of class privilege. This is why despite there being much evidence that attests to the ability of education to serve as a great engine to combat social inequality, there is similar competing evidence that suggests it can also reproduce and deepen pre-existing social inequalities.

One development that has contributed to the downfall of education in the U.S. in particular is the emphasis on testing and evaluation. This approach is far more rigidly ingrained in public schools, who have less autonomy than their private school counterparts to determine curriculum. Moreover, when testing does occur, its content and rigor are weighted heavily in favor of the middle classes and their offspring. The result is that the entire system of “objective testing”  is essentially rigged against working-class and poor students (for complex reasons that are too lengthy to discuss here).

In the interest of staying on point, let’s just say that middle and upper-class families with resources, who can help prepare their kids for exams in ways that less advantaged families can not. In the case of the former, kids benefit from access to private high schools and they have the money to pay for tutors and extra-curricular activities, all of which supports access and admittance to higher education institutions. They are more likely to benefit from curricula in private schools that are more comprehensive and flexible (they’re not narrowly focused on testing), making it easier to teach those students critical thinking and leadership skills.

As a college professor, I work on the front lines where I am a witness to what our education system produces. I have seen changes occur over the years that trouble me. For one, I notice that an increasing number of high school graduates who enter my classroom exhibit difficulty expressing themselves clearly in speech and in writing. They similarly struggle when they attempt to execute what appear to me to be basic intellectual tasks (i.e. independently read a syllabus). Many of these same students could not pass a basic argument literacy test.

Even more concerning, there are relatively few students who can distinguish the difference between fact vs. opinion; they can’t tell you how science differs from non-science, or how natural science differs from social science, and so on down the line. “Fake News” barely scratches the surface in describing the problems that exist with media and information literacy in our present-day social landscape.

Put it different terms, critical thinking – defined more broadly as abstract reasoned thinking – has become an unfortunate casualty of the era of standardized testing. Testing regimes privilege memorizing facts and learning how to play word analogy games as a way to measure and assess “knowledge.”

What if I were to tell you that real life is not going to present you with options that are consistent with the choices presented on standardized tests? What if you took a class with me and I told you that I wasn’t going to test and grade your memorization skills? The latter should feel liberating, but students often feel intimidated because they haven’t been taught to think in a rational disciplined way.

Some of you, I hope, are nodding your heads in agreement, having already experienced the utter pointlessness of the test-taking trap. I imagine you  have always sensed this. You knew something wasn’t right. But what could you have done to resist? Sadly, not too much.

When people are forced to forego more substantive approaches to education in order to play intellectual word games, they will at some point be left out in the cold. Their desire to learn will at some point become blunted and thus their learning potential will not be fully realized; they won’t be able to develop the thinking “muscles” required to engage in abstract reasoning (or they will become easily exhausted when they try).

Absent  critical thinking skills, students will be ill-equipped to question and challenge the status quo. They may even give up on the idea of education altogether and quit as soon as they are able to work. Sneaky, yes? This is how obedience and conformity are learned, as resistance appears to be futile.

Rather than engaging in a fully developed critical analysis (which they most likely have not been taught how to make), students compensate by investing time and effort into trying to figure out the “correct” answer to a given problem. This has been, in their experience, the tried and true path to make the grade and to be rewarded and recognized as a “good student.”

Image result for critical thinking

Another unfortunate result of the standardized approach to learning is that it encourages people to assign too much importance to their own thinking. Given how critical thinking skills are weakened/not developed, they sometimes cannot (will not) try to see problems objectively, as they prefer instead to rely on their own personal subjective “experiences.”

They trust their “gut” feelings and only that which they’ve seen with their own eyes because their reasoning skills are so poorly developed that they cannot think.

Given this, many students find it difficult to execute advanced learning tasks that require them to perform analysis/synthesis of information, as opposed to performing memorization and recall (they prefer the latter because they know how to do it).

All of this, for obvious reasons, can produce frustration. And as I mentioned already, it causes many young people to become cynical and give up pursuing education. Disinterested and discouraged, they are less likely to pursue advanced learning (like higher education). They might feel they are not smart enough and they may even harbor a residual resentment towards so called “experts.” They may go so far as to cultivate a preference for  media sources that tell them not to value expert knowledge.

That being said, I would be remiss if I didn’t call attention to the fact that everything that I just described here remains the students’ problem to solve (which seems unfair right?). In light of this, we have to do some work together to address where we go from here. But first, let’s examine some of the specific pitfalls that get in the way of critical thinking. The path is a bit long and winding. Along the way, we’ll look at some famous sociologists who called attention to these  problems and proposed solutions.

Binary Thinking

Binary thinking distinguishes an approach to problem conceptualization  that reduces problems to two competing sides in order to arrive at a simple truth (i.e. right vs. wrong, left vs. right, liberal vs. conservative, pros vs. cons, good vs. evil). This framework tries to impose order and control on problems that are, more often than not, complex, nuanced, and dynamic.

A derivative of binary thinking is what has come to be known in our contemporary moment as “both sides” journalism. This is the favored thought paradigm of the television era, where the best examples of this can be found on 24-hour cable news programming. The stars of these shows are pundits and talking heads who take two sides of an issue/problem and get into heated arguments with each other. Conflict is the “spice” that enlivens  entertainment presented as information.

This development in our media landscape has been exacerbated by the proliferation of online news and political opinion outlets, including social media.  As one study put it:

“this raises concerns anew about the vulnerability of democratic societies to fake news and other forms of misinformation. The shift of news consumption to online and social media platforms has disrupted traditional business models of journalism, causing many news outlets to shrink or close, while others struggle to adapt to new market realities. Longstanding media institutions have been weakened. Meanwhile, new channels of distribution have been developing faster than our abilities to understand or stabilize them” (Baum et. al. 2017).

It further calls into question the notion that American journalism should operate on the principle of objectivity. In the contemporary era, we have seen a noticeable shift, where journalists who once endeavored to be dispassionate about the subjects they cover, now operate like neutral referees in a cage fight.

In taking pains not to take sides, they “both sides” every topic, where they contrive a match-up of two distinctly opposed sides of a given issue. But here is where the danger lies: in the process of doing so, they confirm each side has equal weight (false equivalency). That they do this, even in cases where one of those sides represents an extreme or fringe view (or simply a view that has been shown to be nonfactual/disputed by evidence), is plainly absurd. The result is that people with an appetite for conflict, dogma, and ideological thinking are left vulnerable to manipulation.

Illustration:

Do you believe in one or both sides of the flat vs. round earth debate?

Do you believe that there are two sides to every problem?

Do you believe that both sides have their own facts?

Again, while all of this might seem pointless if not ridiculous to many people, there are many in our society for whom this kind of thinking makes perfect sense. A large number of Americans have been ill-served by our corporate media. Their approach to discussing major social issues and problems is either so simplistic that it serves no particular value, or they give up, portraying problems as too difficult to solve.

The “all problems have sides” approach remains particularly problematic, for reasons that it is intellectually dishonest. Even worse, this framing has become normalized in our public discourse. Juxtaposing a liar opposite to an ethical professional, as if the two represent two legitimate sides in a debate, is a farce.

The Baum study authors conclude that the cognitive, social and institutional constructs of misinformation are complex and that we must be vigilant and seek input from a variety of academic disciplines to solve the problem, as the “current social media systems provide a fertile ground for the spread of misinformation that is particularly dangerous for political debate in a democratic society” (Baum et. al. 2017).

Jay Rosen, a journalism professor at New York University, points out: “The whole doctrine of objectivity in journalism has become part of the [media’s] problem” (Sullivan, 2017). When journalists use binary constructs to create structural and moral false equivalency, they become complicit agents, who undermine truth and understanding. All of this is a predictable outcome, given that understanding was never the goal here – manufacturing conflict is the aim of the game. These news programs don’t exist to inform you; they’re here to entertain you and sell advertising.

A better and infinitely more responsible approach would be to report things fairly, accurately, and comprehensively. Journalists might further acknowledge their biases upfront instead of trying to exist in a magical bias free-zone.

One of the problems that we are left with here, even when we aim to do better, is that sharing factual data with the public does not always change strongly-felt but erroneous views that conflict with people’s personal beliefs and feelings (facts being immaterial).

Image result for stop saying I feel like

“I feel like” and Using the Personal Pronoun “I”

Ever notice how often a problem comes up for conversation or discussion and instead of addressing the problem objectively people respond by saying something like “yes, but I feel like” and they proceed to relate the problem to their own personal experiences? Notice how those same people are more committed to defending their feelings much harder than facts?

Lots of people do this. It’s a very subtle and often unchecked form of narcissism that allows you to divert attention away from an issue or problem that may be controversial (or you simply don’t know much about it), where you default to the  subject that you are most comfortable discussing – yourself! To some extent, this is to be expected, considering how today’s students have come of age in a time of growing diversity and political polarization. No one could blame them for wanting to back away from confrontation (Worthern, 2016).

In North American English, “I feel like” and “I believe” have in the last decade come to stand in for  “I think” (Worthen, 2016). Sociologist Charles Derber describes this tendency as “conversational narcissism.” Often subtle and unconscious, it betrays a desire to take over a conversation, to do most of the talking, and to turn the focus of the exchange to yourself (Headlee, 2017).

To illustrate, people say things like “I feel X about XYZ problem.” What would be better is for them to say something like: “I think, based on XYZ evidence, that the idea of X appears to be supported.”  In the case of the latter, the speaker “de-centers” themselves (emphasizing evidence and conclusions, not personal beliefs) and uses objectified scientific language.

Note: if you are taking my course in Research Methods, you will need to demonstrate to me that you can write using objectified scientific language.

Worthen points out that when we use language like “I feel like” we are playing a trump card.  And that’s because when people cite feelings or personal experience, “you can’t really refute them with logic.” Why not? Because that’s like saying they didn’t have that experience or that their experience is not valid. “It halts an argument in its tracks” (Worthen, 2016).

The recourse to feelings is problematic because you are limiting what you can discuss/engage within the narrow confines of your own personal experience. This is not a solid ground upon which you can build knowledge of a subject; subjective perspectives and personal “feelings” are not frameworks that facilitate the analysis of complex problems.

I write this with full knowledge of the fact that there are, of course, long-standing debates in the social sciences about objectivity and the role value judgments play and, more importantly, about whether or not value-laden research is possible and even desirable. In his widely cited essay “‘Objectivity’ in Social Science and Social Policy,” Max Weber argued that the idea of an “aperspectival” social science was meaningless (Weber 1904a [1949]).

Nevertheless, in the spirit of compromise, I would argue that we are well served when we can simultaneously reflect on how our personal experiences/feelings connect to a problem we are trying to understand, even as we aspire to think about those problems critically. The ethic that guides us here should be to cultivate a degree of professional detachment from what we are studying, while also acknowledging our biases. Trust me when I say this – it’s hard. And that’s okay.

Political Thinking vs. Critical Thinking

With the advent of 24-hour cable news, young and old alike have become enthralled by the theatrical banter of entertainment news media, which thrives on conflict and encourages demagoguery as part of the process of discussing important social issues.

Political thinking, such as what we often see featured in our MSM (mainstream news media) can contribute to the cultivation of a closed mindset; one that is rigid, dogmatic, defensive, and most of all not critical (to be sure, there is plenty of criticizing – but that’s different from how I am using the term “critical”).

This process of relying on entertainment media to become “informed” has proven to be deeply satisfying for many people. Politics, is in many respects, a performance that is experienced like a “television show.” People have their favorite characters, as political news is both rendered and experienced like a “story.” That it sometimes “informs” is merely tangential to the process.

Some experts have speculated that our media have become addictive. This is because political parties and our MSM all traffic in audience-tested (focus group) simple frameworks – sound bytes – that operate like intellectual short-cuts. They encourage simple solutions to complex problems that help give people a feeling of control that is connected to a deeply held desire/belief that they understand everything. And so it follows, people don’t have to think very hard. And guess what – that’s attractive to many people!

A perfect example of this is when a person accepts a political party’s full roster of beliefs, which is often coupled with a tendency to defend all of its policy positions. This is not at all unlike being a sports fan and rooting for a sports team. The ongoing contest between the Democrats and Republicans is in many ways similar to the Steelers and Ravens rivalry. Such thinking, oddly enough, shares common elements with religious belief (see Durkheim below). Again, you don’t have to think very hard about the issues. Rather, you simply need to identify with and support your team.

Not surprisingly, individuals who desire political affinity are attracted to people who hold their same views.  As long as they stay within the confines of their social group, their views won’t be challenged, they can feel like they are “in the know,” and they almost always “get to be right.” 

Identity confirming behavior like this further generates strong feelings of belonging, which may, in some instances, constitute a vital aspect of a person’s self-concept. In this instance, confirming social identity is more important than exercising rationally informed, independent, critical thinking! Mind blown!

The powers of affinity that drive identity politics are powerful precisely because they enable individuals to solidify their social group membership when they identify with issues and problems in conforming ways that mirror party-line thinking. The attraction here – and I can’t emphasize this enough – is that political thinking helps people to not feel alone; in sacrificing their independence to the group, they relieve themselves of the burden of thinking on their own.

One interesting issue that I want to call attention to is the case where people are forced to confront information that they cannot reconcile with their deeply-held political beliefs. This conflict is identified by psychologists, who study it, by the term cognitive dissonance.

Cognitive dissonance usually involves feelings of discomfort that most of us would prefer to avoid. In order to reconcile the mental discomfort and to restore balance, people with strong political beliefs may actively resist and dismiss critical facts and information that conflict with their cultivated worldview.

Critical Thinking in the Social Sciences

C. Wright Mills & the Sociological Imagination

The American sociologist, C. Wright Mills, coined the term the “Sociological Imagination” in his 1959 book of the same title to describe a type of critical insight that could be offered by the discipline of sociology. The term itself is often used in introductory sociology textbooks to explain how sociology might help people cultivate a “habit of mind” with relevance to daily life; it stresses that individual problems are often rooted in problems stemming from aspects of society itself. 

Mills intended the concept of the sociological imagination to describe “the awareness of the relationship between personal experience and the wider society.” More specifically, he intended the concept to help people distinguish “personal troubles” and “public issues.” To this end, an individual might use this cultivated awareness to “think himself away” from the familiar routines of daily life. We will use this concept in our work together to critically think about social issues and problems.

Personal troubles are problems that affect individuals, where other members of society typically lay blame for lack of success on the individual’s own personal and moral failings. Examples of this include things like unemployment/job loss, eating disorders/overweight, divorce, drug problems.

Public issues find their source in the social structure and culture of a society; they are problems that affect many people. Mills believed it is often the case that problems considered private troubles are perhaps better understood to be public issues.

Let’s take Mills example of unemployment. If it were the case that only a couple of people were unemployed, we could then perhaps explain their unemployment by saying they were lazy, lacked good work habits, etc.  That is to say, their unemployment would be their “personal trouble.”  To be sure, there are some unemployed individuals who are no doubt lazy and/or lack good work habits. Notwithstanding, when we find millions of people are out of work, unemployment is better understood as a public issue. Which is to say, a structural explanation that looks at the lack of  opportunity in connection with the economy is better suited to explain why so many people might be out of work. (Mills, 1959, p. 9). 

By following Mills and developing a sociological imagination, people might develop a deep understanding of how one’s personal biography is the result of a historical process. Everyday experiences are connected to a larger social context.

Individual vs. the Social (Agency vs. Structure)

The root of a given problem, according to Mills, is almost always found in the structure of the society and the changes happening within it. Put another way, Mills is saying that many of the problems individuals confront in society have social roots. Moreover, the problems that people assume are theirs and theirs alone are, in reality, shared by many other people. This is why sociologists spend so much time trying to illustrate the  sociological roots of problems, because it helps people to understand how their biography is linked to the structure and history of society.

So why do we bother with all of this? Well, for some of us (researchers included), it is because we want to empower individuals to transcend their day-to-day personal troubles, to see how they are public issues and in the process help facilitate social change.

Let’s look at a practical illustration. Take, for example, a person who can’t find a job, pay the mortgage, pay the rent, etc. These are problems that are typically (and sadly) often seen to be the result of a personal failing or weakness. The individual is thought to be the cause of their own problem due to some failure or error.

In the same manner, unemployment can be an extremely negative private experience. Feelings of personal failure are common when one loses a job. Unfortunately, when the employment rate climbs (any number about 6 percent is considered high in the United States) people often see it as the exclusive result of a character flaw or weakness, not the result of larger and more overwhelming structural forces.

Now, for the record, Mills is not arguing that individuals are never responsible for some of their own problems – that they don’t have the ability to make choices, even bad ones, or that they that they have no personal responsibility for their actions. It’s just that many of their decisions don’t occur in a void. They are socially structured.

The same holds true for people who commit crime. Rather than focusing only on individual pathology alone, we should also look at the social and political context within which crime occurs. That is, we need to look at the role that structure plays in determining who commits a crime and who becomes a victim of crime.

Mills would ask: is there is something within the structure of society that is contributing to the problem?

As it turns out, the answer is often YES! In many countries today, unemployment may be explained by the public issue of economic downturn (deindustrialization), caused by industry failures (mortgage, banking, manufacturing). In other words, the problem is social and institutional – not simply the result of the personal shortcomings of one person or a group of people not working hard.

Again, this is why it is important to distinguish that Mills is not saying people shouldn’t work hard. The sociological imagination should not be used as an excuse for an individual to not try harder to achieve success in life, or for people to not claim some measure of personal responsibility for their problems.

Rather, what Mills is saying is that in many situations a person may fail even if they try to do everything right, work hard, go to school, get a job, etc.

When it appears that many people or social groups in society lack the ability to achieve success, instead of being quick to assign blame, Mills says we should dive in and identify the roots of the structure, such as inefficient political solutions, the racial, ethnic, and class-based discrimination of groups, and the exploitation of labor forces.

We should all be reminded that there are problems that cannot be solved by individuals alone (by individuals working harder).

In light of this, it is important that we use our sociological imagination and apply it in our daily lives, this way we might be able to change our personal situation and in the process create a better society.

Emile Durkheim (note: this passage about Durkheim is reblogged here from an article by Galen Watts sourced below)

Globally, we are currently experiencing tremendous social and political turbulence. At the institutional level, liberal democracy faces the threat of rising authoritarianism and far-right extremism. At the local level, we seem to be living in an ever-increasing age of anxiety, engendered by precarious economic conditions and the gradual erosion of shared social norms. How might we navigate these difficult and disorienting times?

Emile Durkheim, one of the pioneers of the discipline of sociology, died just 101 years ago this month. Although few outside of social science departments know his name, his intellectual legacy has been integral to shaping modern thought about society. His work may provide us with some assistance in diagnosing the perennial problems associated with modernity.

Whenever commentators argue that a social problem is “structural” in nature, they are invoking Durkheim’s ideas. It was Durkheim who introduced the idea that society is composed not simply of a collection of individuals, but also social and cultural structures that impose themselves upon, and even shape, individual action and thought. In his book The Rules of the Sociological Method, he called these “social facts.”

A famous example of a social fact is found in Durkheim’s study, Suicide. In this book, Durkheim argues that the suicide rate of a country is not random, but rather reflects the degree of social cohesion within that society. He famously compares the suicide rate in Protestant and Catholic countries, concluding that the suicide rate in Protestant countries is higher because Protestantism encourages rugged individualism, while Catholicism fosters a form of collectivism.

What was so innovative about this theory is that it challenged long-standing assumptions about individual pathologies, which viewed these as mere byproducts of individual psychology.

Adapting this theory to the contemporary era, we can say, according to Durkheim, the rate of suicide or mental illness in modern societies cannot be explained by merely appealing to individual psychology, but must also take into account macro conditions such as a society’s culture and institutions.

In other words, if more and more people feel disconnected and alienated from each other, this reveals something crucial about the nature of society.

The Shift from Premodern to Modern

Born in France in 1858, the son of a rabbi, Durkheim grew up amid profound social change. The Industrial Revolution had drastically altered the social order and the Enlightenment had by this time thrown into doubt many once-taken-for-granted assumptions about human nature and religious (specifically Judeo-Christian) doctrine.

Durkheim foresaw that with the shift from premodern to modern society came, on the one hand, incredible emancipation of individual autonomy and productivity; while on the other, a radical erosion of social ties and rootedness.

An heir of the Enlightenment, Durkheim championed the liberation of individuals from religious dogmas, but he also feared that with their release from tradition individuals would fall into a state of anomie — a condition that is best thought of as “normlessness” — which he believed to be a core pathology of modern life.

For this reason, he spent his entire career trying to identify the bases of social solidarity in modernity; he was obsessed with reconciling the need for individual freedom and the need for community in liberal democracies.

In his mature years, Durkheim found what he believed to be a solution to this intractable problem: religion. But not “religion” as understood in the conventional sense. True to his sociological convictions, Durkheim came to understand religion as another social fact, that is, as a byproduct of social life. In his classic The Elementary Forms of Religious Life, he defined “religion” in the following way:

“A religion is a unified system of beliefs and practices relative to sacred things, that is to say, things set apart and forbidden — beliefs and practices which unite into one single moral community called a Church, all those who adhere to them.”

The Sacred and the Quest for Solidarity

For Durkheim, religion is endemic to social life, because it is a necessary feature of all moral communities. The key term here is sacred. By sacred Durkheim meant something like, unquestionable, taken-for-granted, and binding, or emitting a special aura. Wherever you find the sacred, thought Durkheim, there you have religion.

There is a sense in which this way of thinking has become entirely commonplace. When people describe, say, European soccer fans as religious in their devotion to their home team, they are drawing on a Durkheimian conception of religion. They are signaling the fact that fans of this nature are intensely devoted to their teams — so devoted, we might say, that the team itself, along with its associated symbols, are considered sacred.

We can think of plenty of other contemporary examples: one’s relationship with one’s child or life partner may be sacred, some artists view art itself — or at least the creation of it — as sacred, and environmentalists often champion the sacrality of the natural world.

The sacred is a necessary feature of social life because it is what enables individuals to bond with one another. Through devotion to a particular sacred form, we become tied to one another in a deep and meaningful way.

This is not to say that the sacred is always a good thing. We find the sacred among hate groups, terrorist factions, and revanchist political movements. Nationalism in its many guises always entails a particular conception of the sacred, be it ethnic or civic.

But, at the same time, the sacred lies at the heart of all progressive movements. Just think of the civil rights, feminist and gay liberation movements, all of which sacralized the liberal ideals of human rights and moral equality. Social progress is impossible without a shared conception of the sacred.

Durkheim’s profound insight was that despite the negative risks associated with the sacred, humans cannot live without it. He asserted that a lack of social solidarity within society would not only lead individuals to experience anomie and alienation but might also encourage them to engage in extremist politics. Why? Because extremist politics would satiate their desperate desire to belong.

Thus we can sum up the great dilemma of liberal modernity in the following way: how do we construct a shared conception of the sacred that will bind us together for the common good, without falling prey to the potential for violence and exclusion inherent to the sacred itself?

This question which preoccupied Durkheim throughout his entire life — remains as urgent today as ever before.

~ end blog post

Contrarianism is a Helluva drug – The Devil’s Advocate 

Playing the “Devil’s Advocate” (DA) is not a very good way to exercise critical thinking. That is because this particular approach to debate is premised on the idea that every intellectual idea can be explored through a discussion of its opposite.

One tell you will notice is that the DA often advocates for viewpoints that have already been discredited or ones they are not prepared to offer critical evidence to support (…and I might add here, often ones that they personally believe but they are embarrassed to say as much). 

This is a Game

The devil who proposes a game of “Devil’s Advocate” in a debate are actively circumventing/undermining critical thinking under the guise of being reasonable; they’re playing a game and the goal of the game is to  “win.”

People who play DA are not helping to advance a discussion of a problem or situation. Even though they may profess an interest in political neutrality by evaluating “both sides” of an argument, what they are doing with this pedagogical approach is indulging in binary thinking. Playing game of “point-counterpoint,” the DA aims to limit perspective (confining it to the two poles of the binary) even as they are pretending to open it up. And as it has already been pointed out, real problems that are worth solving are generally more complicated than “two-sides.”

I will concede one point here, however, which is that the Devil’s Advocate (DA) approach can be a legitimate debating tool if it helps integrate a perspective that is not being considered, where the aim is to get someone to consider new information. In this aspect, the DA might help if the goal is to help overcome ideas perceived to be one-sided and biased (Fabello, 2015).


Righteous dude, Elon Musk, hits a blunt with Joe Rogan, while workers in his Tesla plant are subject to drug testing. Like Hegel before him, he is smart but he’s a walking contradiction (no one likes him either). Don’t be Elon.

Where groups of students are concerned, debate games are admittedly a good way to learn about a topic. This approach can help “juice up” classroom dynamics beyond what might be accomplished by a traditional lecture approach, as it gives give students an opportunity to compete,  inform, and have fun with each other.

Unfortunately, in real life, when the Devil’s Advocate shows up to argue, it doesn’t work out like this. The debate tends to play itself out and quickly becomes tiring for those relegated to the role of witness.

The Dialectics of the DA

The DA is a master of contradiction. Unfortunately, constant contrarianism does little to foster understanding of a problem. In a classroom it can be downright annoying for those forced to listen on the sidelines.

When the DA joins the debate, their true objective is to provoke conflict. Mistaking antagonism for skepticism and critical inquiry, the DA is often the consummate polished bully.

DA’s like zero-sum combat. Nothing delights them more than dragging a room full of spectators through seemingly endless, fruitless, and circular discussions, which ultimately never serve to help anyone change their point of view.

That’s why this approach cannot be dismissed as only a tiresome debate tactic; it’s far more than that – it’s a malignant thought paradigm that aggressively works to shut down critical thinking.

Reducing problems to two sides – the formula for a classic conflict   paradigm – is a Hegelian exercise in futility; it’s designed to wear you out. As a result, you may be tempted to give up on critical thinking and side with the bully in the room if only for reasons of wanting to avoid sheer exhaustion!

Image result for hegel and dialectics

The Devil in Disguise

A typical DA positions himself as a well-meaning “honest broker,” who merely wants to provide us with “the other side of the story.” To recap, DA thinking goes something like this: by arguing from an opposite, contrarian perspective, I will contribute a much needed missing perspective and help achieve a new level of understanding. 

Efforts are made to signal neutrality when discussing the topic at hand. Convinced of their own earnestness, the DA will often emphasize how they want to intelligently and rationally debate a topic (even if they have zero experience with the said topic). Sounds legit. But hold on. There’s more.

In some instances, a DA may go as far as to politely assert that viewpoints outside of their own are not wrong; they are just uninformed or misguided. No doubt, they are convinced they are flexing their critical thinking muscles when debating this way, as they attempt to wrestle people to the mat to  bring them around to accepting their simple truths. Sadly, this debate tactic screams bush-league; it’s high school-level debate.  Hint: this is why no one ever likes the devil’s advocate. 

In reality, DAs advocate for, and even perform to some extent, a combination of the following:

1) They advocate popular/ conventional “status quo” viewpoints.

2) They advocate polarized “contrarian” thinking.

3) They accuse others who don’t espouse their point of view of being biased.

4) They aim to shut down conversation and discourse – not add to it.

5) They aim to “mainstream” retrograde (out of favor) philosophies.

When I encounter the devil’s advocate, more often than not I find someone who is actively trying to interrupt and, in some cases, dominate my presentation of a problem/idea. But it’s not the fact that they asked me to consider new information that I find to be a problem; it’s their assumption (conceit) that I did not already consider alternative ideas before making my presentation. It’s as if they are telling me they don’t trust my ability to evaluate research and think critically (Fabello, 2015).

Gender & the DA

Shutting down conversation and discourse with debate tactics does not add to learning; it subtracts from learning and that’s oppressive. Nevertheless, the DA is not one who gives up easily.

As is often typical with DAs, sometimes they will go so far as to accuse their debate partner of having a personal bias (even though it never seems to occur to them that they may also be biased). This happens a lot. Sadly, more often than not, it happens with men in particular [even sadder is the fact that men talking down to women remains a common classroom occurrence].

Journalist Melissa Fabello, explains: “men are used to living in a world that affirms and validates their experience as ‘the way things are’ and they are almost never are asked to consider those biases.” When they accuse others of being a victim of their own subjectivity they do so while proclaiming they are, in fact, the one who is demonstrating critical thinking (Fabello, 2015).

What the devil’s advocate offers, more often than not, are feelings and personal opinions stated with confidence, which are disguised as reasoned, conventional, contrarian positions. In a final act of projection, they tend to accuse the very people who are practicing critical thinking of offering personal opinions.

Unable to engage problem-solving based on the actual substance of research and evidence, the DA may resort (when they are at their laziest) to suggesting that the other person’s position or argument be dismissed. In keeping with this, they may go so far as to suggest we consider the radical position that we do nothing at all to solve the problem. 

It’s Easy to be a Devil’s Advocate

To put it simply, a DA is a performance artist; one who puts a lot of effort into telling people they are wrong. They are so obsessed with being rational, that they consistently mistake their own feelings for objective logic, on the basis that simply believing in rationality makes their feelings magically rational; thus, their logic system remains a closed circle – and they always get to be right!

Keep in mind that it’s easy to play the role of the Devil’s Advocate. It’s easy to be a contrarian and say “let’s  go to opposite land and look at the opposite of what you proposed.” You know what’s hard? Solving a problem. It’s far more difficult to say “yes, we may not be doing it right, so let’s try to think through some different approaches that will allow us to think about this problem from different perspectives and maybe do something new.” 

Just to be clear, I want to state for the record that debate and dissent are useful, valuable, and indispensable to the pursuit of knowledge;  dissent can help sharpen and sculpt our efforts to achieve knowledge and understanding. Unfortunately, many DA-type contrarians are not interested in these things.

Understanding different points of view, respecting people for those views, and still having the courage to advance your views based on evidence is the goal of rationally informed critical inquiry. The best of us study for years in order to learn about the efforts and approaches others have tried in the service of solving problems. Standing on the shoulders of giants, we look for ways to make a contribution to the research – that means we must do some heavy lifting first, where we evaluate the best ideas and research methods and subsequently devise a new plan to move the research forward. Success here is often achieved in small increments.

In the end, we may or may not make a significant contribution. But we try. Even if we are only one step closer to solving a problem that’s still something to be proud of. The point is that thinking critically about the different steps in the process, collecting the best data and evidence, and even revising our approach along the way is more important than winning a debate or scoring points in the court of public opinion.

Remember that knowledge isn’t a one-way street. The more you treat knowledge acquisition as a competition, the more annoying you’re going to be as a person. You’ll find you won’t gain more wisdom,. You won’t influence anyone with your ideas…and you probably won’t make any friends.

Being smart isn’t about gamesmanship and proving someone wrong; sometimes it’s about fostering agreement through disagreement while getting someone to see your side of it. Wisdom is ultimately less about pride and being correct and more about having empathy and building bridges to knowledge.

What is the Solution?

There are no simple answers. We must begin the process of rational inquiry by putting the problem in the center of our efforts to solve problems. By taking ourselves out of the equation and attempting to begin from neutral ground (some people argue all research is ME-search), we might then set about the task of asking a critically informed question.

At this point, assuming there is intellectual curiosity, I hope we can move forward together to discover:

1) what experts have to say about problems; and

2) conceive of a plan (a research design) that can support the gathering of new data to advance understanding and suggest a solution to the problem.

Rinse and repeat as necessary.

Case Dismissed

It cannot be emphasized enough that this debating strategy is not driven by a desire to engage in rational inquiry to advance discourse and knowledge, even if it appears to do that. At its best, it’s an amusing parlor game. At its worst, it betrays an indication of disrespect. 

Summary

Critical thinking is more than a “buzz” term. It’s incredibly essential to one’s ability to exercise sound decision-making and function in modern complex societies. The pitfalls and barriers to critical thinking that were discussed here are important to distinguish because they are not only incredibly common, they are important as structures that govern how people understand the world around them – how they interpret their feelings and achieve agreement reality about important social issues and problems.  The thought paradigms that were discussed here are essential to how people develop logic systems and whether or not they can think; they have a profound impact on shaping everything that you know and what you think may be possible to know.

Finally, don’t avoid dissent. Don’t always seek out people who agree with you. Try to prove yourself wrong. Disagree. Debate. Admit mistakes. But be respectful.  Carve out time and space to consciously reflect. And never forget – if you are the smartest person in the room, you are in the wrong room!

Sources

“Pioneering Sociologist Foresaw Our Current Chaos 100 Years Ago,” by Galen Watts, 2018.

4 Things Men are Really Doing When They “Play Devil’s Advocate” by Melissa Fabello, 2015.

“Why We Should All Stop Saying ‘I Know Exactly How You Feel,’ “ by Celeste Headlee, 2017.

“Stop Saying ‘I Feel Like’ “ by Molly Worthen, 2016.

“Max Weber and Objectivity in Social Science”  

“This Week Should Put the Nail in the Coffin for Both Sides Journalism,” by Margaret Sullivan, 2017.

Discussion

What do you do when you have a certain understanding of what you know as Truth and others do not?

What do you do when others are operating under a belief system gained from what others have told them instead of basing their understanding on the knowledge of experts and/or research? Do you challenge them? Or do you “agree to disagree?” [Hint: do not concern yourself if others believe differently. Model what you are coming to know by means of accessing good sources of knowledge and information.]

When you were in high school did you feel prepared to take your exams? Did you do well on your exams? If you did not do well, how did this impact your interest in higher education and your assessment of whether or not you were “smart enough” to do well when you enrolled in college? 

How comfortable are you when it comes to discussing sensitive topics in the classroom? Do you ever feel intimidated by professors or classmates?

What would it take for you to feel more comfortable engaging in “difficult” types of conversations?

Course: Classical Social Theory, Criminology, Race & Ethnicity, Social Problems

The War on the Poor

172 Comments

Poor white

Most Americans that have lived a middle-class life have no idea how people living in poverty negotiate their lives. There is a tendency to think of “the poor” in the abstract – a group of unwashed masses who seem to always live in inner cities or trailer parks. When poor people are thought of as poor people as people at all, they are often referred to using terms like “slackers,” “losers,” “drop-outs” “takers” – the people who “just don’t want to work.”

Even when they do work, they don’t work the “right” jobs. They work in what are thought to be “unskilled” traditional “teenager jobs” – retail sales clerks, gas station attendants, McDonald’s counter work, etc.  And so the thinking goes, these are not the kind of jobs that serious people apply themselves to. People who have serious intentions to one day raise a family graduate from these jobs: they go to college, aspire to better jobs, and move up in the world. If poor people could simply grasp this common wisdom, they wouldn’t be doing what they’re doing AND they wouldn’t be poor. How many times have you thought this?

What you think about poor people and whether or not you believe we should help them probably boils down to your take on a single question: why don’t poor people work? As it turns out, there’s research on that!

But before we get into the research, let’s take a moment to consider how much one’s personal beliefs in this regard are in no small way influenced by their social ecology (family, friends, and neighborhoods where you grew up) and personal experiences. These traditional sources of knowledge are handed down and form a repository of what we call “common wisdom.” This kind of thinking is the opposite of critically reasoned, empirically informed, rational thinking. But it feels good because you get to share the views of everyone around you – confirmation bias – and so why change?

Research shows that the kinds of things your friends and family say about poor people are the primary sources of  influence when it comes to what you believe about poor people – far more than any knowledge based on research and data. When “beliefs” and “facts” don’t line up very well, the result is “cognitive dissonance.” In the case of politicians and policy makers, rather than do the heavy lifting required to explain contradictions and misunderstanding, they often take the easy way out (they go with which way the political wind is blowing). By validating constituent beliefs (even when those beliefs are misinformed and/or wrong), people feel gratified and vote accordingly. Ultimately,  the public is not well-served because legislators aren’t fixing social problems. Instead, they’re playing a game of crass politics calibrated to “what will get me re-elected?” 

In addition to the influence of friends and family, what you think about “the poors” is probably also shaped by your political ideology.  Thus it has been documented that people who identify as liberal will more often attribute poverty to social factors, like discrimination, whereas those who identify with social conservativism will more often point to individual choices (bad ones) that people make and subsequent failure to assume personal responsibility for their fate.

Political party ID also has a major impact on how people understand unemployment, welfare benefits, food stamps, etc. People tend to argue that short term unemployment, particularly as it impacts middle-class families, naturally fluctuates with the job market. Long-term unemployment, however, is thought to be more influenced by cultural forces and not the market – a “culture of poverty” associated with bad behavior. This sentiment reflects a simmering resentment that the poor could work if they wanted to, but a culture of sloth combined with a generous social safety net coddles them and acts as a disincentive to work. Put another way, political conservatives argue that government programs designed to alleviate poverty are doing the opposite: they encourage poor people to not work.

A key data point cited to support this belief in individual sloth comes from the Census. Every year, the Census Bureau asks unemployed Americans why they’re not working.  And traditionally, it is a small percentage of poor adults who say it’s because they can’t find employment. Census figures, however,  can be interpreted in a number of different ways. Taking the Census figures at face value, there are a few lessons are in order.

First, it is helpful to think of poverty and unemployment as not only a function of individual behavior, but also as a function of history and social context

poor2

The recession in 2008 changed poverty in many respects. From that point forward, we find more of the non-working poor are claiming they cannot find a job than at any point in the past two decades. Of course, unemployment rates fluctuate. However, this shouldn’t be much of a surprise. What is interesting is that when researchers took a closer look, they found that a lot of poor who “choose” not to work aren’t necessarily doing so out of laziness (the stereotype). Rather, it is due to other personal obligations: they’re trying to take care of relatives, they’re ill, or they’re attempting to make their way through school.

Poor people also tend to work jobs that are physically taxing; jobs that require a lot of lifting and standing. This puts them at a higher risk for suffering from disabling injuries. Chronic pain and reliance on pain medication opens the door to addiction for many of them, which leaves them in progressively increasingly worse situation.

The research also notes there are big differences with regard to gender. Women are more likely than men to cite family reasons for not working; men are more likely to cite their inability to find a job (Weissman).

Setting aside those people who don’t work for a moment, let’s take a look at the people who are working. Most of the poor who can work are working. The problem is wages remain low. Fifty-seven percent of the families below the poverty line in the U.S. are working families, working at jobs that just don’t pay enough to support them and their families.

Ask yourself: is it okay to pay people poverty wages (wages that can’t feed or support their families) when they are doing work that is socially necessary? Is service work the new “plantation” of the 21st century?

So  it’s not that the poor want to work like burger-flipping teenagers or that they’re lazy; they often do back-breaking and thankless work and work more than one job. Keep in mind, many of these low paid “unskilled” jobs fulfill important social needs. Caregiving and working in restaurants are are very much in demand from U.S. consumers. The real problem is that wages are too low.

Researchers have recently started to focus their efforts on understanding poor working people and what kind of work do they do. And this is what they have discovered. Occupationally, poor employed people tend to be childcare workers, home health care workers, janitors, house cleaners, lawn-service workers, bus drivers, hospital aides, waitresses, nursing home employees, security guards, cafeteria workers, and retail cashiers.

Sadly, not only do we expect them to do these incredibly important jobs, we expect them to also live in poverty while they do it. And we scream bloody murder if they don’t want to do the jobs that we all agree are thankless, difficult, and low-paid.

In should not need to be said (but it does) – these are not stupid lazy people performing unimportant jobs. They are, in fact, precisely the kinds of jobs that help make society work for everyone else; they enable other people to go to their jobs and earn a living. Nonetheless, people who perform this kind of work are summarily written off as socially unworthy, shiftless, lazy, and even stupid for making a “bad” career choice.

(Photo

History

For more than 30 years, politicians in the United States have worked to systematically undermine the poorest Americans. How did they do it? They prioritized tax cuts for the wealthy and engaged in unchecked military spending, all the while convincing average Americans (who don’t benefit from the tax cuts and military spending) that their way of life was in danger if these things were not accomplished. Both of the major political parties in the U.S. have done this to different degrees, with some more than others engaging in fear-mongering and the provoking of racial antipathy to get it done.

Put another way, they discovered that they way to get Americans to support their agenda was to push the narrative that the economy was being hurt by welfare slackers, unrestrained criminality, teen pregnancy, affirmative action recipients, and “illegal alien” criminal immigrants.

This narrative script still holds sway over our current politics. Now more than ever, as even middle class Americans are feeling the pressure. The mindset of middle class people tends to be one that is aspirationally focused: that is, they look up the economic ladder and aim to find ways to  share common cause with wealthy Americans, whose ranks they hope to join if the “work real hard.” They seek proximity to wealth through their jobs and their consumption habits. Part of doing so means they must also engage in efforts to socially distance themselves from the poor – this helps them to demonstrate by way of belief and personal practice that “I am not like them.”

In our recent history, attacks on the former President Barak Obama fit seamlessly into these developments. He was accused of not being American through a politics of fear that tries to generate suspicion of “enemy” others who are “not like us.” In the end, even Obama capitulated in many respects, advocating for social and monetary policies that benefitted banks more than they helped poor people.

Consequently, any attempts to fix social problems like poverty and social welfare programs through responsible governance (i.e. effective policy making) have been largely defeated by politicians who employ cynical politics to manipulate voters and by the voters themselves who have allowed their emotions to be captured by this process.

The Underclass 

Academic poverty studies are prolific. In the current era, Columbia University sociologist Herb Gans argued in his 1995 book “The War Against the Poor” that the label “underclass,” a term that we can apply to a variety of people—working poor, welfare recipients, teenage mothers, drug addicts, and the homeless, reduced members of these groups into to a single condemned “untouchable” class. As a result, poor people became feared and despised by the rest of society.

This label has proved to be powerful and long-lasting. Perhaps the most pernicious effect is how it effectively transformed the individual’s experience of being in poverty into what academics have referred to as a spoiled identity – an identity marked by personal failing and, in particular, a failure to make good choices in life.

On the policy front, all social welfare policies, but especially those popularly referred to as “food stamps,” have been effectively stigmatized and rendered highly controversial in the United States.

The entrenchment of negative stereotypes about poor people has helped political contrarians among others to call into question FDR’s legacy “Great Society” programs along with other civil rights era policies formulated during the 1960’s and 1970’s.  This has been greatly aided by the efforts of those in the political pundit class (big media mouthpieces paid to carry water for they wealthy who pay them) to criticize “welfare entitlements.” In recent years, the attacks have become vitriolic and relentless.

In what amounts to a piling-on effect, the efforts of politicians and the traditional media have been amplified a thousand times by the echo chamber of digital and social media. In social media, the ultimate currency is the “click.” In light of this, social media platforms have become notorious for click-bait images that provoke outrage. The promotion of negative cultural stereotypes has become valuable to efforts to drive clicks and thus revenue. Social media effectively monetizes anger and outrage.

Reagan’s Welfare Queen

There is no more significant political figure in this history than Ronald Reagan, who in the late 1970’s, during a period of significant economic adjustment, restructuring, and de-industrialization in the United States, managed to divert people’s attention away from larger macroeconomic issues by exploiting white working class fears about the expansion of social welfare benefits. For local reference, this is the same period in time during which many of the steel mills in Pittsburgh began to falter and good jobs started to become scarce.

ronald-reagan

The “Welfare Queen” of Reagan’s speeches was intended to provoke outrage; it was an affront to the political philosophy of personal responsibility and rugged individualism espoused by “hard working people” (racially coded white people), who were experiencing considerable economic pain as a direct result of his administration’s economic policies.

Reagan relied on what social psychologists refer to as “narrative scripts,” His 1976 campaign trail stump speech included the story of a woman from Chicago’s South Side arrested for welfare fraud. According to Reagan:

“She has 80 names, 30 addresses, 12 Social Security cards and is collecting veteran’s benefits on four non-existing deceased husbands. And she is collecting Social Security on her cards. She’s got Medicaid, getting food stamps, and she is collecting welfare under each of her names.”

The storyline proved to be extraordinarily effective for its ability to tap into entrenched race, class, and gender stereotypes dating back to the American Civil War about African American women (uncontrolled sexual appetites) and African-American work ethic (laziness).

Screen-Shot-2013-12-19-at-6.52.19-AM

Three falsehoods emerge from the “Welfare Queen” narrative:

1) most people living in poverty are women (they are children and the elderly)

2) most people on public assistance are urban (they are rural)

3) most people on public assistance are black (they are white).

None of these narratives hold up to research scrutiny, yet they prevail today and are perhaps as powerful as ever.

Who Is on Welfare?

Though many are shocked by this, whites are by far the biggest beneficiaries when it comes to government safety-net programs like the Temporary Assistance for Needy Families (TANF), commonly referred to using the short term “welfare.” If we were to break this down further, the largest sub-group of people who are presently living in poverty are white children followed by the elderly. And while black women represent more than one-third of the total number of women on welfare, data shows they account for only ten percent of the total number of welfare recipients.

The Temporary Assistance for Needy Families (TANF) program is the program that has most frequently been called welfare; it was created in the famous “welfare reform act” of 1996. As a result of that reform, the program that exists today is much smaller than its predecessor, Aid to Families with Dependent Children (AFDC), which only served 2.7 million people in 2016.

Of course, many of these facts run counter-intuitive to what many people have come to believe as social fact. As a new HuffPost/YouGov survey shows, the American public has wildly distorted views about which groups are the largest beneficiaries of government programs.

Fifty-nine percent of Americans say either that most welfare recipients are black, or that welfare recipiency is about the same among black and white people. Only 21 percent correctly said there are more white than black food stamp recipients.

Additionally, when survey respondents were asked to estimate who receives welfare, they provided answers that tracked closely to their estimation of who also gets food stamps.  Elizabeth Lower-Basch, a senior analyst with the Center for Law and Social Policy, says that people significantly overestimate the number of black Americans benefiting from the largest programs. She says “It’s not surprising because we all know people’s images of public benefits is driven by stereotype” (Delaney and Edwards-Levy, 2018).

In a similar fashion, the urban vs. rural dichotomy is also not sustained. Rural people receive benefits in far greater numbers than urban people. And here again, most rural people receiving public assistance are white. These social dynamics are visually rendered in both the map and table below. The map below shows the geographic distribution of food stamp recipients in the United States (dark shaded states have more people collecting benefits); the table under the map illustrates the racial identification of food stamp (SNAP) recipients.

Let’s use a sociological intersectional analysis to look at how race, social class, region, and education all work together and tell us something about who is on welfare.

Untitled999

Foodst

Black is Shorthand for Poor

What does poverty look like in America? Judging by how it’s portrayed in the media, it looks black.

That’s the conclusion of a new study by Bas W. van Doorn, a professor of political science at the College of Wooster, in Ohio, which examined 474 stories about poverty published in Time, Newsweek, and U.S. News and World Report between 1992 and 2010. In the images that ran alongside those stories in print, black people were overrepresented, appearing in a little more than half of the images, even though they made up only a quarter of people below the poverty line during that time span. Hispanic people, who account for 23 percent of America’s poor, were significantly under-represented in the images, appearing in 13.7 percent of them (Pinsker).

According to the USDA in their report for fiscal year 2013, 40% of aid recipients are white and 26% are black. While the food stamp program has one of the lowest rates of abuse than any other welfare program, many people find it easy to buy into the misconception that it’s the “lazy blacks” who account for the fraud woes of government assistance. Here again, the data above refute these narratives and common stereotypes.

White on Welfare

How many people – be honest now – are surprised to learn that the biggest recipients of federal poverty-reduction programs are working-class white people? This is a widely documented fact, even if it is not commonly understood.

White people without a college degree ages 18 to 64 are the largest class of adults lifted out of poverty by such programs, according to the Center on Budget and Policy Priorities (progressive policy think tank). Their 2017 report. based on U.S. census data stated that 6.2 million working-age whites were lifted above the poverty line in 2014 compared to 2.8 million blacks and 2.4 million Hispanics.

Whites

So the question remains – why is this not more widely understood? Unfortunately, the answer relates back to the enduring power of stereotypes and prejudices, which people tend to inherit when they believe traditional sources or trust “common knowledge” (i.e. friends and family). That this occurs testifies to the power of traditional narratives to act as a cultural shortcut – this help us fill in the gaps where our actual knowledge may be lacking.

Consequently, even though blacks and Hispanics have substantially higher rates of poverty (both in numbers and as a proportion of their respective social groups), whites receive the most benefits and populate the welfare rolls in higher numbers (see a new study published by the Center on Budget and Policy Priorities).

One of the significant study findings, in this particular case, was that the numbers do not simply reflect the fact that there are more white people in the country; they demonstrate a good deal more than this – that the percentage of poor whites lifted out of poverty by government safety-net programs is substantially higher compared to all other social groups – 44 percent of whites, compared to 35 percent of otherwise poor minorities ( CBPP study).

Whites2 According to Isaac Shapiro, a senior fellow at the Center on Budget and Policy Priorities (one of the report’s authors):

“There is a perception out there that the safety net is only for minorities. While it’s very important to minorities because they have higher poverty rates and face barriers that lead to lower earnings, it’s also quite important to whites, particularly the white working class.”

Research on Stereotypes of Poverty and Welfare

Researchers like Princeton Political Scientist Martin Gilens have documented how negative media portrayals of African Americans contribute to the perception that there are more blacks than whites who live in poverty and “take” benefits.

A Feb 2015 incident in Brushton New York illustrates that contrary to these stereotypes, the opposite case is often the reality. Police officers in Brushton arrested 30 people in connection with fraud. They were arrested for using their food stamp cards in a manner that was against the law. All were white.

maxresdefault (3)

Racial stereotypes not only harm the people they are intended to malign, such views also contribute to the under-estimating of the actual number of poor whites who live under these circumstances. Once entrenched, these narrative scripts can be particularly difficult to dislodge due to a phenomenon called “confirmation bias”(or confirmatory bias)–people incorporate the unsupported narratives into their belief systems and interpret ALL subsequent new information in ways that refer back to/confirm their pre-conceived beliefs.

Confirmation bias is demonstrated when decision makers actively seek out and assign greater emphasis to information/evidence that supports their pre-existing beliefs, while at the same time they conveniently ignore evidence that contradicts or undermines those beliefs. Some people (including your professor) refer to this biased selection process as “cherry picking.”

The Experiment

To prove the pernicious effect of Reagan’s stereotypical “Welfare Queen,” Franklin D. Gilliam, Jr., a professor in the Department of Political Science at the University of California, Los Angeles, conducted an experiment. Gilliam constructed a series of television news stories about the impact of welfare reform that featured a woman named Rhonda Germaine.

In his report, Gilliam had this to say:

“One of the more controversial issues on the American domestic agenda is social welfare policy. The near unanimity surrounding the “Great Society” programs and policies of the mid-to-late 1960’s has given way to discord and dissonance. Conservative thinkers and politicians first launched attacks on the “welfare state” in the aftermath of the civil rights disturbances of the late 1960’s and early 1970’s. While Barry Goldwater, George Wallace and Richard Nixon charted the course, Ronald Reagan encapsulated the white majority’s growing unease with the perceived expansion of the social welfare apparatus. In particular, Reagan was able to forge a successful top-down coalition between big business and disaffected white working-class voters. The intellectual core of the movement was a well-funded punditry class that offered a theoretical vision for the “New Right.”

While this perspective touched on the cornerstones of American political philosophy individualism and egalitarianism it also carried with it a heavy undercurrent of gender and racial politics. In the midst of this evolving political landscape on which new debates about welfare were taking place, the news media played and continues to play a critical role in the public’s understanding of what “welfare” is and what it ought to be. According to Gilliam:

Utilizing a novel experimental design, I wanted to examine the impact of media portrayals of the “welfare queen” (Reagan’s iconic representation of the African-American welfare experience) on white people’s attitudes about welfare policy, race and gender. My assumption going into this study was that the notion of the welfare queen had taken on the status of common knowledge, or what is known as a “narrative script.”

The welfare queen script has two key components: 1) welfare recipients are disproportionately women; and 2) women on welfare are disproportionately African-American.

What I discovered is that among white subjects, exposure to these script elements reduced support for various welfare programs, increased stereotyping of African-Americans, and heightened support for maintaining traditional gender roles. And these findings have implications both for the practice of journalism and the development of constructive relations across the lines of race and gender.”

How it Worked

Study participants, who differed on the basis of race and gender, were randomly assigned to one of four groups. Each group watched one of four different news stories. The first group watched the news story with Rhonda cast as a white woman. The second group saw a story that depicted Rhonda as an African-American woman. The third group watched the story without seeing a visual representation of Rhonda. The last group, a control group, did not watch any TV stories about welfare. At the end of the videotape, study participants were given a lengthy questionnaire that probed their political and social views.

Untitled

Study Findings

The welfare queen script assumed the status of common knowledge. When white subjects were asked to recall what they had seen in the newscasts, nearly 80 percent of them accurately recalled the race of the African-American Rhonda; less than 50 percent recalled seeing the white Rhonda.

Subjects who recorded the most “liberal” views about gender roles turned out to express the most hostility to blacks after they were exposed to white Rhonda. In other words, gender-liberal white participants who were shown the image of the African American woman were more likely to respond that they opposed welfare spending; they attributed “individuals” making bad decisions as the main cause of social problems and endorsed negative characterizations of African-Americans. This tendency was most pronounced among women respondents.

Social Welfare and Social Media

So now that we’ve looked at some research and data, take a look at the following video and see if you can pick up on the cultural stereotypes and narrative scripts that distinguish not only this v-logger’s thinking, but maybe even the thinking of some people you know (or you?):

The next two images might be familiar to you, if only because they tend to constantly recycle themselves through  Facebook and other social media. They are shared because they are assumed to be funny, but only if you are “in” on the joke, which in this case is that poor people are not so poor as you think. Rather, they are apparently living the high life, taking advantage of the system, while us poor dupes work hard to pay for their care-free and easily acquired lifestyle.

Both images imply that people who are not working/collecting benefits should not have smartphones. Odd thinking when we consider how helpful having a smartphone might be for someone waiting for an employer to call them for a job that would potentially get them OFF welfare. It’s as if the thought never occurs to people that welfare recipients might have previously been employed; that prior to their job loss, they might have had sufficient resources  that would let them own things like smart phones, cars, and nice clothing. Is the expectation here that people should sell off all of their personal possessions in the event of sudden unemployment?

531701_10100118430955607_1306307989_n

welfaremom
Social media, as demonstrated here, traffics in a cultural logic that, while not explicitly stated, nonetheless, assumes a set of “unwritten” rules for welfare benefit /”food stamp” recipients:

1) if you lose a job and apply for public assistance, you should sell any and all electronics and luxury goods you were fortunate enough to have possessed prior to your job loss;

2) don’t dress/present yourself too well in public, since this demonstrates you don’t really need financial assistance;

3) don’t dress/present too poorly, because this demonstrates your general unworthiness and constitutes even more proof that you shouldn’t be receiving welfare benefits

By now the contradictions should be obvious. In the end, it’s pretty hard for anyone who is poor to win here. The assumption is that in order to be seen as a “good” poor person” you must divest yourself of all personal possessions (even if you acquired them when times were good); you must also take care to look properly disheveled…but not too much. Striking the right balance is the key. Thus, it is only by walking this fine line that you can escape public judgment and ridicule.

Social media shaming works hand in hand with real-life shaming. In recent times, a number of high-profile videos have surfaced, which show what are almost always white identified people being verbally abusive and sometimes attacking poor minority people for doing things like demonstrating that they are bilingual (they can speak Spanish as well as English whenever they choose). Many low-wage people and families are subject to getting dirty looks from fellow shoppers for using food stamps in the checkout line (something that, as any cashier will tell you, a lot of working people, who are fully employed in low wage jobs, use the small food supplement to help make ends meet).

iPhone & Refrigerator Shaming Continued

Former Utah State Representative, Jason Chaffetz, once told a CNN television reporter that Americans might have to decide between owning a smartphone and having health insurance. In other words, Americans are going to have to cut back on modern “luxuries” like a phone if they want to be able to afford health insurance.

According to Chaffetz: “And so maybe rather than getting that new iPhone that they just love and they want to go spend hundreds of dollars on that, maybe they should invest that in their own healthcare.”

While Chaffetz was justifiably made fun of for his comparison (people pointed out that a year’s worth of health care would roughly equal 23 iPhone 7 Pluses in price), he was merely giving voice to what is again a commonly held belief – that poverty in the United States is the result of laziness, immorality, irresponsibility, and poor individual choices.

Another analogy was featured in a Fox News story. Anchor Stuart Varney talked about how “lucky” poor people are to have things like refrigerators.

poor5

Once more, we see evidence of a simplified logic: that if people engaged in better decision making — if they worked harder, stayed in school, got married, didn’t have children they couldn’t afford, spent what money they had more wisely and saved more — then they wouldn’t be poor. This deeply entrenched narrative continues to shape public thinking and beliefs about poor people in general. Further, it influences the thinking of government policy makers on BOTH sides of the two polarized sides of the political spectrum, who perhaps more than other people, should be incorporating findings published by research if they are truly committed to solving social problems.

poor7

Welfare Reform

While this flawed thinking owes a debt to Reagan, we find it similarly embedded deep within the logic of President Bill Clinton’s 1996 signature welfare restructuring law – the Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA). This was the logic that drove  efforts to impose work requirements on Medicaid recipients, to drug-test people collecting SNAP benefits and/or unemployment insurance, and to in some circumstances prevent aid recipients from buying steak and lobster.

drug test

Social media images, memes and the like, are not trivial. Because of their ubiquity and appeal to humor, they serve as powerful framing mechanisms for non-critical ways of thinking about who is poor and deserving of help – the worthy poor – as opposed to people deemed not “poor enough” and thus not deserving of help.

Although the images are not explicitly racialized, research like that conducted by both Gilens and Gilliam demonstrates that any expressed agreement or support for/against benefits are likely to be more reflective of “beliefs” that draw from gender/race/class stereotypes –  not understanding that is based on data and evidence derived from research.

The images, in terms of their effect, do nothing to “inform” and promote understanding; rather, they entertain while they at the same time stoke the fires of conflict and misunderstanding, and thereby perpetuate resentment of disadvantaged groups.

Consequently, no matter what poor people do (or not do), the poor in the United States are treated like social outcasts and criminals. This is not to say that poor people don’t make bad choices sometimes. Of course they do in some cases. Research, however, demonstrates that it is more likely that the structural system of capitalism has a far greater impact on how earning opportunity is organized, who has access to it, and who will ultimately have to struggle more than others.

Because all of this is complicated, most people would rather argue that the system is working and that it’s individual people who are solely to blame for the cause of their own poverty.

bb795618ff90cfc134148040f4fd2706

The Politics of Visibility

Working class people – people who are only slightly more successful than the working poor people located just below them on the social class ladder – don’t come anywhere close to getting any of the government giveaways that corporate shareholders get at places like Amazon, Google, and Boeing, or for that matter that typically wealthy people get. Their social status/social location make it difficult for them to “see” how wealthy people benefit from the system – specifically, from infinitely more generous tax policy on things like capital gains (taxed lower than wages from labor), government subsidies, and arcane accounting maneuvers like carried interest on tax deductions.

Consequently, because they can’t see how tax dollars and wealth transfer occurs at those high corporate levels, there is a tendency to criticize the things that they CAN SEE – a poor person begging on the street corner, someone using an EBT card; a poor person getting free medication due to Medicaid and/or “Obamacare.” They see people getting FREE stuff when they are barely scraping by and it makes them crazy mad.

Put another way, the incredible wealth and income benefits, tax deductions, and incentive programs that the richest members of society have access to are largely INVISIBLE to most working people. People getting government assistance, however, are HIGHLY VISIBLE and are thus much easier to judge. This is why they become a target for social wrath.

welfare state and warfare state

Sources:

Information about Gilliam’s “Welfare Queen” experiment can be found in an article published by the author in Harvard University’s Nieman Report, which can be accessed here.

Read more about the Brushton New York arrests here.

The Washington Post article with data published by the CBPP study can be accessed here.

Article by Stephen Pimpare, “Laziness Isn’t Why Poor People Are Poor.” Pimpare is the author of “A People’s History of Poverty in America” and the forthcoming “Ghettos, Tramps, and Welfare Queens: Down and Out on the Silver Screen.” He teaches American politics and public policy at the University of New Hampshire.

“6 Things Paul Ryan Doesn’t Understand About Poverty (But I Didn’t Either),” by Karen Weese

Americans Are Mistaken About Who Gets Welfare, by Arthur Delaney and Ariel Edwards-Levy, Huffington Post, 2018.
Federal Statistics on Welfare program usage are published by the Census Bureau and the U.S. Department of Labor Statistics.
Pennsylvania Demographic Statistics published by the World Population Review.

Help Overcome Misinformation, Denial, and Cognitive Dissonance

Many of you who are reading this may be coming by a lot of this information for the first time. It is likely you have numerous years under your belt, where you believed to be true a lot of the things the data here clearly dispute. Given this, it is possible you may not be ready to give up those beliefs. In the scheme of things, you might even go so far as to weigh your personal opinions against the data, and then elect to simply dismiss the data. If you do this, just know there are some very deep and powerful psychological reasons operating that may be preventing you from doing so.

In this instance, the psychological defense mechanism of “denial” is operating in conjunction with “cognitive dissonance.” Cognitive dissonance, according to psychologists, is the resulting mental state of stress/discomfort that occurs when someone simultaneously holds two or more contradictory beliefs, ideas, or values. Thus, the pre-existing beliefs, ideas, and values become contradicted by the new information; in order to resolve the mental stress, the individual simply dismisses/denies the new information.

According to Stephen Pimpare, denial serves a few functions.

“First, it’s founded on the assumption that the United States is a land of opportunity, where upward mobility is readily available and hard work gets you ahead. We’ve recently taken to calling it grit. While grit may have ushered you up the socioeconomic ladder in the late 19th century, it’s no longer up to the task today. Rates of intergenerational income mobility are, in fact, higher in France, Spain, Germany, Canada, Japan, New Zealand and other countries in the world than they are here in the United States. And that mobility is in further decline here, an indicator of the falling fortunes not just of poor and low-income Americans, but of middle-class ones, too (Pimpare). To accept this as reality is to confront the unpleasant fact that myths of American exceptionalism are just that — myths — and many of us would fare better economically (and live longer, healthier lives, too) had we been born elsewhere. That cognitive dissonance is too much for too many of us, so we believe instead that people can overcome any obstacle if they would simply work hard enough (Pimpare).

Second, to believe that poverty is a result of immorality or irresponsibility helps people believe it can’t happen to them. But it can happen to them (and to me and to you). Poverty in the United States is common, and according to the Census Bureau, over a three-year period, about one-third of all U.S. residents slip below the poverty line at least once for two months or more” (Pimpare).

Third — and conveniently, perhaps, for people like Chaffetz or House Speaker Paul D. Ryan (R-Wis.) — this stubborn insistence that people could have more money or more health care if only they wanted them more absolves the government of having to intervene and use its power on their behalf. In this way of thinking, reducing access to subsidized health insurance isn’t cruel; it’s responsible, a form of tough love in which people are forced to make good choices instead of bad ones. This is both patronizing and, of course, a gross misreading of the actual outcome of laws like these” (Pimpare).

To conclude, Pimpare offers the following:

“There’s one final problem with these kinds of arguments, and that is the implication that we should be worried by the possibility of poor people buying the occasional steak, lottery ticket or, yes, even an iPhone. Set aside the fact that a better cut of meat may be more nutritious than a meal Chaffetz would approve of, or the fact that a smartphone may be your only access to email, job notices, benefit applications, school work and so on.

Why do we begrudge people struggling to get by the occasional indulgence? Why do we so little value pleasure and joy? Why do we insist that if you are poor, you should also be miserable? Why do we require penitence?

People like Chaffetz, Ryan, and their compatriots offer us tough love without the love, made possible through their willful ignorance of (or utter disregard for) what life is actually like for so many Americans who do their very best against great odds and still, nonetheless, have little to show for it. Sometimes not even an iPhone.”

At the end of the day, there are a lot of different ways we might tackle poverty. The problem is, we can’t make any progress until the vast majority of people in society are willing to take a hard look at how their thinking too often diverges from reality on the matter; they must be willing to give up many of the assumptions they have about who is poor, on welfare, etc.

In short, they will need to set aside these beliefs and prejudices that are emotionally satisfying, but not informed by data and research; they need to begin to try to envision what the world really feels like for families living in poverty every day.

14681737_1154745547940768_1309604149978516732_n
Discussion Questions:

How might your own thinking about social welfare benefits and poverty be shaped by family perspectives, media narratives and social media imagery? How might these sources of information potentially influence how people understand “who is poor” in the United States? How might those understandings influence ideas about how to fix the problem?

Were you surprised to learn that statistics document there are more white people that receive food stamps and other social welfare benefits than African Americans?

Do you trust poor people who ask you for money? Do you think that most poor people are trying to get over on the system?

Why do you think it is that programs that benefit poor people (Food stamps/SNAP benefits) are referred to as “welfare,” but other programs that are similarly classified as public assistance are not thought of as “welfare.” This includes benefits like the home mortgage interest deduction, unemployment compensation, and the GI Bill. And what about other benefits that tax dollars pay for like capital gains tax write-offs and incentives and subsidies paid to corporations? Which of these categories of benefits do you imagine costs taxpayers the most? Benefits for poor people or benefits paid to wealthy people? As for the latter, why don’t we think of these benefits as “welfare,” considering that taxpayers pay for them too, often at a far greater cost?

Consider the following: Are you more concerned with the fact that a very small number of “undeserving” folks might “cash in” and get a “free lunch”  as a result of efforts to promote less restrictive school lunch programs or are you more concerned with guaranteeing that not one single “undeserving” child gets a free meal? What is most important to you?

Why do you think people get upset about poor people receiving aid and benefits that are far less costly in terms of total dollars than, let’s say, other government programs, i.e. military spending, agriculture subsidies, and tax incentives for people and/or corporations? (see the list below)

1236215_718242904870620_1492990308_n

welfare

Course: Classical Social Theory, Race & Ethnicity, Race Ethnicity, Social Problems

The Fight for 15

55 Comments

fight

What is the Minumum Wage?

The federal minimum wage is $7.25 an hour. This translates to an annual income of about $15,000 a year for someone working 40 hours per week.

In 2011, more than 66 percent of Americans surveyed by the Public Religion Research Institute supported raising this figure to $10. The charts that follow illustrate some historical trends that are interesting to look at within the context of the debate over what the minimum wage should be.

What Should the Minimum Wage Actually Be?

Between the end of World War II and the late 1960s, American productivity and wages grew steadily. And why wouldn’t they? We were the only major industrialized country whose economy had not been devastated by a major war. The minimum wage peaked in 1968, and since that time increases in productivity outpaced the minimum wage growth.

Put another way, the purchasing power of the minimum wage in the late 1960s was nearly $9.54 an hour (in 2014 dollars). That’s more than two dollars above the current level of $7.25 an hour!

While raising the minimum wage to $9.54 would provide a large improvement in living standards for millions of workers who are currently paid at or near the minimum wage, it is worth asking a slightly different question: what if the minimum wage had kept in step with productivity growth over the last 44 years?

Alternatively, rather than just hold constant our purchasing power at the 1968 level, suppose that our lowest paid workers had shared evenly in the economic growth over the intervening years? What would they be paid if that had occured? (and why does it seem so far fetched that workers should get a slice of the profit from the increased productivity that hey they produced?). This brings up another related question: what is the appropriate wage floor for the labor market?

During the years from 1947 to 1969, the minimum wage actually did keep pace with productivity growth. (This might also be true for the decade from when the federal minimum wage was first established in 1938 to 1947, but we don’t have good data on productivity for this period.) For more on these trends see the Center for Economic Policy Research website.

In another study, economists found that the minimum wage would have reached $21.72/hour in 2012 if it kept up with increases in worker productivity [see the study by the Center for Economic and Policy Research].  That means when we adjust the minimum wage for BOTH inflation AND increased worker productivity (which has more than doubled since 1968), the minimum wage would be much higher than it is today.

Consequently, even though advancements in technology and worker productivity combined have increased the amount of goods and services that can be produced in a set amount of time, wages have remained relatively flat while this occured. Why?

[Note: “Inflation” is the term economists use to describe an economic trend, where there is a general increase in prices accompanied by a loss of purchasing power/decreasing value of money].

7ecd1cf17

IMG_9902

Alternatively, a study from the nonpartisan Congressional Budget Office (CBO), found that a minimum wage increase to $15 per hour would likely eliminate about 1.3 million jobs by 2025, or roughly 0.8% of payrolls nationwide, and possibly as many as 3.7 million.

The CBO recommends a gradual increase over time.The data, they say, suggests an increase to $10 an hour would have the least detrimental effect on employment levels. This study, however, did not break down industry sectors, and rather attempted to forecast trends by looking at the economy as a whole, which is difficult to accurately project.

Recent Trends

More recently, during the years 2000 to 2012, workers boosted their productivity by 25 percent. In spite of this substantial increase, which had a pretty sizeable impact on the bottom line profitability of their respective firms, workers saw their earnings FALL rather than rise.

Alternatively, during this same time period, the top 1% of wage earners saw their earnings increase dramatically – more than 33%. This development is why some leading some economists refer to the early 21st century a “lost decade” for American workers. You don’t have to have a degree in economics to process all of this. Just ask yourself a simple question: Does this seem fair?

Bear in mind now, if you are one of the people arguing against the workers who want to be paid $15/hour (a rate far less than what these charts say the wage should be), you are arguing in favor efforts to intervene in sacrosanct “markets” to maintain wages at an artificially low rate.

Put another way, you are taking the proverbial “hidden hand of the market” and using it to bludgeon workers as you argue to tilt the wage scales to favor wealth interests (not waged labor). That is, you would rather let corporate owners increase profits on the backs of the many workers who produce their wealth.

income-change

To put these charts into greater relief, if the minimum wage had grown at the same progressive rate as the earnings of the top one percent of Americans, the federal wage floor would be more than triple the current hourly minimum of $7.25 (over $21.00 per hour).

This clearly did not happen. Instead, the minimum wage has been lower than a poverty wage since 1982.

8dXSp

In 2015, President Obama called for increasing the federal minimum wage to $10.10 an hour by the end of 2015. He argued that after 2015, increases in the minimum wage should be tied to inflation, with the minimum wage rising in line with the consumer price index. While this incremental step is not sufficient in the eyes of many labor advocates, it’s still better than not raising the minimum wage and letting it remain flat.

The $15 minimum wage has become a campaign issue in the 2020 presidential race, where bills calling for the so-called “living wage” have been introduced in both the U.S. House of Representatives and the Senate. The CBO report (Central Budget Office) has reported data on the House version, which is very close to the measure that was introduced in the Senate by Bernie Sanders, I-Vt.

Today, more than 19 states and the District of Columbia have a higher wage floor (higher than the federally mandated minimum wage). New Jersey later followed as they became the 11th state to index their wage to the cost of living/inflation. (Graph courtesy of Sam Waldman at the American Prospect).

New York state also just implemented a $15 wage policy for state workers (City University of New York workers, who are also state workers, were surprisingly singled out by Governor Cuomo and excluded ).

Some local governments took things even further, like the Seattle/Tacoma commerce region, as they put into effect a local $15 minimum wage.

be3e37683

2020 Update:

Since the Fight for $15 was launched by striking fast-food workers in 2012, states representing approximately 21 percent of the U.S. workforce—California, Massachusetts, New York, and the District of Columbia—have approved raising their minimum wages to $15 an hour.

Additional states—including Washington, Oregon, Colorado, Arizona, Missouri, Michigan, and Maine— approved minimum wages ranging from $12 to $14.75 an hour.

Seattle’s experiment has been running for more that 5 years. Here are two studies published by the University of Washington that discuss results in two different industries (child care and food stores).

How many people earn the minimum wage?

The short answer is: Not many. But that’s also the wrong question.

According to the Bureau of Labor Statistics, 1.57 million Americans, or 2.1 percent of the hourly workforce, earned the minimum wage in 2012. More than 60 percent of them either worked in retail or in leisure and hospitality, which is to say hotels and restaurants, including fast-food chains.

If you want to honestly debate the merits of raising the minimum wage, however, you need to think beyond who earns it today. After all, there are millions of workers making $8 or $9 an hour assembling burgers or changing sheets who might be affected by a hike.

The Economic Policy Institute estimates that even if Washington increased the minimum wage to $10.10, some 21.3 million employees would eventually be guaranteed a raise, assuming they kept their jobs.

  • According to 2019 analysis, gradually raising the federal minimum wage to $15 by 2024 would lift pay for nearly 40 million workers— 26.6 percent of the U.S. workforce.
  • Two-thirds (67.3 percent) of the working poor in America would receive a pay increase if the minimum wage were raised to $15 by 2024.
  • A $15 minimum wage would begin to reverse decades of growing pay inequality between the lowest-paid workers and the middle class.
  • Failure to adequately increase the minimum wage accounts for 48 percent of the increase in inequality between women at the middle and bottom of the wage distribution since 1979 (page reference, Economic Policy Institute).

Of course, other  companies could follow along with the federal increase and elect to adjust their wage scales upwards to reflect changes in the cost of living for wage earners across all occupations. This would address the complaints of those people fixated on the income of paramedics (or fill in the blank occupation) relative to burger flippers.

Speaking of paramedics, here’s one paramedic’s response to the controversy:

Paramedic

Raising the Minimum Wage Helps People of Color (POC)

One puzzle piece missing in the graphics presented here are the demographic variables of “race” and “ethnicity,” which tend to be deeply intertwined.

Since racial minorities are over-represented among the minimum wage workforce, raising the minimum wage to just $10.10 would effectively lift 3.5 million people of color out of poverty.

Nearly two-fifths (38 percent) of African Americans and one-third (33 percent) of Latinos would get a raise if the federal minimum wage were increased to $15

Many POC people qualify for food stamps, even though they work full-time jobs. Whenever this happens – and it happens a lot – U.S. taxpayers are effectively subsidizing giant corporations with their tax dollars, as they  clearly earn sufficient profits that would permit them to pay their people a higher wage (Walmart, Amazon, etc.). Think of it as “Corporate Welfare.”

CorporateWelfare

The Typical Beneficiary

The typical person who would benefit from a federal minimum wage increase is, according to analysis, a 35 year old woman who works full-time and takes care of children.

  • Fewer than 10 percent of minimum wage earners are teenagers, while more than half are prime-age adults between the ages of 25 and 54 (this is where a lot of people went to work when they lost their previously held “good jobs.”
  • More than half (58 percent) are women.
  • 60 percent work full time.
  • Nearly half (44 percent) have some college experience.
  • 28 percent have children.
  • The average worker with a spouse or child who would benefit from a $15 minimum wage provides 52 percent of his or her family’s total income.

Top Reasons to Raise the Minumum Wage

1. It Will Put Money Into The Pockets Of Hard-Working Americans. Raising the minimum wage to $10.10 will raise wages for 28 million workers by $35 billion in total. Since many of those workers will turn around and spend that money, that is a huge boost for the economy.

2. It Will Reduce Income Inequality. The average CEO shouldn’t make 933 times more than a full-time minimum wage worker.

minwage_CEO-638x638

3. It Won’t Hurt Job Creation. States have raised the minimum wage 91 times since 1987 during periods of high unemployment, and in more than half of those instances the unemployment rate actually fell. Over 600 economists signed a letter agreeing that a minimum wage increase doesn’t hurt job creation.

greed

4. It Is Unlikely To Significantly Impact Prices. A higher minimum wage would mean a DVD at Walmart will cost just one cent more.

5. It Would Help People Get Off Of Food Stamps. A study by the Center for American Progress found that raising the minimum wage to $10.10 would help 3.5 million Americans get off food stamps.

6. It Will Save The Government Money. The same CAP study found that, in conjunction with helping people get off of food stamps, raising the minimum wage would save the government $46 billion over 10 years in spending on the Supplemental Nutrition Assistance Program (SNAP) as people earn enough on their own to no longer qualify.

7. It Will Improve People’s Economic Security. It is no longer the case that the people making the minimum wage are largely teenagers. In fact, now more than half of workers earning under $10.10 an hour are forced to support themselves on that as their primary income.

8. It Will Lift People Out Of Poverty. The non-partisan Congressional Budget Office states that raising the minimum wage to $10.10 would lift 900,000 people out of poverty. For full time workers earning the federal minimum wage, this bump would give them a raise of over $4,000 dollars — enough to take a family of three out of poverty.

9. Businesses Recognize That They Will Also Benefit. Many successful businesses, such as Gap and Costco, already pay their employees wages above $10.10. They do this so they can “attract and retain great talent“. And 60% of small business owners recognize that their businesses would benefit if we raise the wage.

10. Millions Of Children Will Be More Secure. If we raise the minimum wage to $10.10,21 million children will have at least one parent whose pay will go up.

IMG_9998

Still Not Convinced?

I thought minimum-wage earners were mostly just suburban teenagers. Is that true?

Think tanks like the Heritage Foundation, which is funded by political donors, would like you to think this is true. Often, we find conservative groups argue that, contrary what is argued by liberals, most of the minimum wage workforce isn’t really made up of desperate parents struggling to make ends meet. They say the wage force consists of middle-class teens and married women who live above the poverty line that want to work part-time while raising young children. Now, they’re not all wrong about this, but they are deliberatly misleading us.

Teens, it has already been shown, are represented in this labor pool but they do not represent the largest demographic sub-group in the minimum wage work-force (between 10 and 12% of minimum wage workers are teens). So let’s dispel that once and for all; the vast majority of minimum wage workers aren’t teenagers!

Interestingly enough, 62 percent of those under 25, who are eargning these wages, are according to Heritage enrolled in school. This means that this group is doing precisely what everyone argues they should do – they are upgrading their skills to get a better job. In other words, they are  not planning to make a career out of folding snack wraps.

Among minimum wagers older than 25, Heritage notes that the average household income is $42,000 a year. Is that poverty? Not unless you’re a single parent with eight children (or trying to pay rent as a single person in New York City). But is it rich? Of course not. In fact, it’s still well below the median household income of $51,000.

I get it. It’s complicated. But what does the research say about the impact of raising the minimum wage?

Ask a liberal economist, and they’ll likely point to a 2009 study of studies by Hristos Doucoullagos and T.D. Stanley that pooled together the results of 61 different research papers published over the decades. When averaged together, the results suggested that raising the minimum wage had close to zero impact on employment (to the extent that jobs might be lost). An increase of 10 percent in the minimum wage, they found, might reduce employment by about 0.1 percent, which they concluded had “no meaningful policy implications.”  “If correct, the minimum wage could be doubled and cause only a 1 percent decrease in teenage employment,” they wrote.

So fears of jobs being lost or replaced with automation are largely exaggerated. According to the data, taking small losses to jobs would be a worthy sacrifice, considering the raises other workers would receive. In short, the change is almost certainly worth it.

McDonalds-Machines

If raising the minimum wage doesn’t kill jobs, who stands to benefit the most? The poor, or the middle class?

Mostly the middle class. As was discussed up top, most minimum-wage earners don’t live under the poverty line. So you shouldn’t be surprised to learn that most of the people who stand to gain from raising it are also not in poverty.

A 2010 study by Joseph Sabia and Richard Burkhauser, who fall on the solidly conservative side of this issue, finds that if the minimum wage were increased to $9.50 from $7.25, only 11.3 percent of beneficiaries would live in impoverished households.

So maybe it’s better to think of the minimum wage as a way of getting more money to the broader working class.

This might sound like a selfish question, but how much more expensive would my hamburgers get if we raised the minimum wage?

You’re not being selfish at all! If raising the minimum wages caused a lot of inflation in the economy, it might cancel out the benefit to workers.

Thankfully, the studies and evidence suggest that probably wouldn’t be the case. Sara Lemos reviewed the literature on this topic and found most studies report that a 10 percent US minimum wage increase raises food prices by no more than 4 percent and overall prices by no more than 0.4 percent.” (see Seattle study cited above).

But what about burgers specifically? Well, their prices would might go up a bit more. Based on data from 80s and early 90s, Daniel Aaronson estimated that a 10 percent increase in the minimum wage drove up the price of McDonald’s burgers, KFC chicken, and Pizza Hut’s pizza-like product by as much as 10 percent.

Assuming that holds true today, it means that bringing the minimum wage to $10.10 would tack $1.60 onto the cost of your Big Mac. That said, there is international evidence—Mickey D’s makes a killing in high-wage countries like Australia and France, where it’s against the law to pay their workers poverty wages—that suggests the price hike could be even lower.

politifact-photos-Min_Wage_meme

My dad owns a small business (restaurant) and he said he would go out of business if he had to pay his workers $15 hour.

In what has been a major development since the late 1970’s, the United States effectively made the transition from a manufacturing-based to a service-based economy. This has been good news for American restaurants, who, unfortunately, have not shared much of their new-found wealth with their employees, who studies have shown work under some of the worst conditions in the U.S., suffering from low wages and high levels of sexual harassment, wage theft, etc.

“There’s no question that every variety of sleazy scam is at work on the floor, in the front of the house,” said chef-turned-author (and my sometime employer) Anthony Bourdain, whose memoir of a twenty-eight-year restaurant career, Kitchen Confidential, published in 2000, shines a light on some of the unsavory facets of the business.

The average weekly earning for an employee in a full-service restaurant was $274, according to data from the Bureau of Labor Statistics. That’s an annual income of $14,248, which puts a family of two or more below the poverty line. Only a small fraction of American restaurant workers have health insurance or paid sick days, and stories of wage theft, both inadvertent and intended, are common.

A study, out of Cornell University, examines the effect of minimum wage increases on the restaurant industry specifically—an industry that is labor-heavy, employs lots of low-wage workers, and can be relied upon to lobby intensely (National Restaurant Association) against any sort of minimum wage increases. Here is a bite-sized takeaway of the findings:

 [The] results of this study confirm previous findings, namely, that the relatively modest mandated increases in employees’ regular and tipped minimum wages in the past twenty years have not had large or reliable effects on the number of restaurant establishments or restaurant industry employment levels, although those increases have raised restaurant industry wages overall. Even when restaurants have raised prices in response to wage increases, those price increases do not appear to have decreased demand or profitability enough to sizably or reliably decrease either the number of restaurant establishments or the number of their employees. Although minimum wage increases almost certainly necessitate changes in restaurant prices or operations, those changes do not appear to dramatically affect overall demand or industry size. Furthermore, there is strong evidence that increases in the minimum wage reduce turnover, and good reason to believe that it may increase employee productivity as well.

Is the current restaurant model being subsidized by underpaid employees? The data says – YES! What is the true cost of dinner in a restaurant, and is the dining public willing to pay it? WIll restaurants benefit from a higher minimum wage? Potentially, according to many studies, the answer is yes.

Economic studies indicate that a higher minimum wages do not in fact decrease employment. David Card and Alan B. Krueger’s 1994 study, “Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania,” published in the American Economic Review, is a landmark piece of research that refutes job-loss claims. A paper by Arindrajit Dube, William Lester, and Michael Reich that built on Card and Krueger’s research design, “Minimum Wage Effects Across State Borders,” published in the November 2010 issue of Review of Economics and Statistics, “found that increases in the minimum wage raise workers’ earnings without reducing employment,” according to a summary by the National Employment Law Project (NELP).

Simply put, studies have shown that when working people have extra money in their pockets, they almost always SPEND it – and not on luxury items like yachts, which is what the rich do, they spend it on consumption items. This includes having more money to go out and pay for a meal, which they might have previosly not been able to afford to do. That translates into more business for restaurants.

Bear in mind now, many restaurants have reliably benefitted from business model currently set at a notch or two above slavery. Letting go of that model is going to be difficult for many of them. Introducing a new model, one based on a living wage, is scary. Yet it is one that offers new found benefits to restaurants that are good at what they do.

The bottom line: if your business model is viable only because of wage slavery, you don’t deserve to be in business.

Why should we pay good wages to unskilled workers? (article written by E. McClelland)

Let me tell you the story of an “unskilled” worker in America who, as it turns out, lived better than most of today’s college graduates.

In the winter of 1965, Rob Stanley graduated from Chicago Vocational High School, on the city’s Far South Side. Pay rent, his father told him, or get out of the house. So Stanley walked over to Interlake Steel, where he was immediately hired to shovel taconite into the blast furnace on the midnight shift. It was the crummiest job in the mill, mindless grunt work, but it paid $2.32 an hour — enough for an apartment and a car. That was enough for Stanley, whose main ambition was playing football with the local sandlot all-stars, the Bonivirs.

Stanley’s wages would be the equivalent of $17.17 today — more than the “Fight For 15” movement is demanding for fast-food workers. Stanley’s job was more difficult, more dangerous and more unpleasant than working the fryer at KFC (the blast furnace could heat up to 2,000 degrees). But according to the laws of the free market, though, none of that is supposed to matter. All that is supposed to matter is how many people are capable of doing your job. And anyone with two arms could shovel taconite. It required even less skill than preparing dozens of finger lickin’ good menu items, or keeping straight the orders of 10 customers waiting at the counter. Shovelers didn’t need to speak English. In the early days of the steel industry, the job was often assigned to immigrants off the boat from Poland or Bohemia. “You’d just sort of go on automatic pilot, shoveling ore balls all night,” is how Stanley remembers the work.

Stanley’s ore-shoveling gig was also considered an entry-level position. After a year in Vietnam, he came home to Chicago and enrolled in a pipefitters’ apprenticeship program at Wisconsin Steel. So why did Rob Stanley, an unskilled high school graduate, live so much better than someone with similar qualifications could even dream of today? Because the workers at Interlake Steel were represented by the United Steelworkers of America, who demanded a decent salary for all jobs. The workers at KFC are represented by nobody but themselves, so they have to accept a wage a few cents above what Congress has decided is criminal.

The “no skills” argument holds more or less that if the teenager cleaning the grease trap wants more money, he should get an education. While we can agree this sounds at least superficially logical, it has little connection to economic reality. Workers are not simply paid according to their skills – they’re paid according to what they can negotiate with their employers. And in an era when only 6 percent of private-sector workers belong to a union, and when going on strike is almost certain to result in losing your job, low-skill workers have no negotiating power whatsoever.

tumblr_mhi1x9qiHE1s4g4o4o1_1280

IMG_4671

Won’t raising the minimum wage cause more unemployment, as companies reduce head-count to cope with increasing labor costs?

“Increasing the minimum wage will cause more unemployment” is the prototypical “Econ 101″ trope that turns out to be much less straightforward (or flat out wrong) in real life.

fast-food-workers-in-ny-4_3

Okay, Fine. Explain to me again why paramedics, who actually save lives, don’t make $15/hr. Why should fast food workers make $15/hr?

This question reflects a classic divide and conquer strategy. It’s the basis for a false equvialency narrative that infuses many anti-labor arguements, who want to pit workers against each other. The narrative wants us to believe that fast food employees are being selfish/greedy, because they don’t possess comparable work skills. And so they should not expect to be paid a wage that exceeds that of paramedics. Unfortunately, this analogy is not only misleading, it is flat-out wrong.

Pitting groups of workers against each other is a tactic used to justify keeping wages low. Paramedics’ compensation has nothing to do with fast food employees’ compensation, other than the fact that BOTH are not being paid a fair wage by their employers and BOTH deserve to be paid more!

When this logic is accepted, however, our attention gets deflected away from the fact that BOTH are being exploited by their respective employers. BOTH are routinely asked to perform tasks that exceed and/or do not reflect the material reality of their job function/compensation.

Pitting workers against each other takes our focus away from something far more important – the relationship with the EMPLOYER.

These facts tend to get muddled when workers instead opt to compete in what amounts to a “Hunger Games” race to the bottom.

1969189_10152053764701275_520403265_n

It’s the Economy Stupid

Changes in the economy, some recent, and others trending for quite some time, are making it difficult if not impossible for the average person to get ahead.

Now, you might say to yourself “well, I’m not average, I’m above average, so I will be okay.” Really? Think about this. Look around. Is it easier to believe that everyone around you is lazy (obvs. not you) and not that people born into wealth, power, and privilege are willing to do whatever it takes to keep the money flowing into their pockets and maintain the status quo that made them rich?

Lean on social science as you go forward. Don’t just think about your individual-level personal observations and experience. Instead, think about how those individual experiences are likely shaped by the larger structural forces that are potentially at work here. These are, more often than not, infinitely stronger than any one individual, desire and hard work notwithstanding.

This is important, because how you think about this may ultimately determine your own success. You have two choices really – 1) you can work to help bring about positive social change that enables everyone’s success; or 2) you can side with those who want weaken worker benefits to the detriment of us all.

In the first scenario, you at least have a chance to master your own destiny. In the case of the latter, this only serves to foster the social reproduction of pre-existing wealth and privilege; it helps those with wealth and privlege to leverage their dominance and secure their social class interests.

To put it bluntly, when you choose this second course of action you become at best what some have termed a “useful idiot” to the powerful monied classes, who are more than happy to use you as a tool in their efforts to oppress other people, who despite their race and country of origin, are not really a lot different from you.

12235124_10206854992507551_1170119268769377864_n

Don’t Like Low Wages? Get an Education

Again, the “no skills” argument holds water when we assume that the teenager cleaning the grease trap only wants more money for low skills. They “should get an education” is offered in response to the demand for higher wages. But here again, while the argument sounds superficially logical, it has little connection to economic reality. As it turns out, 60% of minimum wage workers are in fact pursuing an education while working. But let’s put this argument into historical perspective.

How easy is it for somebody earning a minimum wage today to pay for college compared to someone who worked in the 1970’s? How long would you have to work at minimum wage to pay the average in-state tuition and required fees at a U.S. public university?

In the early 1970’s it was less than 300 hours, or about seven and a half weeks of full-time work. Now? More than 1,000 hours, or 25 weeks.

min wage

How Much College Would a Summer’s Minimum Wage Pay For?

min wage 2

In light of these statistics, working a minimum wage job for an entire summer covers approximately 25% of tuition costs at a public university; less at a private university. And this, of course, assumes the person earning the minimum wage is a teen, who doesn’t have to use their wages to pay for other living expenses like rent or child care. If they do have to pay for those things, then the likelihood that they can work their way out of their minimum wage job by going to school outside the job is well – you do the math. It’s virtually impossible and, moreover, is cruel to suggest that they can do so (and when they can’t say they are  lazy because they’re not working or trying hard enough).

“Old Economy Steve” – my favorite meme!

Summary

Simply put, it’s an economic no-brainier for Congress to raise the minimum wage – at least to around $10/hour. It will put more money into the pockets of hard-working Americans and will not negatively impact businesses and job creation. Raising the wage is a critical step in creating an economy that works for everyone, not just the wealthy few.

The fact that many of today’s college graduates have the same standard of living as the lowest-skilled workers of the 1960s proves that “race to the bottom” market ideologies, which stipulate we should simply let the market dictate what people should be paid is the wrong attitude. And by this I mean, it doesn’t work for our economy and it doesn’t work for workers or the businesses they work for (okay maybe a few rich owners).

None of the data produced outside of the politically biased realm of think tanks and the lobbying industry supports keeping the minimum wage artificially low.

If we want to fix our economy and restore what we’ve traditionally thought of as the middle class, we have to stop thinking of ourselves as middle class, no matter how much we earn, or what we do to earn it. “Working class” should be defined by your relationship to your employer, not whether you perform physical labor. People with college degrees can be “working class” (and some are even poor). Unless you own the business, you’re working class.

Sources

This post reflects data and information that originally appeared and was published in two articles. The first article is by Edward McClelland, “The ‘middle class’ myth: Here’s why wages are really so low today.” He is also the author of “Nothin’ But Blue Skies: The Heyday, Hard Times and Hopes of America’s Industrial Heartland.” McClelland cites passages from another work you might check out – “Methland: the Death and Life of an American Small Town,” by Nick Reding.

Economic Policy Research website.

Bureau of Labor Statistics website.

Bureau of Labor Statistics, Occupational Employment Statistics, May 2016.

Congressional Budget Office (CBO) website.

Projections for the consumer price index were applied to the Economic Policy Institute’s Family Budget Calculator, which measures the amount of income a family needs to attain a secure yet modest standard of living in all counties and metro areas across the U.S.

High-End Food, Low-wage Labor, by Laurie Woolever, 2012.

“Should We Raise the Minimum Wage? 11 Questions and Answers.” Published in “The Atlantic,” by Jordan Weissmann – Center for Economic Policy Research.

“These Charts Show How  Much College A Minimum Wage Job Paid For, Then And Now,” by Greg Schoofs

Sylvia Allegretto and David Cooper, Twenty-Three Years and Still Waiting for Change: Why It’s Time to Give Tipped Workers the Regular Minimum Wage, Economic Policy Institute, July 10, 2014.

Ken Jacobs, Ian Perry, and Jenifer MacGillvary, The High Public Cost of Low Wages, University of California Berkeley, Labor Center, April 2015.

Cooper, Raising the Federal Minimum Wage, Appendix Table 3. See also Laura Huizar and Tsedeye Gebreselassie, What a $15 Minimum Wage Means for Women of Color, National Employment Law Project, December 13, 2016.

Doruk Cengiz, Arindrajit Dube, Attila Lindner, and Ben Zipperer, “The Effect of Minimum Wages on Low-Wage Jobs: Evidence from the United States Using a Bunching Estimator,” LSE Center for Economic Performance Discussion Paper 1531, February, 2018.

Paul J. Wolfson and Dale Belman, “15 Years of Research on U.S. Employment and the Minimum Wage,” Tuck School of Business Working Paper No. 2705499, 2016.

David Autor, Alan Manning, and Christopher L. Smith, “The Contribution of the Minimum Wage to U.S. Wage Inequality over Three Decades: A Reassessment,” American Economic Journal: Applied Economics vol. 8, no. 1, January 2016.

Raw data for college tuition and minimum wage statistics can be found here.

Discussion Questions

Do you think it makes sense to pass along all of the profits gained from productictivity increases to business owners (not sharing any of that money with workers)?

Have you ever worked for minimum wage? If so, who did you work with?What were the social demographics of your workforce? Were your co-workers predominantly young, old, male, or female? Or was it mixed?

If you worked in such a job, did you feel that you were paid a wage appropriate for the amount of work that you did?

Do you think that fast food and other minimum wage workers perform needed services? Do you see yourself personally enjoying benefits from what their labor provides? Do you think those workers should be paid “slave” wages because you like to eat cheap hamburgers and chicken?

Do you think it is mathematically possible for a person making the minimum wage to pay their household costs as well as save money to attend college? Have you ever tried to do this?

Do you think it’s fair that taxpayers are forced to subsidize the low wages of extremely profitable corporations (again, because many of their employees can’t live on their wages; to work there, they must file for food stamps and other benefits)?

Have any of these statistics helped you to think differently about the minimum wage as a social problem?

Course: Classical Social Theory, Race & Ethnicity, Social Problems

We are the 53%

19 Comments

53percent_guy

The different responses to the “We are the 99 percent” movement are somewhat funny, though they are also a little bit heartbreaking and tragic for the level of cognitive dissonance they imply.

Recalling former Presidential candidate Mitt Romney’s famous gaffe that “47% of Americans were takers” (in the sense they were not paying taxes – not true), the self-declared “Mr. 53%” depicted here aimed to declare how he, by way of contrast, was not a taker (aka slacker/not “entitled”). He says he doesn’t blame Wall Street for the fact that, despite his being a veteran, he has had to struggle to work and pay for school. For him, these struggles are something akin to a badge of honor that he wears with pride.

For a different point of view, consider what this blogger has to say:

“I bet the Wall Street elite are thankful that they have some idiot kid with no sense of history willing to act as one of their brown-shirts if things start getting ugly in this new class war. There aren’t too many of the top 1% who are ex-military and a lot of liberals are afraid to death of former soldiers. I served too, and I’d like to tell this kid something. I didn’t serve in the military to protect a country that seems only out to make life better for the richest few.

If the conservative hero Ronald Reagan hadn’t slashed the living shit out of the GI Bill this kid would have had a lot easier time making it through college. He could have done it in four years while only working part-time. But Saint Ronnie said that government spending sucks so he made a complete mockery of educational benefits for veterans (while drastically increasing overall military spending). If you don’t believe me you can look it up. I was actually serving during the Reagan administration and luckily for me I had completed most of my degree before I enlisted.

And f*^ck you for calling people whiners who are fighting for the rights of the working class. I’d be the first to say that many of the Occupy Wall Street folks have their heads up their asses and don’t have much of an idea of what to do, but at least they are doing something. They are actually protesting so people like you don’t have to work like a 16th century peasant.  I suppose that you think the early union organizers in America were whiners for trying to protect workers from the worst abuses of the industrial era. You should be ashamed of yourself instead of being so smug and high and mighty.

It was a huge government “bail-out” of returning WWII veterans that turned America into a country which actually created the idea of the middle class. Veterans were able to go to college and then buy home with VA loans. These men were mostly lower middle class deadbeats who would have never been able to afford a college education, much less buy a house. My father was one of those vets.

This was the United States government that did this, so be careful when you spew out shit from Rush Limbaugh criticizing the government. And perhaps you skipped the class on civics, but in a democracy we are the government. This means that if we don’t like something we can work to change it. If our elected officials aren’t carrying out our wishes, then we can protest our government. It’s legal and it’s in the constitution.  Try reading it sometime instead of having some right-wing moron spoon-feed you their bizarre interpretation of our founding document.

53alex

Discusion Questions

Based on our diverse readings about the plight of the white working class, what do you think might explain why the people pictured here, who are proud to claim they are “making it” without government handouts? Why do you think they don’t question the fact that they are forced to work so hard for so little?

Why are poor and working class people content to see their children sent to fight in wars of choice in exchange for free education? Why don’t they demand that education without a service obligation? Why do they let their children expose themselves to physical harm and even death, without fighting for other service/policy options?

Do you think policy makers are hesitant to make education “free” in the United States, because they might lose out on “volunteers” for military service?  

u3ogmyp

ccbui-tukaanozm

2e1e412405c0ac8134a65b3a07e2d96bb1e10b2112493003b66acc317f8c1e20

u7d2o

Course: Classical Social Theory, Social Problems

Who Is Middle Class?

24 Comments

barkan-fig08_x006

When you read a report about how the “middle class has shrunk” or that the president gave a speech vowing to “strengthen the middle class,” what does it mean?

Given how much attention is paid to the “middle class,” you may be surprised to learn that there isn’t a hard official definition of who gets to be in it. The vague parameters may help to explain why people aren’t great at deciding whether they themselves are middle class or not.

For example, while we know that wealth inequality is growing, and therefore presumably those in the middle class are showing signs of falling down the economic ladder (or for a rare few, up), the number of Americans who report being in the middle class has remained largely unchanged for years – until recently. What is perhaps more surprising and even a little bit galling is that some of the wealthiest Americans in the country consistently self-report their status as “middle class.”

An analysis published by the Pew Charitable Trust took a look at this question. Pew defined middle class households as those earning 67%-200% of a state’s median income.

Take a look at the chart below. Pew computed their range using median income numbers published by the US Census Bureau’s 2013 American Community Survey. The ACS median numbers are listed in the leftmost column. Pew shows the median figure opposite the range, which it breaks out further by state, to show how much middle-class earners make in each state. The states are listed in descending order based on median income.

mc

Middle Class but Economically Insecure

Middle-classness is very often not as much about actual earnings, but about how people actually feel. And as it turns out, people’s feelings (in terms of their identifying with middle class status) are remarkably broad. Many wealthy as well as poor people can be found to simultaneously identify as middle class, even when their incomes fall far outside of the middle range. Earnings and assests are thus, for many of us, only part of the definition of class. That is to say, there is an important “psychological” component to class that interacts in a dynamic way with the economic dimensions.

Consequently, you might look at this chart and think that the “lower bound” middle class income category doesn’t really feel like middle class at all. That’s because the term “middle” here is being used in its most literal sense. Middle income, in other words, does not always signify middle class. So if you find yourself located in the middle range, but find that this doesn’t feel like middle class to you, it might have something to do with the skyrocketing cost of living, which is being driven up principally by housing costs, and the fact that the majority of people aren’t seeing large boosts to their salaries. Lost jobs, stagnating income, and general feelings of insecurity are all as important as economic criteria in determining how a given individual feels about their class status.

Interestingly now, because there is such an extreme concentration of earnings at the top end of the scale, the “middle” number is starting to show signs of trending higher over time. If the trend persists, more and more people fall outside this range. It’s no wonder that a Pew survey found that between the years 2000 and 2013, the number of people considered “middle class” fell in every single state. People are falling out of the middle class at an alarming rate! These numbers should serve as a stark reminder of just how little most Americans actually have. The take-away here is that middle class ain’t what it used to be.

More than this, there isn’t one middle class, rather, there are “many” middle classes. What all of them require, according to experts, are feelings of income security. Without income security then, theoretically and practically speaking, there can be no middle class.

[Note: using the “median,” rather than the “mean” measure of income, generally produces a statistical result that reflects a more accurate picture of income/earnings, because data are less likely to be skewed by abnormalities in the extreme ends].

Median Household Income and the US Economy

Since 1980, U.S. gross domestic product (GDP) per capita has increased 67%, while median household income has only increased by 15%. An economic recession will normally cause household incomes to decrease, often by as much as 10%.

mc2

Median household income is a politically sensitive indicator. Voters can be critical of their government if they perceive that their cost of living is rising faster than their income. Figure 1 shows how American incomes have changed since 1970. The last recession was the early 2000s recession and was started with the bursting of the dot-com bubble. It affected most advanced economies including the European Union, Japan and the United States.

The current crisis began with the bursting of the U.S. housing bubble, which caused a problem in the dangerously exposed sub prime-mortgage market. This in turn has triggered a global financial crisis. In constant price, 2011 American median household income is 1.13% lower than what it was in 1989. This corresponds to a 0.05% annual decrease over a 22-year period. In the mean time, GDP per capita has increased by 33.8% or 1.33% annually.

The Politics of Poverty

Now, if we were to think about these findings in light of poverty issues, the picture becomes even more grim. The federal guidelines for what we consider the “poverty line” set an income threshold to determine who is poor enough to qualify for benefits. That threshold, as it turns out, is painfully set at a very low level.

So for example, according to the 2015 guidelines, anything above $11,770 and, as far as the government is concerned is an indicatior that you aren’t really poor. Unfortunately, this delusion has real world consequences. Everything from healthcare subsidies to food stamps to help with housing is based on this rigid guideline, which has demonstrated that it is difficult to change for reasons that are largely political. This means that many people who are struggling can’t get access to the help they need.

Discussion Questions

How do you feel about your individual or family status, based on what is reflected on the income chart?

Do you feel anxiety about your earnings ability and your ability to secure a comfortable lifestyle?

How do the psychological dimensions of class interact with the economic dimensions of class in your own life? Do the numbers say one thing, and your feelings another? In other words, do you sometimes feel conflicted about your class status?

Course: Classical Social Theory, Race & Ethnicity

Should We “Fix” Poverty?

76 Comments

8-SMITH-Wanda

Poverty in the Land of the Free

Why is there so much poverty in wealthy country like the United States? And we might also ask: why do so many Americans dislike anti-poverty programs? This is the question posed by Martin Gilens in his (2019) book Why Do Americans Hate Welfare?

Dramatic cuts in welfare have been called for from politicians who represent both major political parties in the U.S. In this case, they are capitalizing on distorted public opinions and “feelings,” rather than data, to further erode crucial aspects of a social safety net that is already full of holes. So again, we must ask – why?

Gilens research aims to answer this question (more on that later). For now, lets take a look at some facts and information contained in official government statistics, which are put together by the US Census Bureau.

In order to talk about “poverty” we should first agree on a working definition.

To define poverty in America, the Census Bureau uses what are called ‘poverty thresholds’ or Official Poverty Measures (OPM), updated each year. Note that there are two different versions of the federal poverty measure. The differences may be slight but they are important:

  • The poverty thresholds, and
  • The poverty guidelines

Poverty thresholds are the original version of the federal poverty measure. They are updated each year by the Census Bureau. The thresholds are used mainly for statistical purpose — for instance, they are used to prepare estimates of the number of Americans in poverty each year. To be clear, all U.S. government official poverty population figures are calculated using the poverty thresholds, not the guidelines. These thresholds are applied to a family’s income to determine their poverty status. Official poverty thresholds do not vary geographically, but they are updated for inflation using Consumer Price Index.

Note that the official poverty definition uses money income before taxes and does not include capital gains or non-cash benefits (such as public housing, Medicaid, and food stamps). To put it simply, in 2020, a family of  4 is considered to be living in poverty if their family income falls below $26,200.

The poverty guidelines are another federal poverty measure. They are issued each year in the Federal Register by the Department of Health and Human Services (HHS). The guidelines are a simplification of the poverty thresholds, which are used to determine financial eligibility for certain federal programs.

Poverty as of 2019

In 2019, the overall poverty rate in the U.S. is: 10.5%  or 34.0 million people. Almost half of those (15.5 million) were living in deep poverty, with reported family income below one-half of the poverty threshold.

To put this is terms of income, the percentage of people who fell below the poverty line — $25,926 for a family of four — in 2019

Child Poverty Rate: 14.4% (10.5 million people)

Percentage of children under age 18 who fell below the poverty line in 2019

Women’s Poverty Rate: 11.5% (19.0 million people)

Percentage of females who fell below the poverty line in 2019

African American Poverty Rate: 18.8% (8.1 million people)

Percentage of African Americans who fell below the poverty line in 2019

Hispanic Poverty Rate: 15.7% (9.5 million people)

Percentage of Hispanics who fell below the poverty line in 2019

White Poverty Rate: 7.3% (14.2 million people)

Percentage of non-Hispanic whites who fell below the poverty line in 2019

Native American Poverty Rate: 23.0% (600,000 people)

Percentage of Native Americans who fell below the poverty line in 2019

People with Disabilities Poverty Rate: 22.5% (3.3 million people)

Percentage of people with disabilities ages 18 to 64 who fell below the poverty line in 2019

To summarize, these rates tell us that Whites by far constitute the largest number of people who are living in poverty; African Americans are disproportionately represented as a group (18.8% vs. 7.3% of whites). This out-sized representation contributes significantly to the perception that African Americans are taking advantage of the system, even though more whites receive benefits. Children are also represented in high numbers as are the elderly, who are not distinguished in this table.


United Nations Report on Extreme Poverty

Not long ago (December 2017), the United Nations Special Rapporteur on extreme poverty and human rights, Professor Philip Alston, issued a formal statement which provided an assessment of poverty in the United States. His report details findings from a 15-day fact-finding mission that took him into some of the poorest neighborhoods in the U.S., in states that included  California, Alabama, Georgia, Puerto Rico, West Virginia, and Washington DC.

Alston began his statement with a nod to the passing of sweeping new tax reforms, as he said “my visit coincides with a dramatic change of direction in US policies relating to inequality and extreme poverty. The proposed tax reform package stakes out America’s bid to become the most unequal society in the world, and will greatly increase the already high levels of wealth and income inequality between the richest 1% and the poorest 50% of Americans.”

Alston goes on to acknowledge that “the United States is one of the world’s richest, most powerful and technologically innovative countries; but neither its wealth nor its power nor its technology is being harnessed to address the situation in which 40 million people continue to live in poverty.”

“American exceptionalism,” he points out, “was a constant theme in my conversations.  But instead of realizing its founders’ admirable commitments, today’s United States has proved itself to be exceptional in far more problematic ways that are shockingly at odds with its immense wealth and its founding commitment to human rights.  As a result, contrasts between private wealth and public squalor abound.”

He further notes that “in practice, the United States is alone among developed countries in insisting that while human rights are of fundamental importance, they do not include rights that guard against dying of hunger, dying from a lack of access to affordable healthcare, or growing up in a context of total deprivation. . . at the end of the day, particularly in a rich country like the USA, the persistence of extreme poverty is a political choice made by those in power. With political will, it could readily be eliminated.”

[Note: Alston is also a professor of law at New York University].

The Deserving and the Undeserving Poor

Back to Gilens. His research calls upon a wide range of empirical sources to argue that the problem is more complex; that Americans don’t simply all hate welfare.

According to his findings:

Americans support government aid to people they believe are “deserving” recipients; in other words, the worthy poor.

Americans are grossly misinformed about who is actually getting formal  assistance, mainly because the media misrepresents welfare recipients.

Media representations, which are mostly visual, disproportionately over-represent African-Americans as aid recipients – especially single mothers.

Media executives, especially editors and journalists, are as misinformed as the public. Their life experiences are traditionally far removed from first-hand experiences of poverty/knowing poor people. This makes it difficult to them to understand and appropriately relate to those experiences, which in turn distorts media narratives and results in misreporting.

Distorted understandings of race are deeply embedded in the making of  welfare policy, resulting in welfare being understood as “black” serving program. As such, people judge it as not deserving of support (Gilens, 2019).

Contradictions

What is interesting about Gilens research is that he is able to analyze public opinion polling data to show that there is, in fact, widespread support for the idea of a social safety net in general and for welfare to the poor in particular. But there are some inconsistencies that emerge, as these sentiments did not carryover and translate as support for African Americans. What and how did this happen?

According to Gilens, media representations of people living in poverty changed over time. He studied book reviews and stories about poverty and noted that these started to increase in the time period of the 1960s. At this time, the number of welfare recipients started to grow in connection with the racial turmoil and civil unrest that occurred during that time. This was true for black as well as white recipients. Whites especially, due to their larger overall numbers, constituted the largest number of welfare recipients. Despite this, the public came to see welfare as a program that mainly benefited African-Americans. Gilens attributes this to distorted media narratives about poverty and welfare, many of which still have currency in our present time.

The important takeaway here is not that the media simply act as an amplifier of public opinion; they are in many respects responsible for manufacturing public opinion. Ultimately, this exerts an major influence on our public policy, which instead of being based on facts ends up cynically indulging people’s feelings about who should get public help and who should be written off as unworthy.

This is why we see in the United States that there is unwavering support for what are essentially draconian welfare reforms that have the effect of hurting the most needy in the interest of hurting those that the public believes should be punished. Americans, according to Gilens, support these cuts for reasons that they mistake who is on welfare, attributing many among them to be undeserving.

These views link up to other narratives and ideas that run deep in American culture. For example, the idea that everyone who works hard will be able to achieve their dreams, the idea that everyone must assert “personal responsibility” as this pertains to work and taking care of their family, and the idea that relying on the government help for any reason is indicative of personal failing.

A Perfect Problem In An Imperfect World

(The following article is re-blogged: “The myth destroying America: Why social mobility is beyond ordinary people’s control,” by Sean McElwee)

Many cultures have viewed poverty as an inescapable part of an imperfect world. Throughout history, societies have suffered from two kinds of poverty: social poverty, which withholds from some people the opportunities available to others; and biological poverty, which puts the very lives of individuals at risk due to lack of food and shelter. Perhaps social poverty can never be eradicated, but in many countries around the world, biological poverty is a thing of the past.

Until recently, most people hovered very close to the biological poverty line, below which a person lacks enough calories to sustain life for long. Even small miscalculations or misfortunes could easily push people below that line, into starvation. Natural disasters and man-made calamities often plunged entire populations over the abyss, causing the death of millions.

Today most of the world’s people have a safety net stretched below them [note: the very idea of a “safety net” is under attack in the United States for political reasons and ideologies born out of “free market” fundamentalism; some politicians have referred to the net as a “hammock”]. Individuals are protected from personal misfortune by insurance, state-sponsored social security and a plethora of local and international NGOs. When calamity strikes an entire region, worldwide relief efforts are usually successful in preventing the worst. People still suffer from numerous degradations, humiliations and poverty-related illnesses, but in most countries, nobody is starving to death. In fact, in many societies, more people are in danger of dying from obesity than from starvation.

As science began to solve one unsolvable problem after another, many became convinced that humankind could overcome any and every problem by acquiring and applying new knowledge. Poverty, sickness, wars, famines, old age and death itself were not the inevitable fate of humankind. They were simply the fruits of our ignorance.

We are living in a technical age. Many are convinced that science and technology hold the answers to all our problems. We should just let the scientists and technicians go on with their work, and they will create heaven here on earth. But science is not an enterprise that takes place on some superior moral or spiritual plane above the rest of human activity. Like all other parts of our culture, it is shaped by economic, political and religious interests.

Poverty, consequently, rather than being seen as a “technical” problem that might be fixed is often seen as a moral failing: it is the poor themselves that are to be blamed.

Research on Poverty

According to a new report from the Pew Research Center, Americans are almost evenly split over who is responsible for poverty and whether the poor have it easy or hard. Here are some factoids from the data:

44% think that the government should do more for the needy, even if it means more debt
51% think the government can’t afford to do more for the needy and shouldn’t
45% think that poor people today have it easy
47% think that poor people have it hard

pew

What is interesting here is how survey responses correlate with whether the respondents themselves are rich or poor. Not surprisingly, a proportionately larger number of the least economically secure (2/3rds) think government benefits don’t go far enough; the proportion of people who share this view diminishes among economically secure people (only 1/3rd). The pattern repeats again when people are asked whether the government should and can do more – 60% of the least economically secure say “yes,” while 62% of the most secure say “no.”

The Myth of the American Dream

In the United States, there is a strongly held conviction that with hard work, anyone can make it into the middle class. Pew finds, however, that Americans are far more likely than people in other countries to believe that work determines success, as opposed to other factors beyond an individual’s control. Unfortunately, this positivity comes with a negative side — a tendency to pathologize those living in poverty.

In other words, Americans are more inclined to blame individuals for structural problems. Thus we find that 60 percent of Americans (compared with 26 percent of Europeans) say that the poor are “lazy.” Only 29 percent of Americans say those living in poverty are trapped in poverty by “factors beyond their control” (compared with 60 percent of Europeans).

Again, it is important to distinguish here how the survey responses provided by people reflect their “beliefs” – and this differs from the data and evidence. While a majority of Americans might think that hard work determines success and that it should be relatively simple business to climb and remain out of poverty, the empirical reality is that the United States has a relatively entrenched upper class, but very precarious, ever-shifting lower and middle classes.

As for welfare, while many Americans hate welfare, the data suggest they are fairly likely to fall into it at one point or another. In their recent book, “Chasing the American Dream,” sociologists Mark Robert Rank, Thomas Hirschl and Kirk Foster argue that the American experience is more fluid than both liberals and conservatives believe. Using Panel Survey of Income Dynamics (PSID) data — a survey that tracked 5,000 households (18,000 individuals) from 1968 and 2010 — they show that many Americans have temporary bouts of affluence (defined as eight times the poverty line), but also temporary bouts of poverty, unemployment and welfare use.

Keep in mind that “welfare” is not just food stamps. This study tracked use of Medicaid, Temporary Assistance to Needy Families/Aid to Families with Dependent Children (food stamps), Supplemental Security Income, and any other cash/in-kind programs that rely on income level to qualify. The chart below illustrates different measures of economic insecurity experienced by people relative to time spent claiming benefits.

inc

Researchers found that a large number of Americans eventually fall into one of the “welfare” categories, but few stay “welfare dependent” for long. Instead, the social safety net does as it is intended – it catches them – and allows them to get back on their feet.

The same authors also found that the risk of poverty is higher for people of color. (Since the PSID began in 1968, most non-white people in the survey have been black.) And while most Americans will at some time experience affluence, again, this experience is segregated by race.

inc2

Social Mobility

In a study published earlier this year, Rank and Hirschl examine the top 1 percent of wage earners and find that entry into it is more fluid than previously thought. They find that 11 percent of Americans will enter the 1 percent at some point in their lives. But here again, access is deeply segregated. Whites are nearly seven times more likely to enter the 1 percent than non-whites. Further, those without physical disability and those who are married are far more likely to enter the 1 percent. The researchers, however, didn’t measure how being born into wealth effects an individual’s chances, but there are other ways to estimate this effect.

For instance, a 2007 Treasury Department study of inequality allows us to examine mobility at the most elite level. On the horizontal axis (see below) is an individual’s position on the income spectrum in 1996. On the vertical level is where they were in 2005. To examine the myth of mobility, I focused on the chances of making it into the top 10, 5 or 1 percent. We see that these chances are abysmal. Only .2 percent of those who began in the bottom quintile made it into the top 1 percent. In contrast, 82.7 percent of those who began in the top 1 percent remained in the top 10 percent a decade later.

mob

One recent summary of twin studies suggests that “economic outcomes and preferences, once corrected for measurement error, appear to be about as heritable as many medical conditions and personality traits.” Another finds that wages are more heritable than height. Economists estimate that the intergenerational elasticity of income, or how much income parents pass onto their children, is approximately 0.5 in the U.S. This means that parents in the U.S. pass on 50 percent of their incomes to their children. In Canada, parents pass on only 19 percent of their incomes, and in the Nordic countries, where mobility is high, the rate ranges from 15 percent (in Denmark) to 27 percent (in Sweden).

There is reason to believe that wealth, which is far more unequally distributed than income, is also more heritable. In his recent book, “The Son Also Rises,” Gregory Clark explores social mobility in societies spanning centuries. According to Clark, “current studies… overestimate overall mobility.” He argues as follows:

“Groups that seem to persist in low or high status, such as the black and the Jewish populations in the United States, are not exceptions to a general rule of higher intergenerational mobility. They are experiencing the same universal rates of slow intergenerational mobility as the rest of the population. Their visibility, combined with a mistaken impression of rapid social mobility in the majority population, makes them seem like an exception to a rule. The are in instead the exemplary of the rule of low rates of social mobility.”

Clark finds that the residual effects of wealth remain for 10 to 15 generations. As one reviewer writes, “in the long run, intergenerational mobility is far slower than conventional estimates suggest. If your ancestors made it to the top of society… the probability is that you have high social status too.” While parents pass on about half of their income (at least in the United States), Clark estimates that they pass on about 75 percent of their wealth.

Thus, what Rank and Hirschl identify, an often-changing 1 percent, is primarily a shuffling between the almost affluent and the rich, rather than what we would consider true social mobility.

The American story, then, is different than normally imagined. For one, many Americans are living increasingly precarious existences. In another paper, Hirschl and Rank find that younger Americans in their sample are more likely to be asset poor at some point in their lives. But more importantly, a majority of Americans will at some point come to rely on the safety net. Rather than being a society of “makers” and “takers,” we are a society of “makers” who invest in a safety net we will all likely come in contact with at one point or another.

The Gini Coefficient measures how equally distributed resources are, on a scale from 0 to 1. In the case of 0, everyone shares all resources equally, and in a society with a coefficient of 1, a single person would own everything. While income in the U.S. is distributed unequally, with a .574 gini, wealth is distributed far more unequally, with a gini of .834 — and financial assets are distributed with a gini of .908, with the richest 10 percent own a whopping 83 percent.

Wealth and financial assets are the ticket to long-term financial stability; those who inherit wealth need never fear relying on the safety net. And it is these few individuals, shielded from the need to sell their labor on the market, who have created the divisive “makers” and “takers” narrative in our contemporary politics.

Using race as a wedge, they have tried to gut programs that nearly all Americans will rely on. They have created the myth of the self-made individual, when in fact, most Americans will eventually need to rely on the safety net. They treat the safety net as a benefit exclusively for non-whites, when in reality, whites depend upon it too (even if people of color are disproportionately affected).

mob2

As many scholars have noted before, the way the welfare state works (where inefficient tax credits are given to the middle class) is a big part of why this delusion has been sustained.

It is therefore not that Americans believe themselves to be “temporarily embarrassed millionaires,” but rather “self-made men” (with a dose of racism and sexism), that drives opposition to the welfare state.

And by this, I mean that while most people understand they are not likely to become millionaires, few among them realize how much government programs have benefited them throughout their lives.

Sources

The source for this article, including the charts referenced in it is Sean McElwee. His original article, published by Salon, is entitled “The myth destroying America: Why social mobility is beyond ordinary people’s control.” Link no longer available.

Poverty Data Sources

The Census Bureau reports poverty data from several major household surveys and programs.

The Annual Social and Economic Supplement (ASEC) to the Current Population Survey (CPS) is the source of official national poverty estimates. The American Community Survey (ACS) provides single and multi-year estimates for smaller areas.

The Survey of Income and Program Participation (SIPP) provides longitudinal estimates.

The Small Area Income and Poverty Estimates (SAIPE) program provides model-based poverty estimates for school districts, counties, and states.

Discussion Questions

How should an affluent society like the United States respond to poverty?

Millions of Americans lack access to sufficient food and shelter. What should we do with them?

Why do you think so many Americans hate the idea of welfare even as they also support helping the poor?

Do you think the United States should provide for a social safety net? (setting a minimum threshold for subsistence…or not?)

When you close your eyes an imagine a picture of someone who fits the description of “deserving poor” what do they look like? Do the same for “undeserving poor.” What do they look like? (think in terms of age, gender, race).

What do you think about programs like Medicaid and Medicare? Do you know what they are and how they work? (one is an anti-poverty program and the other is a benefit for people over the age of 65 that is funded through payroll deductions over the course of one’s working lifetime). Should we maintain these programs, make them more or less available, or get rid of them?

How might “personal responsibility,” “personal freedom,” and “small government” narratives make it difficult to deal with social problems at the policy level?

How do you think we might address the problem of persistent inter-generational poverty and social inequality (think about places like Appalachia, WVA and Kentucky in particular, and even rural and deindustrialized parts of Pennsylvania)?

Do you think that the government providing things like job training and food stamps are enough to fix the problem? Is it too much help or not enough?

What do you think about the sentiment “No one deserves to be poor?” Or do some people deserve it and, likewise, deserve to be punished?

How might our economy be systematically organized, even “rigged,” to condemn many people, including a disproportionate number of African Americans, to live lives of poverty and desperation?

Look at your own neighborhoods and towns. Do you think the poverty that you see is a product of economic structural failure (widespread job loss and the re-ordering of the local economy to provide only low wage jobs) or do you think it is the result of people simply not working hard enough?

Course: Classical Social Theory, Current Social Theory, Race, Crime & Justice, Social Problems

Survival of the Richest

17 Comments

 

se1

Recent economic data paints a bleak picture of the economy in regards to the financial health of middle class Americans. This particular group is rapidly losing ground to another group sometimes derisively referred to as the “one-percenters,” (or as Marx called them the “bourgeois”) a group that averaged $5 million in wealth gains over just three years. The Global 1% increased their wealth also, growing their income from $100 trillion to $127 trillion in just three years

Let’s break that down into some an analogies we can all relate to:

Each Year Since the Recession, America’s Richest 1% Have Made More Than the Cost of All U.S. Social Programs

What you have here effectively is a reverse transfer from the poor to the rich. Even as political conservatives blame Social Security for being too costly and social welfare programs for being too generous, most of the 1% wealth club members are continuing to accumulate wealth at record speeds. The numbers are nearly unfathomable. Different estimates cite the American 1% as taking in anywhere from $2.3 trillion to $5.7 trillion per year.

Even the smaller estimate of $2.3 trillion per year is more than the budget for Social Security ($860 billion), Medicare ($524 billion), Medicaid ($304 billion), and the entire safety net ($286 billion for SNAP, WIC [Women, Infants, Children], Child Nutrition, Earned Income Tax Credit, Supplemental Security Income, Temporary Assistance for Needy Families, and Housing).

se2

Even the Upper Middle Class Is Losing 

In just three years, from 2011 to 2014, the bottom half of Americans lost almost half of their share of the nation’s wealth, dropping from a 2.5% share to a 1.3% share.

Most of the top half lost ground, too. The 36 million upper middle class households just above the median (6th, 7th, and 8th deciles) dropped from a 13.4% share to an 11.9% share. Much of their portion went to the richest one percent.

This is big money. With total U.S. wealth of $84 trillion, the three-year change represents a transfer of wealth of over a trillion dollars from the bottom half of America to the richest 1%, and another trillion dollars from the upper middle class to the 1%.

Almost None of the New 1% Wealth Led To Innovation and Jobs

In 2005, for example, every $1 of financial wealth there was 66 cents of non-financial (home) wealth. Ten years later, for every $1 of financial wealth there was just 43 cents of non-financial (home) wealth. What happens to all this financial wealth?

Over 90% of the assets owned by millionaires are held in low-risk investments (bonds and cash), the stock market, and real estate. Business startup costs made up less than 1% of the investments of high net worth individuals in North America in 2011. A recent study found that less than 1 percent of all entrepreneurs came from very rich or very poor backgrounds. They come from the middle class.

On the corporate side, stock buybacks are employed to enrich executives rather than to invest in new technologies. In 1981, major corporations were spending less than 3 percent of their combined net income on buybacks, but in recent years they’ve been spending up to 95 percent of their profits on buybacks and dividends.

Just 47 Wealthy Americans Own More Than Half of the U.S. Population

Oxfam reported that just 85 people own as much as half the world. Here in the U.S., with nearly a third of the world’s wealth, just 47 individuals own more than all 160 million people (about 60 million households) below the median wealth level of about $53,000.

The Upper Middle Class of America Owns a Smaller Percentage of Wealth Than the Corresponding Groups in All Major Nations Except Russia and Indonesia.

The upper middle class in the U.S., defined as everyone in the top half below the richest 20%, owns 11.9 percent of the wealth. Indonesia at 10.5 percent and Russia at 7.5 percent are worse off, but in all other nations the corresponding upper middle classes own 12 to 27 percent of the wealth.

America’s bottom half compares even less favorably to the world: dead last, with just 1.3 percent of national wealth. Only Russia comes close to that dismal share, at 1.9 percent. The bottom half in all other nations own 2.6 to 10.2 percent of the wealth.

Ten Percent of the World’s Total Wealth Was Taken by the Global 1% in the Past Three Years

As in the U.S., the middle class is disappearing at the global level. An incredible one of every ten dollars of global wealth was transferred to the elite 1% in just three years. A level of inequality deemed unsustainable three years ago has gotten even worse.

For more on this, consider the following:

se3

se4

se5

se6

SE7

The Emperor’s New Clothes

The rich are getting richer, everyone else is struggling. Is that fair? Watch Russell Brand’s new documentary The Emperor’s New Clothes when it debuts in selected cinemas on April 21, 2015.

https://www.youtube.com/watch?v=mBCwM2UdV9c

SE8

Graphics shown here are published by Mother Jones magazine, which can be accessed at the following link:

Charts: How the Recovery Left Most Americans Behind

More information appears in an article originally published by Alternet, which features data published by the Credit Suisse 2014 Global Wealth Databook (GWD). You can access it here: http://www.alternet.org/economy/stark-facts-global-greed-disease-challenging-climate-change

Discussion Questions

Despite overwhelming data and evidence that present day global economic policies, including domestic policies in the U.S., are by their very design transferring public sector wealth (tax dollars) into private hands, why do you think average Americans remain oblivious and even applaud this process? Why do they get upset about “welfare entitlements,” which pale in comparrison to the amount of their individual tax dollars that get used to underwrite the financial adventures (and misadventures) of wealthy people?

How do you think the different financial crises will impact you (i.e. global economic crisis, student loan debt, housing debt)?

Why is wealth and not just income inequality a problem?

Course: Classical Social Theory, Current Social Theory, Race & Ethnicity

Why Should I Care About Social Inequality?

88 Comments

Untitled

Sociological Concepts: Class, Status, & Party

It might strike you as odd that classical sociologists were deeply engaged with broad questions about modernity and social inequality. This led them to question the very basic structures and functions of society and how all of this further bound up with questions about social change.

Marx, Weber, and Durkheim used the concepts of “class” “status” and “party” to explain how people organize themselves into a society. Social groups, furthermore, reflected how people self-organized into social niches or social “strata.”

Contemporary sociologists still turn to this foundational work, because it furnishes concepts that we can use to understand complex issues like social inequality and the role that capitalism plays in society.

Why Should I Care About Social Inequality?

Isn’t it more or less dumb to assume that everyone should be equal? Isn’t it just “natural” that some people are smarter, stronger, and richer than other people? Don’t people achieve success and better outcomes simply because they work harder than others? Superficially, many might say “yes.” But the real answer, of course, is that “life is more complex than this.”

Concepts like social “class” allow us to think critically and pursue an answer to these questions. That’s why it is such an important component of theory building in the social sciences. Yet “class” can be understood as both a theoretical concept and as a “variable.” In the case of the latter, when we look closely at it, we find that what appears to be one variable “class” is in reality comprised of other related interlocking concepts and/or variables like wealth, income, and education. 

Given this, social science researchers are keen to study the different ways that the “class” variable is constructed. Moreover, they are interested to understand how this dynamic variable potentially interacts with other variables that reflect different levels or “dimensions” of social stratification like race and gender.

Whenever researchers try to understand the interlocking dynamics of concepts (or variables) like class, race, gender, and party, this is called taking an intersectional theoretical approach. Some people might call this a “radical” approach, however, social scientists think it merely reflects an effort to understand the complexity of human social life.

Defining concepts and variables is the first step in the process of undertaking a systematic effort to measure outcomes that embedded within social patterns. This helps us to appreciate the dynamic multifaceted nature of social group experience.

Looking deeper into social class, we find it is interesting to look at how groups maintain their “groupness.” In doing so, they often make great efforts to not only distinguish their class, but to also assert social status (because “status” confers social advantages). They do in ways that are both conscious as well as unconscious.

As part of this process, they similarly make choices about how to socially identify with political parties (and candidates). To this end, they may not have an especially deep grasp of a party or candidate’s stated policy goals; however, they make subjective choices to affiliate with them based on “feelings” as well as the sense of belonging they achieve through political group affiliation.

Groupthink

Often, and perhaps more interesting, are the ways in which individuals demonstrate their committed desire to maintain a social group affiliation, which political scientists find often overrides any one individual’s desire to demonstrate independent thought on a political issue. Psychologists sometimes refer to this as “groupthink,” a mode of thinking in which individual members of cohesive groups tend to accept a viewpoint or conclusion when it represents a perceived group consensus; they do this regardless of whether or they, as an individual group member, believe it to be valid, correct, or even optimal. For this type of person, it is more important to be socially recognized as professing beliefs that are consistent with their social group identification than it is to be “right” about any given issue.

Equality vs. Equity

Karl Marx

Marx analyzed the development of modern capitalism and predicted the emergence of polarized social class conflict. He became interested in how a given individual’s relation to the means of production. His simple approach understood people fit into one of two basic social groups. His logic dictates that people are either part the bourgeois ownership class (the capitalists) or they are part of proletariat working class…and nothing in-between.

For Marx, it is the dynamic contradictory social relationship between the two different social classes that forms the basis for social class conflict – he sees conflict as the major motive force for change in social systems based on capitalism.

Put another way, Marx saw social class as the main axis around which power relationships are organized (economic and political). An individual’s class outlook (how you see the world) is determined by their material position (wealth). Of course, there are theorists that followed Marx, who find this conceptual framing to be a bit too limited (i.e. Pierre Bourdieu). In short, they felt that there are other things like “feelings” (i.e. rage) that might motivate people to override/disregard their class interests.

Researchers have found over and over again that the material position you are born into has not only a major impact on the way you see the world, but also on how you understand things like politics, major issues, and social problems.

Sadly, they have also found that it is common for people to betray their class interests in favor of indulging their emotions, feelings, and desire to seek proximity to  power (even if that “power” seeks their perpetual subjugation).

They love the “invisible hand of the free market” until it one day slaps them in the face. This is the basis for a lot of today’s grievance politics.

Social Mobility

For the record, many people can and do change social classes. Yet despite the efforts of some people, whose engage in “hard work” to facilitate this, this is increasingly not a statistically a common occurrence. In the United States, for example, the statistical norm is for people to remain in the social class that they were born into.

The concept of social mobility refers to a process when someone moves from one class to another, either up or down. Such people are known as “class travelers.” That is to say, they change their social classes over the course of their lifetime. When this happens, the highest attained social class status will tend to exert the most influence over how one sees the world.

Despite what you might expect, even lower class/status people can possess what are called “aspirational” social identities; which is to say, they identify more strongly with the social class situated above them (the class status they hope to achieve), as opposed to their current social class.

The Tools of Distraction

Marx understood that one of the primary goals of capitalism (capitalists) was to distract working class people (the proletariat) to prevent them from achieving class consciousness. This can be accomplished though any number of ways. To this end, it accounts for why capitalists spend a lot of money and effort trying to entertain working class people – the classic “bread and circuses” ploy.

Workers are easily distracted by entertainment, because the have hard jobs and simply desire some level of pleasant distraction that doesn’t require hard thinking. They don’t always have the luxury of time to think about, let alone understand, abstract concepts like “capitalism” to reflect on how they are being exploited in society by capitalists, who can use their money and power to manipulate them into buying their products (that they often cannot afford) and serving their interests.

When working class people buy into the belief system that they must conspicuously consume/buy things (materialist ideology) to be seen as worthy, one of the outcomes of this, according to Marx, it that it prevents them from achieving class consciousness and solidarity with other workers. Now, they have to work even harder to pay for the things they bought, they remain committed to their individualist endeavors, because they are always chasing the next paycheck to pay the bills.

Why is this important? Because alienated workers who carry debt burdens are easy to manipulate; they don’t have the time or inclination to seek out/create social bonds with other workers, which are their best shot at leveraging the force of their collective power to overthrow the system oppresses them.

Now, some people may say “exploited my ass….just quit your job if you don’t like the work.” But that’s not always an option for poor or working class people, who are often focused on “survival” activities so they can get through another day. Future planning looks like a luxury to people who are struggling to survive.

Marx & Conflict Theory

Conflict theorists in the social sciences trace their intellectual roots to Marx in light of his emphasizing that social “conflict” defines the relationship between class factions and is the motive force for social change. Conflict, according to Marx, is inevitable, because the capitalists that control the material resources and wealth in society are they are not likely to ever give up the game of exploiting workers in order to attain more wealth and profit.

More to the point, capitalists keep the deck stacked in their favor by using their influence to shape the key major social institutions in society: the education system, the criminal justice system & laws, the media, and healthcare.

Violence is hard-wired into the System

Violence is, according to Marx, understood to be a hard-wired feature of capitalism. In order to keep people laboring at low levels of compensation (or maybe even working for free, i.e. slavery), capitalists will at some point have to resort to coercion; that is, violence. This refusal by capitalists to acknowledge and compensate workers relative to the important contributions that they make, in Marx’s view, necessitates the violent overthrow of the social system. 

Marx says that the only way to end the cycle of violence is to end to capitalism and reorganize the social order in a more equitable fashion, this way ALL people can profit from the system, not just the wealthiest of the wealthy.

Alienation

Another concern Marx had with regard to capitalism and social inequality is how this also produces alienation. According to him, as capitalism advanced and people were forced to sell their labor to survive, the division of labor was increased; this resulted in them becoming alienated. Marx says workers became alienated on 4 different levels:

• alienated from the objects/products they produce
• alienated from the process of production
• alienated from themselves – their “species-being”
• alienated from other people.

In other words, the experience of selling their labor resulted in a loss of control and power over their life as well as their labor. People became slaves of the objects they produced in the same manner as they became appendages of machines.

The inherent structural disparity between the members of the working class and the owners of capital, Marx believed, contained the seeds for revolution, where the working class would have no choice but to rise up to throw off their capitalist oppressors.

Put another way, he believed that the two social classes must eventually “clash” and that capitalism, as s system of social organization, must be superseded in order for man to recover his alienated self and be free from class domination.

Here’s a little video to explain Marx’s concept of alienation:

Max Weber

Weber was most interested in the formation of the modern state and the rise of modern organizations. He formulated a three component theory of social stratification, which included class, status, and party. Some people describe his intellectual work as the sociology of domination.

He was also committed to understanding the history of “rationality.” To be more specific, we might understand his intellectual project as one dedicated to understanding why modern forms of domination – rational legal authority – developed first in Europe. For Weber, rationalization was a process whereby “ends” and “means” were progressively clarified.

Unlike Marx, Weber thought it was important to think about social positioning in terms that took into account non-economic qualities like honor, prestige, religion, and the political power domain.

Weber does not see class as forming a basis for social action and change. He doesn’t theorize a “working class” in the same manner as does Marx. Weber rather speaks of an upper, middle and lower class within lifestyle groups. Nevertheless, he still considers social class to be important to determine an individual’s “life chances.”

Weber, in this respect, understood there to be a multiplicity of classes in any given society, which contained multiple overlapping dimensions and groups. People were driven to achieve their individual goals/ends by employing different means within these overlapping structures. That’s because for Weber, unlike Marx, it’s not just about economics.

max-weber-economia-y-sociedad-em-espanhol-volume-2-14093-MLB2883384860_072012-F

Weber is also known for his seminal work The Protestant Ethic and the Spirit of Capitalism, wherein he coined the term the “Protestant work ethic” to describe how links made by theologians between religion, work, and capital laid the groundwork for capitalism.

Calvinist theology is heavily invested in the idea of “predestination,” which dictates that only an elect few are predestined for salvation from birth. Confoundingly, the opposite also held true: poverty and abjection were signs you’d been denied God’s grace; that you were not one of the “chosen” people. You don’t have any material success to point to as an outward sign of God’s grace and so you are damned.

This double predestination doctrine, despite being cruel and despotic, has been (and remains) very successful. Historically, protestant societies are always wealthier than Catholic, which are wealthier than Orthodox. The United States represents an example of a wealthy society built on Calvinism. Yet here, even as the “Protestant Ethic” persists, most people have long since forgotten about its religious origins. Yet is is worth noting that when this ethic developed in the U.S., many people in the country weren’t even considered people (i.e. slaves). In this social context, slaves were blamed for their misfortune, subjugation, and torture even as they at the same time had the legitimacy and value of their labor erased (slaves worked hard but weren’t even regarded as human).

Naturally, all of this caused people quite a bit of personal anxiety, and so they were compelled to look for hints or signs that they were members of the elect. Since they were invested in the idea that material success was among the most notable indicators of God’s favor, the spent their life doing the hard work, where they set about to create God’s kingdom on Earth through a secular vocation. Hard work thus was considered the primary pathway to achieve God’s grace.

Again, these feelings and beliefs continue to persist, even though the original religious connection may be no longer recognizable. A strong argument can be made that this kind of religious morality continues to inform how many of us think not only about work but also about people whose ancestors were slaves (African Americans).

Note, if religious Calvinism served as the nasty model for capitalism, by way of contrast, the Nordic model (based on Lutheranism), represents an attempt to clean it up a bit, as it aimed to smooth over capitalism’s rough edges and messy exploitation, while still emphasizing the virtue of working hard. This is why people tend to praise the Nordic model whereas they criticize the U.S./neoliberal model of capitalism.

Race & the Protestant Work Ethic

“The Protestant work ethic that influenced the founding of this country included a belief that the more material wealth you have, the closer you are to God,” said Robin DiAngelo, a professor whose research focuses on how white people are socialized to collude with institutional racism.

“So during slavery, we said, ‘You must do all the work but we will never allow that to pay off.’ Now we don’t give black people access to work. Then and now they have not been allowed to participate in wealth building or granted the morality we attach to wealth” (DiAngelo)

This historical entanglement of property and virtue continues to inform racial views.

“Property among white Americans is seen as something to be treasured and revered,” said Winbush. “Black Americans, for reasons having to do with this history of disenfranchisement, have not viewed themselves as truly owning anything in America” ((DiAngelo).

Put differently, when we think about people within a context where work/labor contributes to building God’s kingdom on Earth in a very physical way, anyone whose labor was not recognized as legitimate was denied access to material wealth, success, and social status – all the things that supposedly were indicators of one’s “chosen” status.

Denying the labor of black Americans thus became an important part of the legitimating ideology of white supremacy – it helped reinforce it.

Ethicist Katie Geneva Cannon has written at length about how the institutional denial of citizenship and freedom to black people essentially precluded the possibility of them ever being seen as virtuous in white society.

“The ‘rightness of whiteness’ counted more than the basic political and civil rights of any Black person…

Eventually, institutional slavery ended, but the virulent and intractable hatred that supported it did not,” Cannon wrote in The Emergence of Black Feminist Consciousness. For it is through both erasure and ignorance that we continue to deny the virtue and legitimacy of black citizenship and labor (Cannon).

Although it has been a long time since the ideas of the Protestant work ethic took hold – and slavery too was a looooong time ago – nonetheless, we are still trapped within the legacy of inequities brought about by these social dynamics and ways of understanding who “works hard” and who is socially worthy. While the religious explanations that underlie the logic no longer seem to be relevant; the economic logic remains, emptied of its religious content.

American industriousness would not have come about without free slave labor. War and Imperialism (“offense” taking other people’s stuff combined with a little bit of “defense”) was also key to industrial progress. That these things are plainly obvious, yet they still manage to escape critical recognition, is a problem – one that gets in the way of achieving real social progress.

Pierre Bourdieu

Bourdieu’s work represents something of an elaboration and synthesis of the work of both Marx and Weber. He proposed a functional theory that linked the “material” elements of class with the symbolic “psychic” dimensions of class (could it be that they’re not really separate at all?). Bourdieu was, in this respect, interested to explore how social class gets internalized; that is, how it not only reflects the material aspects of your life, but how it also affects your habits of mind, including the tastes and preferences that you develop for culture (i.e. hair, clothes, music, art, food, sports, people, body style, and other aesthetics). All of these things, according to Bourdieu, reflect the individual’s relation to the dominant class in society; they distinguish the dominated from the dominant.

In light of this important work, we might consider how the cultural realm continues to constitute an important site of tension, conflict, and expression, as people jockey for position in the status hierarchy and engage in status-seeking behavior in order to claim their position in the social order. Bourdieu’s ideas further illuminate how economic and cultural relations might be merged with relations of subordination, based on class, age, and gender (he didn’t say much about race) and in doing so shows how multiple forms of subordination articulate and may be deeply intertwined.

Print

Wealth Distribution

Presently, in the United States, we have an economic situation similar to what Marx described.

The vast majority of people are resource-poor, relatively speaking, when compared to their well-off brethren, who own upwards of 40% of the country’s wealth. The following video clip offers a powerful illustration of wealth distribution dynamics in the contemporary U.S.:

How to Fix it: Policy Solutions

For a long time, economists, policymakers, and many lawmakers have argued we shouldn’t worry about social and wealth inequality; that the real problem to solve is how to reduce poverty. Many of these same people insisted that high levels of inequality were unimportant because policies that benefit wealth accumulation among high-income earners effectively help everyone. “A rising tide lifts all boats” is how the wisdom goes.

New research by Branko Milanovic and Roy van der Weide takes issue with this kind of thinking. They assert that policies derived from such flawed logic are dead wrong. According to the authors, social inequality doesn’t produce gains for the economy as a whole (and everyone across the earning spectrum); rather, inequality only benefits the very very rich.

Thomas Piketty advances claims that echo Marx, as he argues that social inequality is not an accident or the simple result of individual actors/groups making better or unfortunate bad choices; rather, he says social inequality is a distinguishing feature of capitalism that can only be reversed through state-based policy intervention. Unless capitalism is reformed, he says, the entire democratic social order will be put at risk.

Prior to this research, economists were prone to argue that wealth redistribution slowed down economic growth. These economists maintained that any attempts to reduce economic inequality through the stimulus of policy mechanisms would have the effect of making poverty worse.

If you can follow this contradictory logic, they are essentially saying is that financial incentives/rewards are a good way to incentivize the wealthy to stimulate the economy; but similar financial incentives/rewards provided to poor people don’t work. This logic further suggests that incentivizing poor people with money will only make them lazy and less likely to work hard and achieve. In short, incentives for me but not for thee.

Given the benefit of an expanding literature on the subject, we now know these assumptions aren’t true; that social inequality actually reduces economic growth. That is to say – the assumptions of many leading economists were backward.

The latest research findings indicate that by structuring economic and tax policies to motivate  rich people to invest, thereby increasing social inequality, resulted in even higher levels of social inequality, which prompted poor people to take on more debt as a means to catch up/survive. This has had the effect of destabilizing the economy.

In a consumer driven economy, where there are not enough poor and middle-class families to consume products, businesses are forced to  contend with less revenue and fewer customers. Consequently, instead of providing the poor and middle class with an incentive to better their lives (so they might achieve the American dream), higher levels of social inequality gave rich people an incentive to pull up the economic ladder, leaving everyone’s boat stranded.

The rich, in other words, left the poor and the middle class behind. More than this, they invested their riches in building walls to seclude themselves, out of fear of needing to protect against angry poor people, who they know at some point are going to figure out that they got played. Now you know why people increasingly feel like they need so many guns to protect themselves….and why billionaires are building bunkers in New Zealand.

So now, instead being forced to always work harder, the rich are able to sit back and enjoy the fruits of their accumulated wealth (check out HBO’s “The White Lotus”). Research on spending patterns demonstrates that when they’re not spending their wealth on luxury consumables, they often choose to redirect earnings outside the local economy and into off-shore tax havens, through the use of complicated accounting mechanisms and tax deferment schemes.

The poor and middle class, not surprisingly, lose hope and become disenchanted…so much so that they sometimes argue against their own economic interests, where they too will sometimes argue that the rich have every right to become rich by any means necessary (even if it means stepping on the necks of poor and working people)….because that’s just “smart business.” This disenchantment comes from restricted opportunity, a perceived lack of fairness in the system, and the lack of belief that they might help organize the world in a different way, so that poor people didn’t have to be exploited.

The problem with the continued oppression of poor people is that: 1) they will at some point get angry and rise up; and 2) with earnings are depressed and employment not always stable, they will have less money to spend/invest to keep the system chugging along. Bear in mind, one of the consequences of all this is that they probably have less money to invest in their own education to get ahead, as they are increasingly buried in debt and find it nearly impossible to plan for their future.

Social inequality reduces economic growth because it reduces demand at the same time as it curtails upward social and economic mobility.

In light of this, it is important to think about how your different class and status positions might influence how you think and feel about things like social problems. Or are you so privileged to think that social inequality is not a problem at all.

IMG_6732

What Does a More Equitable Society Look Like?

A more equal society would mean everyone has shelter, healthcare, education, food, and time to rest and play as well as work. It would mean not discriminating on grounds of identity, sex or skin color.

It would mean having a social system that provides people with access to facilities such as libraries and galleries and parks which could be participated in by everyone.

It would involve foregrounding egalitarian goals and dramatically curbing corporate power and high pay.

It would mean heeding the call for universal public services (i.e. internet, heat, and hot water).

It would mean prioritizing climate change as a social issue that affects everyone.

It would mean prioritizing healthcare as a social issue that affects everyone.

These are structural social problems; they are not “work ethic” problems.

Discussion Questions:

Why is there social inequality? Is it “natural” as many people assume? Is it made by our institutions and social policy? Or is it simply a reflection of the fact that some people don’t work hard?

Is the United States, land of the free, a “classless” society? Or are there defined social classes? If so, what social class do you relate to? Can you identify your class position (as determined by your birth) and relate to how it might impact the way you see social problems as well as opportunities in the world?

Apply concepts and theories related to class conflict, social inequality, and social stratification (i.e. social class, race, gender) explain your own life experiences. 

How might your class position influence how you see people located at both the lower and high end of the wealth scale?

Where have you traditionally believed that most of the wealth is located in our society? Do you think it is held by middle-class people? How might your class position influence the way you think about social problems and policy solutions? 

If you don’t feel particularly “exploited” based on your present-day economic circumstances, does that make it difficult for you to relate to someone else who might feel that way?

Do you find yourself saying things like, “if you don’t like the way you are being treated, get a better job?” (without questioning why exploitation should be a natural part of someone’s work experience). Think, for example, about McDonald’s workers, Walmart workers, and even union workers.

Do you think that the accumulation of assets at the high end is a simple reflection of “hard work” or “smart work” invested by the people who end up there?

Do you think that the people at the bottom of the wealth distribution are there because they are the natural losers, who didn’t work hard/smart; that they simply failed?

How is the notion of “bootstrapping” and/or the philosophy of “individualism” challenged by the video?

Why do you think working class people are often among the loudest complainers with respect to redistributive politics and programs (programs critics call “socialism”). What is the major source of their complaint? Do you think, for example, that maybe they complain because they don’t like or relate to many of the people who comprise the working poor (blacks and Hispanics) or do they simply just not support any form of benefits (i.e. “handouts”) for poor people in general?

Why do these same working class people often not often complain about giving away tax dollars to support tax incentives for wealthy people (i.e. policies that give money/handouts to banks, wall street, etc.)?

The chart clearly indicates that most of the country’s wealth is concentrated among the top 1% of wage earners. When you add to this the fact that most government and social programs are paid for by wage taxes extracted from the middle class (because neither the very poor or the very rich pay a high percentage of their income in taxes), why do you think it is that so many people across the income spectrum are calling for the poor (and not the rich) to pay more taxes? Does this seem logical?

Why do you think that programs that benefit poor people (food stamps/SNAP benefits) are referred to as “welfare,” but programs that benefit the working and middle-class people (home mortgage interest deduction, unemployment compensation, GI bill), wealthy (capital gains taxes), and corporations (tax incentives, subsidies) are not similarly thought of as corporate “welfare?”

Why do you think so many people accept the upside down logic that rich people need financial incentives (high pay or tax cuts) to produce jobs, but providing financial incentives to poor people is bad policy, because giving them money/benefits rewards bad behavior?

Do you think corporations are the true “wealth” generators in society….or might the be the real “Welfare Queens?”

10686673_10153274018703755_6127453600898744689_n

Course: Classical Social Theory, Race & Ethnicity

Selling the American Dream

161 Comments

American Dream III - Winter Wonderland - Peter Crawford

People in general (and especially politicians) love to talk about the American Dream – the idea that competitive individualism and hard work pay off with guaranteed steady progress up the economic ladder. The “dream,” as such, is theoretically attainable for anyone who is willing to chase it. But what if this were not true? Have you ever stopped to think about how you came to believe this?

The American Dream speaks to us all in different ways. Realistically, then, there is not just one dream, but in all likelihood, there are many dreams, even if they are just a variation on a theme: “If I work hard I will be rewarded.”

Why do some people believe in the dream fervently, whereas others do not? How might the extent to which someone embraces the dream (or rejects it) depend on social factors (i.e. circumstances of their upbringing, which includes family status, household wealth, social class, and education)?

Regardless of whether or not one “believes” in the dream, the power of its allure is real. Here is the simple formula that comprises the dream:

“hard work = wealth/success.”

Here again, the beating heart and soul of this type of thinking is an American obsession with competitive rugged individualism and self-reliance. This makes it acceptable to do things like applaud people for giving up vacation, family time, and leisure activities. So much so, that many people work themselves to death to prove themselves worthy of the dream. This formula for success has become so culturally ingrained it almost functions as a social law.

And so it follows, the people who can’t “make it” are those who fail to exert sufficient hard work and effort to achieve success.  When or if they fail, it is assumed that they made a “choice.” To quote Donald Trump, from his book “Crippled America,” people who don’t buy into this vison for America are looked at in terms of  “disgust,” “weakness,” “losing,” and “pathetic” (Lowndes, 2020).

Presently, during a time when nations the world over are struggling with the COVID-19 pandemic, it is curiously only in the United States that we see people calling for, at best, benign neglect, and even worse in some cases, outright death for the elderly and disabled. Don’t want to risk exposure at your minimum wage job at the grocery store or care home? Too bad. Toughen up. Sux that you don’t have a strong immune system. When people are deemed no longer useful for business, many of their fellow Americans are willing to let them perish. All because they are deemed no longer productive members of the work force.

The rhetorical oppositions of work vs. welfare, self-reliance vs. dependence, individuals vs. the state, citizens vs. foreigners are all oppositions that are animated by the social categories of race, gender, and class—they run deep in American political culture. All are reflected in the politics of the pandemic right now, and offer a grim political vision of American freedom (Lowndes, 2020).

The fact that so many people fail and fall through the cracks, as it were, is more indicative of a flawed economic system than it is ant given individual’s work ethic. Failure, defined in economic terms, is not always a simple matter of being lazy.

Alternatively, those who are either wealthy or have good paying jobs are considered the good people, as indicated by their success. And how do we know they are successful? Well, they usually have the “toys” to prove it (i.e. nice cars, nice house) which proves they worked harder than the average person. In short, they “made it” through their own individual efforts. Right?

This logic, unfortunately, only accounts for a limited spectrum of dynamics that may influence a person’s opportunity for success. It fails, for example, to take into consideration that people’s life chances, more often than not, are determined by not only factors associated with their birth, which includes access to family wealth, but also access to institutions and other opportunity structures (check out the voluminous body of social mobility research that proves this) .

Briefly put, one’s attained success in life is not a simple linear formula predicted by the single magic variable of “individual effort.” The reality is more complex than this.

Dreaming But Not Believing

According to a recent annual American Values Survey of 4,500 Americans, nearly half of Americans who once believed in the American dream (defined as working hard to get ahead) now think it no longer exists. Similarly, close to half of all Americans over 18 think their generation is better off financially than their children’s will be (Pathe). That’s pretty bleak.

What symptoms are Americans experiencing that have led to this gloomy outlook? With the results of the survey, the Public Religion Research Institute created an Economic Insecurity Index to try to pinpoint the source of the American economic malaise. They asked their survey participants whether they’d experienced any of six different forms of economic insecurity: Had they reduced meals or cut back on food to save money? Were they unable to pay a monthly bill? Did they put off seeing a doctor for financial reasons? Had they lost a job or had hours reduced? Were they receiving food stamps or unemployment benefits?

The most common reported economic insecurity reveals there may be a less publicized dimension of human suffering compared to layoffs or unemployment: food insecurity, with 36 percent of respondents saying they’d experienced it. Why food? For a lot of Americans, this is the one  budgetary item that they may feel they can manage/control (compared to whether or not they pay a monthly required bill) and so they find it is the easiest expense to cut.

Research demonstrates that blacks, more than Hispanics or whites, have had to cut back on food for economic reasons [note that there are variations among whites based on social class, particularly when education is indicated as having/not having a  college education]. These factors are strong predictors of who will have to make food sacrifices.

Even more sadly, the research documents there is increasing food insecurity among college students. In light of this, colleges are establishing food pantries as a way to help combat the problem.

american-dream

The PRII study additionally found that most Americans have a decidedly negative self-evaluation of their financial situation. Roughly 4-in-10 Americans say they are currently in excellent (7%) or good (34%) financial shape (down 50% from 4 years ago in 2010) while a majority of the public report being in only fair (37%) or poor financial shape (20%). In 2010, half of Americans indicated they were in excellent (9%) or good (41%) shape financially (Pathe).

Today, only 30% of Americans believe the economy has gotten better over the last two years, while 35% say it has gotten worse, and 33% say it has stayed about the same. Keep in mind, this is occurring at a time when the stock market is performing at an all-time high. That says something about who is benefitting (and who is not) from our economic policies in the United States. More on that later (Pathe) [Note: most stocks in the United States are owned by members of the top 1% of society].

Psychology & the Virtue of Selfishness

Imagine for a moment a person who enjoys great wealth and status. What would happen to them if they discovered that the success and privileges they enjoy are not the result of their own hard work?  In other words, how do they process the fact that the model of hard work = wealth/success doesn’t really explain their particular case? They are forced to confront what is perhaps an uncomfortable contradiction; one that may provoke an existential crisis (something they would rather avoid). Psychologists call this mental disconnect “cognitive dissonance.”

To overcome the disconnect/cognitive dissonance, wealthy people have to create a competing narrative: one that lets them return to a state of mental balance. Otherwise, they might have to ask themselves: Do I truly deserve what I have? Do other people deserve to not have the things that I have? Is it fair that I have things (despite not having worked hard) when other people work hard, can’t claim the success that I was given, and thus enjoy the same things that I enjoy? 

This is why many people who identify as wealthy/financially well-off (and even those who aspire to wealth) put extra effort into rationalizing that they are the “natural” beneficiaries of their own hard work and virtue (I’m genetically smarter, work harder, etc.). I should point out here that this is not to say that they don’t work hard. Many do. But many more do not for reasons that they are the beneficiaries of wealth handed down to them by parents, family, etc. This goes on quite a bit with some middle class people too (think about someone you know who inherited a family business).

Perhaps more than others, they invest a lot of time and effort to stress the hard work = success logic. That’s because they have to continually prove to themselves and everyone else that  1) they too are hard-working people who are worthy (not frauds, lay about heirs/idle rich); and 2) the poor are truly poor because they refuse to adopt a hard-working enterprising lifestyle.

As the economist and social philosopher Max Weber points out here, they can never be satisfied that they have simply been fortunate; they have to continuously work to prove they have a right to their financial rewards/fortune; and they want make extra certain that everyone else is assured that they deserve it (because they themselves can never be assured). They’re gaslighting us all while they gaslight themselves.

Avocado Wars

This leads me to sidestep into one of the more recent meme wars that perfectly illustrate this contradiction – the avocado wars! Millennials, in particular, have taken a bashing, as they are told they spend too much money on luxury food items, including avocado toast and brunch. As the logic goes, they are more or less told “your financial problems are not the result of a broken economy; they’re the result of your self-indulgent food choices.”

So along comes Mr. Moneybags – let’s call him Avocado bruh – a supposed “self-made” millionaire to point out the error of their ways:

But as it turns out, here’s what’s really going on:

There are many examples of this type of thinking in our culture. The fact of the matter is that most people in the United States who are financially well off – not that there are not exceptions to this – probably got that way the old fashioned way – they inherited wealth or were given the money. Imagine what you could do if someone gave you a pile of money?

Looking Out for Number One

When all else fails to sustain the delusion, appeals to the philosophy of Ayn Rand are often trotted out as a means to claim some literary credibility. Rand’s writing, as demonstrated in works like Atlas Shrugged and The Virtue of Selfishness, is often used to justify a moral philosophy that refuses any ethical basis for a social contract (an ethic that recognizes mutual social obligations to others), particularly when those social obligations are thought to be achieved at the expense of curbing individual desire and ambition.

We know from interviews with Rand that she modeled the protagonist (Danny Renahan) of her first novel, Anthem, on the notorious serial killer, William Edward Hickman – a despicable psychopath if there ever was one. This was her first-draft portrayal of what she conceived as her “ideal man” which she would later refine and portray in the Atlas Shrugged character, John Gault. Rand writes in her journal entry:

“[Renahan] is born with a wonderful, free, light consciousness — [resulting from] the absolute lack of social instinct or herd feeling. He does not understand because he has no organ for understanding, the necessity, meaning, or importance of other people … Other people do not exist for him and he does not understand why they should.”

This is the psychopathic model of rugged “individualism” as Rand imagined it; it is a model for behavior that continues to inspire people (and many politicians among them), who fantasize about life free from mutual obligations and “government interference.”

Not only is this thinking deeply depraved, it’s the polar opposite of the altruistic empathy for others that Christian moral philosophy mandates for its adherents. Neither does it constitute an effective basis for governance. One cannot aspire to live in a complex modern “society” governed by rules and cooperation if one imagines they are an unaccountable free-agent.

Ayn Rand

Boot-strapping Psychology

Where it really gets interesting is when we look at the different ways this type of thinking manifests among people of average means – the “boot-strappers.” These are people, who generally worked to overcome some level of disadvantage, but through their own efforts (and often with help) managed to “bootstrap” their way to success.

As a result of having undergone this experience of working hard, with no perceived help from others, they now feel that everyone, regardless of obstacles that they might encounter, should similarly be able to overcome disadvantage and achieve success. Compared to the wealthy person, who often inherited their privilege, the existential anxiety that this person experiences is bit more complex. So let’s take a closer look at this person.

Not having been blessed with the luck to be born into privilege, the rugged individualist/bootstrapper must contend with at least two significant fears: Fear #1: someone might discover the “secret” of their less privileged/low birth past, thus they work even harder to maintain the veneer of success – success they further equate to evidence of their inherent “goodness” as a person (the very thing that they hope  distinguishes them from the people and social groups they are aiming to stand apart from). Fear #2: That if they stop working, even for a minute, they will fall back to the low place from where they ascended to success and in the process cease to be a good person.

Buying into the American Dream is critically important to the intellectual disposition and psychic make-up of the “bootstrapper.” While they might perceive they derive some limited benefits from the current system, their social relation remains one of subservience to true wealth. But rather than recognize this, this person would rather “shoot the messenger” that dares point out these facts and contradictions. Consequently, they are easily aroused and become upset when anyone dares to unmask the system of exploitation from which they only marginally benefit.

Personal responsibility narratives run a close second to beliefs in rugged individualism, and maintain a powerful hold in compelling conforming behavior among the rich and poor alike. Nevertheless, regardless of how popular or entrenched the thought process might be, one thing is certain: it is a fatal error to assume that people without money are lazy moochers.  People don’t simply “choose” success. Again, success is not a simple calculus of hard work plus the sum total of our individual choices. There’s a lot more going on beneath the surface that explains success.

The same people who lay claim to boot-strapping ideology are likely also to subscribe to the belief that there is no such thing as structural poverty (or structural racism, etc.). In denying this, they put the onus for failure totally on the individual. It’s the secular version of the prosperity gospel that has in the recent era destroyed the social justice mission of many churches.

The idea that individuals alone are responsible for what happens to them is not supported by empirical research and evidence. Notwithstanding, the idea, as such, is highly destructive and corrosive of our collective social well-being.

To this end, we would all be better served if we simply questioned and attempted to understand the root causes when there are so many people not doing well. At the very least, we should be open to considering that  attaining the American Dream for many people is not as simple as many of us so desperately want to imagine.

Stars and Stripes on back of pickup truck, USA

Sources:

Excerpts from this post were derived from the article by Simone Pathe, “Why half of U.S. Adults No Longer Believe in the American Dream.” Last accessed May 2016.

Public Religion Research Institute, Economic Insecurity Index, by Daniel Cox,  Juhem Navarro-Rivera, and Robert P. Jones

Discussion Questions:

How do the two-car advertisements engage with the idea of the American Dream? What vision of the dream are they each selling?

What kind of language and symbols do they employ to motivate potential car buyers?

How do the ads subtly (and not so subtly) exploit issues of race, class, and gender?

What is George Carlin’s basic argument about the American dream?

How is Carlin’s argument similar to Marx’s argument?

Do you think the American Dream can be attained in today’s society? If so (or not) comment on your own experience trying to “live the dream.” Do you think it is attainable for you?

What do see potentially getting in the way of your success and ability to live this dream?

adr
Artist: David Horsey, LA Times

Course: Classical Social Theory, Race & Ethnicity

Wealth, Wages & Social Inequality

58 Comments

Wages
Wage and salary information indicated as a share of GDP. Statistics provided by the Federal Reserve Bank of St. Louis

Wages Are Declining

The chart pictured above offers a graphic representation of the trajectory of working people’s wages as a share of GDP since 1950. The downward decline indicates that over the course of the last 30 years since 1970 in particular, economic growth has stalled because people’s earning power and living standards have been eroded. To make matters worse, economic indicators since the Wall Street generated Financial Crisis of 2008 reveal that while the US economy has grown, worker wages have fallen sharply. Those individuals in low to middle income jobs are the ones who are seeing their wages suppressed, while the wages of top earners are reaching all-time highs. The number of workers making $5 million or more grew almost 27 percent since 2009, to 8,982 workers, up from 7,082 workers in 2011. Total wages earned by these highly paid workers grew 40 percent — 13 times the overall increase in compensation for workers.

For over thirty-five years American workers’ wages have been stuck. This comes while we’ve seen tremendous growth in our country’s productivity and soaring corporate profits.

What happened? Why is the middle class dissolving and childhood poverty rising while the economy continues to create more millionaires and billionaires?

The answer is simple. Our elected officials in Washington and across the country have adopted policies, written by the very wealthy, which are meant to ensure that the lions’ share of income growth continues to go to the top 1 percent while the rest of the country is left behind.

Wealth and Perception of Social Inequality

The video below which represents work conducted by a research group at Harvard is troubling for a number of reasons. Not only does it graphically illustrate income distribution and income inequality in the United States, it touch on another important issue – people’s perceptions of inequality. Why is this troubling? Because you can’t fix something if people don’t recognize the nature and depth of the problem.

To be sure, people have a “sense” that something is askew. An NBC News/Wall Street Journal poll of American adults found that in spite of the fact that the economy is registering record economic gains, 57 percent still think the economy is in recession. This is occuring at the same time that people don’t think social inequiality is a problem. How do we accout for these contradictions? What other social factors combined with a tangible knowledge deficit might be driving these perceptions?

One reason is that there both a wealth and a wage gap.

Robert Reich shared some statistics on social media that were recently published by the French economist Thomas Piketty. Notice how the bar chart dramatically illustrates almost ALL of the economic gains acrued in the recent recovery are going to the top 10%. Almost nothing is acruing to the vast majority of wage earners, who are within their rights to question: economic recovery for who?

income dist.

To put this in terms Marx would use, the economy is growing, however, the growth is disproportionately accruing to the owners of capital, who are not sharing any of the profits with their proletarian wage earners. When you combine the weak job growth with abundant (increasingly desperate) cheap labor, toxic ideologies about “makers and takers,” “welfare cheats,” and immigration opportunists, the ability to mobilize public action to reverse the policies creating the pattern is virtually impossible.

So Where Are the Jobs?

Americans have for more than 30 years been deceived by the mythical narrative of “trickle down” economics, which assured them that all they had to do was support cutting taxes for the rich, who would in turn take all that money and pass it along to everyone else in the form job creation. The problem, as many economists and researchers note,  is that despite all indicators that corporations are reaping record profits, the promised jobs never materialized. Almost none of the profits realized from increased worker productivity from the last 30 years have accrued to workers. All of it went to the owners of capital. Put differently, corporations have realized the profits of their dreams, which they accomplished on the backs of their workers,  who made all of the labor concessions, contributed all of the productivity, and assumed more of the public tax burden in order that their employers could achieve these market gains.

Young people are especially hurt by these regressive economic policies. Often scolded and told to “stop whining” and “get a better job,” young people are facing increasing obstacles to realizing their goals and dreams. Much unlike college students who lived in the 1070’s and 80’s, today’s college students are often forced to work full-time while going to school; others who are not favorably positioned to attend college are forced to work a never ending series of unstable low-paid minimum wage jobs that not only don’t confer a living wage, they make it impossible for anyone to “work their way out of it” by getting the education that would permit them to escape that situation.

111112555

Discussion Questions:

Based on current demographics, among what social groups do we find wealth concentrated in the United States?

Do you think things like poverty and joblessness are chosen or deserved, due to deliberate choices made by individuals not to work hard, or are there perhaps other social factors that influence these outcomes?

After watching the video, do you think “bootstrapping” individualism alone can keep people out of poverty, or do you think there is a potential role for government to play (through social policy) to control the power of money and capital to make markets serve only the interests of the wealthy?

Do you think social policies should address the problem of social inequality or should everything should be left to “the free market.”

Do you think social policies should only be concerned with equal “opportunity” and not be concerned about unequal outcomes?

Which of your “beliefs” are challenged by what the video portrays?

Why do you think it is more often than not working class people who register the loudest complaint about redistributive politics and programs labeled “socialism” perceived to benefit poor people? Why is there not a similar outrage expressed about the redistribution of middle class wealth to support people who are wealthy (i.e. banks, wall street, etc.)?

Given that most of the wealth is concentrated among the top 1% of wage earners and given that most government and social programs are paid for by wage taxes extracted from the middle class (because neither the very poor or the very rich pay a high percentage of their income in taxes), why are so many convinced that the way to solve fiscal problems is by making poor pay more?

Why do you think that programs that benefit poor people (Food stamps/SNAP benefits) are referred to as “welfare,” but programs that benefit the middle class (home mortgage interest deduction, unemployment compensation), wealthy (capital gains taxes), and corporations (subsidies) are not similarly thought of as corporate “welfare?”

11130090_953605448007270_3552405835312163296_n

11133681_976610139038392_2611297050610687864_n

IMG_6654

Course: Classical Social Theory, Race & Ethnicity

Marxist, Fascist, Communist, or Socialist?

11 Comments

Mark Twain once wrote, “If you don’t read the newspaper, you’re uninformed. If you read the newspaper, you’re misinformed.” This is a good place to start, considering how much of the contemporary discourse outside of academic and activist circles continues to get everything wrong when it comes to talking about things like Capitalism, Socialism, and Fascism.

People in the U.S. get caught up on labels like communism, socialism, left-wing, and right wing, Sadly, they are often indoctrinated by their social groups to see these words through an “emotional” lens, where they don’t fully appreciate the history and meaning of the terms, so much as they dive off the deep end, where they become convinced that their non-favored political candidate poses an existential threat to their “freedom.” Well, maybe?

Traditionally, Nazism, Fascism, Communism, and Stalinism are all are forms of authoritarianism governance. It is important to understand the nuances that separate these things. Unfortunately, labeling unfolds as parlor game and favorite political pastime for people, who throw these words around like hand grenades in public debates, all without ever thinking much about their substantive meaning.

As it stands now, we have never seen communism put into practice in the United States and the world for that matter (okay, maybe the kibbutzs in Israel come close to practicing this ideology). Many people don’t even realize that there are differences between Neoliberalism/Liberalism (forms of Capitalism), Social Democracy (also a form of Capitalism), and actual Socialism. What did the term socialism even mean to you before you might have passively learned it was used by Karl Marx to describe the tyranny of the proletariat?

To make things more difficult, our contemporary media landscape – online as well as print media – tends to privilege “entertainment” and “clicks” over “information.” Consequently, the average person is provoked a thousand different ways to fly off the handle without thinking to hard (a comfort for some). Driven to nearly endless distraction, it becomes next to impossible to stay informed on important ideas that are relevant to civic discourse. Think about it: how many of you took High School civics classes that were taught by the football coach? Is it any wonder that people are barely literate in this regard?

Motivated learners might try to  “do their research” on the Internet, but more than likely they are self- selecting (more like “cherry-picking”) sources of information that line up with a world view they have already formulated. This selection bias occurs even among good intentioned people who view themselves as “politically conscious”’ or “informed.”

Labels or isms like the ones listed above (i.e. Socialism) provide people with an opportunity to feel smart while they are engaging with important social issues. Yet their very appeal is grounded in the fact that as “isms” they are attractive because they tend to be simple and totalizing at the same time.

Take the terms, for example, left wing and right wing. Their meaning becomes even more difficult to grasp when you attempt to discern/compare their meaning to social contexts outside the U.S. In short, someone who identifies as right wing in the U.S. does not share the exact views as someone who identifies as right wing in England. This is because right-wing ideology in the US context privileges individualism at the expense of social order and social responsibility. Right wing and conservative ideologies in Europe, Australia, New Zealand, Canada lean toward a more mixed understanding of what constitutes an ideal social order and social responsibility within a capitalist economic framework. The same is true about left-wing ideology in the US. Sometimes it is nuanced and discerning, privileging social responsibility and care for the human condition over unfettered individualism. But in some cases, much like centrism, it merely exists as a general oppositional ideology to US right wing conservatism.

Ultimately, the labels themselves are a problem. Because, more often than not, they engage a limited spectrum of understanding. Again, this is because their very appeal derives from they way they offer intellectual shortcuts for people who tend to not want to think very hard about problems that are dynamic and complex!

o-NY-POST-570

Labeling Theory: Governments & Social Systems

In carving out an approach the study of these topics, one needs to first develop a habit of mind where they don’t accept everything they read at face-value. Ideally, they should want to cultivate broad sources of information (which they often don’t do, because why spoil the fun of being “right” about everything?). Failing to do so means that, despite even the best intentions to be informed, they will tend to achieve the opposite. Even worse, at the precise moment when they think they are sounding “smart” about a topic, those who are well informed will dismiss their ideas as superficial and ill-conceived.

Political labels like “Marxist” “Fascist,” “Communist,” and “Socialist” are perfect examples of this. We don’t need to confine our views of these things to history, as there are emerging/ongoing examples of them in action now, as indicated by dictatorships and autocratic type of governments.

Fascism

More recently, we have seen a resurgence of the word “fascist” circulating in American politics. What does it mean? Where does it come from? Is it a mere label? Or does the proverbial “shoe fit” in some instances?

Italy’s Benito Mussolini is credited with bringing the word fascism into popular usage. He coined the term in 1919 to describe his political movement (from the Italian “fascio,” which means group and refers to a kind of militant brotherhood).  Although there may be different definitions of fascism and people may debate its qualifying characteristics (some people refer to it as a political philosophy – one that relies on a on a “strong-man” type leader to promote the philosophy – or theory of mass movements, while others define it in terms of specific political actions). Whatever the case, most people agree that fascism is authoritarian, it has a fascination with racial purity, and it promotes nationalism at all costs.

A Bunch Of People Are Looking Up The Definitions Of “Fascism” And “Misogyny” | The FADER

Tamil on Twitter: "Fascism - definition, characteristics, history. https://t.co/YHlBr2L9Wy" / Twitter

Are all Fascists Nazis?

Not exactly. Whereas all Nazis were fascists, it is not always the case that all fascists are Nazis. An example of a modern day fascist is Hungary’s Victor Orban.  We know this because he tells us.

Oh, and for what it’s worth, there is no such thing as a “semi-fascist.” Nice try but take the gloves off and say the “F” word, as it is increasingly becoming the case that one is an out and proud fascist or they reject fascism in all of its manifestations, both in theory as well as practice.

What is a Social Democracy?

The closest nations have come to achieving a utopia for their citizens, in terms of the social organization of civic life, exists in social-Democratic states (i.e. Finland, Denmark) whereby these states have a record of recognizing, through the implementation of social policy, the social rights of their citizens. Programs and polices are, nonetheless, prioritized within the framework of capitalism; hence capitalism is to some degree humanized. They are not “socialists” in they way that people who are fond of labels tend to think of them.

By way of contrast, the U.S. is stands as the paradigmatic example of an an advanced industrialized state with the weakest social democratic values; it’s economy is based on unrestrained capitalism, which is to say freedom for companies, but not always people. This peculiar mix of freedom is what people in the U.S. are always banging on about. Far from being a welfare state (and I’m not taking about people on welfare or receiving welfare services), the U.S. offers its citizens the least return on their tax dollars in the form of social services investment (i.e. healthcare, education, recreation), when compared to other developed industrialized countries (OECD countries). This is because the U.S. prioritizes investment in the economy over investment in people. Superficially, in terms of policy, this looks like stimulus for corporations (tax cuts & subsidies) combined with robust military spending. The amazing part of all this is how they have managed to convince a large segment of the middle earning population (who don’t equally benefit) to pay for/support this system.

Take for example claims made about “free markets” or even “freedom.” There’s no such thing as “free” markets. Simply put, “the market” is fundamentally a political and social construct – it’s not something that exists in a pure form in nature. For as a political matter, governments everywhere impose all kinds of rules and regulations on “the free market,” beginning with the protections of private property (encoded in law and enforced by the police). They regularly enter into negotiations and dialog with representatives from industry to actively shape markets in ways that benefit national interests. This kind of activity is normal and varies across countries and states. As for the results of this approach, the financial rewards predictably line up to serve the interests of people with wealth and power.

capital

Politics as “Team Sports”

Given the degree to which critical thinking is often stymied, when we combine this inability to think with the “sportsification” of politics, we find people are fond of screaming into the void – they loudly state their personal “beliefs” about issues and problems, without having even a rudimentary ability to advance an argument based on facts and evidence. Emotion rules. Not surprisingly, these beliefs tend to reflect their political ideology (their chosen political team).

Regardless of what ideology you subscribe to, many people have been effectively convinced (or conned) that the biggest social problem is the lazy poor people. Can’t afford a house? It’s because you didn’t work hard enough. Sadly, poor people themselves are not immune from getting caught up in this thinking, given how they have been indoctrinated to believe that poverty is their fault (and not structural) because they lack an appropriate work ethic. Really? Go tell that to the home health aid that’s taking care of your grandma in addition to the the nurse, the teacher (who must buy their own school supplies), and the guy hauling your garbage everyday that they can’t get ahead in this world  because they’re simply not working hard enough.

And let’s not forget “Schrodinger’s immigrant.” The term perfectly illustrates the paradox of twisted logic that relies on immigrant stereotypes. This particular expression has been used to call attention to the contradictions bound up in anti-immigrant discourse, as the term describes immigrants as simultaneously “stealing our jobs” and “laying around” stealing public benefits.

Put another way, its easy to mobilize an audience of people with emotional manipulation, as powerful media narratives call upon people to summon their own ignorance. They’ve been taught to focus criticism inward (blame self) and not outward and up the social ladder, as they dare not criticize their corporate Masters, who they think sit around all day trying to think up ways to create jobs for them.

Regionally and locally, we are seeing this play out among Appalachian people, steelworkers, and other labor groups who have been exploited and taken advantage of for years. Public lands were virtually given to coal barons and steel magnates, who extracted the resources (natural and human) for their sole profit, which they delivered in product form (coal and steel) for the rest of the country. In return for the wages and benefits their unions had to constantly fight for, they were forced to labor in unsafe and contaminated working conditions. The products of those industries fouled the air and the water to such an extent that the residual contamination still exists today. This continues to harm regional populations, who have some of the highest cancer rates in the country.

And yet some of these same people continue to act as full-throated apologists for the wealthy people and industries that did great harm to them (and their ancestors) even as they employed them. They continue to vote in solidarity with the people who exploit them. Why? Because the people who benefit from this social arrangement are skilled in making appeals to their emotions (FEAR), despite almost never delivering to them the promise of a better life.

Right wing extremist ideologies gave people something far more tangible than policy results: they gave people false hope that “life will get better if you work harder” (even as they move the goal posts for success). It’s a suckers bet and people continue to fall for it.

More moderate/centrist ideologies have tended to follow the technocrat playbook: instead of emotion, they appeal to a different kind of logic; one that privileges evidence and the granular details of social policy. The problem with this logic is that when they try to explain their reasoning to the public, they (the public) is often too tired and/or not interested enough to educate themselves on the facts of a given matter. Facts are boring. They are not sensational.

Emotion-based reasoning, by way of contrast, is far more satisfying. Consequently, even as political moderates have delivered incremental (slow) change, they are almost always perceived as failing to deliver (even as moderates have lacked the majority power within government to fully advance their agenda and deliver more impactful results, due to being blocked by the opposition party).

Another dynamic that is operating here is that scared middle class people, who are frantically left looking over their shoulder at the dreaded “poors,” are trying desperately to stay ahead by  consolidating their class advantage/position, so not lose ground. They are fearful of becoming poor themselves.

To this end, we see them (the middle class) embracing symbolic gestures and performing rationally calculated acts of solidarity with people they perceive to be their financial betters, who are situated far above them on the social class ladder. They are enacting what psychologists and call aspirational social identities; that is, they hope that by lending their support to their betters, engaging in gratuitous praise of wealth and power, they might insulate themselves from sliding back down the class ladder. Put another way, they are playing a deceptive game of “fake it until they make it.”

Now you might say, “well, what’s wrong with aspiring to acquire money and to live a comfortable life?” And the answer is, in and of itself, absolutely nothing. Where the problem lies is in the incessant need to “punch down” on poor people as a way to psychically separate yourself, if only so you can feel better about the fact that you’re forced to work in an economic system that’s rigged against just about everyone except those born into wealth and privilege.

Why are people like this? Does working three jobs to make ends meet cause them to be so tired they can’t manage the effort to question the organization of the system? Is it any wonder that people have become seething rage monsters, whose anger can only be soothed by a combination of White Claw and trips to the firing range to shoot their AR-15s?  Have they taken so many drugs and eaten so much junk food to manage their pain that it clouds their thinking? Why have people come to accept this state of affairs as normal? And why, when given the opportunity to vote, do they tend to vote against their own best interest?

The Power of Labels & Elections

Every time election seasons roll around, you will hear people letting loose with labels and insults. Senator Bernie Sanders was regularly savaged by candidates like Donald Trump and even some Democrats for the seeming crime of being a Socialist. Alternatively, proponents of unregulated capitalism are argued to be inherently fascist. Many people are saying Donald Trump is a Fascist. And they may be right. Believe people when they tell you who they are.

Ultimately, the effect of label-slinging is that it shuts down substantive debate about actual social policies and does nothing to invite interrogating the nuances of argument. What prevails instead is gamesmanship and name-calling. This is not an accident. Focus groups have found that voters are more responsive to emotion-based appeals than they are reason and logic. Consequently, appeals to labels aren’t going away any time soon because they effectively reach people, even as they remain poisonous to engaged debate on important issues.

FILE - In this Tuesday, June 16, 2015, file photo, Republican presidential candidate Donald Trump speaks to supporters during a rally, in Des Moines, Iowa. A petition asking NBC to cut ties with Trump because of recent remarks he made on immigration, has collected more than 200,000 signatures. (AP Photo/Charlie Neibergall, File)

What’s the Difference Between Socialism, Democratic Socialism, and Communism?

Political Scientist Jeffrey Isaac explains in an interview as follows:

“in the most general terms, ‘socialism’ is the idea that the productive wealth of society—factories, offices, large-scale service firms, etc.– should not be owned, controlled, and deployed for the benefit of a small class of people, but should be owned, controlled, and deployed for the benefit of the society as a whole. The basic rationale for such “socialization” of productive wealth is simple: the knowledge, techniques, and relations of production that produce wealth are all social. In the story of Robinson Crusoe an individual works more or less from scratch (with his “man Friday!”) on “virgin” nature. But in reality, all wealth is social. Particular individuals may innovate. They may even deserve special rewards for their innovations. But most members of “the one percent” are not innovators of this kind. Further, even those that are innovators did not grow up in the wild and innovate through their own efforts alone. They matured in a society with an educational system and a knowledge base and an infrastructure and a division of labor, and their innovations involved a complex network of others.

The idea of socialism is the idea that because all innovation and all production is “social” in this way, the production process ought to be organized in a way that ensures some democratic social control and some broad social welfare. Socialism is a very old idea, and it can be traced back to Plato, the early Christians, Sir Thomas More, and many important modern writers who wrote before Karl Marx was even born. Marx and Engels were socialists who claimed that their socialism was “scientific.”

Marxism is a complex subject. Suffice to say that the founder of Soviet Communism—Lenin—was a Marxist, but so too were founders of German social democracy and advocates of a parliamentary road to socialism, such as Eduard Bernstein and Karl Kautsky. More importantly, while most Marxists have been socialists, and some even democratic socialists, most socialists are not Marxists at all. Some examples include Albert Einstein, George Orwell, Bertrand Russell, W.E.B. Dubois, and perhaps even Martin Luther King, Jr.

Democratic socialism is a variant of socialism that emphasizes the importance of democracy in two ways: a socialist society ought to be run on a democratic basis and not as a dictatorship—as Lenin and his Soviet and Chinese followers believed—and it ought to be achieved by working through the institutions of a liberal, representative democracy, mobilizing citizens and voters, winning elections, and legislating social reform.

In the 20th century U.S. a number of important figures were democratic socialists, most notably Eugene V. Debs, Norman Thomas, and Michael Harrington, whose book Socialism is but one example of a good book on the topic. Harrington was the founder of Democratic Socialists of America, a group that is strongly backing Bernie Sanders. This group does NOT believe in state control of all economic assets. It believes in the use of a democratic state to institute egalitarian social reforms and a more “progressive” system of taxation and to steer social investments in more public ways (think public transportation as opposed to publicly-subsidized, privately-owned sports mega-stadiums). Sanders has had some ties to this group—which has always seen itself as “the left-wing of the Democratic party—and the things he supports are the kinds of things this group has long supported, and also the kinds of things that European social democrats—in Germany, the UK, France, and Scandinavia—have long supported.

sanders_2016

People like Donald Trump are red-baiting when they call Bernie Sanders a ‘Communist,’ even though this vision of socialism is democratic and historically it is anti-Communist.

If the broad mass of Americans were more historically informed, they would know that self-styled socialists have played an important role in U.S. history, that most of the leaders of the early trade and industrial union movement were socialists, that important New Deal figures were socialists, that one of the most important leaders of the U.S. civil rights movement—Bayard Rustin—was a socialist, and that a socialist—Michael Harrington—is widely credited with having inspired LBJ’s “Great Society” programs through his book on poverty, The Other America.

Indeed, the so-called “neoconservative” movement in the U.S. was founded by former socialists, many of whom had earlier been not just simple socialists, but Communist Trotskyists. These people turned hard to the right. Others like Harrington and Irving Howe, the founder of Dissent magazine, and also Sanders, continued to be active in the struggle to democratically achieve democratic socialism.”

But Isn’t There a Strain of Democratic Socialism That Advocates for State Control of the Economy?

Issac continues: “This is a complicated question. The simple answer is no.  One of the defining features of modern democratic socialism is an opposition to the widespread “collectivization” of the economy as was practiced by the Soviets.  Some European democratic socialist parties have supported public enterprise and some forms of nationalization of certain industries—but so have non-socialists in Europe. (Indeed, one need look no further than the enormous bailout of U.S. banks in 2008 to see that it is not only socialists who advocate for government socialization—they simply advocate socializing the losses of big business, but not the gains.) But none support the wholesale collectivization of the economy. Sanders, for example, supports a “mixed economy,” as any of his statements and/or position papers makes clear.

12744699_792486034228134_7378712134374233372_n

Puerto Rican rapper Residente! Residente, real name Rene Perez Joglar, is one of the founders of the alternative rap group Calle 13.  Not only is he the recipient of numerous Grammy awards, he is also an out and proud self-declared socialist (as it turns out, his father was a member of the Puerto Rican Socialist Party who traveled to Cuba and Nicaragua during the Sandinista Revolution).

How is what Sanders wants the same/different from what modern European social democrats want/have? Or from modern Russia? Or China?

It is very similar to what European social democrats have long advocated and enacted, as he himself has stated repeatedly.  It is also much less ambitious than the most ambitious social democratic party platforms. For reasons explained above, it is totally different from the Soviet or Chinese or Cuban or North Korean models.

Sanders does NOT advocate the abolition of private property in the means of production. He does not even advocate massive wealth expropriations. He advocates breaking up banks and more progressive income taxes and the public subsidization of health care and public education (most of these things are quite common in Europe). Further, all of his policy proposals are contributions to the ongoing democratic debate of a democratic society, advanced as proposals to be legislated when and if a democratic majority of citizens can bring such an agenda into office through democratic elections.

Sanders is seeking the freely given electoral support of American citizens. He is not organizing a vanguard revolutionary party intent on seizing power! He is, in other words, a democratic socialist.

12821624_10153946779789522_7902061320566740337_n

Sources

Jeffrey C. Isaac, “Bernie Sanders, Democratic Socialist: A Primer,” by Jeffrey C. Issac. Last accessed Feb 12, 2016.

Discussion Questions

Take a brief moment (after looking up definitions for the terms) and consider the following questions:

Are Marxists the same as Communists?

What are the defining characteristics of a Fascist?

Is it possible to be a Fascist, a Communist, and at the same time a Socialist?

What is a Democratic Socialist?

How is being a Democratic Socialist different from being a Socialist and from being a Communist?

Do you think labels like this contribute to productive political discourse, or do they confuse important issues?

trump7

Course: Classical Social Theory, Race & Ethnicity

Tweets by @SandraTrappen