Monday, February 27, 2017

And now, for something complete different - Existence

Previously, I described brief notions of objective and subjective truth. But I pointed out that the latter is somewhat problematic. For example, I have experiences, and these experiences have a foundation in the physical world, and together with my biology sum to preferences as to sights, foods, and smells. It can be difficult to tease apart this information in such a way that we can say, “It is in fact true that I like such and such.” So why bother?

Because our subjective experiences are the foundation for institutions that exist outside of ourselves. This is a rather abstract statement, so let’s unpack it. In an objective framework, the earth and all life on it is just a collection of matter. There isn’t inherently anything meaningful about the organization of so many atoms. Now, those with religious beliefs might object, but I won’t address those here (due to the impossibility of proving or disproving such propositions) except to say that a deity that could create our universe could easily create a virtually identical one without souls, so perhaps imagine what that would be like.

So in this framework, the cluster of atoms is void of meaning, but that is not to say that meaning cannot exist. If we zoom in to focus on what we classify as “life”, there’s a lot going on. Consider a very simple, classical ecosystem: Rabbits eat grass, have sex and produce more rabbits, are eaten by foxes, who also have sex and produce more foxes, and both rabbits and foxes die, decompose and fertilize the grass, which also reproduces in its way. Is there meaning here, what is it? Ask a lot of people, and you’ll get a lot of different answers. But regardless of who you ask, what’s the key here? It’s not just that we’re on the outside looking in, it’s that to evaluate meaning, we have to bring our own experiences to the table.

From the moment we’re born, we engage in learning about the world around us, compiling and categorizing everything that we perceive, whether or not we’re aware of that fact. The objective nature of one’s being (for example, genetics) are going to key in some physical facts like the color of your eyes, hair, and skin. But the subjectivity of experience plays a significant role in how we interact with the world around us. If someone reputable offers you a job, your reaction is going to be rooted not just in the purchasing power offered by such a wage, but also by your perception of the job. Consider reasons why you might decline. Maybe you’re hesitant to take on a job that is either too tough for the compensation, or isn’t a good fit for your skills. Maybe it’s boring work, and you would prefer to be challenged, or find something more personally fulfilling. Perhaps you have an unfavorable view of the type of people that frequently perform that task, and you don’t want to be associated with them. Suppose that one of these reasons is true and you accept the job in spite of that. Your acceptance of the job doesn’t magically change everything you’ve been through and learned up to this point, so there’s a decent chance that your misgivings are prophetic. On the other hand, you might be happier if you live in a country with high unemployment, or if you reframe how you view your work.

As Jean-Paul Sartre said, “Existence precedes essence.” A couple that is looking to have children might discuss the potential appearances and personalities of those as yet unborn progeny, but until they’re actually born, those qualities are mere possibilities. Who a child grows up to be is not just a matter of environment but also of choices: choices made by the parents, by other relatives, friends, teachers, and on and on. And of course, the child too; we have a tool that allows us to change and develop. And it’s not free will, not in the traditional sense. It’s metacognition, and we’ll get into that next time before looping back around to objective truth and science. Until then, be well, and do good.

Monday, February 13, 2017

Subjective "Truth"

If truth is verifiable, then there is potential (and somewhat of an age-old) problem. Some statements that we want to deem true aren’t really verifiable at all. For example, I like pizza. You can take me at my word that this is a true statement, but if you want to dispute it, there is no clear-cut way to prove that it is a false statement. The closest you might come to demonstrating that my declaration is not true would be by monitoring my brain while I eat pizza, and determining that the cascade of chemical responses and interactions is indicative of my not actually liking pizza. Even that technique is prone to flaws. Addiction, for example, provides a positive neural response to behaviors in which a person might not otherwise want to engage. “I really hate taking this pain medication, it’s destroying my life, but I can’t stop.” Thankfully, pizza is not destroying my life, and while I do enjoy it, I can’t seem to eat more than two or three slices at once.

There is actually no solution this problem. Preferences and opinions are experiential. If you find certain foods like Brussels sprouts or broccoli to be bitterly unpalatable, it might be due to your genetics. And while there can be underlying objective features that help define preferences, our feelings can be subject to change, but not always. Coffee and beer and often said to be “acquired tastes”; keep drinking either one, and eventually you’ll probably end up liking it. Keep pursuing an unrequited love interest and get slapped with a restraining order.

To be sure, there are underlying objective features that give rise to these preferences. However, we lack the computational power and observational data to make such deterministic calculations, similar to how we can’t perfectly predict the weather. Even simple rational assumptions about preferences such as transitivity (I like pizza more than I like burgers, and I like burgers more than I like tacos, hence I like pizza more than I like tacos) don’t actually hold true. Biological feedback loops can account for malleable preferences, such as sudden cravings for sweet or salty items, or for a drink over food. Recency of consumption matters too; if I’ve eaten pizza and burgers for the past three nights, tacos might seem like a much better option.

Preferences and opinions are more difficult than they might seem at first glance. It is true that we know what we like, but to dig into why we like what we do is complex, perhaps unanswerable in many cases. It would be nice to be able to discard the notion of preferences and opinions as being irrelevant since the question of why often doesn’t lead us to mutable characteristics; my knowing that I don’t like Brussels sprouts doesn’t help me eat them without some measure of displeasure. But that doesn’t mean that it’s not important to consider the context and framing that preferences and opinions impose upon each and every one of us.

Where, then, is the connection between the experiential and the verifiable? If a deep understanding of the origin of our own preferences and opinions isn’t necessary for navigating the world, why are we talking about these things, and what does it have to do with objective truth? Maybe you’ve heard about cognitive biases. There are a lot of them. (As an aside, I would actually posit that a lot of these different biases are the results of splitting hairs, and there are really just a couple of underlying features [not bugs!] of brain function that lead to these biases. If I had to pick two to be the most familiar with, it would be confirmation bias and fundamental attribution error.) A simple way to think about these cognitive biases is that we prefer to selectively accept experiential over information that we can’t immediately verify through sensory perception. Context and framing matter because without metacognition (thinking about our own thoughts), it’s easy to allow the subjective truth to supercede the objective. We’ll get into much greater detail as to why this is next time. Until then, be well, and do good.

Tuesday, January 31, 2017

First Steps of 2017 - Truth

It’s already a month into a new year. I’ve written sporadically on seemingly disconnected subjects, with likely little success in conveying what it is that I really believe. So, let 2017 be the beginning of a journey into exploring an idea that spans truth and human behavior, wrapped in an existentialist bow. Regardless of your values and views of and about the world, I hope you will join me. Without fanfare, let us begin.

Truth - what is it, and why is it so important? These can be viewed as either trivial or impossible questions depending on who is being asked, making it a seemingly odd place to start. But start somewhere I must, and so let me define objective truth as the material state of the world. I don’t need to be present, much less alive, in order for something to be true. When I state that something is true, I’m asserting something about the configuration of matter in the neighborhood of some dimensional coordinate system. There are tons of ways to pick this apart, including linguistic and cognitive arguments getting into how all of this works. There will be time for that later. But for now, suffice it to say that when I tell you that I live in the United States on the planet Earth, your accepting this as truth gives you some insight about the general location of my person.

The why, then, should be apparent from this example. Truth imparts information about reality, and it informs our decision-making process. If you stop and ask me for directions (pretend your smartphone’s battery is dead), you’re relying on the idea that our underlying cognitive constructs are similar enough that I can convey information that’s useful to you. “Go down the hill, past the first stoplight, left at the second, straight past the stop sign, should be on your right.” In this instance, it doesn’t actually matter if my memory of the area is fuzzy, as long as the stated details correspond to the rest of the world. Even small errors might be tolerable, such as if your destination is actually on your left instead of on the right side of the street, and you notice this. But you might not be too happy if I misremember that it’s actually the third stoplight you were supposed to turn at, and not the second.

I don’t think it’s a stretch to say that it would be impossible for a single person to travel to every single location on the planet, much less remember all of those locations accurately. So we make maps. Multiple people take information they perceive (aided by tools and instruments) and consolidate it into forms that communicate to others something about the material state of world. Deciding where I want to go on vacation is informed by available information about the material state of the world (plus expectations and predictions about the future - to be discussed another time). I’m going to be very confused (and then mad) if I look at a resort advertising beautiful white sand beaches and arrive to find a snowy evergreen forest. And it’s not just about leisure, as sometimes bad directions can have dire consequences.

There exist facets of the world for which truth is straightforward. Nevertheless, our perception is far from perfect. Remember the dress illusion that drove social media nuts for a while? Even as disagreement raged over the actual color of the dress, there was still a tacit agreement that there is a “true” answer. In spite of the fact that up to 70% of our brains are occupied by visual perception, we still get so much wrong. Optical illusions aren’t just something fun to look at in children’s books, but tell us something very real about the limits of our own perception. If there is a lesson to be learned here, it is that relying on a single person for all of your truths about the world at large is probably a really bad idea. Better to head in the opposite direction.

I posit that multiple perspectives can be beneficial to approximating truth. Suppose you and I are sitting at a table opposite each other. There sits on the table a Rubik’s Cube, and the green side is facing directly toward me. If I had never before seen one of these objects, then I wouldn’t be able to say anything certain about the colors of the sides out of my sight - not without touching the cube. But you could tell me that the side facing you is blue. In this way, I learn more about the state of the world from your perspective. Many experiences are outside of our reach, and we can only rely on the perspectives of others to drive at a better understanding of what is, and making smarter, informed decisions. A more practical example can be found in Wikipedia, which has ostensibly replaced its etymological cousin, the encyclopedia, in most households. Wikipedia relies on the crowdsourcing of information and labor to produce articles for perusal, while the paper versions rely on a comparatively small selection of experts to write the same. Studies vary in their conclusion, but largely find that the accuracy of Wikipedia compares favorably to the Encyclop√¶dia Britannica. Even if the former isn’t perfect (and one should not assume that the latter necessarily is), it is impressive what people interested in collaboration can do.

From the last few months, it is quite clear that no small number of people have no interest in truth - assume safely that these people are not interested in your ability to make well-informed decisions. Whether or not you agree with a person’s agenda should have little to no bearing on a healthy skepticism and measured analysis. If a particular point of view is worth holding, it should be supportable by data on which there can found a consensus - to suggest otherwise is to simply attempt to invalidate the perspectives of Others. We’ll visit this subject more in the near future. Until then, be well, do good.

Wednesday, November 9, 2016

Basketball tips for a post-whatever era

When I was young, I lived for a few years in an apartment complex that had a basketball court not more than 100 feet from my front door. When the weather was nice enough, I’d frequently play, as did many others in the neighborhood. I wasn’t very good. I’m terrible at all sports. Sometimes, people might have at first glance expected more from me, as I’m a fair bit taller than average, but I can’t jump, can’t shoot, and can only control the ball with the most rudimentary moves. It was fun, but as much as I played, I never really improved. My lack of skill was readily apparent to any observer. Sure, I’d make shots now and again. My height would even allow for an occasional block. But in drafting for a pickup game, I wouldn’t be someone you would choose early.

Off the court, my skill in basketball is completely irrelevant to, well, everything in the universe. By not playing, it’s not as if I’m wasting talent or throwing away some collegiate or professional opportunity. So, maybe I’m not causing your team to lose by being present, but for the most part, does anyone care that I’m not good at basketball? Definitely not.

Whether it is the presence or lack of a quality that is of interest is dependent on the quality itself. For those qualities that can be demonstrated or observed, declarations are pointless. Waiting to be chosen for a team in a pickup game, it doesn’t make sense for me to yell, “Hey, I’m awesome, pick me!” if I can’t back up my claim. If I further go on to miss some really easy shots, I’m just going to look like a fool. When I’m not playing basketball, which these days is, uh, always, I don’t bother inserting into conversations that I suck at basketball. No one cares about that.

Here are a few other statements that no one cares about:

I’m not racist.
I’m not sexist.
I’m not xenophobic.
I’m not homophobic.

Why? Because those qualities will be born out through conversation. There’s no need to make such a declaration if the truth of that statement is going to be determined outside of and independently of the assertion itself. And these types of sentences are not caveats. If you want to start talking about the inferiority of a particular group of people, no amount of, “Well I’m not XYZ but I just have to say that—” is going to mitigate what comes after. Such a prologue does, however, bring attention to whatever subject you’re broaching, so there can be no complaining that the discussion/argument/flame war became about XYZ. You kinda stepped in it.

This is not, as some might be inclined to rebut, “political correctness”. Stop and think for a moment about what those terms above mean. Racism or sexism or homophobia aren’t about just using epithets when talking about people, or telling certain jokes. It is about inflexibly ascribing a quality (or lack thereof) to a person based on another quality that is often immutable. This doesn’t even have to be a conscious process. In fact, a lot of the time, it isn’t. Plenty of psychology studies on implicit bias demonstrate this. Neuroscience gives us some hints as to why our brains operate in this manner (familiarity with others plays a large role). But there’s good news! Metacognition, or analyzing one’s own thoughts, can help overcome these cognitive “features”.

The bad news is that being adamant about not being XYZ-ist is that you are opting out of metacognition. If you’re absolutely sure, absolutely without question or doubt that you have or do not have a particular quality, then any evidence to the contrary is going to sound stupid. It’s going to make you mad. It’s obviously not worth considering, because it can’t be true. So don’t be that person. Be willing to listen to others. Be willing to dissect the ideas that float around in your head. And don’t feel the need to start a conversation with, “You know, I’m not—” because let’s be honest. No one cares about that.

Tuesday, November 1, 2016

The Weaponization of Truth - Or How I Learned To Stop Worrying and Embrace the Media

For a very long time, I didn’t bother to consume news in any way, shape, or form. I didn’t read the newspaper. I didn’t watch the local news, not even to take in the weather forecast. I would rather have been caught off-guard by a sudden, and to me, unexpected atmospheric event than sit through ten minutes of whatever stories seemed likely to garner ratings. With the advent of the internet, I still chose not to opt in to keeping up with current events, whether pop cultural or political. It just wasn’t something that interested me. News tends to be bleak, because there is something inherently captivating about disaster. I was in elementary school, and more specifically, I was playing outside during recess when the Challenger space shuttle exploded shortly after launch. There are two things I will always remember about that day. The first is that my teacher was crying when we came back inside. She explained to us through her tears what had happened.

Footage of the explosion played on CNN all afternoon. This is how the news cycle plays out now, to the extent that we can actually label this as a thing and call it the “news cycle”, but for those not old enough to remember, CNN was the only 24 hour news game in business at that time. Now, as an aside, many years after the fact, I can appreciate the conundrum the blossoming Cable News Network faced with covering a major event - because the station is running 24 hours a day, not everyone will tune in at a particular given time to consume news, as with a local newscast. Then, how does one keep informed those who have just tuned in? Play the same story over and over again? That wouldn’t normally work, except if the event is sufficiently horrifying. Audience attrition is low because it’s so damn hard to stop watching, while people tuning in can get in on the ground floor, so to speak. And that is the other thing that will stay with me about that day. It’s a pattern that I suspect isn’t intentionally used for any ulterior motive, provided that you take as given that information and profit are the two driving motivations for any news organization. Major tsunami or hurricane? Oh yeah, we’ll be seeing a lot of footage of that. Earthquake or mudslide? That’s good for at least two or three days. September 11th? If footage were still on physical film, the originals would have snapped from wear and tear within days.

My attempt to avoid news wasn’t an aversion to these kinds of events, per se. I’ve just never found value in the constant repetition of minimal information. If a news event has some significant developments, hey, that might be worth taking a look at, but what constitutes a development has definitely changed out of desperation to keep audiences glued. The irony is that there becomes a sort of numbness to the onslaught of catastrophe. People tune out. Eventually, they forget. Worse, they might stop caring. Personally, I couldn’t find the value in trying to invest myself in this kind of news. On the one hand, I think it’s important for people to be aware of what’s going on in the world around them, no matter how far away. This awareness can make manifest real, physical aid and assistance as when donations and relief flood into a hot spot being actively covered. And on the other hand, it can, and does, strip attention away from other happenings. Clearly, no one can be aware of everything that goes on. It is easy to criticize others for being blissfully unaware of tragedy unfolding somewhere else on this planet, particularly because there is always tragedy unfolding somewhere else on this planet.

These days, I try to be at least aware of major headlines as they emerge around our country, and around the world at large. So what changed? Did the advent of the internet evoke a dramatic shift in the patterns of news organizations? Nah. In fact, plenty of research shows that polarization has increased due to the way news is consumed using the internet. This isn’t just about politics, but think about almost any topic, and the kind of news you’re going to get in your happy corner of the internet is going to be a lot different than that of someone with differing viewpoints. Vaccinations? Climate change? Dog training? Gender tropes in gaming? Whatever your point of view is, I’m pretty sure you can find a set of websites that conform to those views. I’m not interested in that, as nice as is can be to read something that feels self-affirming. I want the truth, and yes, Colonel Jessup, I can handle the truth.

What changed is that at a certain point, I realized that good decisions cannot be made without knowledge of objective truth. For some people, that’s not really a sticking point. To them, truth is easy. They’re more than happy to tell you what the truth is - what their version of the truth is. But they’re not concerned with the minutiae of data or facts. You might say, “Oh, I’ve heard this all before, there’s confirmation bias, there’s cognitive dissonance, yada yada yada.” Or perhaps you prefer to reject the science point blank. Congratulations, you’re not concerned with the minutiae of data or facts! Empirics have provided the foundation for our current technology, so an absolute rejection of empirical methods is probably not warranted. A good dose of skepticism though, well, that can be healthy. Let’s hang onto that for a moment.

For the rest of us that aren’t beholden to pundits and their impenetrable views, what makes objective truth hard? Well, to start, objective truth is exactly that: objective. This kind of truth describes nothing more than the physical state of the world. This kind of truth isn’t dependent on frame of reference. This kind of truth doesn’t imply any kind of values, ethics, or morality. That might sound simple, like a zen koan, but the induced rabbit hole is equally as dizzying. Rightness and wrongness are a by-product of the filters we apply to perceived events that may not even be objectively true. To get at truth requires being skeptical of every faculty, every source, every notion, and further still requires us to not sink from skepticism into cynicism; there are schools of thought, beginning with the ancient Greeks that were skeptical of everything. That’s not going a step too far, that’s running over the edge of the cliff.

So let’s bring it back around to the media. If you start with the assumption that the media cannot be trusted, then you’re already not concerned with truth. In this day and age, practically everything is verifiable with enough work (and for those things that are not, because they’re state or corporate secrets or whatever, well, there is such thing as reserving judgment until facts are revealed). News outlets on any part of the political spectrum are aware of this. The ones that are surviving have a business model in play. Part of that business model involves not getting sued for libel, or slander, or to placate the vindictive whims of vengeful billionaires that more dollars than sense. There is going to be some inkling of truth behind every news story, barring honest errors or misreporting. These things do happen. So be skeptical, right? Figure out what the author thinks, and reverse-engineer that filter. It shouldn’t be too hard. It shouldn’t, but it can be. The crux of the problem is that facts can be, in a sense, weaponized. Facts, things that are objectively true, can be used to construct multiple narratives that contradict each other.

A silly example: Two people drive through an intersection. Their cars collide. Driver A states that he had a green light, and that it is the Driver B’s fault. Driver B states that he had a green light, and not only is it Driver A’s fault, but that Driver A is a liar. Tensions escalate, and a fight ensues. Later, it becomes known that there was a malfunction with the traffic light and all directions were green. No one was really at fault. However, the facts of the matter, taken separately, lead to a justifiable but false conclusion. This is perhaps the most hammy way of stating that if one does not know both sides of a story, one doesn’t know the story at all. Now, I stated that objective truth is immutable; frame of reference doesn’t matter. Fault requires a frame of reference. Blame requires a frame of reference. To espouse these notions is already to apply one’s own filters and impose preconceived notions on incoming information. First, the bad news: we can’t really stop doing that, it’s just how our brains are wired. The good news? There is no good news. But I do have worse.

Instead of dryly reporting a series of events, a lot of news attempts to construct a narrative. Stories are appealing. Again, this goes back to the way that our brains our wired. Studies have shown that people are more likely to believe and remember stories than disconnected chunks of information. The narrative structure of news, then, actually serves to reinforce more than just the actual facts - it provides us with a false sense of understanding. Take the anti-vaccination movement as an example. Andrew Wakefield, the doctor who authored the original (and thoroughly debunked) paper discussing a link between autism and the MMR vaccine was shown to have a financial interest in other alternative vaccination options that were still under patent. But for some, all of this is irrelevant, and what does matter is that there is a positive correlation between diagnosed cases of autism and the use of vaccinations. And no matter how many times you say that correlation is not causation, and pick apart the myriad reasons for why these disparate trends are both going upward, none of that mitigate the parental response to a narrative about children being “damaged”.

Mass media operates in a similar fashion, some unconsciously, others predatorily. It’s how big content providers keep people tuned in to television stations, listening to the radio. It’s why a cooking website that just has recipes is just an online version of cookbook, but a cooking blog keeps people hungry for more (I’ll see myself out). It’s why people conjure up grand conspiracies portending an evil agenda that threatens the very fabric of our nation. The last of these is not the product of an overnight shift in sentiment about the media. It’s a confluence of many factors, not the least of which are people that have been successfully preaching a narrative of despair. Pundits carefully pack a critical mass of facts like the radioactive core of a nuclear bomb - enough to gain a foothold of credibility with the undecided or disillusioned, and weave in an explosive narrative. When it detonates, it spreads, it stays, and it doesn’t look a damn thing like the truth. In a society that values freedom and liberty, this kind of storytelling cannot simply be shut down. Sure, with time and pressure, things can change - culture changes, society evolves. If you want to convince that your perspective is valid, then throwing out facts in an attempt to rebut Others is an exercise in futility. However, that doesn’t make the collection of facts any less important. The pursuit of objective truth might be akin to chasing unicorns, but we stand to learn a lot about the world and the people in that world. Taking the time to unravel information, doing due diligence when it comes to fact checking instead of believing what sounds pleasant, and trying to understand the values that color an overarching narrative can help us make real sense of a complicated era. And right now, we need all the help we can get.

Wednesday, December 9, 2015

Why Being Moderate Isn't Sexy

It’s pretty tough to take in the news or use social media without running into some strong opinions on politics or social issues. Often, these views aren’t formed as a result of facts or events as they unfold, but represent values that people hold dear. This sort of behavior should be expected; it’s how our brains are wired. That gray matter in your skull isn’t a perfect, rational, impartial information processor. It’s designed to help you survive. A hunter/gatherer that had to empirically determine each and every time he or she looked at a berry that THIS berry is the sweet, tasty, non-lethal kind instead of THAT berry which leads to bitter, intestine-wrenching diarrhea is gonna have a bad time. And probably a very sore butt.

Information is available in vast quantities, and covers more subjects than which kinds of berries are safe to eat, and where the bushes are located. The heuristics which we use, however, have not really evolved. In fact, the first stop on the brain-train is usually the amygdala, the center the of the brain responsible for the so-called “fight or flight” response. That’s because the amygdala handles fear and anger (hate is the next station down, and suffering is the end of the line). When we see or read something that makes us fearful, or angry, then survival mechanisms immediately kick in. A direct consequence is that the frontal lobe, the portion of the brain that’s actually responsible for a lot of heavy lifting when it comes to thinking, doesn’t actually get to do its job. You get angry, you don’t think. It’s that simple. The hunter/gatherer that was just bitten by a voracious white rabbit? He’s not putting effortful processing into determining motive, he’s pissed. And he just found lunch.

The problem, then, and there is definitely a problem, is that it’s easy to foment rage. Even a small crowd, a tiny segment of the population that gets whipped up into a frenzy can do an immeasurable amount of damage to the fabric of society. Violence borne out of this kind of emotion isn’t what democracy, republics, liberty or freedom are all about. But there are certainly provocateurs, whether unwittingly or not, that are spewing vitriolic ideas that have a causal link to others being hurt. Is there any coincidence that all of the anti-Muslim rhetoric on the news corresponds to more threats and violence against, not even specifically Middle Eastern Muslims, but just brown people in general? What about a spate of assaults and murders of the transgender community, and the polar and polarizing rhetoric of prominent anti-LGBT religious leaders and politicians? Or the so-called “pro-life” movement inciting terrorist action against clinics and doctors that may or may not even be involved with abortion services? Of course it’s not a coincidence.

On the whole, issues such as these get the most attention, because they evoke an emotional response. Topics that keep people coming back for more are polarizing issues, regardless of whether we understand them fully (in fact, the less we understand something, the easier it is to get angry). And this is precisely why being a moderate isn’t sexy. A moderate response necessarily must put aside the initial emotional reaction, because it requires splitting the difference between extremes. Obviously, this isn’t possible if you’re so upset that your face is flushed, and spit is flying out of your mouth at such a pace that even rabid raccoon are captivated by the marvelous fountain of foaming ire.

Hence, being a moderate is often not about a call to action, but rather a call for compromise. Loud and influential players on the fringe effectively set the boundaries for discourse, and those in the middle are often content to express discontent and vote accordingly. Any vocalized concerns are usually going to be drowned out by the opposition, and the undecided will tend to gravitate toward one extreme or the other, unless they’re interested in contemplative discussion and analyzing data for the purposes of policy optimization. I know what you’re thinking. “Ooh. That sounds hot.” Right?

Those of us, then, that are moderates have a job to do. It’s time to foster dialog, and that’s going to happen by priming our friends and relatives with the idea that they should step a bit closer to the center. Just a tiny step. Because the next step will be that much easier. The tools are available. Cell phones are ubiquitous, and social media means everyone is available all of the time (ready to post the next image with misattributed quotes and made up data)! Being a moderate isn’t sexy, so embrace it. Dragging people away from extremist views isn’t glamorous, but without citizens willing to engage in the arduous conversations that might eventually result in someone’s views budging even slightly, the only thing we can do is hope that bad, hurtful ideas don’t take hold. There are far too many examples in history of that plan not working out so well to take the risk.

Saturday, August 15, 2015

On Political Correctness

There is a backlash against so-called “political correctness”. It was referenced in the recent GOP debate: “I think the big problem this country has is being politically correct.” It has been brought up by customers angry with Target. It gets trotted out by dwarves that - don’t ask me why. I don’t really get dwarves.

So what is political correctness, exactly? It’s a unicorn. That’s right. A virgin-friendly, rainbow-farting, wide-eyed unicorn. A non-existent creature.

And that’s just it. A unicorn, like political correctness, does not exist in any real, tangible form. Even if it did, it’s unattainable; U for unicorn and Utopia. Unironic use of the phrase “political correctness” is to grasp at some ideal world that just isn’t going to happen. On the other hand, there are people that would love to hunt unicorns, because there are people that like to hunt every kind of animal. Call ‘em Ishmael.

To be sure, there has been a a great deal of societal pressure to avoid using certain pejorative labels. Using a certain word that begins with ‘n’ can mean the end of one’s career. Just ask Paula Deen or Michael Richards. The justification for not using this and other slurs of race, gender, sexuality, and so on is pretty straightforward. Outside of very specific scenarios that involve friendly contexts, use of such terms are at best, diminutive, and probably outright hateful. It’s demeaning and dehumanizing. Treating these labels with care isn’t about some wider political agenda, it’s about treating people with respect.

If you can’t grasp this, just imagine someone frowning at you and yelling, “You’re a fucking idiot,” without provocation. No matter what you say or do, this person refuses to address you as anything else. Would you be inclined to believe that this person has any respect for your life? Probably not. And why would you? An assessment is being made about you as a person at a distance before being permanently etched into all subsequent interactions.

And further still, how would you feel if, when challenged, the other person simply asserted that calling you a fucking idiot really wasn’t any kind of statement about your character or intelligence, not really. “I’m not calling you stupid, and I have nothing against you, it’s just what you are.” If someone can do this to you without you having any violent thoughts, you’re either a much better person than most, or you belong in an Asimov novel.

There are reasons to be concerned about the evolution of language. Stupid shit can happen, like the word literally losing its literal meaning by meaning both the thing it was and the thing it was not. Decrying words that are used only pejoratively against people is not stupid shit. If you think you have a good argument about group X, and you can’t make it without resorting to basically name-calling, then you don’t actually have a good argument.

But the crazy thing about invocations of “political correctness” is that invariably they are accompanied by dysphemistic language. If so-called PC advocates are attempting to make language less offensive by removing words from usage, the anti-PC crowd is trying to remove words from usage by making them more offensive. I posit, what is a socialist? In the current political climate, the answer is dependent not on what philosophy or economics textbooks a person has learned, but on his or her present political affiliation.

This weaponization of words has also been applied to political correctness itself. The unicorn, it is whispered, is actually not a friendly, innocent equine but a fire-breathing lizard that will eat your children. Our unicorn still doesn’t exist, but it is precisely its non-existence that makes the concept so malleable.

Hence, people vying for presidential candidacy decry the idea as a way to insult others carte blanche (but don’t you dare retort, or you’re a “loser”); people make not labeling toy aisles about sweeping structural changes mandated about the government and not about coercing children into playing with one type of toy and not another based on their genitalia; and make dwarf civilization about… well, never mind the dwarves.

I don’t think chasing unicorns is particularly healthy. Just ask Ishmael. The best option, then, is to cut through the bullshit and actually be more honest about what we mean. Some people are going to be offended, and that’s okay. In many cases, it’s not even sufficient to say that offense is justifiable or warranted. The offense each and every one of should take at being dehumanized is necessary to our well-being and existence. So yes, people are going to get called out. They will lose their jobs and tarnish their reputations because of Shit People Say. Instead of pretending that something more sinister is going on, hiding behind excuses of the deterioration of our moral fiber or planned social programming, how about we all just take ownership of the words that come out of our mouths and the consequences that follow. Because if we can’t do that, we might just be fucking idiots.