Wednesday, July 18, 2018

Why Is It Suddenly So Hard for People To Know Things?

If you’re like me, you’re watching the news, actively engaging with social media, and from time to time engaging in actual face-to-face conversations with other people, and you’re asking the question:

Why is it so hard, suddenly, for people to know things?  Or put more cynically: How is it that I and other right-minded people can be so secure in my knowledge about a great many things — including and most importantly What’s Really Going On in the World — while at the same time a roughly equal number of folks on the other side are utterly, stubbornly deluded and resistant to plain facts?

If you’re interested in sustaining a civil society, this seems like an important question.  Maybe even the important question, as an informed and engaged citizenry is essential to the operation of a democratic republic.

(This seems obvious, but as a thought exercise let’s take a moment to toggle the switches and explore the three other possibilities:


  • Uninformed, disengaged population --> tyranny in a hurry.
  • Informed, disengaged population --> tyranny by the scenic route, and with a few more complaints.
  • Engaged, uninformed population --> division and anarchy, until some collection of interests comes to predominate over all others, at which point you can fairly expect tyranny.)


  • Having covered all that, how do we describe our current condition?  It’s fair to say that none of the four categories adequately captures the particular predicament we’re confronting right here right now, in the United States in 2018.  To be sure, there is a critical mass of people who care deeply about the nation, its direction, and the many issues and controversies at the forefront of our politics.  This to the point that I’d say, taken together from both sides of the aisle, there are enough of us invested in current events and their implications to support the conclusion that for better or for worse, yes, the people are engaged. 


    The next question is are the people informed?  I don’t think we can deny that they are.  This is, after all, the Information Age.  Each of us has by orders of magnitude greater access to information than the American people did even twenty years ago.  And each of us participates daily in the circulation and flow of that information.  At least, that’s what I think we’re doing on our phones.

    But there is a problem here.  Something’s broken.  In fact, in a time of perhaps unprecedented division and bitterness in our politics — at least since the First Civil War (see what I did there?) — the following may be the single proposition on which liberals and conservatives largely agree: There is a real, and threatening, information gap cleaving the nation in two.  Partisans will complain that framing the problem at this level of generality was a trick.  You did get us to agree, Phutatorius, but for different reasons.  The left believes the right is willfully under- and misinformed, blinded by ideology and impervious to the actual evidence.  The right sees the left as captives of corrupt institutions that distort the political process with slanted information.  I have my own opinion on who’s right here, but I’ll save it for later.  For now, let’s offer the neutral framing that the two sides are differently informed and leave it there.

    Having settled that proposition, we can start breaking down how we landed ourselves in this blind alley, where we have ready, near-immediate access to 1.2 million terabytes of online information (some of it, admittedly, behind paywalls), but we can’t reach basic agreement on what we know.

    A first step is to consider, at the 50,000-foot level, how we come to know things.  I.e., what processes do we undertake to draw conclusions about the world we live in?  The short answer is all kinds, but because I’m a lawyer, and becoming a lawyer indelibly alters your mind, one such process strikes me as especially attractive and beneficial.  And here it is: we come to know things by (1) making observations, (2) examining the evidence provided by those observations, and (3) making conclusions.

    Elaborate, Phutatorius.

    I shall.  Let’s talk first about observations.  We can often observe immediate events personally — I know my son’s and daughter’s ages because I witnessed their births firsthand.  I was present, in the hospital room, on a particular date, when my children entered the world, and I can calculate their current ages with reference to that date.

    (It’s not that simple, necessarily.  Because as time passes, the births of my children become increasingly remote events, not so much in place but in time, with the result that I have to rely on my memory.  That’s easy to do, for an event like the birth of my children.  But there are often cases where facts slip from memory — or they would — so we rely on supporting evidence that records facts: say, a birth certificate, or the date-stamps on photos of the newborn baby.  More on this below.)

    Events that are place-remote are more problematic for me to observe, for the obvious reason than that I’m not there.  In cases like these, I need to rely on someone else’s account of what happened.  I am able to know Mithridates’ daughter’s age, because he told me when his daughter was born.  He was there, he strikes me as someone who would be a reliable source on this question, and I don’t have any reason to believe he would lie.  These three factors — immediate presence at the event, a relationship suggesting he would be particularly attentive to the event, and no obvious motivation to lie — make Mithridates a reliable authority on the question of when his daughter was born.  On this basis I can receive his evidence and, without requiring corroborating information, arrive at a definitive conclusion on his daughter’s date of birth.  Do I know the DOB to the point of absolute certainty?  No, but I am satisfied.

    So we have firsthand evidence (what we personally witnessed), and secondhand evidence (what someone else tells us).  Written documents, generally speaking, are secondhand evidence.  The birth certificate records the hospital’s testimony as witness to a child’s birth.  It isn't more authoritative, necessarily, than what Mithridates told me.  But because we can imagine cases where we might have reason to find Mithridates less reliable — say, he was out of the room, he was fried on LSD, or he is embroiled in a legal controversy to establish his child’s citizenship status — contemporaneous written documentation from a neutral arbiter (here, the hospital) comes with an additional measure of authority that can be important. 

    (We can ask whether a written recording of firsthand evidence — a note to myself, written in my journal — is secondhand evidence, too.  If I don’t remember the event per se, and I need the writing to remind me, one could argue that written evidence is in fact secondhand evidence.  And in the course of evaluating its authority I should apply many of the same principles I would to a writing prepared by someone else.  Indeed, my first question should be: do I remember writing this?  If I didn’t, I can examine the handwriting to determine if it looks like mine.  I can examine the journal to determine if it is familiar to me and has ever left my custody.  And so on.  Ultimately, the question I am required to answer is whether the journal entry is an authoritative record of the events it purports to describe.  Whether or not my journal entry provides secondhand evidence or a recording of firsthand evidence is really an academic exercise.)

    A further note on the subject of documents merits a brief discussion.  Certain documents may be regarded as more reliable than others, because rather than simply inscribe what a witness perceived, they are instead created by technologies that allow for the direct recording of an event.  An audio or video recording of a courtroom proceeding will generally be regarded as more reliable than the court reporter’s transcript, because in the latter case events in the courtroom are mediated by the court reporter’s perception, with the result that the transcript could be marred by mistakes, or for that matter the court reporter’s bad faith.  By the same token, an audio or video recording could be fabricated or doctored, and it is for this reason that “chain of custody” evidence is important in a court of law.

    This process that we undertake, in order to know things, happens in most cases on automatic.  There’s no clutch we have to engage, and we don’t talk ourselves through the work of gathering, sorting through, and evaluating evidence.  All day long we receive information, assess it, assign measures of authority to it, and draw conclusions from it — largely unaware that we are working through a process each time we do it.  I look at the baseball score.  The Indians are losing.  I don’t belabor the question whether the MLB.com website is reporting accurate information, whether it might have been hacked and is now reporting a false score to me, whether the true score was reported from MLB.com but was garbled or intercepted and replaced on its way to see me.  And I don’t have any reason to question the conclusion.  The Indians are playing the Yankees.  I would like them to be winning, but it doesn’t strain credulity that they’re not.  I see the score, I accept it with resignation, and I move on.

    Because in most cases we draw final conclusions about what we know so easily, so quickly — so blithely — we tend to understate the extent to which we personally play a role in deciding what we know.  Although at times it might seem so, information does not act upon us.  It doesn’t just pilot its way to our brains, land and stick there.  We actually operate a tower that summons the information and assigns it a runway.  And there’s always the possibility that we might reject the information.  We may conclude that it does not, in fact, come from an authoritative source.  Or we may conclude that the source is authoritative as a general matter (it’s never been wrong before), and it seems as if it would be authoritative in this particular instance (like Mithridates, it was present, attentive, faithfully recording what it perceived, and having no reason to lie), but it just doesn’t fit into the broader system of conclusions we’ve previously drawn.  It just can’t be true.  That guy sold us Pudding Pops for decades.  He can’t be a rapist.

    We’ve covered some ground here, but it seems to me we’ve only scratched the surface of real epistemology, and we have not yet even started to consider why a process like the one I’ve described could fail, or could lead to such a yawning, sprawling, chasmic Information Gap as we’re seeing right here, right now in 2018.  Returning, then, to the question embedded in the title of this post — Why is it so hard, suddenly, for people to know things? — I’ll throw out these ideas, which I’ll explore more fully in subsequent posts (not necessarily in this order):

    • First, there are gradations of knowable-ness.  For example, What did the President say at the press conference? is, one should think, a question we should be able to settle, among all parties.  What did the President mean to say? is a harder question.  Why did the President say it? and What was he thinking? are still harder.  And don’t even get me started on a question like Do tax cuts promote economic growth?  I would argue that over time, our tendency to engage in vigorous disputation on the less knowable propositions has bled over into the more knowable propositions.  We have called off the hard arguments on the notion that “reasonable minds can disagree,” and in so doing we’ve set the stage to apply that principle to questions where, to be honest, reasonable minds probably shouldn’t.  Add to this the proposition that nothing — nothing at all — can be known to absolute epistemological certainty, and suddenly everything seems like fair game.
    • Second, the networked world makes it harder to know things.  One obvious reason for this is that in a networked world, there is more information interference.  But more than this, cause and effect determinations, which comprise one important subset of knowable propositions, are much harder to make, because the Internet enables a multiplicity of distant and diffuse causes to render actual effects, big or small, without regard to time displacement or distance.
    • Third, the Internet and digital media have created a crisis of authority, by devolving the capacity to generate and broadly publish information from capitalized elites to, well, anybody.  This is a good thing if, for example, you’re a nobody to book publishers and you’re starting a blog.  But it problematizes knowledge determinations by, among other things, enabling the publication of information untethered to journalistic or editorial standards.  For centuries we relied on media elites to provide information to us, and we assumed reliability.  Our media literacy — our ability to assess the quality and bona fides of information sources has atrophied.  Or we never had occasion to develop it in the first place.  As a result, we are struggling to apply traditional indicia of reliability, toward fairly and reasonably evaluating evidence.
    • Fourth, and relatedly, the Internet and digital media have created a crisis of authenticity.  We are accustomed to examining documents — in text, photographs, recorded audio and video — and taking at face value that they are authentic and accurate.  But through technology unprincipled persons can, and increasingly they are, fabricating and altering documents with an expertise that we’re just not gonna catch.  Seriously.  This is only going to get worse.
    • Fifth, and with my apologies to Republicans for walking back from neutrality, there has been for decades now a concerted effort by the right wing to attack the legitimacy of institutional sources of information that are central to our civil society, in order to obtain short-term political advantage.  Targets include the New York Times, the Washington Post, CNN, network news, institutions of higher education, and lately even the Congressional Budget Office and the Joint Committee on Taxation.  As a result, a large portion of the country has been convinced to reject out of hand reporting and assessments from principled and generally reliable information sources.
    • Sixth, not everyone hews to or even favors an evidence-based approach to knowledge all of the time.  Am I engaging in unfair speculation when I suggest that religious fundamentalists who predicate so much of their understanding of the spiritual world on faith (which I won’t criticize) will be susceptible to basing their understanding of material and sublunary political matters on faith, too (which I will?).  There is a go-with-your-gut, reject-the-evidence mentality that predominates on one side of the political debate.  The process these folks follow will absolutely lead to divergent results, as against evidence-based analysis.
    • Seventh, and I mentioned this briefly above: in the end we decide what we know.  And it is in that point in the process — even in the evidence-based process I would favor — that we are most at risk.  Every human weakness is brought to bear, and applies its leverage, at that point.  And there are a lot of human weaknesses.  Just to name a few: prejudice, spite, pride, grievance, group identity.
    More to come.

    Sunday, July 15, 2018

    Where Have You Been, and Why Are You Here Now?

    We’re back.  It’s years later, and here we are crawling back.  And we’re out of practice.  And weak.  But bear with me, because I think it’s worth pausing a minute to explain, as best I can, why we’ve been gone and why we’re back now.  

    To be fair, I don’t even know if we are back.  I can only speak for myself.  So for now, and with crossed fingers — Mithridates: are you there? — I’ll abandon the first-person plural, lest I appear to be writing as The Royal We.

    Edit made, then: I’ll explain why I’ve been gone and why I’m back now.  Not that it makes a damn bit of difference one way or the other in the Grand Scheme of Things (which, incidentally, is starting to look increasingly like an Actual Grand Scheme), but because maybe this seven-year drift out and back might allow me room to reflect on the progress of said Grand Scheme, through what exactly it’s done to me.
    More...
    For sure, there are lots of personal reasons that figure into why I’ve posted exactly once to this blog in seven years — why, when I visited the home page just a moment ago the URL didn’t even auto-complete, and I had to type the whole damned thing and hit Return to conjure it up.  Let’s run through those quickly:
    • Small children getting older, meaning later bedtimes, and thus fewer opportunities to sit down, in a moment’s tranquility, to gather my thoughts.
    • Small children getting older, meaning more commitments over and above the eight-hour workday.  To take one example, I’ve been coaching a soccer team [pause for laugh track].
    • Recommitment to fitness.  When I’ve had an hour free in the evening, I’ve been out running, or on the elliptical.
    • Peak TV — which, among other things, helps to get me on the elliptical.
    • Exhaustion from the recommitment to fitness, which is all the greater because I am on the cusp of middle age.

    So fine: I have needed to make, and have made, personal choices.  Not much about that is interesting or meaningful outside of the four walls of this room.

    But there are two other factors that might be, and here they are: (1) around 2011 I got hold of my first smartphone, and (2) around the same time, I got lost in social media.  Now it is not my intention here to cast ballots for or against iPhones and Facebook.  I will, however, testify to how these two innovations changed my relationship with the Internet, with the result that I have been reacting to a great many things, but not thoughtfully, and not so very productively, either.

    When Mithridates and I founded this blog, we were genuinely excited about the ability that the Internet and digital media conferred upon us — that we might communicate our ideas directly and immediate to anyone in the world, untroubled by the constraints of time and space and without submission to the preferences of the gatekeepers who had, by virtue of owning printing presses, heretofore held the privilege of deciding What the World Would Read.  This after for my part, I had spent more than a decade performing an elaborate Dance of the Seven Veils to literary agents and fiction publishers who were at the least immune to — and probably nonplussed by — my awkward attempts at seduction.

    We were genuinely beside ourselves with excitement.

    And we generated content.  Content of all kinds: political commentary, legal analysis, film and record reviews, news roundups.  We were, the two of us, trying to produce an online general interest magazine.  And this was in addition to ridiculous serial first-person fiction I was writing and a blog where I published letters I had sent, public and private, back when I wrote letters.

    On a given day I might imagine what would happen if the Flaming Lips’ Yoshimi challenged Triangle Man, They Might Be Giants’ reigning champion, to a fight.  On another day Mithridates would propose an elegant, neutral solution to probably the single most significant political problem facing the country.  And when we weren’t creating, editing, posting, commenting on one another’s posts, we would rush over to review the site analytics, where we would learn that in the past three days, we’d had three visitors.  To the entire site.

    If you were using Google Analytics ten years ago, then you would know that at least back then (can’t say what it does now) it tracked visitors by state of origin.  Notwithstanding that extremely blunt instrument for traffic analysis, we were still able to know exactly who was visiting our site — because our user base was that small and consisted entirely of firsthand acquaintances.

    And while it is true that the internal satisfaction that comes from writing — of framing, rendering, and recording one’s own thoughts for Some Future, Slightly Different Iteration of Self to review later — there comes a time when the Self downloads and installs its latest update, squints into the computer screen, and asks the question:

    Why do I keep doing this, when next to no one (else) is reading it?

    And along comes Facebook.  And Twitter.  And Facebook and Twitter promise some greater satisfaction, in the form of (1) an audience of Friends/ Followers, and (2) a (limited, in the case of FB) promise that some of your content will find its way into a feed for those Friends/ Followers to look at, every night, between 10:30 and 11:30 PM.  And if your Friends/ Followers like what they say, they can hit a button and tell you so.  

    And that’s gratification.  To be sure, it doesn’t mean much: Likes and retweets are unlimited resources.  They never run low, they cost nothing, and so they’re easy to give away.  But if you’re me, and you’ve been sending query letters into The Literary Void for years on end, if you’ve run at various times five blogs that have over fourteen years accumulated more comments from bots than from real people … well, let’s just say a guy can get hooked on Likes.  (Not so much the retweets, in my case, because I’ve never really taken to Twitter like I have to FB.)

    But of course there’s a catch to Like-fishing on Facebook, and it’s this: unless you’re Dan Rather (who somehow, some way, is getting away with it), nobody wants to read a 1000-word post on Facebook.  You have to keep it short.  You have to keep it snappy.  Hell, in most cases, you’re best off simply recirculating something short and snappy that someone else — some distant person you’ll never meet, if he/she even exists at all and isn’t instead and in fact 100 lines of code written to generate and release sentiments optimized for viral distribution — wrote three days ago in a meme square.

    So the trade-off is as follows: you can spend your evening on Facebook, snapping off a dozen or more clever and poignant observations, under three lines please, and actually elicit one of five prefabbed reactions (Like, Love, Sad, Angry, Funny) from a dozen or more people, or you can spend an evening on Blogger, trying against the odds to write a single nuanced, complex, insightful post, secure in the knowledge that no one other than possibly Mithridates will ever know it exists.  And in a seven-year moment of weakness, I chose Facebook, fully realizing that I was adding nothing of value to the discourse —even snapping off, from time to time, clever and poignant FB posts bemoaning the devalued state of FB discourse — but it was enough to know that I was actually participating in a discourse.

    And that’s Facebook, which doesn’t even have character limits.  Don’t even get me started on Twitter, which is distinguishable from Facebook principally in that Facebook’s object is to make you feel good through surface-level interaction with Friends, while Twitter’s is to make you feel terrified and hopeless through surface-level interaction with strangers who hate you and want you to die.  We credited Twitter with destroying tyranny in North Africa, and we were quite surprised when it was later deployed to destroy democracy in Europe and the Americas.  But the truth is that we should have seen this coming.  Building things, and then sustaining what you’ve built, is hard.  It requires nuance and complexity and insight, and just you try to fit that into 140 characters.  Destroying things is easy — it takes comparatively little text to insult people, or to alienate, frighten, or incite them to violence.  Character limits call for incisiveness, the elimination of niceties.  They favor the clever, and clever turns quickly into snarky, and snarky into mean.  You might think that doubling the character limit to 280 would open the door for a more constructive discourse, but by the time that happened, the rules of engagement were too well established in the Twitterverse, and the concrete had set solid.  They had only doubled the number of barrels in the gun.

    Is there a more telling commentary on the state of our social, political, and intellectual discourse than this: that someone actually reduced the phrase “too long; didn’t read” to an acronym — and that, without irony, it caught on?

    But I worry I’m piling on here, when the truth is that by virtue of social media I access more information, from more sources, than I ever would through a web browser or search engine.  A hundred, maybe a thousand links flash by my eyes each day, via FB and Twitter.  That’s great news.  We all, each of us, curate collections of content, and through our relationships on social media and the authority we accrue through those relationships, we are able to share those collections with hundreds of direct contacts, and they proliferate onward to thousands, maybe millions.  And with a cell phone in my hand, I can do this from a moving train, from the beach, from under a desk during this week’s mass shooting.  This is powerful, and actually empowering.

    But for me, in this moment, it’s not enough.  It’s not enough, because while I may be curating that collection of content, reading an article or watching a video, passing it along, attaching an endorsement, a wry comment, or a brief expression of outrage, I’m not creating anything.  And I have been allowing that part of my brain that formulates and expresses ideas to atrophy.  Eight to ten years into the Social Media Evolution, I am emotionally sharp — I LOL with the best of them, and I can rise to anger at a moment’s notice.  I just don’t think anymore, like I used to, and I certainly don’t write anymore.

    That needs to change.  Because the state of the world today has me thinking — a lot — about what’s important.  And what I read and see online, including and especially on FB and Twitter, prompts me to think even more, and (I hope) in a more sophisticated way.  There’s room here, on this blog, for me to write all this down.  It might be that only a handful of people actually find it here, and every one of them takes a brief look, rolls his/her eyes and types TL;DR in the comments.  But I’m back in the mode now where it’s important to me — and it can be enough — just to write it down.


    So here goes.