Skip navigation

Truth is a funny thing to think about, isn’t it?  The idea that something can be True forever? Something about the concept of Forever is disquieting, or majestic, or both. Contemplating the idea of an infinity is like standing at the foot of a mountain and looking up–or like standing at the top and looking down.

It’s a mathematical truism that infinity comes in different sizes. This can seem silly or unnecessarily complicated but really…didn’t we already know that? After all, it’s true for time; there are different lengths of eternity which we navigate from day to day.

And yet, despite this, we don’t think about eternity much in the modern day. Eternity, and eternal Truths, seem to have gone out of vogue. Everything is rapidly changing; politics, fashion, the environment, society, math, biology, technology. We harnessed coal and steam and changed the face of the world, but coal and steam did not last forever. We created nations and kingdoms and armies, but those don’t endure. The borders are constantly redrawn. We thought we understood biology–of humans, and of animals–but often enough to be alarming, what we thought we knew about medicine in 2010 turns out to be exactly the opposite.

And, of course, there’s no room for eternity in daily life. When you spend 16 hours awake and 8 of them working, that only leaves 8 hours for everything else; 2 hours to see family and friends, 2 hours to eat, 1 hour to take care of yourself, 1 hour to exercise…that leaves 2 hours for everything else.

The “onward march of progress” has included, among other things, the slow and inexorable hunt and extermination of infinities. Some eternities remain–mostly in pseudo-religious contexts, things that were once sacred and spiritual which have now been secularized. Ceremonies. Holiday celebrations. Communal gatherings. Other eternities survive in the recesses of personal life. Lovers’ trysts. Family gatherings. Deep conversations that start at 2 in the morning and seem to last forever.

The age of eternal truths has ended as well. At the risk of sounding dramatic, postmodern thought has killed Truth. This is not necessarily a bad thing: many empires were founded upon Truth, and the collars of many prisoners were shackled at its altar. But it remains that there are few True things we can attest to.

In the past one might have said “Well, I know very little about the world, but I know I am a Man, and that tells me what I must do!” We’ve since explored more fully what we assume “A Man” to be, and found it not to our liking. As a matter of fact, many of us kill ourselves trying to fit into that definition. So we reject it as Truth and accept it as a guideline–something to steer by, occasionally. But we take no Truth to replace it. We have overthrown the definition which imprisons us, without bothering to find out what it is we should use in its place.

I don’t mean this to sound like an indictment of feminism. Far from it; the feminist movement has been immensely enriching to the lives of all people (yes, including men). But feminism and critical identity theory and postmodern thought have done their work with the enthusiasm of termites, undermining the structures which oppress the population, and leaving very little accessible to us. What remains is a kind of desolation. What values can we embrace, when we know that the ones we grew up with are problematic? How can we anchor ourselves in the world when the words and ways we interact with it are linked so closely to old violence?

There must (I hope) come a response. The purpose of religious ritual is to put us in touch with Eternity: to remind us of our place in the cosmos, and allow us to take part and take pride in the World. The purpose of Truth is to equip us to understand falsehood; as Chesterton said:

“It is ludicrous to suppose that the more sceptical we are the more we see good in everything. It is clear that the more we are certain what good is, the more we shall see good in everything.”

I think that we have really, as a culture, still not fully recovered from postmodernism. It passed over us like a fever; in its wake it left a great many systems cleaner and clearer, and we are closer to good health…but we’re still shaken and scatterbrained in its wake.

G.K. Chesterton writes his book Heretics on this subject. Gilbert is a journalist, a splendid and punchy writer of editorials, and Heretics is, in short, an editorial review of the 20th-century’s Western social and intellectual traditions. Chesterton finds his Truth–his Eternity–in his Christian faith, something we seem to be moving away from, as a species. I ask then, of the West; with what Truth are we to replace it? Skepticism has torn down many dogmas and pointed out that many powerful men have used them to deceive and beguile us. But in the process they pulled down Truth and Eternity with them, and now we have to start over in figuring out where to go, as individuals, and as a species.

That’s the bad news. The good news is, when everything is meaningless, there’s no bad place to start finding meaning. Find eternity in art. In cleaning or building. In coloring. In conversation. Find Truth in laughter, in good company, in helping your friends. Find meaning in everything; especially in the things that mean something to you.

 

Advertisements

I’m a writer–I think.

I like writing…

…I think.

The problem with writing is that it is an attempt to translate the infinite into the finite. This is a source of endless frustration. I have worlds upon worlds in my head, enough material for an endless number of television series (including all the relevant information for casting, costume, set design, combat choreography, soundtrack, photography, storyboarding, and beat-by-beat scene direction).

I know more than one thing about anthropology. And about philosophy. That sounds silly to say–there are not many fields in the world which we can say we know only one thing about. (Hermeneutics might be one of them: all I know about hermeneutics is that Heidegger critiqued it)(that was a joke. I know more than one thing about hermeneutics).

In fact, I know multiple things about anthropology, to the point where it would take more than twelve pages to write all those things down in their simplest possible form. For any given thing that there is, I can say more things about it than I have room for–and I have an infinite number of ways to say it, ways to attack it, ways to think about the problem.

That’s the infinite.

But I don’t have nine hours to spend typing out an exhaustive, nuanced exploration of every political issue on my facebook wall. No one wants to stand around for a week and listen to a 40-hour lecture on comparative religion in response to the question “So why is Princess Mononoke your favorite Miyazaki movie?” And no one will buy my novel if it is an eighteen-part epic that’s thicker than a human thigh. I’m not Alexandre Dumas, and my novel isn’t The Count of Monte Christo. 

My blog post has to be small enough that you’ll read it all without losing interest (it’s gonna be touch and go, here). My novel has to have a number of pages such that it is ecologically viable to print more than one copy. I can’t go around quoting the entirety of Aristotle’s Nicomachean Ethics every time I want to talk about why it’s hard to do the right thing.

That’s the finite.

I have to take this: (please here imagine a Doctor Strange-style expanding wall montage where I make some grandiose gesture and reveal that we are standing in a massive chamber of knowledge which makes the Library of Alexandria look like a rural-Montana Bookmobile from the 1960s), and fit it into this (please now imagine me holding up a piece of paper approximately large enough for two thousand words, single-spaced 12 point).

How does it fit? Well, quite simply…it doesn’t. It never all makes it onto the page. I never fully say what I mean. You never get all of it. No one gets all of it, in fact, not even me, because eventually I have to eat, or sleep, or do my accounting, and then I can’t keep on thinking about this.

That’s immensely discouraging for me. I pretty regularly have a crisis wherein I wonder “what’s the point of the whole thing?” I can’t even fully articulate my own opinion of Starbucks–how the hell am I supposed to put something as big and nasty and complicated as a novel into the world?  And so, logically, I stop. There’s no point in communicating halfway, I think. No reason to engage with politics. Fruitless to write for any reason other than my own enjoyment.

I spend a few weeks like this, maybe a month or two at the most, before I think to myself: “You know…I can’t get it all out there…but I can get pretty close. And anyway…isn’t that the fun of writing? The ability to, in another man’s better words, fit a universe into a grain of sand? To gesture to the infinities present in everything?

And then, I suck it up, grab a keyboard, and start to write again.

So hi, again.

I’m a writer.

I like writing.

You hear the phrase “constructive criticism” a lot, don’t you? Sometimes people use it in the way that F-22 jets deploy flares against heat-seeking missiles—to draw attention away from them and off to something else. “Don’t be so sensitive; I was just offering constructive criticism.”

I’ve thought about doing a post on criticism and feedback for a while now, so this has been a long time coming. In this next few hundred words, we’ll go over (briefly) the difference between critique and feedback, and which one is more helpful in different situation.  I’ll close by talking about how I give constructive criticism, why that criticism is constructive, and some easy ways to make your own critiques more constructive as well. I don’t know much about critiquing sculpture, but I do know how to critique writing, and that’s what I’ll be drawing on throughout this post.

First of all, like any good philosopher, I’m going to get us clear on our terms before anything fun happens. This is where all the important moves occur in a philosophical text—at the very beginning, when you decide what words mean. When we define the meaning of our terms, we choose what we want to emphasize about them, and what we want to downplay. Define your terms adroitly enough, and you can change the entire interpretation of your text.

(So the next time you have to read something that doesn’t seem quite right, look at the way they introduce their terms, and the way they are defining their words. Chances are, they’re doing some work “off the page”—by changing the definitions of their words in mid-page, or by using a different definition than you are.)

So what, exactly, is criticism? Isn’t it the same thing as feedback?

Well, yes. The way most of us talk about criticism and feedback, you can fairly safely use the two words as meaning the same thing. But the point I want to make in this blog post requires me to separate the two of them, to drill down through the “general” meaning and make a big deal about the subtle difference between the two.

Feedback is a response. It’s also an A/V term (that stands for audiovisual, for those of you who aren’t tech-savvy), more specifically, an audio term. In a recording- tech sense, feedback is what happens when there is an overlap between an input and an output, e.g. (for example) a microphone within range of a speaker. I mention this to help make a point in a few sentences, but I’m going to use a semi-psychological definition for feedback, though, because using that definition helps me to explain why I am right.

The way I always think about feedback is as a response. Audio feedback is a response to something occurring in the environment. In behavioral psychology, feedback is the response the brain gets following an action. So if you are a rat in an experiment, you push a button and get feedback. That feedback can be positive (you get a delicious raisin) or negative (you get a terrifying and painful electric shock).  The feedback then affects your behavior: if you receive positive feedback for pushing the button, you are going to push the button more often. If you get negative feedback for pushing the button, you are going to want to stop pushing the button as quickly as possible.  So when would you give feedback in this sense? Well, when you want to encourage or discourage activity. For example, when you show up late to a party and someone saved you a slice of cake, you thank them, giving them positive feedback which makes them want to perform similar actions in the future. If someone steps on your toe, you give them negative feedback by saying “OUCH THAT HURT YOU SON OF A”, to make them want to think twice before stepping on your toe again.

Criticism is slightly different in this view. If feedback is a response, then in the context of art, feedback is your response to a piece. Responses differ from one person to the next. Some people love Thoreau. Others can’t stand poetry at all. Some people like Tim Burton’s films, while others dislike the dark, gothic atmosphere his works project. Some people like Taylor Swift’s music. Others are wrong.

The point is; feedback is what you think about a piece. If you say “I loved that movie!” then you’re giving feedback. If you say “ugh I hate this song,” you’re giving feedback. That feedback then communicates to whoever is listening, either encouraging them to do something (watch the movie again, or talk about the movie more, or, if you’re lucky enough to be talking to a filmmaker, to inspire them to develop more films like the one you loved) or discouraging them from doing something (such as never playing that song again around you—or entirely avoiding songs in that genre). Feedback is a response to stimulus, which can encourage or discourage the repetition of that stimulus. This is how feedback can make or break a young artist’s interest in creating—if they happen to get only negative feedback—by sheer random change—then they will be sufficiently discouraged to avoid that activity in the future. Conversely, if an artist gets enough positive feedback, they will have the encouragement and reinforcement they need to go on creating.

Criticism is not just whether or not you like the subject matter, or the writing style, or the performance. “True” criticism is a careful analysis, with a totally different purpose. We’ll talk directly about writing now for the sake of simplicity:

Feedback is about whether or not you ever want to read the piece again, regardless of how good it is. Criticism aims to provide the writer with a way to make that piece—or the next one—better.  It is a deliberate evaluation of the successes and failures of the subject, which provides recommendations both on what to change, and what to keep the same.  When you offer criticism, you recognize the subject’s merits and faults equally, measure them against one another, and suggest ways for moving forward.

Now that we’re all on the same page, it’s time to talk about constructive criticism. What does this term mean? Well, remember, we generally talk about criticism and feedback like they were the same thing. We also talk about criticism as if it were negative feedback, which makes things even more confusing. When someone criticizes you, we interpret that as meaning they are discouraging you from doing something—which is, as we know, the definition of negative feedback.  In my opinion, constructive criticism is a way that we try to reclaim the difference between feedback and criticism—to emphasize that what is being provided is intended to make the subject better.

I think most people don’t understand how to give constructive criticism. Often, when people offer “constructive criticism,” they are simply giving negative feedback, discouragement that is badly disguised. Other times, when people provide constructive criticism, they focus solely on the bad things, and not on the good.

I’ve always found that constructive criticism works best—which is to say, it provides the most improvement in the subject of critique—when it incorporates both critique—an analysis of the subject’s faults and virtues, suggestions on how to improve—and positive feedback—encouragement to continue. I’ve written a separate little blurb about how to provide constructive criticism to writers. 

So when you go about your life, listen closely to how people talk about feedback and critique. When they say they’re giving constructive criticism, are they really trying to help you improve? Or are they just giving you negative feedback—discouragement? People use the terms interchangeably, so they might not even notice the difference unless you point it out—although you should also note for yourself, why would you point it out? To help improve the way they interact with others? Or to discourage them from giving feedback?

Being clear on these ideas of feedback and criticism can improve not only your ability to edit other peoples’ work, but also to edit your own—and change the way you interact with people in your life. Knowing whether or not you want to offer criticism or feedback can be empowering—because sometimes, you just want somebody to stop making racist jokes. Sometimes, people do things—and really enjoy things—that they absolutely suck at. Knowing the difference lets you ask yourself; “which one should I use? What do I intend to accomplish?”—which can make you more mindful, more helpful, and more encouraging to the people around you. And who doesn’t want that?

That’s all for today. Thanks for reading, and stay tuned!

I like critiquing other peoples’ written work. It’s fun for me. People write the way they think, and it’s fascinating to see the way other people think. When I critique other peoples’ written work, I have a certain process I go through.  It’s taken me a few years to see that I even have a process—but I do, and I think it’s high time it was written down.

This sequence was developed critiquing essays and short stories. It works most directly for those—but you can extrapolate it to provide constructive criticism for something as long as a novel, or as short as a haiku. Be warned: this will take a lot of reading to do properly. But don’t worry too much: if you’re practiced at it, the whole process won’t take long.

  • Read the piece as a whole. Does the author succeed in saying their part? Did they end too early? Do they make their point halfway through and then keep talking for no reason? Strip away the Read the bare text. Can the text express itself adequately even without the idea in mind?
  • Re-read the piece as a whole. Focus on what the author is trying to say. Get inside their idea. What is the intention of the piece? Go beyond what the actual text says—go to the idea. Can you tell the idea of the piece from what is written there?
  • If you can get the idea from the text—why? That means the author succeeded. Where did they succeed, and how can they do it again next time? If you can’t—why not? If you can’t figure out what the idea of the piece is from the text alone, the author failed somehow. Where? How can they fix it?
  • What did you really like about the piece? Writers are not like numbers—they do not have a single positive or negative value. If you can’t find a single thing you like about the piece, the problem isn’t the writer. Read it over and over again until you know exactly what you love about it.
  • Now go through specifics, one paragraph at a time. What sentences are done really well? Where is the author eloquent, brilliant, flawless? What sentences are done poorly? Where does the train of thought get confusing? Highlight sentences in both of these categories—done well and done poorly.
  • Get out your grammar book. Hunt out the little errors. Deploy the red pen with ruthless glee.

When steps 1 through 6 have been completed, go back to the author. Start with number 4. Tell them the things you really liked, and tell them why. Once you have built that groundwork of positive feedback, move into number 3. If they succeeded in expressing the main idea of the text, congratulate them. That’s the hardest part of writing. If they didn’t, don’t just tell them they failed. Tell them how, and tell them how to fix it. If you don’t know how they can fix it, then your job as critic is to help them figure it out.

Finally, when you’ve taken care of that, you can move on to the small stuff—5 and 6. Explain the good sentences and the bad ones, and offer suggestions for how to fix the bad ones. At this level, it’s ok to not know how to fix the confusing sentences. That problem you can give to the writers—chances are, they’ve already spent time wrestling with that sentence even before you saw it. Finally, bring the grammar book out and fix all the little issues that remain.

And boom. It’s just that easy! You’ve just given constructive criticism to a writer. Now not only do they feel good about their work, they’ll have a good idea of how to improve it–from the global scale, all the way down to individual sentences and Oxford commas.

Dr. Q, I would never have been able to articulate this so eloquently had it not been for your course on Empires. So, as (somewhat humorously) promised: here’s one last short write.

————————————————————

What is the purpose of learning? Why bother taking in mountains of information about American Imperialism? Why read the long, long list of books which cover the depressing things that happened to different brown people?

Well, why do we read the Instructables.com page about “how to tie a sheep’s knot?” Why do we read “Fitness For Dummies?” Why take a late-night dance class? The answer here is easier–we want to learn to do something, or to do something better. To plug this answer right in to the original question: Why do we expose ourselves to the horrors of the past? So we can learn to do something.

Why do we learn statistics? So we can be better academics, able to produce and decipher statistical analyses. So why do we study imperialism, oppression, militarism, neocolonialism? So we can be better people, able to understand and explain systems of oppression.

Cynicism cannot be the answer. If you turn away in cynical despair when faced with the knowledge of a new tragedy–if your reaction to Bad Things In The World is just “Of course. Why do I bother asking? Of course bad things happened,” then the knowledge is wasted. Knowledge is power–the power for change. To make use of that power–the knowledge we have come to possess–we must also embrace the moral imperative that comes with it.  To respond to inhumanity with resignation and cynicism is to allow the world to stagnate. With learning and optimism come the single most dangerous thought in the world, the beginning of any quest for change:

“It doesn’t have to be this way.”

For me, that is the essence of the activist’s passion. That is why I learn. The more I learn about the bad things in the world, the more I understand their subtleties, the more I can see what can change, what could be different, what could be better. And with the knowledge of bad things past, I can see bad things in the present, experience wholly new things, and see how they could be different as well–because with an awareness of history and a cross-cultural education comes, once again, the idea that I always return to in the course of any social activism, the thing that runs first through my head when I see something wrong, something morally outrageous–because this is why it is morally outrageous, that is why it is wrong, because the status quo is not as fixed as cynics think, and because no tragedy is inevitable, because, in short:

Things don’t have to be this way.

I recently had the pleasure of reading this impassioned piece from the Washington Post. By “recently” I mean “today.” I’ve since reread it several times, because the commodification of American colleges and the narrowing of academic fields is an issue very dear to my heart.

As I reread this piece (which, now that I’ve begun this blog post, I confess I find rather uninspiring), I must ask: who is its audience? The author’s entire argument can be summarized by skipping the entire article and reading the last sentence: “Genuine education is not a commodity, it is the awakening of a human being.”

Great. Super. Fantastic. I’m on board. I agree.

Who are you talking to? Are you talking to me? I just finished college. I’m probably going back for more school. My response to a similar article was a little salty, to say the least–but still. Is this directed at future employers? At students? At current college professors?

I think it’s the third one. C. Door Number Three. The soaring rhetoric and its entrenched location within the Washington Post seem to corroborate this first impression. Use of the word “naive” (a word often used to encompass the analytical category of “people who don’t understand academia”) deepens my suspicion.

So what is this article saying, then, if it is directed at other professors? What is the ultimate message being conveyed? I don’t know–I’m not a professor (my only degree is a B.A., which I promise you was hard-earned).

My first and immediate point to make in response to this article is:

A): re: “Genuine education is not a commodity.” True. “Education” in a liberal arts context is learning how to engage with and integrate multiple disciplinary, cultural, and/or epistemological perspectives. In other words, it’s about learning multiple different ways of doing things, in order to be able to apply the appropriate one(s) to all relevant situations. It’s something you can achieve on your own, or with the help of your parents, or with the help of unpaid teachers, or at a state school, or at an Ivy-league university. Education is, as this good professor says, “the discovery that you can use your mind to make your own arguments and even your own contributions to knowledge,” (I’ve made this analogy before); mastering multiple different theoretical perspectives, much like Bruce Lee learned multiple martial arts–to be able to better accomplish your own goals with the most effective method.

HOWEVER

What happens throughout this article is a persistent (and, if I can borrow the word briefly, pernicious) conflation of the process of learning, the undergraduate experience, and the university-as-business. 

EDUCATION, as we’ve previously pointed out, can happen anywhere. An argument can be made that it’s easier to achieve “education” in a college environment. That argument is not occurring here (or, rather, it’s occurring off the page, a dirty trick that I would have expected from a philosopher, not a classics scholar (though the difference is sometimes hard to spot. The easy test? Do they ever mention German names?). The education is what “good” students are after. This is why they are good students–because whether by temperament or economic good fortune, they are highly interested in the self-improvement aspect of a college education, not just its value as a commodity. I was one of those students, because I was extremely lucky financially, and because I am a huge nerd.

The undergraduate experience is a whole other canteen of nematodes, which I’m not going to get into right now, but basically shorthand version: going to a college, participating in classes, learning from professors, etc., all are part and parcel of what makes college so transformative. College can force you out of your comfort zone (if you aren’t EXTREMELY, NEUROTICALLY devoted to remaining within it), and it’s when we’re out of our comfort zone that we grow. However, it’s not the issue at hand.

Number three (again number three! Second one in the article! I wonder if it has any cosmological significance…?): The university-as-business. This is the part that our author seems to be worked up about, which I find troubling for reasons I’ll expand below. But basically, my response: College has become a commodity in the U.S. (and in the wider world, I’m sure, but I am not concerned with that at the moment). As our author acknowledges, college is “replacing high school as the required ticket for a career.” This means that having a college education makes you stand out (even at my workplace, my co-workers make jokes about my “fancy college degree”). Your odds of being employed (and employed well) skyrocket. Success in college pretty heavily indicates success later in life.

Now, unlike the last article I blasted on my blog, I don’t entirely reject the author’s point here. The commodification of education is a problem (not just because of the way in which it bars the doors to the lower and middle classes). Some students do treat their college purely as a business, feeling entitled to a degree with no effort or challenge on their part. The government sees colleges as businesses, and so does not offer them any great degree (ha) of support.

Ultimately, colleges have adapted. The college I attended occupied an uncertain middle ground between being a business and a place of education. There was a tension between the institution’s bottom line and their values. On the student side, there was similar tension–we sought to balance our role as students with our newfound power as customers. It gives students an unprecedented degree of power, to acknowledge that they are customers. We have to figure out what that power MEANS, all of us, students and professors.

Hence, my concern. Rather than acknowledging the changing face of education (and trying to offer some direction going forward), this article seems to deny it. Education is not about money, the author says. It’s about the students’ engagement with the material. Well…that’s not true. Not any more.

Education is about money. It is inextricably, inalienably, unavoidably about money. Even when you’re talking about student engagement–who are the students who can afford to be engaged? The ones who don’t work an extra 30 hours each week to pay for school? The ones who don’t have to take care of children? The ones who could afford to go to school in the first place? The ones who could buy the textbooks–the list goes on. The point is: “student engagement” is not the boogieman to pin the problems on. Like it or not, the problems are more complex than that. We can’t escape the complexities of the present-day university by just demanding that students pay more attention.

That is where my problem with this piece comes in. I don’t think it’s wrong…I just think it’s not asking the right question. The question we need to ask ourselves is: What does it mean that students are now customers? What new pressures does that place on faculty? On students? On administration? What power does this give all of those involved in higher education? When a bad grade or a faculty grudge can make or break a students’ future, how do we negotiate these structures? And when a bad review or an angry parent can ruin a professor’s future, how do we negotiate these structures? What about higher education needs to change? And what needs to stay the same?

And for the love of GOD, can we not commit the fallacy of equivocation so damn much? Jeez, people. More of you need to take philosophy classes.

I am not sure why the term “fellow traveler” came to mind when I was writing this post. I think, in my head, it had a much different emotional undertone than its actual historical context suggests. Despite its name, this will not be a post about the legacy of communism in the late 1940s, nor about the Russian intellectual movement following the revolution of 1917 (sorry, Helen).

Rather, this blog post is about a particular kind of emotional connection that I have begun to notice as having a pattern. This blog post is about the moment when you connect with someone you recognize as one of your “tribe.”  Not just when someone recognizes the obscure T-shirt you’re wearing, or when your TV-show ringtone turns someone’s head–but when you exchange a few words with someone and find that, somehow, you understand them, and they understand you.

An example of this is an interaction I had at work the other day (side note: “The Other Day” is another of my favorite expressions, a verbal [citation challenge] which nods to humans’ nonlinear, irregular perception of the passage of time–but I’ll write another post about that later). AS I WAS SAYING:

At the place I work, we are required to wear aprons (huge denim aprons which either look awful or adorable depending on whether or not you ask my girlfriend) and nametags. My name tag says my name, which is one of the most common names in the Western world (Michel, Miguel, Micky, Michael, Michelle, Mike, Mikael, Michal, Michele, etc.). I am ringing out a woman’s purchases when a man comes up, looks at my name tag, and addresses me.

“It means God-Like,” he says, “You know. The name Michael.”

I know what he means immediately. The conventional etymology of the name Michael is, originally, a question, posed by an angel to a devil: “Who is like God?” What the man is doing is interpreting the name without a question mark–a little conceit which I am sure many Michels have indulged over the years–changing the meaning from “Who is like God?” to: “[subject] who is like God.”

I smile, and I reply “Yes. Quis ut deus, in the Latin, meaning “who is like God?” It’s in the Bible.”

He points at me, and smiles back, and in that moment we understand a great many things about each other, all at once, with no words spoken. And then he leaves.

I see this happen a great deal with elderly women. They pass each other by, pause, and smile at one another. What are they thinking? I have no idea. I am not an old woman, and it’s highly unlikely that I will ever be one. I also see it with nerds. And I’m not just talking people who watched Game Of Thrones. It’s the moment that happens when you ask someone “Who’s your favorite character?” and they reply with the correct answer: “Arya Stark.” You smile at one another. Perhaps you exchange words but it’s not the words that are important–its the moment when you understand that here is a person whose values align with yours. Here is a validation of your beliefs, in front of you, in the flesh.

It’s akin to the feeling when you see a familiar face in a crowd of strangers, or find a friendly gesture amid hostility (or even amid indifference). The feeling when you make a connection that you could not have anticipated, but which touches some deep chord, and shakes you to the core.

What is it that makes this moment so powerful? It doesn’t just apply to interpersonal connections. I have had moments like this with a song. Or a physical object. Or an animal. A moment of discovery. A small-scale miracle. We discover outside, in the world, something which we had previously assumed existed only in our heart–a piece of soul–and we say, I know you. I have met you before. (TITLE DROP) We are fellow travelers, you and I. The same feeling is present, according to archetypal psychology, in an Anima-figure dream–a dream wherein we meet a mysterious individual (usually a young woman but not necessarily) with whom we connect, and converse, and are haunted after waking by the idea that we know their face…from somewhere. 

And like everything strange, everything mystical, everything in the world that I can’t quite explain, I find myself asking the same question:

What does it mean?

That’s all for now, readers.

I would like to apologize in advance for this post. This lecture was originally delivered to the empty air at my workplace, early in the morning before we opened, as I worked at restocking the toy department. In its original form, the lecture was a masterpiece—a gem of rhetorical brilliance which I know I will not soon match. However, the workday that followed wiped out all but the roughest memory of my eloquence, and so what remains for you now is a pale imitation of the communiqué which should rightfully have been displayed here.

But disclaimers notwithstanding:

This rant was inspired by a throwaway line in James Cameron’s Avatar, a line which I may have remembered entirely incorrectly as being: “Good science is good observation.” Whether or not any character spoke these words, they became stuck in my head, and I couldn’t get it out without a ten-foot polemic.

It started me thinking (not surprisingly) about “theory” and observation.
“Theory” is a word I throw around a lot with some of my peers and mentors. We play fast and loose with it because we have a good sense of what “theory” is supposed to be. But when it comes time to explain “theory” for the very first time, to a wide-eyed audience (be they fifteen-year-old brothers or sleep-deprived undergraduates), the best metaphor I have so far found is the Theory as Lens.

Theory is like a pair of tinted glasses—or, more accurately, like the colored lenses in those glasses. It highlights certain shades of whatever it is you look at, and makes everything look somewhat alike. That lets us compare those things across something approaching the same dimension. For example, a theory of gravity lets us compare physical interactions across the same dimension—across a single, monochromatic dimension.

Now, there are issues with this metaphor—most prominently that this metaphor entails the idea that we are using theory to look at something. Really, a theory is an image of an object. The key points of the theory correspond to key points in the reality it represents—or, to put that another way; “That the elements of the picture are combined with one another in a definite way, represents that the things are so combined with one another.”  “Theory” is a representation of reality—so instead of glasses with colored lenses, briefly imagine a Polaroid with colored lenses. Isn’t that a weird image? This is why we went with the glasses thing.

But there are also real advantages to this theory, and one of my favorite points is this: If theory is like a lens, which highlights certain features of whatever we are looking at, then it becomes intuitive that looking at the same object with the same lens gives us no new information. To gain new information, we must make a change, either internally, in the way we approach the lens, or instrumentally, in the kind of lens we use. In other words, you can only learn so much by looking through one lens. Like looking at a multi-colored picture through a mono-color lens, the world has more information than can be parsed by a single theory. To put it in a punchy philosophical one-liner: Complex phenomena require a theoretical complex.

In the effort to investigate complicated situations, we have to use multiple theoretical perspectives. Jung employed “modern” psychology, Gnostic text, and echoes of the German Romantic tradition in pursuit of a theory of the soul. James Hillman, following him, employs Jung, comparative religion, and biographic methods while seeking the same goal. The classic French sociologists integrate philosophy, sociological theory, public statistics, and historical methods to investigate the patterns of organization and interaction between humans on the individual level and above.

So a theory is like a martial art—it’s good to master one, but you become Bruce Lee if you master all of them.

Okay, that was a weird way of putting it. More accurately:

Any one theory can be an extremely powerful way of representing events in the world. Theories can accentuate the shared factors in areas which might appear vastly different to “the naked eye,” letting us examine, for example, human silence and conspiracy on the level of friend groups and on the level of entire cultures.

But a single theory can only do so much work. And so the point becomes a little teleological—which theories you deploy (and how many) depends on what you want to do. For extremely basic physics calculations, Newton’s laws are good enough to get by. For higher-level work, you might want to also include theories on wind resistance, breaking points,  aerodynamics, and even particle interactions. No one theory is going to get a rocket to the moon, and no one theoretical perspective is going to create a discipline.  So for some tasks, a single theory will get you far. But for others…you need to get a little more creative.

This is just the beginning. More on theory and disciplinary boundaries will follow.

Stay tuned for more semi-weekly rants about theory, politics, and whatever action/sci-fi movie I was watching last night!

There are a whole host of things that interest me that are the subject of no great focus for academia.

Synchronicity is one of them—meaningful coincidence. A concept introduced by none other than MY MAIN MAN CARL, who was perhaps the king of weird not-quite-science psychology in the early 1900s. (the crown passed eventually to Hillman, who also is a total whackjob of a theorist and whose works I treasure) I rather like synchronicity because it makes it possible to talk about many things which we all notice. The song that comes on the radio at just the right time (a song chosen by an intern we don’t know at a radio station miles away with no reference to our lives whatsoever). Finding the exact right job opportunity just when you decide to change your life. Meeting the perfect person at just the right time in just the right place.

It’s possible to speak synchronicity in the context of other theoretical frameworks. Priming, for example, offers a comforting mechanistic explanation, viewing the human brain as a data-processing machine which utilizes different schema at different times. Looking at it this way, we wear particular cognitive “lenses”, which highlight or downplay details in the world around us (this is how immature people can always pick out a “420” or “69” in a credit card or phone number). When we are “primed” for something (like a gun), we spot it more easily, remain open to it, and are affected by it more dramatically.

Going way over to the other side, we can also talk about Daoism. Like water, if we follow the flow of our life mindfully, without struggling or fighting, then that path will eventually take us to wherever we need to go. When we remain open to the universe, the universe responds—all we have to do is assume the proper place, and the Ten Thousand Things will arrange themselves around us.

(I’ll note here that I’m far from being a scholar of Daoism, and any resemblance between the paragraph above and the actual Daodejing is purely the result of happy mischance)

Where do I go if I want to study that? Where do I go if I want to understand how it is that the “right moment” can be the one we wait for, or can be the one we make happen? Where do I study if I want to explore the many coincidences that make up each human life?  Which university can help me investigate the world around us, and the way that world impacts and is impacted by each person in it?

Sonder is a word I use occasionally—a neologism signifying the moment of realization that your own rich life is surrounded by many equally rich lives, most of which will touch yours only for an instant, only from a distance, as a passing car on the freeway or a plane crossing overhead—but all of which are as intense, complex, and laden with private thoughts and significations.

I turned to Anthropology when I was an undergrad. I thought that by studying the people who study human lives, I could develop a greater insight into human lives. Instead I learned a great deal about the study of human lives. Most of my lessons in humanity have come from extracurricular activities—from long nights spent in dormitory doorways, an endless string of stresses, or soft voices in the dark.

Now I’m finished with college. My (dearly bought) degree sits on the wall of my bedroom in a $12 frame, above my antique Don Quixote headboard and just to the right of my autographed Smaug (a treasure in its own right, one of the Hildebrandts’ great works). I’ve learned a lot about books—my head is full of authors, mostly French, mostly dead—and from them I can trace out most of the index of ideas in my brain.  But I want to learn more.

I don’t think it’s a kind of learning I can get from college. Undergraduate programs in this nation are undergoing rapid decay, with too many factors to be easily listed. The cost of education still rises, even as government programs are shoehorned into place. The “debate” over “political correctness” rages in the academic sphere. Sexual assault plagues universities across the world (but especially in the U.S.). The academic disciplines have turned inward, becoming trade schools for their own practice, teaching students how to teach more students the things that they were taught as students. What counts as academic practice is changing—there is a Right Way to study, and a Wrong Way, and if you choose the Wrong Way, well, how can you really Know anything meaningful? How can you “situate yourself in the literature?”

Don’t get me wrong—college is critical. I wish everyone had the opportunity to attend a university like the one I did. Even in its present condition, the process of going to school, learning a discipline, and graduating teaches you an immense amount about yourself. Many dedicated people work long (underpaid) hours to ensure the continuing quality of education.

I learned a lot. But I wonder now if ultimately, I learned what I wanted.  And I don’t think so. I think learning to understand the world requires a different kind of study—and I want to undertake it. I want to know how it is that coincidences happen—not the simple statistics, but what it means. I want to know why gods of Reason are often sky-gods or sun-gods—what makes Reason the same as Light? Why do we think in analogies and metaphors? Why are we so radically different from one another—physically, emotionally, intellectually, culturally—even if we are so much the same? Why do people who graph onto the same areas of the Myers-Briggs share similar facial structures? How is it we can get a “good feeling” off someone after we have only known them for two minutes? Why do patriarchal systems arise? Why does music exist?

The answers are not purely psychology, or philosophy, or anthropology, or sociology, or statistics, or biology.  But these are the old questions. As a teacher of mine (not just a professor, but a teacher) would say: “These are the big questions.”

I can’t lie; I like big questions. This puts me at an unfortunate impasse with most disciplines—when you practice analytic philosophy, or study depth psychology, in order to survive you must narrow your questions. The scope must shrink to one context, one time, one group, one place, one event.

Case studies are good—I don’t mind looking at one person at a time if I have to (though 7 billion is a lot to go through)—but the door has to swing both ways.

What point is there to the pursuit of the finite, if we cannot use it in contemplation of the Infinite?

Isn’t that the point? We are finite, the universe infinite. We use what tools we have to understand what we don’t know. We can never know the Infinite–it is, somewhat by definition, beyond our reach. But as anyone who’s been in love can tell you, it’s not necessary to fully know an infinity in order to understand it.

I’ll close with the only question I ever really ask–the question I mutter to myself under my breath, speak aloud to random strangers, the question that flits through my head at each new experience:

What does it mean?

Article: “What’s the Point of a Professor?” by Matt Bauerlein

http://www.nytimes.com/2015/05/10/opinion/sunday/whats-the-point-of-a-professor.html

Well, he’s not wrong. These facts are not false. I even agree with the final paragraph, although it seems to come out of nowhere with terrifying speed (the thing almost took my head off!).  My quarrel with this, rather, is (as one might expect from the man who wrote a book about us called “The Dumbest Generation) he’s not looking at the big picture. Here’s where I’d make a joke about English majors, but the thing is, writers are supposed to be well read. You have to know the world to write it well. And I haven’t read this guy’s writing, but he seems to be leaving something out.

Well, several somethings.

Actually, a near-infinite array of somethings, subtly interconnected and inter-relationally constitutive.

A lot of things. What I’m saying is THIS GUY IS MISSING A LOT OF THINGS.

AMONG THEM, how we were raised. Maybe, Professor Bauerlein, you went to college purely because you wanted to learn. Perhaps for you, developing relationships with professors was an entirely intellectual act, devoid of all ulterior motives. (I doubt either of these are the case; but it’s possible).  If so, I envy you, because I never had that luxury. I grew up in a population-dense, competition-heavy world, where to get any job worth having, you needed at least a college degree. To get a job that I wanted—talking about animals, ecosystems, human interaction, psychology, writing, etc.—I knew I’d need another degree after that, or else an extreme measure of luck or intelligence.

I grew up in late capitalism. I grew up knowing that, wherever I was going to want to go, there would be thirty other people there before me, most of them smarter and better qualified, and to keep up with them I would have to be perfect, a glittering diamond of 3.5s and recommendation letters.  I grew up knowing that, no matter how hard I looked myself, I was more likely to get a job from someone I knew personally, because today’s job market is insane, and I have been taught from an early age that if I don’t get mine quick, someone else is going to get it.

That’s hard to pass by. And while I have learned a great deal in college, and I have pushed myself to learn more, and I’ve built great and rewarding relationships with my professors—many students don’t or cant do that. Professor Baurlein points out with a trembling voice that As now constitute 43 percent of all grades. I applaud my peers, if that is true, because they deserve applause. I have seen the world they inhabit—a continual battle with anxiety and scheduling, trying to fit in five courses, all A’s, and work a job (because otherwise they can’t afford the tuition, even with the massive loans, because the price of education has skyrocketed; the cost of college increased by at least 20 percent for private schools, 40 percent for public, from 2011-2012), and maybe an internship, and oh, maybe they can find something to do for the summer, and maybe, if they’re lucky, spend time with friends.

I’ve seen that kill people.

Professor Bauerlein, what your piece seems to be missing, in my humble opinion, is an acknowledgement of the social conditions that have brought us to this place. The lack of relationship-building you describe is only part of the problem—and, frankly, something that is very low on the list of “Things that are Messed Up in the American Economy and Educational System.”  In other words (and again—not surprising from someone who wrote a book on how dumb we are), you don’t know what it’s like.

Further, while I don’t see a word of blame explicitly spoken, I don’t like your tone. “When College is more about career than ideas,” you delicately state. Your word choices throughout the article are a subtle, breathy indictment—oh, these children, they haven’t quite got the point, have they? Don’t they know what they’re doing wrong? Of course I understand, after all, ha ha, I was young once, but still—couldn’t someone show them the right way? “When paycheck matters more than wisdom”—do you know why we count a paycheck higher than wisdom? Do you know why Millenials are overwhelmingly concerned with their financial security? Why 63% of us don’t have a credit card? Why we’re so obsessed with our paycheck?

We saw the world burn, Professor.  For those of us who were old enough to see 9/11, that is almost a literal statement. We saw the recession hit hard, for our parents, for our friends, for our family.  We cut off the big, grand, expensive Christmases. We might not have lost our food, or our water, or our power—but we lost other things. The things you remember. Most of us have always known we were going into a world that is ready to financially dismember us. For most of us, it’s a rush to find a job that will pay you enough to stay afloat (good luck with that on the U.S. minimum wage!), until you can find a job more fulfilling, to make yourself secure, to do well, to be able to retire someday, maybe.

Many of us traded our childhood for the fantasy of financial freedom. Is it so surprising we want the world to deliver? And those 43% A grades I mentioned—I’m glad for that. Because the world expects us to be A students. There are so many of us, and we are coming fast, seeking the slim jobs that exist. We’re smart, and driven, and we travel in packs, wrapped up in 3.5s and 4.0s and flawless test scores, because that’s the minimum requirement. That’s what you need to even be considered—or at least, that’s what we were taught. We stress, slave, and cry over our GPAs. We need those As, the grades that professors naively claim are still pure representations of our skill. False. In our age of information and high educational requirements for employment, we come to colleges–pay colleges–for a commodity. That commodity is a good GPA and a Bachelor’s, and in many cases, our whole future is tied to that degree. Professors are no longer “just” teachers (if they ever were)–you are the gateway to our career. Our life.

So, Professor Bauerlein, I think you should read some history books. And some economics books. Since you’ve apparently written a book looking at generational change, might I suggest you try to explore the factors that play into it? The increasing globalization of our economies. The deep cuts to education and assaults on social support, a time-honored tradition dating back to Reagan.  The rising cost of college (average costs around $30,000 per year for public schools, $40,000 per year for private schools). The depressed (recessed?) economy, increasing pollution, international controversy, social issues–We millenials didn’t just wake up some morning 2002-2011 and say “I’m about to be a freshman—I think I’ll perniciously alter the face of college education!”

I agree with your last point—professors should be mentors. The system should be different. Certainly things will change once we millenials hit the top of the ladder. Professors should take it upon themselves to build relationships with their students. Perhaps, while you’re at it, you should talk to us about something other than our classes. (after all, “There’s so much more.”) Ask us if we’re worried about finding a job (hint: they are, you work in an English department, in an aesthetically bankrupt nation that places writing and the arts somewhere below “pizza delivery person” on the salary ladder). Ask us if we’re worried about our grades. Ask us if we’re losing sleep over the future.

And maybe, just maybe, ask us how you can help?