Consistency, anyone?

I don’t usually write about political topics, but a recurring theme has lately been… well, recurring, and I’m curious what others think.

I’m wondering why seemingly no one making political arguments is able to do so with any logical consistency.

It’s a meta-issue, really. I don’t really care whether you’re for the war or against it, pro-life or pro-choice, or whether your tea goes with “tax” or with “party.”

I’d just like you to have a little consistency when you launch your next barrage of political speech!

Is it really too much to ask?

Personally, in my second year as a law student, I’ve already become well trained at arguing all sides of an issue. (If you thought there were only two sides to an issue, there’s your first mistake.) But what I’ve also gotten good at (perhaps unfortunately) is identifying logical inconsistencies in arguments. And as those close to me would tell you if you could find them to ask, I tend to get more hung up on such inconsistencies than I do on the subject matter at hand.

Maybe that makes me weird.

Before going further, let me give you a few brief, concrete examples. In the political arena, neither side is inconsistency-free, so we can take our pick:

On the liberal side of the aisle, we could talk about affirmative action. Categorizations based on race are decried as bigoted, and it is woefully politically incorrect to speak of different groups of people being better at certain  activities—or, to take a controversial example, differing crime rates in different racial or ethnic communities. Yet when it comes time to talk about hiring, firing, or educating (and let’s not forget about both private and federal aid), then it’s time to make decisions about people (or groups of people) based on race.

Little bit inconsistent?

Don’t worry; I won’t spare the conservatives. One inconsistency up their sleeve is their mixed relationship with government regulation. In talking to most conservatives, you’d quickly get the impression that the best place for government is far, far away. Regulation is bad; the government is only here to provide for the national defense and perhaps the interstate highway system, and it certainly shouldn’t tell us what we can and can’t do in our private lives. Oh—but it should regulate the behaviors we don’t like: abortion, pornography distribution, marijuana consumption, and other things that (mostly) only liberals do.

Bit of a double standard?

My point here is not really to pick on either side. I’m sure someone with more knowledge (or, at the very least, fervor) than me could explain why these points, which I see as inconsistent, are not really inconsistent at all.

But aren’t they?

The point becomes especially clear when you carefully examine the way someone addresses two entirely distinct issues. If you look closely, you’ll notice that while they may defend a position in one situation on the basis of some dearly-held principle, they’ll refuse to apply this same beloved principle in a separate but related context.

Take the two examples above, for instance. The deeply important principle in the first scenario is that it’s wrong to judge people on the basis of race. To me, that sounds like a pretty good principle. So why not apply it when it comes to benefits as well as burdens? Similarly, whether you agree with it or not, the principle at work in the second scenario is that the federal government should intrude as little as possible in the private lives and decisions of citizens. Again, this is a tenable principle. So why throw it out the window when it comes to particular personal decisions?

To me, when you say, “This is a really solid principle… except here, here, and here,” your principled stand is in fact somewhat shaky—or, to splice a metaphor, rings very hollow indeed.

I had a conversation with someone today about two separate issues occupying the news. One, of which most of the nation is by now aware, is the construction of an Islamic community center near the site of the 9/11 attacks. The other, less well known, relates to a very large amount of funding—roughly $650,000—accepted by Harvard’s sociology department in recent weeks. Harvard’s acceptance of the donation was controversial because the man in whose honor it was collected and donated (Mr. Martin Peretz) had made some very unsavory remarks about Muslims. If you’re unfamiliar with the latter situation, The Atlantic has an informative article on it here.

My interlocutor’s position was that it was wrong of Harvard to accept the money from those honoring the Muslim-basher and it was wrong for anyone to speak out against the construction of the Islamic community in New York.

Digging deeper, I asked this individual whether he thought that the donors behind the New York project had the legal right to build the community center wherever they wanted. Unmistakably yes, he said. (I agreed.) Then I asked him whether Mr. Peretz had the legal right to say whatever he wanted about a particular group of people. Yes, he said—but with some reluctance.

I reminded him of a little thing we Americans like to call the First Amendment, and then he seemed more sure of his answer. Yes, he said, Mr. Peretz had the legal right to say whatever he wanted; that’s freedom of speech.

Just to round out the trilogy, I asked him whether Harvard had the legal right to accept the money. “Of course,” he said. “It’s a donation. There’s nothing illegal about it.” But again he seemed uncomfortable with his answer.

He explained that while Mr. Peretz had the legal right to disparage Muslims thanks to the First Amendment, it was neither tasteful nor prudent to do so. Similarly, he said, while Harvard certainly had the legal right to accept a donation collected in honor of Mr. Peretz, both prudence and good taste—not to mention respect for a large community of people—should have cautioned Harvard from doing so.

“Ah,” I said. “So let’s determine the operating principle here. You agree that even when someone has the full legal right to do something, there may be circumstances when they should defer to overwhelming concerns of prudence, good taste, and respect?”

“Yes,” he said.

“So what about our other situation—the one in New York? Is it possible that the Islamic community center, despite fully existing legal rights, should be built somewhere else due to just those concerns?”

“Well,” he said… [and in that hesitation all consistency was abandoned]

“… that’s different.”


Leave a comment

Filed under Uncategorized

The lanyard effect

Yesterday was freshman move-in day on the Harvard campus, and there could have been no doubt in anyone’s mind that this was the case.

There were all the usual signs. The queue of overloaded family vehicles might have been the first indicator, coiling all the way around Harvard Yard as though attempting to strangle buildings built centuries before automobiles were even thought possible and trees planted when green was just green, not “the new Crimson,” as Harvard is now so politically correct to point out.

The cars belong unmistakably to the parents of Harvard students. Nary a Chevrolet or Ford dares to show its face on move-in day; nearly all the queuing cars are Mercedes, BMWs, and Volvos, almost as if the vehicles are as ashamed of America as many of the students purport to be here. If a parking lot could aspire to the status of the final clubs which many of these eager young firstyears will “punch” within a year’s time, Harvard Yard on move-in day would be it.

Then there are the parents. There are a variety of these—some dragging, others being dragged along. This, no doubt, is not much different from move-in day on any other college campus. Students have reached varying degrees of assertiveness by the time they are ready to begin college, while parents have (not always correspondingly) resigned themselves to their changed role with varying degrees of success. Some students are more than happy to have their parents drop them off on the curb, while others quiver at the first sound of the German motor purring away into the distance. The same is true of the parents who practically move in with their students for the first week or so of the experience. The only thing more odd than observing a student trying to run away from his parents on Day 5 of the orientation program is watching a student still happily sauntering everywhere with hers.

Meanwhile, in Harvard Square, businesses of every flavor switch into high gear. Banks set up tables with (apparently) enticing balloons, offering everything from free money in the checking account to a chance to win a trip to Maui or a fantastic new computer. The local Starbucks (plural) ramp up production to meet the needs of frazzled parents who would probably much rather  be drinking somewhere (and something) else. Above all, the various competing Harvard tours hawk their wares at high volume, with very little regard for historical accuracy or, to put it more bluntly, the truth. Unsuspecting parents blithely drink in the massaged fiction offered by the well-instructed tour guides at the rate of “just” $25 a head, which also just happens to be the “low, low” amount needed to open a checking account at the local credit union.

And so we come full circle.

Above and beyond all of these familiar move-in signals, however, there are… The Lanyards.

Anyone who has been at Harvard for at least one school year knows that the surest sign of an incoming freshman is The Lanyard, a small cloth necklace used to hold the key to one’s room and intended (as freshmen apparently presume) to be worn around the neck. I suppose the logic is unassailable: if you clip your room key (and perhaps also your university ID) to your neck, you can’t lose them. (My own first year of college demonstrated otherwise, but of course no one ever listens to me…)

The Lanyard is just absolutely signature Harvard College Freshman. No one else would be caught dead in one, but the freshmen—all 1600 of them—traipse to and fro on campus, fully lanyard-ed and ready to do battle with any college dorm door as long as it happens to be their own. (Which, in a week when many of them are also being exposed to alcohol for the first time in their lives, is not always a total guarantee.)

To be fair, the freshmen have been warned. For example, the Harvard Crimson (motto: “Writing for a fairly mediocre college newspaper makes you a vastly superior human being, and by the way you don’t need to check facts, grammar, or other stuff too good either”) counsels as follows:

After you get your photo taken, an employee of the University will likely hand you a lanyard on which you can put your ID and room key. Don’t do this. It’s important not to lose these items, but at the ripe age of 18, one should be able to do so without the help of a collar. (“Freshman Week: Accepting Your Awkwardness,” 8-20-2009)

And for once, the Crimson is right. Sooner or later, all the freshmen figure it out. By October, some of them have already stopped wearing The Lanyard altogether, while others have covertly transitioned to a beltloop-and-pocket arrangement. By the time everyone returns from winter break, there are few lanyards to be seen, and these veteran freshmen are now one step away from mocking incoming newbies for the same flagrant fashion offense the following autumn.

What interests me about all of this is the mob mentality it evinces. At one point early in the academic year, it’s cool to wear The Lanyard. Everyone is doing it, and no one wants to be the odd one out. Much later in the year, the cool thing to do is not wearing The Lanyard, but making fun of those who do. In some sense, it’s almost a rite of passage.

I tried to think back to my own first year at Harvard College, but I don’t really recall for how long I wore The Lanyard. Certainly, there is no photographic evidence of my having ever done so, but that may be because I made sure to destroy it all. Honestly, though, I don’t think it was for more than a week. I just didn’t like how it looked.

At the same time, though, the story didn’t end there. At some point when I was packing up at the end of that first year, I must have saved my lanyard, because the other day when I was unpacking—now as a resident tutor in charge of students who have already made it through their first year, and beyond—I came across it once again. If nothing else, it was a friendly reminder that I was once younger and (much) more naïve than I am today.

Of course, I now lose my keys quite a bit more, too.

Leave a comment

Filed under Uncategorized

If these walls could talk

Moving in to a new place is always a complicated experience. Between the frenzy of moving and lifting and shifting boxes, the complex questions of furniture placement, and the frayed exhaustion that begins to set in at the mere thought of the sheer amount of work to be done, it’s not an experience I ever enjoy. And I hate packing—probably because everything is such a mess for so long before everything is finally packed (or unpacked), but maybe because I just don’t like too much change.

Still, there’s something exciting about the whole process. Whether it’s an office, a classroom, or an apartment, moving to a new place is an opportunity to make it yours—to put your own personal touch on a space, large or small, that was formerly generic or (worse) belonged to someone else.

Then again, while it can be rewarding to make your mark in a new place, it often turns out that a prior owner has left marks of his or her own—some of them indelible.

Already in the past few days, I have made a few interesting discoveries in my own new home. For instance, in the sitting room, there is a very long, very deep gash in the wood floor. I’m really not sure what could have done it, but it looks dark and old—as though it has been varnished and revarnished since sometime during the Eisenhower administration. Perhaps someone wearing particularly penetrating heels engaged in a seriously hazardous dance move in the midst of a debaucherous party fifty years ago. Or perhaps someone simply dragged a heavy trunk with sharp metal corners across the floor.

Then, in the bathroom, there are some curious marks near the toilet. Never fear—no lewd or unsavory remarks here. It’s just that the toilet is quite close to the facing wall, and at roughly knee level (seated, of course) there are a number of what appear to be stray marks from highlighters and pens. It must be that someone holed up and did a great deal of studying there. And what can I say? I suppose we’ve all had days like that.

Whenever you find yourself in a space that was previously used, take the time to look for subtle physical evidence of the past. Even leftover nails in the wall can make you wonder what hung there before. And while many physical marks may be fairly easily interpreted, some are more difficult to fathom.

For example, take the Q-tip I found this morning, wedged into a random corner of the bathroom window, inside the window frame. I have no earthly idea how that got there—or why. Did previous residents engage in a furious Q-tip-throwing extravaganza and one of them just happened to land inside the windowsill? Perhaps I’ll find more in random places throughout the year. Or perhaps a previous resident was just extraordinarily messy and things were always turning up in odd places. I’m sure there’s no extremely interesting reason for the Q-tip’s sudden appearance on my windowsill this morning, but imagining is always fun. Who knows? Maybe one day it was just raining cats and Q-tips.

If these walls could talk, who knows what kind of stories they might tell?

Given the unfortunate fact that the walls where I live happen to be very thin, I’m already anticipating that they’ll do more talking than I’d like. I expect that I’ll end up learning quite a bit about my neighbors just by living next door to them. But that’s the present; I’m talking about the past.

Living in a fairly old building as I do (and long have), it’s almost impossible to escape the desire to know what went on in here before I arrived. The students who previously inhabited my college dorm rooms went on to all walks of life. Some vanished into obscurity, while some became household names. When I was a firstyear college student, one of my good friends lived in a room formerly occupied by a young Robert Lincoln, son of the president.

Of course, even at that age, Lincoln was important because of his father, but Thoreau and Emerson, who lived in the next building over, were not. Those were men who made a name for themselves later, and the places they lived did not become noteworthy until long after they had left.

This is always a great curiosity to me: how can something acquire a significance based on subsequent events that occurred entirely elsewhere? Why do music enthusiasts flock frenziedly to the suite of rooms in which Mozart was born? For so many years, there was nothing special about that room, but renewed appreciations for Mozart’s work made it so.

Meanwhile, the Upper West Side apartment where Barack Obama lived while a student at Columbia University was going for a cool $1,900 per month as of June 2010—really not a bad price for a one-bedroom in New York. (Maybe the rent will rise or fall with Mr. Obama’s job approval rating, though only time will tell.)

I suppose this is ultimately a complex philosophical question of why cultures ascribe importance to certain sites and locations based on historical developments, and the same could (and has) been wondered about art. The average price of a toothbrush might be $1.49, but that number would quickly skyrocket if we were to learn that the toothbrush belonged to Picasso. And yet the object remains the same. So, too, with a painting: until it is determined to be the work of “someone,” it is worth little, and art forgery remains a serious crime. Oddly, while nothing about the painting changes other than its history, the documented touch of the master drastically changes its worth.

And so, before I launch myself into the stratosphere of meandering musings, I’ll leave you with this: while few of us will leave our marks on history to the extent of a Lincoln or a Picasso, all of us leave traces of our presence everywhere we go. I may not know who lived here before me, but I know that someone did.

In light of this, ask yourself: if these walls really could talk, what would they say about you?

Leave a comment

Filed under Uncategorized

How it ends

What if we knew how things would end before they began?

What if every time we opened a book, we started with the last page? Or if every time we ordered something from the menu, we knew how it would taste?

Would we want that?

The last line of a song by Devotchka has been bouncing around in my head. The song is How It Ends, and the last line is, “Yeah, you already know how this will end.” As soon as these words pass, of course, the song has ended, leaving you to wonder whether they refer reflexively (and perhaps ironically) to the song itself, or to some outside situation as narrow as a particular story or as broad as life itself.

It got me to thinking (as I suppose is apparent from the fact that I’m writing about it here) that we probably wouldn’t want to live in a world in which endings were known and outcomes were assured. Much of the unduplicable zest of life comes precisely from wondering just what the food will taste like or how the story will be resolved.

Well, you say, these are only trivial examples. Of course we want to be surprised by fiction or food, but not by the big things—relationships, home purchases, our investments, and the like. These are, perhaps, areas in which we wish to lock down the results and proceed towards a fixed outcome.

Wouldn’t it be nice to know which stocks would pay off, which home would triple in value, and which man or woman was the one with whom a first date would blossom into a solid and loving marriage? If that were the case, we would all certainly make much better choices…

… wouldn’t we?

Of course, many if not most industries trade on some degree of risk. Many institutions would simply cease to exist if outcomes were known in advance. Consider the stock market, all forms of betting and gambling, and the accompanying sports and other competitions of skill, just to name a few. Demand would either skyrocket or plummet given the known future value of a commodity, while the insurance industry would either collapse or become even more extortionate than it already is. The dating industry, by comparison, would suffer a major setback, as people would need only one date to determine whether this one was the one—and, by necessary implication, divorce lawyers would find themselves entirely out a job.

Oh, there might be some positive side effects. Government programs might be scrapped before—rather than after—they had wasted millions (if not billions) of taxpayer dollars on unachievable goals. Doomed aircraft or vessels or vehicles would never take off, set sail, or leave the driveway, and countless hearts would never be broken.

Yet for all that we may sometimes express a desire for a world filled with certainty rather than risk, I would suggest that we do not really want to know in advance the outcome of every venture, the success of every march. In fact, we don’t even live as though this is our wish—nor should we.

Despite all the uncertainty of beginning any great undertaking, every year millions of people enroll in college, change careers, move to a new place, and (yes) even get married. No one ever knows how these decisions will end or what the future will bring as a result of making such choices, but we—all of us—make them just the same.

Why do we do that?

It’s not because we’re gluttons for punishment. No one would ever willingly sign up for a college dropout or an unsatisfying new job or an unwelcoming new home or an empty marriage. That’s simply not how we’re wired.

Nor is it because we really enjoy uncertainty. All sorts of human behaviors demonstrate that we attempt to minimize uncertainty whenever possible. We invest. We buy insurance for everything from plane tickets to our own lives. We fireproof our homes, waterproof our attire when we go out in the rain, and (try to) injury-proof our children before they ride their bikes. No, we don’t like uncertainty.

And yet we keep opting for it—not because it’s something we enjoy, but because the negative feeling of uncertainty is outweighed by a much more powerful positive feeling: hope.

We take big leaps like relocation and marriage because we have hope: hope that the grass will be greener, hope that investment will pay off, hope that promises will be kept.

Are we stupid to do this, in a world that keeps demonstrating the apparent futility of hope? In a world where jobs often turn out to be failures, homes often turn out to be money pits, and marriages are often shattered by divorce?

I’m not sure.

But what I am sure of is this: while we may sometimes think we want to live in a world without uncertainty, we definitely do not want to live in a world without hope.

And that’s exactly what would happen if outcomes were always certain and secure. There would be no hope because there would be no reason for it. Why hope for an outcome you already know will occur? Why hope for anything when the result is always known?

Yes, the leaps we take and the plunges we make often end without success. And in the process, we often get hurt. Sometimes, the pain is enough to make us feel like complete and utter fools for ever daring to hope at all.

But we do it because we enjoy that taste—the taste of hope. We take risks, get messy, and even make fools of ourselves because the experience of hope is delicious, tantalizing, and so incredibly human. Indeed, I really think we wouldn’t be human without it.

We do not already know how it ends, nor do we wish to. The constant possibility of renewed hope makes even the darkest mistakes bearable—and, in the process, mercifully makes the future ever more attractive than the past.

1 Comment

Filed under Uncategorized

Return to me

In the 2000 film Return to Me, a loving husband loses his wife in a freak accident and later falls in love with the woman to whom his wife donated her heart. When the entire film is summarized that way, it sounds like an extraordinary concept; my friends in medical school would no doubt be quick to explain that the heart is an organ that pumps blood, not a center of feelings or the instrument of love.

Yet we have long associated the heart with love, perhaps because it is the heart that gives life to the body just as love gives life to the soul. Indeed, we’ve come a long way since the Middle Ages, when many believed that the source of our feelings was the spleen. (Thus the still-used phrase, “venting one’s spleen.”)

Of course, the heart is also featured in several familiar phrases, and the one I have in mind today is “home is where the heart is.” This phrase, like so many others, refers not to the organ that pumps your blood (after all, we would hope that would be wherever you are), but to love itself. Home is often where your love is, rather than where you are.

I have long had two homes and thus, I suppose, a divided heart. On the one hand, there has always been Milwaukee, home of my family, the church of my growing years, and many of my long-time friends. On the other hand, there is Boston—a more recent development, where I’ve formed many new friendships and where I’ll ultimately spend a total of seven years in school.

This being the case, I’ve spent several years simultaneously enjoying the home where I am and missing the home where I’m not. And no matter which place I find myself returning to on a given flight, I always feel like I’m going home. No doubt many long-time students feel the same way

But in the relatively short time since I last wrote, much has changed. And most of it is due to the state of my heart.

This applies as much to writing as anything else. My original intent in beginning this blog was complex, but much of it was related to the confusion and need for direction I was feeling at the time. At this time last year, a long-term relationship had ended—certainly for the best, as I see now, but this was difficult to grasp then. Last autumn was characterized by a string of bad dates and the beginning of law school, a place I wasn’t sure I wanted to be given the now-changed circumstances surrounding my initial decision to go there. By the time the new year rolled around, I realized that I needed structure and a healthy outlet, and writing was to be the key.

So the writing went along just fine until something more fulfilling came along. By February, just when I had given up on romance (certainly not forever, but for the time being), someone new came along. This someone new was in fact an old friend—indeed, someone with whom I had always been “just” friends, and I never had any reason to expect that this would change.

(Hope? Maybe. But expect? Not a chance… or so I thought.)

Looking back on it now, the whole experience would make for a movie script that (in my humble opinion) would outsell even Return to Me: a romantic weekend enjoying the dizzying opulence that only Manhattan can offer, a first date amidst the flurries of a real New England snowstorm, and a Valentine’s Day that concided (perhaps not coincidentally) with the entire trip. By the time that week in February had ended, it was clear to me that this would not be like any other journey I had taken.

And suddenly, writing seemed a whole lot less interesting.

Oh, I carried on, nearly making it through the end of April, but by then writing had been entirely replaced by another distraction—which in fact was no distraction at all, but my very waking and sleeping and life and breath.

Am I a complete sap? You bet.

But I’m not ashamed of it, not a bit—and I don’t care who knows it.

Especially now that we’re engaged. (!)

You see, by the end of the spring, this had become my entire aim and ambition for the summer: to ask the woman I love to be my wife. All else became secondary, and (needless to say) the writing took a hit. So did several other things, like sleeping enough and staying in touch with my friends.

So I hope you’ll all forgive me.

Because if you knew the woman I’m going to marry, you would completely understand.

She is distilled sunlight, radiating brilliance and warmth. Her smile would melt ice, and has certainly melted my heart. (No comparisons, please.) My astonishment at her incredible kindness and generosity borders on disbelief—and yet every day offers more proof that such an angel does exist. She is loyal and honest, full of good sense and patient understanding, and she is fiercely loving. She is a true honor to her parents and a joy to all who know her exuberant spirit. She is all of this, and she is by far the best thing that has ever happened to me.

Sadly, she is also back in Milwaukee, the home I left today for my other home, where I’ll spend the year preparing a place for her to come after our wedding day.

Leaving was incredibly hard today, and Boston has never felt less like home. The wet and windy weather with which it greeted me seemed to match the current state of my heart. The upcoming months will be a challenge, and both of us know that full well.

Yes, now more than ever before, home is where my heart is, and my heart is not here. But soon enough, home—and my heart—will return to me.

And in the meantime, I’ll be writing again—this time, not because I’m searching, but because I’m missing what I’ve found.

1 Comment

Filed under Uncategorized

That’s brusCHetta… with a K

The titular primo piatto (Photo credit:

It’s been said that we live in an age of hyper-correctness.

These days it seems that society can’t get out of bed in the morning without its daily dose of political correctness, and we all tiptoe around in an acrobatic attempt to avoid offending groups of people we have decided are worthy of special protection. We are all very careful to refer to these groups using the most up-to-date words, and any failure to do so results in scandalized opprobrium from those who are most fully “in the know.” When someone gets it wrong, it’s never long before someone else makes it all too clear.

Because we spend so much time making sure to be correct in this political realm, our inflated sense of correctness bleeds into other areas, as well. For instance, I now hear a number of grammatical mistakes that result from people trying too hard to get it right. Here are a few:

This is a gift from your mother and I.

Actually, this is a gift from your mother and me. Just because you include yourself in a verbal phrase with the word ‘and’ does not require you to refer to yourself as ‘I.’ In fact, where you are included as the object of a prepositional phrase, the correct pronoun is almost always ‘me.’ However, because it’s simpler to teach children never to include yourself with “and me” (after all, doing so is usually wrong), hyper-correctness applies the rule unnecessarily here.

It was such a nice day that I decided to lay out in the sun.

Here, because both ‘lie’ and ‘lay’ are irregular verbs, many people opt for what seems more irregular because, I suppose, it seems more likely to be correct. When in doubt, in other words, go with the stranger of the two—knowing English grammar, it’s probably the right choice. But that would be the wrong conclusion to draw here. In fact, the only thing you can ‘lay’ in the present tense is an egg (and, come to think of it, you can’t even do that). ‘Lay’ also works with some prepositions, as in to ‘lay out’ funds or to ‘lay into’ someone. The rest of the time, though, it only works as the preterite of ‘lie’—i.e., while today I will lie out in the sun, yesterday I lay inside.

If you have any questions after the lecture, please speak to Professor Franklin or myself.

This one is hyper-correctness at its worst. There is no earthly reason to use ‘myself’ in this sentence, but because it sounds more formal (even more legalistic, perhaps), people seem to think it’s the right way to go. Unfortunately, it results in sheer grammatical ridiculousness, since ‘myself’ (along with any other pronoun ending  in -self) is reflexive and therefore can only apply to actions one does to or with oneself. You, or your neighbor, or even the Queen of England cannot talk to myself; only I can do that. And you can talk to yourself, and she can talk to herself, and so on. In this sentence, the speaker should simply say, “Talk to me.”

Interestingly, however, while hyper-correctness has led us to substitute ‘myself’ for ‘me’ and ‘vertically challenged’ for ‘short,’ it has not yet managed to make people pronounce common foreign terms correctly. (You knew I had to get to bruschetta eventually.)

The other day, I was going about my business on a sunny Saturday afternoon. It has already begun to feel like summer, and I had gone with a certain someone to enjoy a glass of wine and an appetizer on the patio of a small restaurant outside of town. The breeze was blowing, there were several white wines on the menu, and it was (as I’m sure I don’t even need to tell you) the perfect day for bruschetta.

Now, after learning the secret from a good friend in Boston, I make a pretty excellent bruschetta myself. It’s one of the simplest appetizers in the world and requires no cooking, and I’ve found very few that I prefer to my own. But from time to time, when it’s on the menu and the mood strikes, I like to see what other places do with the simple dish.

And just to get this out there, the word ‘bruschetta’ is absolutely, positively pronounced with a ‘K’ sound in the middle. The ‘sch,’ in other words, is hard—just as in ‘paschal,’ ‘scherzo,’ or ‘scheme.’ Written phonetically, it is broŏ’sketə, in case anyone wants to know.

Apparently, our server didn’t want to know. Here’s where the hyper-correctness comes in: as soon as I had ordered the appetizer, he came right back with a modification. “Oh, you mean brussssssssshhhhhetta?” he replied.

Now, because I happened to be with someone whose company I esteem very highly, I decided to let it go. After all, what would be served by re-correcting (or perhaps un-correcting) my server? Oh, perhaps a small but substantial vindication of the entire Italian language, but aside from that…

At the same time, though, it really irked me that I, who knew exactly how the word ought to be pronounced, was corrected by someone who clearly did not. I have a hard enough time handling correction when I’m in the wrong, much less when I know I’m in the right! And yet I smiled and swallowed my words, together with a very large mouthful of that-appetizer-which-shall-not-be-named.

Honestly, I had half a mind to ask him if he could bring over some Eye-talian dressing to serve on the side.

Of course, I realize that I’m making an Apennine out of an appetizer here. But it bothers me that some self-important server is running around somewhere in a small town in Wisconsin, instructing well-meaning patrons in how to butcher the Italian language and mispronounce their words. Isn’t there enough error in the world already, without him creating more?

I don’t think I’m free from error. I’m sure that when I place my order at a French restaurant, I make all sorts of mistakes with regard to my entrée. But if that’s the case, I would expect to be corrected. That’s what should be done when someone is wrong.

If you ask me, though, there’s far too much correcting going on when someone is right—and that applies as much to what we call political “correctness” as it does to anything else.


Filed under Uncategorized

Elbow grease?

When was the last time you cleaned your toilet?

I once got into an extended discussion with a close friend regarding the benefits of manual labor. Specifically, we were debating the implications of paying someone to do cleaning you don’t wish to do yourself, particularly in the bathroom. The job we settled on as the point of reference for our discussion was cleaning the toilet.

If you know me well, you won’t be surprised to learn that I was arguing in favor of paying someone to do the job, while my friend argued that doing so would reflect something distasteful about me as a person. I couldn’t for the life of me understand why.

I still recall where we happened to be at the time: standing on San Francisco’s Pier 39 last June, looking out at Alcatraz Island with sea lions barking noisily in the foreground. (The sea lions, by the way, mysteriously disappeared in December and did not begin returning until late February. They are now back en masse, but for a while I was thinking that perhaps our extended conversation drove them away—allowing six months or so for the content to penetrate, of course.)

In any case, my friend’s point was that there are certain jobs that build character because of their menial nature. Cleaning the toilet, scrubbing the floor, or washing the dishes are all tasks that require both a strong work ethic and humility. Not only do you have to put your back into such work—i.e., you actually exert physical effort and may even work up a sweat—but you also are reminded by doing so that you are not above these tasks. Thus, they keep you humble; they remind you that you are human.

While there is a great deal to be said for this approach, I think I made a few good points of my own. First of all, while many people may be born into the sort of privilege that allows them to pay for the completion of their housework, many more achieve this ability as the result of their own hard work. In my case, for instance, if I’m ever in the position to hire some household help, I don’t think I’ll ever forget the summers I spent cutting grass and cleaning gutters, or the winters I spent shoveling snow. In fact, having had those experiences would only make me appreciate my roots even more, reminding me of a time when I was the one being hired that way.

In addition, I think there is some work that just makes more sense for me to do. I’m not talking white-collar vs. blue-collar jobs here. I’m talking about the old economic notion of comparative advantage. If I’m better than someone else at writing a legal memo, and that person is better than me at fixing broken plumbing, then I should write the memo and he should fix the plumbing—even if he could, with some effort, write the memo, and I could, with considerably more effort, fix the plumbing. In my mind, the same goes for cleaning toilets and scrubbing floors.

Yes, said my friend, but then eventually you grow detached from your humanity. You forget that you can do things with your hands. You forget that you can make things happen using physical power rather than purchasing power. You constantly delegate “lower” tasks to someone else until you begin to think that not only the tasks, but also the people who perform them, are beneath you. Over time, you develop the very superiority complex that doing your own labor serves to prevent.

Well, I don’t know about that. I came back with a number of arguments of my own (among them, the idea that the service industry developed by people who can pay for it gives jobs to people with limited skills), but I couldn’t help but notice that I was coming across as a bit elitist—exactly as my friend was subtly characterizing me.

In the end, though, all of this talking in circles cannot compete with the reality of the labor itself—as I saw today, scrubbing my floors at home.

Yes, that’s right: after all of my high-flown words at this time last year, there I was, on my hands and knees, scrubbing the family linoleum with a good old brush and bucket and thinking back to our conversation on Pier 39.

And you know what? I decided that I was right.

Oh, I tried my hardest to see it from my friend’s point of view. With shaking knees and tired arms, I reminded myself that this was being human, that I wasn’t above this sort of work, and that physical labor has a virtue unto itself.

That lasted for all of three minutes.

By the time I was done with all of the floors, I was thoroughly convinced that MerryMaids and I would be developing a very close relationship in the coming years. If I ever want to be reminded that I can do such work myself, I can always pick up a brush or mop and give it a go. (Or perhaps I can just look to my chiropractor bills as a still more vivid reminder.) But in general, if I can afford to pay for it, and someone is more willing to do it than me, why on earth would we not take each other up on the deal made possible by his willingness and my ability to pay?

You see, I am a firm believer in the free market economy and the potential gains from trade, and I think this is a perfect example of those principles. To suggest that there is something morally superior about cleaning your own toilet seems at best inefficient and at worst supercilious, and I don’t think that all the elbow grease in the world could convince me that allowing someone to do me a service is in fact doing a disservice to myself.

Perhaps I’m wrong, but (as usual) I think not.

If you’d like to come and convince me, however, I would only ask that you take off your shoes before walking across the floor.

1 Comment

Filed under Uncategorized