Pages

Thursday, November 29, 2012

An inquiry into bananas


The other day I had to go for a ride with my parents (whom I’m visiting) to see my grandfather. It was sort of sudden, and we were leaving at about the time I usually eat lunch. Unable to prepare a genuine meal, I grabbed a few bites of snacks and leftovers, and then I searched for something to take with me on the road. I grabbed a banana, and began eating it as we pulled out of the driveway.


It was a singularly disappointing experience. 

The problem with bananas is, quite simply, that they suck. They are the most low-rent, low-class, low-character fruit known to man, yet out of all the fruits in existence I have probably, in my life, eaten more bananas than any other. 

Bananas suck, but they’re ubiquitous. In homes, in school lunchrooms, in continental breakfasts, in fruity fruit salads—bananas. Whenever a junk-food emporium gas station decides to throw a sop to healthy eating, you can be sure they’ll do it with a wicker basket full of bananas (and a few withered oranges and waxy Red Delicious apples). 

We live in a very bananocentric culture. There’s no escaping the tropical yellow phallus. 

When I ate the banana the other day, I had to sit back for a little while and just look at the thing, wondering what the hell I or anyone else could possibly be thinking when we make the decision to eat one. The thing is, I’ve never cared for bananas that much, and yet I used to buy a full bunch every week, as if it were my duty to eat them. I would walk through the produce section and say, “Duh, better get da bananas,” and then spend the rest of the week forcing myself to choke them down. 

This went on for years. I lost the habit a while back, I guess, though it happened without any deliberate decision on my part. This recent experience, however, only confirmed the distance that had grown between the banana and me.

After all, the banana is really just the donut of the fruit world. It turns into a starchy, sickly sweet, mucusy mess in the mouth, like pureed squash mixed with honey and snot. Where other fruits leave my stomach feeling cleansed and refreshed, a banana in my body feels like a chemically reactive marshmallow ballooning up to the size of a small pillow. I spend at least ten minutes feeling bloated and unhappy, and my entire olfactory cavity remains stained with the scent of bananas for the better part of an hour. 

I’m sick of banana apologists. I’ve heard all the reasons for the banana’s supposed greatness: it’s cheap, it doesn’t need to be washed, it comes with a built-in handle. Whatever. These are not excuses for a shitty fruit. It’s time for the banana to go back where it came from.

Saturday, November 17, 2012

Waste and time


The EPA found in 2007 that the United States flushes 4.8 billion gallons of water down its toilets every day. I mention that not to shame you into doing your business in a hole in the backyard from now on, but only because the other day I had cause to reflect on toilet water a bit. 

I was shopping at the local food co-op and felt the need to relieve myself. This co-op is in a pretty new facility, and it’s replete with eco-friendly bells and whistles. Solar panels, native species planted around the parking lot—that kind of thing. It also has in the bathroom stalls those fancy toilets with separate buttons for “solid waste” and “liquid waste.”

These toilets always bring out the righteously embarrassed Victorian in me. A gentleman’s excretions should be his own private business, should they not? How can we tolerate the barbarous humiliation of being asked to indicate the “solidity” or “fluidity” of our leavings? It’s almost more than a civilized man can bear. 

I suppose we should only be thankful they extend us the courtesy of euphemizing the substances as “solid waste” and “liquid waste.” Were they to label the buttons just “urine” and “feces,” the indignity might be so severe I would simply have to hold my function until such time as I could perform it in the privacy of my own home.  

I was in an unusually observant mood the other day, though, and I noticed that I didn’t actually need to describe my wastes to the toilet machine at all. Reading the fine print below the buttons, I learned that a sensor in the toilet could make the decision as to “solid waste” or “liquid waste” for me. Surprising in itself, but even more surprising is how it makes that decision. The sensor determines the proper flush volume, I read, “based on the amount of time spent in the stall.”

I had to think about that for a while. Someone working for a toilet manufacturer (Or toilet sensor manufacturer—they could be totally different industries.) had to make a decision for the cut-off time between urination and defecation. Who was that guy, and how did he do it, and what time did he decide on? What kind of statistical analysis goes into determining the average times for human waste elimination?

It must have been a hell of a job to research. Did they place stopwatches and video cameras in bathroom stalls across America, or did they bring in test subjects for a series of blind clinical trials? How careful were they to get a representative sample of the population?

It’s just that I have to imagine there’s sometimes overlap between the time it takes to pee and the time it takes to poop. If, for example, you imagine a really slow urinator—say, an old guy with a narrow urethra and a large bladder and an uncircumcised penis—and you imagine a really fast defecator—a fit young guy who gets plenty of fiber and just had a cup of coffee—you have to figure that occasionally the latter finishes before the former.

These are boundary cases, sure, but the toilet scientists must have accounted for them with some kind of statistical weight, right? Did they have heated arguments about the relevance or irrelevance of urinary outliers? Maybe after months of debate they just said, “Fuck it. Anything longer than a minute thirty-five gets the poop flush.” So there may be fast poopers out there routinely finding their creations met by a mere trickle insufficient to the task, while slow pissers find theirs met by a roaring deluge capable of sucking down a bag of rocks.

And once they decided on an upper time limit for urination, did the toilet engineers consider an upper bound for what we could reasonably assume to be healthy defecation? Surely there’s a time beyond which we would have to start worrying. If someone’s on that toilet for longer than half an hour, does the toilet sensor automatically call an ambulance? Seems like it should.

These thoughts all ran through my head as I stood there eliminating that morning’s beverages, and probably dribbling some of them on the floor in my absentmindedness. I  reached the conclusion that designing environmentally responsible toilet solutions for the public may be one of the hardest jobs in America. Those guys must really care about saving those few extra gallons of water.


Monday, November 12, 2012

Come restaurant with me


A mastery of parallel constructions is one of the skills that separates the careful writer from the slapdash. Our world is rife with false parallelisms: “As a teacher, I will need to possess patience, warmth, respect, and be a good listener.” Ugh.

Airports are pretty reliable places to observe bad writing. I had to spend about 83 minutes of my life in one the other day, and, sure enough, I saw a false parallelism that made my skin crawl for a good five minutes or so. It was part of the slogan (if you can call it that) of the restaurant near my gate. Here it is:




Now let’s take a look at those words beneath the restaurant’s name and try to insert them into some kind of grammatically parallel structure. It won’t work, but what’s interesting about this false parallelism is that it is palindromic in its falseness. That is, the elements are parallel in both directions, excluding the final element. Observe:
“Come to Scotty Browns. It is a restaurant. It is a lounge. It is a socialize." 
“Come to Scotty Browns. You can socialize. You can lounge. You can restaurant.”
With the middle element being equally plausible as a noun and a verb, we can’t be quite  sure which element actually makes the parallelism false. Maybe lounge is meant as a verb, in which case the noun restaurant is the false element. If lounge is a noun, then socialize is the offender. 

I found this rather confounding as I sat staring at the sign with nothing else to occupy my thoughts. (I was by myself, of course. Maybe I should have gone to Scotty Browns and enjoyed the socialize they have there.) I began to wonder if I wasn’t just observing the bleeding edge of language change; maybe restaurant is expanding to become not only a noun but also a verb. It’s happened before with contact and access, so why not with restaurant?

Maybe in the year 2013 we’ll say to our friends, “Hey, if you’re not busy Friday, why don’t we restaurant?” Our social communication could be revolutionized. Here are some other possibilities:
“I’m pretty broke, so I can’t restaurant much for a while.”
“I love New York; the restauranting is divine!”
“So this totally creepy guy tried to restaurant me the other night, and I had to lie that I had plans with my sister just to get him to leave me alone.”
If it’s the case that in ten years we’ll all go “restauranting” without giving the term a second thought, then I guess I have to commend the proprietors of Scotty Browns for bringing linguistic innovation right into the mainstream. They’ve shown the masses how to be flexible and creative with words. Usually this is the kind of thing that only poets do. 

Monday, November 5, 2012

Half and half


Is the glass half-empty or is the glass half-full? I guess it depends on what we’re talking about here. Are we thirsty or are we not? That could definitely influence our interpretation of this glass and its contents. If we’re not thirsty, then I guess the question is meaningless. We’re indifferent. The glass could not even exist, for all we care.

But if we are thirsty, then surely we regard the glass as half-empty, because we’re disappointed to have so little water to slake our thirst. Unless we’re not really that thirsty, in which case the half-capacity glass provides just the amount we need, so we consider it half-full. Or unless we’re desperately thirsty, in which case we’re glad to have anything at all, and we would again consider it half-full. 

But how large is this glass? I think that matters, doesn’t it? Because if it’s a tiny juice glass, then it might not be satisfying even when filled to the brim. And if it’s a big tumbler, then just a quarter of its volume might be enough to satisfy us. I think I’m entitled to know what kind of glass I’m looking at. Does anyone have figures on the average size of a drinking glass in America? 

Is it truly sufficient for the purposes of this question, though, to only consider glasses in America? Other cultures may drink out of considerably larger or considerably smaller glasses. I don’t feel I’m prepared to answer this question without being given the time and the means to explore the world and observe more drinking glasses. 

(Is it weird that I always imagine this glass being filled with water? It makes sense that it would be, because water is the one beverage we all need to survive, but maybe other people are imagining something else. Does the glass’s being filled with juice or beer or wine affect how one perceives its fullness or emptiness? A baby would want the glass to be filled with breast milk, after all. Gotta remember the babies.)

I feel like I need to be told what the water in the glass is even meant for. Is it there to be drunk? Maybe whoever poured the water in the glass intended to use it to water a small plant. In that case, it’s probably better to have it at half capacity, so it won’t run down the side and make a mess while being poured out. So you’d have to call that half-full.

And mustn’t we also consider how the water level came to be at the halfway mark to begin with? If someone filled the glass to the brim and then drank half of the water, leaving half of it for me, I couldn’t help but regard that as half-empty. It has been emptied by half. But suppose someone takes an empty glass and carefully pours water to exactly the halfway mark. They have taken nothing and replaced it with something. Half-full

But, God, if I just happen upon this glass without knowing who poured the water and when, isn’t it awfully presumptuous of me to begin projecting my views and my needs onto it? I mean, this is just a glass, right? Where do I get off pontificating on its fullness or emptiness, as if I’m the center of the universe? Other people have needs, too, and it isn’t my prerogative to go labeling a glass from just my own narrow viewpoint.

I really shouldn’t be so selfish.

Sunday, November 4, 2012

Silent achievements


In a book about cycling I read recently, the author tries to convince readers to shun the world of spandex-wearing racing cyclists and instead ride for the sheer joy of it. He advises cyclists who feel themselves pressured into participating in over-serious cycle events to ask themselves this question beforehand: Would I do this if I couldn’t tell anyone about it? If I couldn’t wear the t-shirt afterward—if I couldn’t tell friends, family, and coworkers about my weekend-warrior sweatathon—would I still choose to do it? His point is that most of us would say no—we’d rather join a buddy for a casual ride out to the swimming hole.

It’s a great question that cuts down the pretensions of the athletic “image.” I think you can take it further, though, and apply the question to life as a whole: What would I choose to do with myself if I could never talk about it?

Suppose you lived in a world where no one was ever going to ask, “What do you do?” You could write, sell, or build whatever you wanted, and have your name attached to it, but you could never bring up the topic of your vocation to anyone. People would have to make the connection of your name and your creation on their own. You could say, “I’m Art Fry”—and let them realize who you are—but you couldn’t say, “I’m Art Fry, inventor of Post-it Notes.” 

In that situation, what would you spend your time doing? A handful of people could answer, “Exactly what I’m doing now.” But most couldn’t. Our sense of what we find intrinsically rewarding is usually clouded by what we find—or think we will find—socially rewarding. Prestige and recognition misdirect us into doing things we dislike.

If people could no longer boast of their achievements, we’d probably see thousands if not millions of people in high-status jobs suddenly quit. Law firms and investment banks would empty pretty quickly. Enrollments at Harvard and other prestigious institutions would drop considerably. Politics might disappear entirely. How would a politician sustain his morale when robbed of the opportunity for self-promotion? 

That’s not to say all lawyers and bankers are self-deceiving hypocrites working at things they hate in the hope of being admired, nor that they are the only ones doing so. Surely there are some lawyers who love what they do, and just as surely there are aspiring novelists who don’t really enjoy creating literature, but who just want to “be novelists.”

It might be impossible to fully separate our genuine enjoyments from our desire for    social gratification. For all we talk about doing what makes us—individually—happy, we assert just as often that human beings are “social animals.” We don’t live in solitude, and everyone needs at least some kind of recognition. Even work that seems more purely creative—writing, architecture, fine cooking—is still done for an audience.

It’s probably fatuous, then, to ask “What would you do for no recognition? What would you do just for yourself?” If we had no one but ourselves to live for, we’d probably all commit suicide. So the better question is probably, “What would you do for no recognition other than that which the work itself generated?” That line might be a bit clumsier to include in a motivational speech, but it’s probably the best way to get to the truth.

Terrible baby names


Back in the ‘90s George Carlin did a bit on how he was sick of guys named Todd. It’s a newfangled, weak-sounding, goofy name, he said. He listed a bunch of other ridiculous boys’ names—Cody, Tucker, and Flynn being a few of them. He’s right; these are silly names. And things have only gotten worse since the ‘90s.

If you spend any time around people who are at the baby-making stage of life, you’ve probably encountered some real head-scratchers of names lately. It’s not limited to boys. Baby girls are suffering, too. American society is currently in a state of baby-naming anarchy. Anything goes, evidently. Mothers and fathers are naming their kids after fruits, turning surnames into first names, using improbably spelled foreign names, and even (it seems) inventing names out of thin air.

It would be impossible to compile an exhaustive list of all the bad baby names I’ve heard recently. But you know what kind of names I’m talking about. Like when your cousin comes to a family get-together and says, “This is our new baby ______,” and people furrow their brows a bit and say, “Oh.” Or the baby announcements in the employee newsletter: “Eric and Katie welcomed their new son ____________,” and it looks like they filled in the blank by mixing up Scrabble letters on the kitchen table.

I think in the past few years I’ve heard more bad baby names than good ones. Classic names like Paul, Michael, Jane, James, Claire, and Katherine seem to be on the wane. If you were to ask the parents why they chose not to use one of these more recognizable names, you’d probably get the same response from all of them: “We wanted something unique.” 

It’s almost an admirable impulse. Almost. Why wouldn’t a kid want an unusual name? It sounds like what everyone wishes for: going through life being one of a kind, and without any effort. What parents don’t realize is that it’s actually out of selfishness that they choose their strange baby names. No one chooses “Ryder” over “Peter” because it’s likely to make the kid happier. They choose it because it’s likely to make them, as parents, seem hipper (or something). 

Parents have a difficult time seeing their children as independent people, and so they don’t consider what it might be like to live as an adult with a name like “Fenton” or “Milligan” (for a woman, no less!) or “Camden”. Would you really want people to do a double-take every time you introduced yourself at a party? Would it not be better to be able to say, “Hi, I’m John,” and just be done with it?

Picking up where Carlin left off, Louis CK does a bit where he says “there should be a couple of laws” about what you can and can’t name your kid. I think there used to be laws about that, and they were collectively called “culture.” Parents 100 years ago had just as much freedom to name their kids whatever the hell they wanted, but they didn’t, because they were living in a society that actually had expectations for civilized behavior.

So is it symbolic of the decadence of American culture that we now have no norms for what we call our kids? If our baby-naming has descended into anomie, will our morals and relationships follow? Maybe so, but I’m not one to predict the future.

One thing I will do is give parents-to-be this advice: give your kid a name that could plausibly be that of a soldier, an artist, or a corporate executive. Those three vocations pretty well cover the entire spectrum of life choices. See how well the proposed name fits into each of these blanks:
“_______ has been decorated in battle several times over for bravery, selflessness, and outstanding skull-crushing ability.”


“An invite to _______’s SoHo loft is one of the most coveted status symbols in the New York art world.”


“_______ cleared eight figures last year through a series of hostile takeovers and cutthroat stock market deals.”

If the kid’s name works for all three of these, go with it.

Monday, August 13, 2012

In the city of the future . . .


The previews for the remake of Total Recall (I’m not going to bother seeing the whole thing) show a city near the end of the twenty-first century—for the sake of this essay, let’s just say it’s 2092. The city of the future has all the things you would expect: flying cars, robot cops, and buildings that seem to hang in midair. At first, 2092 sounds pretty far off. It seems plausible to imagine our cities will have been drastically transformed by then. But it’s actually just 80 years away. How much is going to change in that amount of time?

Take another example: Blade Runner, made in 1982, opens with flying cars, flaming industrial wastelands, monolithic skyscrapers, and a sky totally obscured by smog. The title card says it’s the year 2019. A lot could happen in the next seven years, but I think it’s fairly safe to say Blade Runner’s art directors were pretty far off the in how they imagined the  future. 

It’s a problem in most films set in the future, even those that really try for something plausible. The creative team imagines many highly visible changes to our reality: cars that drive up the sides of buildings, wholly synthetic food, homes made out of clear plastic. But the truth is that technological changes are rarely so high-profile. Most of the important ones, in fact, are nearly invisible at first glance. 

Thinking again about the timeframe established in the new Total Recall, what would be most shocking to Americans of 1932 if they could see footage of “the future” of 2012? 

Imagine showing New Yorkers from 1932 a low-altitude flyover shot of the city of 2012, at a scale where they couldn’t see people, cars, or advertisements. There might be nothing at all to shock or impress them, other than that the city had gotten bigger. It would look quite familiar, just the way a city ought to look. The urban flyover is a commonplace of films set in the future. We always see a cityscape transformed beyond recognition. But New Yorkers of the past would find plenty to recognize in the New York of 2012.  

Now imagine moving in to a smaller scale and showing them footage of a walk down Fifth Avenue. The first things they’d notice would probably be the changes in fashion and the changes in car design. Men are going out in jeans and hoodies and driving streamlined cars made in Japan? No doubt these things would genuinely shock them, but clothing and automobiles aren’t recent technological developments. They had those things in 1932, just in different forms. They also had neon lights and recorded music, so an average urban street scene would not impress them, as far as technology goes.

It would only be when we started to show things in higher resolution that they would see the most important technological changes in the last 80 years. A quick stroll down Fifth Avenue might not be that impressive, but if we began poking into coffee shops along the way, looking at laptops and iPhones and explaining the internet, then even the most worldly viewer in the 1930s would have to be impressed. 

It takes a bit of digging to see these technologies, yet they’ve done more to change our lives than a flying car ever could. Compared to the internet, a flying car is kind of boring. The latter changes only how we commute. The former changes how we communicate. 

Business, socializing, and information exchange have all been revolutionized in the past 80 years. But these things are hard to show visually. They don’t translate well to film. What do translate well to film are the changes brought about by the Industrial Revolution, around the turn of the last century. That revolution took place on a huge, visually awesome scale. Accordingly, these are the changes from which filmmakers extrapolate their depictions of the future.

The tech revolution (if you want to call it that) doesn’t work that way, though. It takes place on a physically small scale—smallness being one of the revolution’s chief tenets. But within that small scale, buried in chips and code, are abstract concepts that dwarf even some of our greatest industrial achievements. Our cities may look nearly the same, but our lives are very different.

It’s possible our technological advancement will continue down this path—becoming  smaller and less visible, yet changing our lives even more. If that’s the case, films depicting the future will probably get to be even further off the mark.      


      

Wednesday, June 20, 2012

Petition interruption


A while ago I was walking and talking with a friend who is of Russian extraction. We were approached by a girl gathering signatures for a petition. I don’t remember the cause—stop coal, improve salmon self-esteem, elect Jesus, whatever. She asked us if we were interested in helping it. “Oh, I already signed that one,” I lied. Seemed like an easy way to get out of hearing about it. My friend had a more interesting response. “Sorry,” he said, gesturing toward me and obviously a bit put out, “but we’re in the middle of a conversation right now.”

I can’t imagine many Americans being that frank. It was refreshing. He was right that we were in the middle of a conversation, and that it’s rude to interrupt two people conversing. Until he pointed it out like that, I had never realized how often these petitioners violate basic manners. Nor had I realized how accustomed Americans are to it. We accept intrusive cause-hawkers the way we accept bad weather.

Maybe it’s worse in the Northwest than it is in other places. We have many causes. The environment is a big deal—there being rather more of it here than other places—and we have more college degrees than we probably need. It’s a recipe for lots of street-corner activism.

People like me only encourage the petitioners in their bad manners. I’ve been educated against my will to believe that causes and activism and awareness-building are endeavors to be solemnly admired. I’ve been programmed to feel ashamed to admit in public that I consider my own business more important than that of stopping squirrel poachers. So I wimp out and avoid telling the signature-collectors they’re intruding. “I already signed it” or “No, thanks” are about the best I ever do.

I wish I could be more like my friend and tell these people—politely and correctly—why they can go ahead and kiss off. Maybe it’s that Russians are raised on more traditional values. My impression has always been that they’re much more friend- and family-centered than Americans. They tend to get really involved in their conversations. For a stranger to insinuate that his pet political project is more important than two Russian men’s discussion of Moscow women versus Petersburg women? Highly insulting.

Most petitioners would probably counter that their interruptions are really for the benefit of the people they’re interrupting: “Invasive plant species affect you.” That may be true (though I take it all with a grain of salt), but it doesn’t excuse the interruption. Stopping people on the street is just bad marketing. It’s door-to-door sales and cold-calling and spam. Sophisticated sellers don’t use those methods. 

It doesn’t matter what the content of the interruption is. If I stopped two people in the park and started recounting one of my hilarious anecdotes, they would immediately find it rude (crazy, too) and maybe even call it out as such. Why should they react differently when an environmental science major tells them the sky is falling without their signatures?


Tuesday, June 19, 2012

Getting and getting rid


I got rid of some stuff a couple weeks ago. Clothes, old shoes, a few other things. Now that all that stuff is gone, I can’t remember why I ever had it. I can’t even remember what it all was. If you asked me to list the contents of the four bags I took to Value Village, I probably couldn’t come up with even half of them. 

 A lot of the stuff we have is like that—stuff we wouldn’t even notice being gone. Yet most of us have a hard time getting rid of it. Even if it’s something we haven’t used in a year and don’t plan to use anytime soon (or something that never had a use to begin with), we convince ourselves we will someday regret getting rid of it. When we come to the moment of truth—throwing it away, giving it away, selling it on Craigslist—we falter.

We get anxious thinking about discarding possessions. It feels like throwing away money. Even if we plan to sell these things, we worry that we’ll get a disappointing price. Easier to just hang on to them and lie that they still have value in our lives. Easier to avoid the distressing feeling that when we toss something, it’s forever. 

Getting rid of stuff is emotionally fraught. But being rid of it is easy. I now feel nothing about the stuff I donated recently—it wasn’t a meaningful part of my routine. I don’t go absentmindedly reaching for that stuff and find the shelf empty. Not having it is a non-experience. 

There’s a similar contrast between getting and having. Getting something, either a gift or a purchase, is a thrill. Having it is boring. Of all the things in your home you were excited to get, how many do you actually find fun to own? The answer—for me—is only a handful. If you took my bike or my computer, I would notice within a day. If you took my kettlebells, I would notice within a week. But if you took my copy of On the Road—which I didn’t like that much and never plan to reread—I might not notice ever. For stuff like this, owning and not owning are equal states.

Getting and getting rid—charged with feeling. Having and not having—as humdrum as flossing. We love hellos and hate goodbyes. But the stuff in between and after, we barely notice. 

Monday, June 18, 2012

When you wish upon a lottery


A few months ago I had a conversation with two older coworkers about what we would do if we won the lottery. They started the conversation, not I. Everyone was talking about the lottery at the time. The jackpot had gotten huge, and the numbers were going to be drawn in a few days.

My coworkers mentioned the typical dreams: sail to some island, build some big house, move someplace where the sun always shines and the water tastes like rose nectar. They seemed pretty well convinced that if you won the lottery you would have it made.

I presented the heretical idea that maybe having a huge lump of cash wouldn’t be that great. Maybe, I suggested, what makes life good is having something meaningful with which to occupy yourself. Maybe people need to produce something with their lives in order to feel fulfilled. Maybe, if you feel lost when you have little money, you would feel just as lost with lots of it.  

They wouldn’t have it. They laughed and implied that I must be over-complicated: “Well I wouldn’t have any problem just sitting in Fiji for the rest of my life!”   

Lots of people say things like that. I think they’re mistaken. People want to believe they would be content living off a massive nest egg with a Mai Tai in hand and no worries in their heads. But that kind of contentment can’t last very long. Passive pleasure is not in itself a reason to live. Any thoughtful person who has spent a continuous stretch of time pursuing it can attest to that fact.

I once took a chunk of savings and spent over four months traveling with no obligations. I remember many long days. Sixteen hours is a lot of time to spend with nothing productive to do. After only a few weeks, the pleasure and excitement had become routine.

My coworkers turned the discussion to less selfish aspects of lottery winnings. They thought this might be more rhetorically persuasive. “Think of all the people you could help,” one of them said. 

It’s true, to a small extent. If I had ten million dollars sitting around, I could give it to the hospital and pay for every cancer patient’s chemo. But the thing is, you can help people with just ten thousand dollars. Or even one hundred. Or even with nothing more than your time. Most of us don’t do so—at least not very often. It’s just not a strong desire.

So unless you’re someone who gives every last spare cent to charities and causes—and loves doing it—you probably wouldn’t be made that happy by giving away millions. And even if giving away millions was a happy experience, how long could that single experience sustain you? Five years later, would you be looking out the window with a cup of coffee in hand thinking, Boy, I sure do feel good about giving that money to those needy strangers half a decade ago? Would you ruminate on that one thought for six hours straight and go to bed that night feeling satisfied with yourself?

Again, count me doubtful. Giving money to the needy may be an admirable thing to do, but as an experience it’s boring. There’s no creativity in it.

This discussion with my coworkers didn’t bring up a single money-enabled thing that sounded to me like it could make me happy long-term. Maybe for a month, or two, or even a year—but not for life. 

I realized how rarely money is a solution to anything. Some of our problems can indeed be solved with cash. You need a knee replacement. Your basement flooded. Your car won’t start. But these aren’t the kinds of problems people are thinking about when they dream of winning the lottery. 

The problems people really hope money can solve are a lot more serious. Things like aimlessness. Loneliness. Confusion. Money doesn’t help with these things. 

In fact, a big sum of money that you simply won might actually be detrimental to solving these problems. The daily necessity of making a living gives shape to our lives. If approached mindfully, it may be in that process that we have the best hope of finding our true direction. And the best hope of connecting with others. And maybe also the best hope of seeing clearly. 

Saturday, June 16, 2012

The dinosaur method


Five years ago, when I was 20, I found a way you could teach someone to like eating vegetables—or teach yourself how to like them more. That is, if you know you should eat more vegetables, you want to eat more of them, but you just can’t get very excited about them—this is the method. All you have to do is pretend you’re a dinosaur when you’re eating them.

Wash a whole leaf of lettuce and hold it in two hands. Bend at the waist a bit to mimic a more horizontal spine. Imagine you’re some bipedal herbivorous dinosaur, like a parasaurolophus, iguanodon, or pachycephalosaurus. Rip a huge mouthful of vegetation out of your hands. Don’t be afraid to really put your neck into it. Chew extravagantly. Steal furtive glances from side to side as you watch for predators.

Go ahead and do it right now. I promise you it will be the funnest god damned leaf of lettuce you ever eat in your life.

Even if you only do it a few times, the dinosaur method works. It gives you the idea of vegetables as something precious and coveted. I never hated vegetables, but doing a bit of dinosauring a few years back took my appreciation of them to the next level. Even when not eating like a dinosaur, per se, I can still keep the idea in the back of my mind that hey—fuck it—I’m just a tenontosaurus, and munchin’ on salad is what I do

I learned the dinosaur method—in a different form—when I was six or seven. Like all right-thinking boys of that age, I was obsessed with dinosaurs. When I would leave class to use the bathroom or drinking fountain I would—as discreetly as possible—walk in the gait and posture of my favorite dinosaurs. I would put my hands right up against my pecs with my fingers curved and facing outward like claws. Two fingers meant I was a tyrannosaurus. Three fingers—and a bit more forearm extending from my chest—meant I was a velociraptor.

(I had one friend who got even more into character than I did. He really didn’t give a fuck what anyone thought if they saw him. He would walk in lumbering, menacing steps and look at you with the possessed eyes of a predator. His posture was such that you would swear he had a tail. He’d even snarl a bit and refuse to respond in words if you said anything to him. Damn, he was good.)

The dinosaur method helped alleviate some of the boredom of school—and other parts of life. I would do it whenever I needed to. The physicality of it made the imagining more intense. (Again—just try it. It can’t not make you feel good.) 

I guess when I was about nine, though, I decided acting like a dinosaur was no longer an appropriate way to spice up my life. It was over a decade later—as I looked at a head of lettuce—that I rediscovered its effectiveness.    

Adults need more fantasy in their lives.


Monday, April 9, 2012

Capitalization punishment

Working with words is a dangerous business. I don’t mean working with words in the sense of writing amateur essays or poetry or the like (though that can be fun). I mean working with words in the sense of a 9-to-5 job, where words are business. If you’re a natural linguistic nitpicker (like me) a job working with words will probably only amplify that shameful part of your personality. 

You find yourself internalizing house style standards and applying them to everything you see. Every publication (or publishing house) worthy of the name has some sort of house style guide. Some of these might be extremely long. Some houses might follow Chicago to the letter. Others might be shorter and less formal. But even blogs (the professional ones, anyway), have made choices concerning fonts, line breaks, jump breaks, spacing, margins, and other basic formatting. They do it because they want their readers (who are, after all, customers) to know what aesthetic to expect when they visit their site. It’s like McDonald’s.

Working within a defined style standard for a long time trains your eye to catch minor discrepancies as though they were nickel-sized blotches of ink spilled on the page. I can  now tell the difference between an en dash and an em dash faster than most people can tell the difference between a Great Dane and a Shih Tzu. I know where each respective dash is supposed to be used, and when I see one in the wrong place I find it—well, appalling. Conventions for line breaks, closing up, and ampersands I’ve taken to heart as well.

But the rule I’ve internalized more than any other is one for capitalization. A convention within the niche I work in (technical publishing) is to list all titles of papers in sentence case (e.g. “A compact hyperspectral imaging spectrometer for spaceborne . . .”), with only the first letter capitalized. When I was introduced to this style standard, I thought it looked odd. I had spent my entire academic life writing papers with titles like “Regressive Transcendence and Subversion: An Analysis of Postantimodernism in the Novels of . . .” where you gave a cap to pretty much everything other than a preposition or an article. 

Taking a title down to sentence case seemed almost like a deliberate humiliation to the author. “Your paper’s title is no longer a capital-t Title, it is simply a sentence that happens to be on the top line.” It felt like we were intentionally announcing to the world that these papers were not important enough to warrant capital letters.

It wasn’t until I had been employing this standard for a while that I realized that this demotion is, in a sense, the point. We apply this standard to proceedings papers, which are just reflections of presentations given at conferences. They aren’t peer-reviewed, and they don’t make or break careers. They’re typically a snapshot of work in progress. The idea behind them is that of sharing information. From that perspective—sharing—a title in sentence case makes sense. You’re just saying, “Hey, this is what my team and I did—see if you find it interesting.” You’re not trying to impress anyone with a big show.

The standard works. It makes the papers appear less pompous. If you care to see how fully I’ve internalized this style, just take a look at the title of every post on this blog. If I gave these blog posts title-case titles, each one would become “a piece,” as in, “That piece I wrote on x.” And I don’t consider a single one of them to be that. They are, at best, snapshots of ideas in progress. 

Some writing—the long stuff, the very thorough stuff, the painstakingly crafted stuff—does deserve a title in caps. But much doesn’t, and it’s actually better to admit it. Take a look at The Economist. They use sentence-case titles (though they cap the first word after a colon, which I dislike). Now take a look at Time, capped up to the gills. Which of the two looks smarter? 

Lately I’ve noticed odd capitalization all over the place. I got thinking about this from something I saw in the dentist’s office today. They had on the wall a wooden placard in the foofy, curlicue style of those “Home is Where the Heart Is” things your mom or grandma probably has in the kitchen. It read thus:

We Always
Make room for your
Friends
and Referrals

I won’t even bother commenting on the extremely strange choice of line breaks (though you could probably write an essay on that, too). I’ll confine myself to the capitalization. The first baffler is the inconsistency. Why are “room” and “your” lower-case in a sea of caps? But that almost seems like a typographical mistake. 

Let’s say they were capped like the rest. It would still be an odd thing to read: “We Always Make Room for Your Friends and Referrals.” I hope it’s oddness is especially apparent when written in-line without special breaks. The oddness is that it’s a damn sentence! It’s not a title or non-sentence phrase like “Home Sweet Home.” It’s a diagrammable, textbook sentence with subject, predicate, preposition, and objects of preposition. So why the caps? Make it sentence-case and give it a period. It would look like an actual statement.

Just below this in the rack was a magazine called Trailer Life (god only knows what is found in those pages or who the hell bothers to find it). They listed a feature on the top line of the cover: “Improve Truck Handling With One Simple Add-On.” Man, they even capped the prepositions in this one, and yet this too is precisely a sentence—of the imperative variety, with the implied subject and modal verb of “you can” or “you shall” (the latter being definitely the funnier). 

So what’s with the loud caps? Is this a “piece” that trailer enthusiasts are going to cherish for its wisdom and pass on to friends and family? Or is it just some mildly informative throwaway content that readers will forget as soon as it’s no longer useful to them? Turn it into the sentence that it is. Give it no more than its due. Stop yelling at us with titles too big for their britches. 

Anyway, my hypercritical perspective.

Saturday, March 31, 2012

When no news is good news

I’ve been abstaining from the news lately. Not that I’ve ever been a news junkie, but for the past week or so I haven’t read or watched any at all. I’d found myself tired of it—the repetitiveness, the irrelevance—and then I read an essay by Rolf Dobelli that convinced me to try giving it up altogether (at least for a while, as an experiment). The guy makes a cogent argument against the Standard American News Diet—one of the key points being that nearly everything you read in the news has no discernible effect on your life. After all, if the effect was discernible, you wouldn’t need the news to tell you about it. 

My news-abstinence became significant early yesterday morning, though. There was a fire at the marina, clearly visible from my apartment, about a mile away across the water. It was only 6:30, and still twilight. I watched the flames and the rising black smoke while I brushed my teeth and buttoned my shirt. It looked like the fire was pretty big.

As I drove in to work I thought about the fire and its relation to my news-abstinence. Obviously, this was a newsworthy event. In only a few minutes I would be at my computer, where I could look up some local “BREAKING NEWS” site and surely get a preliminary, on-the-scene report. And I kind of wanted to. I was curious.

But of course my curiosity had no basis. I don’t sail or own a sailboat, nor do any of my friends/family, nor do I even think very highly of sailing as a pastime (very cliquish, it seems). I knew (as Dobelli’s essay predicts) there was almost certainly nothing in that fire that would affect me. My curiosity was morbid, voyeuristic, and transitory.

So I decided to maintain my celibacy and not look it up online. I decided instead to conduct an observational study. I was the first person in my little work area to arrive that morning, so I figured my two workmates, when they got in half an hour later, wouldn’t know about the fire (not everyone’s home has a kick-ass bay view like mine, heh heh). I wanted to see how they’d react when I brought it up. I know neither one of them sails, and that they’d have as little reason as I did to imagine they’d be affected by the fire.

Only a few minutes after they’d each sat down, I said (nonchalantly), “Hey, did you guys see or hear about the fire at the marina.” One said she’d noticed smoke on her commute, the other wasn’t aware at all. Obviously, since I had only seen it from a distance and hadn’t read about it, I had nothing to offer by way of explanation. “Yeah, I could see it from my apartment, and it looked kind of big,” I think I said.

They immediately wheeled around to their respective computers—in unison, and as if in a deliberate effort to confirm my expectations—and looked it up online. I couldn’t help cracking a wry smile behind their backs. They each found it on a different site, too. The fire—still burning—had made not only local but also regional news. My, the speed of the internet in spreading tidings of disaster.

They shared aloud some of details, and sure enough there was nothing meaningful to any of us. A boathouse had gone up. No one knew why. If anyone was hurt or dead, they didn’t know who. It was still rather hot at the moment. 

So the whole checking-it-out-online, gathering-of-information thing had been a waste of time and attention. Not a huge waste, obviously, but a waste just the same. Imagine how that interaction might have gone if we didn’t have the internet (or other rapid-delivery news channels). The fire is the type of thing I probably would bring up anyway, even if I wasn’t conducting an observational study, just in case my workmates had friends or family they might need to be concerned about. Here’s how I see the web-free version:

Me: “Did you see/hear about the fire?”
Workmates: “No.”
Me: “Looked kinda big.”
Workmates: “Oh. I hope no one’s hurt.” (Maybe get on phones to call loved ones.)

I’m not trying to deride my workmates for turning around to their computers the way they did. Turning to a computer (or pulling out a smartphone) is the instinctual 21st-century response to hearing about anything. Can’t blame anyone for it. But the nature of that response should by itself indicate to us that what we’re responding to isn’t terribly relevant to our lives.

How would you respond if someone told you something that seemed like it might be a cause for your personal concern? If a coworker came in and said there had just been an explosion at the tire factory, and you knew your son/husband/brother was supposed to have just started his shift at the tire factory, would you turn to your computer and try to find a news story about it? Far more likely is that you’d be getting on the phone and/or running out to your car.

My guess is that many people around town turned to their computers over word of this marina fire. And my guess is that it was for them as irrelevant as it was for my workmates and me. It’s funny, since you would expect that if any news were to be relevant, it would be the local news. The truth, though, is that most of the things that genuinely affect us are either mundanities that don’t make the news, or they are emergencies so personally critical that our knowledge of them bypasses news streams altogether. 

I’ve only been avoiding the news for a matter of days, so I can’t confirm if a news-free life is truly better. There may in the future be something seriously important that I—in my book-reading, semi-unplugged world—totally miss out on. Maybe I’ll miss the opportunity to work for some hot new company because I didn’t read about it in a Time magazine profile. Or maybe I’ll lose all my money when I don’t read the CNN Money article about fears of my bank’s becoming insolvent. Or maybe my mother will be shot in the face by a terrorist and I won’t hear about it until she’s already buried.

I’ll be sure to let you know.   

Sunday, March 25, 2012

Reading the signs

Now that I’ve spent a decent potion of my life working in a place where I hear about marketing campaigns, branding, web copy, etc. and where I know some of the people directly responsible for those things, I’ve started giving a lot more thought to the specific words that businesses use to attract customers. 

Along those lines, I noticed something yesterday that made me chuckle. I was riding my bike through an area of fairly cheap-looking suburban apartment complexes—the kind that cater to college students and people who started families way too young. One of these places had a sign that looked like this:

NEWPORT
an apartment community

If you had asked me beforehand, I would have said “apartment community” is an oxymoron. Apartment complexes are collections of transient strangers. Usually the feeling between apartment neighbors is one of mutual apathy—and sometimes antipathy.

And sure enough, this Newport place didn’t exhibit outward signs of being any more a community than the apartment complexes surrounding it. When I think of community, I think of people planting a garden together, or lots of kids roughhousing good-naturedly, or—I don’t know—music and dancing or something. Newport had drawn shades and a few cars parked in front.

So I thought the sign was maybe misleading. If this place is a community, I’d hate to see what a non-community looks like.

I had to wonder if any customers are actually influenced by the use of the word “community” here. Are there people who see it and find themselves (consciously or unconsciously) more inclined to consider living there? Do the existing residents drive by that sign and feel reassured in their choice to call Newport home? It seems hard to believe. 

So that leads to the corollary question: who do the proprietors of the complex imagine they’re reaching with this sign? The sign obviously cost some time and money, so I’m sure they gave a bit of thought to how they were going to brand the place. They probably drafted several other slogans, and the choice of “community” was most likely not careless.

They would do better to keep the sign purely factual (e.g., “Newport Apartments”) and spend their time instead just perfecting the design. That more than manipulative word-spinning, I think, is the way to make a sale. 

We all know from our experiences online how important design is in forming customer relationships. We’re unlikely to buy from poorly designed websites. They look like they might be scams. Bad design affects other decisions, too. Would you apply for a job at a company with a really cheap-looking website? I’d be hesitant.

Good design builds customer trust. I trusted Dropbox when I signed up for it six months ago not only because my friends told me about it, but also because the site just looks professional. And although Dropbox does have a slogan (“Simplify your life”), they don’t put it front-and-center on their homepage. The design is the more important part.

So the point is that this Newport place really should have just gone for broke on fonts, spacing, color schemes, and the like. It’s obvious from their use of “community” that they’re trying to set themselves apart from the other apartment complexes. Design would have been the better way to do that, especially since slogans and marketing copy, when so patently false, can backfire. I’m less likely to want to live there, because I find their sign insulting to my intelligence.

If the sign had just looked really, really nice, I might have given them a second thought.