Pages

Monday, March 18, 2013

Sight-skipping


Sometime in May 2008, I was sitting on a train crossing Central Java, en route from Bandung to Surabaya. Almost exactly halfway through the journey, the train pulled in to the station at Yokyakarta, a small city famous for being the gateway to Borobudur, the largest Buddhist archaeological site in the world. Once it had stopped, the train disgorged a handful of passengers, took a few more on, and resumed its course eastward. 

I stayed in my seat, watched Yogya recede from view as the train got up to speed, and skipped what is probably the biggest man-made sight in Southeast Asia after Angkor Wat. I didn’t regret the decision then, and I don’t regret the decision now, five years later. I had been on the road for three months. I was weary of sightseeing. I just wanted to move.

The handful of big sights I’ve seen in my life (the man-made ones, that is), have been an incredible bore and disappointment to me. Not because they were smaller, uglier, or more defiled than I had imagined, but simply because nothing at all happened when I saw them. One day in Sydney, for example, I walked past the Opera House and over the Harbour Bridge. That’s it. There’s no story to the experience, just as there never is in  sightseeing. The grand sights of Sydney were boring, as was the Grand Palace of Bangkok, the Empire State Building, all of Washington, D.C.

The most interesting and memorable part of Sydney, in fact, was my nearly unhinged middle-aged roommate who got drunk on boxed wine almost every night and told me about the plans he and his girlfriend had to resurrect dinosaurs from the rich jungle soil of Vanuatu. In retrospect, the time I spent seeing the sights—and not chatting with this loony character—seems like a waste. 

Paul Theroux wrote that sightseeing “has all the boredom and ritual of a pilgrimage and none of the spiritual benefits.” I have to agree. The few times I’ve gone out sightseeing, it has always been with a sense of duty. Up until the day before I passed through Yogyakarta, I had intended to do what I was supposed to—go see Borobudur, snap a picture, be able to say I had seen it. But my road-weariness made me realize I was simply following what other people thought I should care about, not what I actually did care about. So I passed it up.

Was my travel experience necessarily richer for having skipped this major attraction? Hard to know. But I do know that the best memories from that trip involve people rather than sights, and if the choice were between either seeing the wonders of the world or skipping them and having more days like the one I had in Medan—where two college guys befriended me, taught me how to ride a motorbike, and introduced me to goat meat and durian—I’d take the latter in a heartbeat.

Now that I’m going to be doing some traveling again, I’ll probably be skipping a lot more sights. What I hope is that, in skipping them, I can maximize the experiences that make travel truly rewarding—namely, human connection, and honest observation of what makes a place tick. It’s not always easy. Travel can be lonely, and the days can be long. Sightseeing is often a way to fill a void of time. But I’m convinced that simply existing in a place, maintaining a slow pace and an open mind, is a more likely route to travel fulfillment than dutifully trekking from one sight to the next.

After all, the people living in a place rarely go out of their way to gawk at the local monuments and grand buildings. What New Yorker cares about the Statue of Liberty? Life happens elsewhere, and the life is what I’m most interested in when I visit a place. To see it, absorb it, and be richer for it.


Postscript
None of the above applies to the genuine grandeur and awe of the natural world. The sight of Mount Rainier trumps any other experience I know of.

Friday, March 15, 2013

The end of reader's vanity


I ordered a Kindle the other day, something I long thought I would never do. I’ve held out against e-reading for steadfastly aesthetic and sentimental reasons: I like the weight of books, their smell, the sound of pages turning. I like looking at covers and dust jackets, and I like using a Post-it as a bookmark so I can jot down words, places, and names I want to look up later. 

What made me abandon my hang-ups about e-readers is that I’m heading overseas soon, and for a long time. My reading interests tend toward the specific, and I couldn’t bear the thought of spending a year or longer reading nothing but bestselling paperbacks passed on from traveler to traveler. (One time while traveling I read the traumatically bad Angels and Demons, simply because there was nothing else available.) A “device,” as everyone enjoys calling them, seemed the only answer.

I’m sure I’ll be pleased with the purchase. Once I’m in those faraway lands where English-language books are scarce, the intellectual pleasure I get from reading exactly what I want will easily trump the aesthetic displeasure I get from holding an electronic gadget instead of a paper-and-ink book.

But as I sat on the couch last night reading one of those paper-and-inkers, I realized that by moving to e-readers we’re losing something more than just aesthetics. We’re losing reader’s vanity.

Even though reading itself is a private activity, books clearly have a social dimension to them. We carry books around with us, we read them in public, and other people—friends, roommates, strangers at the coffee shop—often see them. I doubt there are many bookish people who haven’t secretly hoped someone would notice what they were reading and admire them for it, maybe even fall in love with them for it.

Reader’s vanity like that has been with me my entire reading life. I like hearing people comment on what I’m reading. I like the idea that my reading reflects well on me. I haven’t ever read anything purely for vanity’s sake—slogging through something I hated just because it looked smart—or ostentatiously called attention to a book I was holding, but I’ve always gotten a thrill from the way a book subtly advertises my intellectual life. 

Of course, that advertisement no longer exists with a Kindle. As e-books continue to supplant print, the population of readers will become an undifferentiated mob of tablet-holders. Their reading choices will be no one’s business but their own. No one will be able to tell if you’re reading an e-book with a file size of two megabytes or ten. Nor will they be able to tell whether it’s a trashy murder mystery, or a serious historical treatise, or even a damned magazine.

Also, to look at things from the other side of the glass, the proliferation of e-readers is going to make it impossible, as a people-watcher, to judge others by what they’re reading. Who hasn’t taken a greater or lesser interest in a stranger based on what book they had in front of them? Reading choices can be a window into someone’s soul. A Kindle shuts the blinds.

It’s not unreasonable to suppose that e-books will completely (or very nearly completely) take over print within our lifetimes. It’s a strange vision of the future: libraries and bookstores moved entirely online, everyone staring at identical black or slate-gray tablets. I imagine it will be harder to make an intellectual map of one’s environment. Books are part of the color and life of our world, and in their absence we’ll probably lose a shade of understanding of those around us.

But if there’s one lesson to be drawn from technological development in the past ten years, it’s that where there’s a desire, it will be met. If readers feel they’ve lost something by migrating to e-books, someone will find a way to give it back to them. Most likely, as books and libraries move online, so too will readers’ vanities and judgments.

I noticed on my roommate’s Kindle, for example, that there’s a “share” feature: Let your friends know what you’re reading, in real time! You can highlight text and press a button to broadcast it to the world, presumably through Facebook, Twitter, or some other social time-wasting site. 

At the moment this kind of sharing seems a bit too vain, even for me. I want people to notice what I’m reading, I don’t want to shout it at them. But in time the crafty souls at Amazon and elsewhere may find more subtle ways for us to regain our reader’s vanity. Perhaps a small screen on the opposite side of the tablet that displays the cover of the book you’re reading. Or an app that detects other e-readers within a fifty-foot radius and tells you what’s on them. 

Maybe these things are already in development. But until the purveyors of e-readers give me a dignified way to display my reading choices, I’ll just have to learn to be a bit less vain, and a bit less judgmental. Both of those would be healthy changes for me. But that doesn’t mean I have to like them.

Thursday, November 29, 2012

An inquiry into bananas


The other day I had to go for a ride with my parents (whom I’m visiting) to see my grandfather. It was sort of sudden, and we were leaving at about the time I usually eat lunch. Unable to prepare a genuine meal, I grabbed a few bites of snacks and leftovers, and then I searched for something to take with me on the road. I grabbed a banana, and began eating it as we pulled out of the driveway.


It was a singularly disappointing experience. 

The problem with bananas is, quite simply, that they suck. They are the most low-rent, low-class, low-character fruit known to man, yet out of all the fruits in existence I have probably, in my life, eaten more bananas than any other. 

Bananas suck, but they’re ubiquitous. In homes, in school lunchrooms, in continental breakfasts, in fruity fruit salads—bananas. Whenever a junk-food emporium gas station decides to throw a sop to healthy eating, you can be sure they’ll do it with a wicker basket full of bananas (and a few withered oranges and waxy Red Delicious apples). 

We live in a very bananocentric culture. There’s no escaping the tropical yellow phallus. 

When I ate the banana the other day, I had to sit back for a little while and just look at the thing, wondering what the hell I or anyone else could possibly be thinking when we make the decision to eat one. The thing is, I’ve never cared for bananas that much, and yet I used to buy a full bunch every week, as if it were my duty to eat them. I would walk through the produce section and say, “Duh, better get da bananas,” and then spend the rest of the week forcing myself to choke them down. 

This went on for years. I lost the habit a while back, I guess, though it happened without any deliberate decision on my part. This recent experience, however, only confirmed the distance that had grown between the banana and me.

After all, the banana is really just the donut of the fruit world. It turns into a starchy, sickly sweet, mucusy mess in the mouth, like pureed squash mixed with honey and snot. Where other fruits leave my stomach feeling cleansed and refreshed, a banana in my body feels like a chemically reactive marshmallow ballooning up to the size of a small pillow. I spend at least ten minutes feeling bloated and unhappy, and my entire olfactory cavity remains stained with the scent of bananas for the better part of an hour. 

I’m sick of banana apologists. I’ve heard all the reasons for the banana’s supposed greatness: it’s cheap, it doesn’t need to be washed, it comes with a built-in handle. Whatever. These are not excuses for a shitty fruit. It’s time for the banana to go back where it came from.

Saturday, November 17, 2012

Waste and time


The EPA found in 2007 that the United States flushes 4.8 billion gallons of water down its toilets every day. I mention that not to shame you into doing your business in a hole in the backyard from now on, but only because the other day I had cause to reflect on toilet water a bit. 

I was shopping at the local food co-op and felt the need to relieve myself. This co-op is in a pretty new facility, and it’s replete with eco-friendly bells and whistles. Solar panels, native species planted around the parking lot—that kind of thing. It also has in the bathroom stalls those fancy toilets with separate buttons for “solid waste” and “liquid waste.”

These toilets always bring out the righteously embarrassed Victorian in me. A gentleman’s excretions should be his own private business, should they not? How can we tolerate the barbarous humiliation of being asked to indicate the “solidity” or “fluidity” of our leavings? It’s almost more than a civilized man can bear. 

I suppose we should only be thankful they extend us the courtesy of euphemizing the substances as “solid waste” and “liquid waste.” Were they to label the buttons just “urine” and “feces,” the indignity might be so severe I would simply have to hold my function until such time as I could perform it in the privacy of my own home.  

I was in an unusually observant mood the other day, though, and I noticed that I didn’t actually need to describe my wastes to the toilet machine at all. Reading the fine print below the buttons, I learned that a sensor in the toilet could make the decision as to “solid waste” or “liquid waste” for me. Surprising in itself, but even more surprising is how it makes that decision. The sensor determines the proper flush volume, I read, “based on the amount of time spent in the stall.”

I had to think about that for a while. Someone working for a toilet manufacturer (Or toilet sensor manufacturer—they could be totally different industries.) had to make a decision for the cut-off time between urination and defecation. Who was that guy, and how did he do it, and what time did he decide on? What kind of statistical analysis goes into determining the average times for human waste elimination?

It must have been a hell of a job to research. Did they place stopwatches and video cameras in bathroom stalls across America, or did they bring in test subjects for a series of blind clinical trials? How careful were they to get a representative sample of the population?

It’s just that I have to imagine there’s sometimes overlap between the time it takes to pee and the time it takes to poop. If, for example, you imagine a really slow urinator—say, an old guy with a narrow urethra and a large bladder and an uncircumcised penis—and you imagine a really fast defecator—a fit young guy who gets plenty of fiber and just had a cup of coffee—you have to figure that occasionally the latter finishes before the former.

These are boundary cases, sure, but the toilet scientists must have accounted for them with some kind of statistical weight, right? Did they have heated arguments about the relevance or irrelevance of urinary outliers? Maybe after months of debate they just said, “Fuck it. Anything longer than a minute thirty-five gets the poop flush.” So there may be fast poopers out there routinely finding their creations met by a mere trickle insufficient to the task, while slow pissers find theirs met by a roaring deluge capable of sucking down a bag of rocks.

And once they decided on an upper time limit for urination, did the toilet engineers consider an upper bound for what we could reasonably assume to be healthy defecation? Surely there’s a time beyond which we would have to start worrying. If someone’s on that toilet for longer than half an hour, does the toilet sensor automatically call an ambulance? Seems like it should.

These thoughts all ran through my head as I stood there eliminating that morning’s beverages, and probably dribbling some of them on the floor in my absentmindedness. I  reached the conclusion that designing environmentally responsible toilet solutions for the public may be one of the hardest jobs in America. Those guys must really care about saving those few extra gallons of water.


Monday, November 12, 2012

Come restaurant with me


A mastery of parallel constructions is one of the skills that separates the careful writer from the slapdash. Our world is rife with false parallelisms: “As a teacher, I will need to possess patience, warmth, respect, and be a good listener.” Ugh.

Airports are pretty reliable places to observe bad writing. I had to spend about 83 minutes of my life in one the other day, and, sure enough, I saw a false parallelism that made my skin crawl for a good five minutes or so. It was part of the slogan (if you can call it that) of the restaurant near my gate. Here it is:




Now let’s take a look at those words beneath the restaurant’s name and try to insert them into some kind of grammatically parallel structure. It won’t work, but what’s interesting about this false parallelism is that it is palindromic in its falseness. That is, the elements are parallel in both directions, excluding the final element. Observe:
“Come to Scotty Browns. It is a restaurant. It is a lounge. It is a socialize." 
“Come to Scotty Browns. You can socialize. You can lounge. You can restaurant.”
With the middle element being equally plausible as a noun and a verb, we can’t be quite  sure which element actually makes the parallelism false. Maybe lounge is meant as a verb, in which case the noun restaurant is the false element. If lounge is a noun, then socialize is the offender. 

I found this rather confounding as I sat staring at the sign with nothing else to occupy my thoughts. (I was by myself, of course. Maybe I should have gone to Scotty Browns and enjoyed the socialize they have there.) I began to wonder if I wasn’t just observing the bleeding edge of language change; maybe restaurant is expanding to become not only a noun but also a verb. It’s happened before with contact and access, so why not with restaurant?

Maybe in the year 2013 we’ll say to our friends, “Hey, if you’re not busy Friday, why don’t we restaurant?” Our social communication could be revolutionized. Here are some other possibilities:
“I’m pretty broke, so I can’t restaurant much for a while.”
“I love New York; the restauranting is divine!”
“So this totally creepy guy tried to restaurant me the other night, and I had to lie that I had plans with my sister just to get him to leave me alone.”
If it’s the case that in ten years we’ll all go “restauranting” without giving the term a second thought, then I guess I have to commend the proprietors of Scotty Browns for bringing linguistic innovation right into the mainstream. They’ve shown the masses how to be flexible and creative with words. Usually this is the kind of thing that only poets do. 

Monday, November 5, 2012

Half and half


Is the glass half-empty or is the glass half-full? I guess it depends on what we’re talking about here. Are we thirsty or are we not? That could definitely influence our interpretation of this glass and its contents. If we’re not thirsty, then I guess the question is meaningless. We’re indifferent. The glass could not even exist, for all we care.

But if we are thirsty, then surely we regard the glass as half-empty, because we’re disappointed to have so little water to slake our thirst. Unless we’re not really that thirsty, in which case the half-capacity glass provides just the amount we need, so we consider it half-full. Or unless we’re desperately thirsty, in which case we’re glad to have anything at all, and we would again consider it half-full. 

But how large is this glass? I think that matters, doesn’t it? Because if it’s a tiny juice glass, then it might not be satisfying even when filled to the brim. And if it’s a big tumbler, then just a quarter of its volume might be enough to satisfy us. I think I’m entitled to know what kind of glass I’m looking at. Does anyone have figures on the average size of a drinking glass in America? 

Is it truly sufficient for the purposes of this question, though, to only consider glasses in America? Other cultures may drink out of considerably larger or considerably smaller glasses. I don’t feel I’m prepared to answer this question without being given the time and the means to explore the world and observe more drinking glasses. 

(Is it weird that I always imagine this glass being filled with water? It makes sense that it would be, because water is the one beverage we all need to survive, but maybe other people are imagining something else. Does the glass’s being filled with juice or beer or wine affect how one perceives its fullness or emptiness? A baby would want the glass to be filled with breast milk, after all. Gotta remember the babies.)

I feel like I need to be told what the water in the glass is even meant for. Is it there to be drunk? Maybe whoever poured the water in the glass intended to use it to water a small plant. In that case, it’s probably better to have it at half capacity, so it won’t run down the side and make a mess while being poured out. So you’d have to call that half-full.

And mustn’t we also consider how the water level came to be at the halfway mark to begin with? If someone filled the glass to the brim and then drank half of the water, leaving half of it for me, I couldn’t help but regard that as half-empty. It has been emptied by half. But suppose someone takes an empty glass and carefully pours water to exactly the halfway mark. They have taken nothing and replaced it with something. Half-full

But, God, if I just happen upon this glass without knowing who poured the water and when, isn’t it awfully presumptuous of me to begin projecting my views and my needs onto it? I mean, this is just a glass, right? Where do I get off pontificating on its fullness or emptiness, as if I’m the center of the universe? Other people have needs, too, and it isn’t my prerogative to go labeling a glass from just my own narrow viewpoint.

I really shouldn’t be so selfish.

Sunday, November 4, 2012

Silent achievements


In a book about cycling I read recently, the author tries to convince readers to shun the world of spandex-wearing racing cyclists and instead ride for the sheer joy of it. He advises cyclists who feel themselves pressured into participating in over-serious cycle events to ask themselves this question beforehand: Would I do this if I couldn’t tell anyone about it? If I couldn’t wear the t-shirt afterward—if I couldn’t tell friends, family, and coworkers about my weekend-warrior sweatathon—would I still choose to do it? His point is that most of us would say no—we’d rather join a buddy for a casual ride out to the swimming hole.

It’s a great question that cuts down the pretensions of the athletic “image.” I think you can take it further, though, and apply the question to life as a whole: What would I choose to do with myself if I could never talk about it?

Suppose you lived in a world where no one was ever going to ask, “What do you do?” You could write, sell, or build whatever you wanted, and have your name attached to it, but you could never bring up the topic of your vocation to anyone. People would have to make the connection of your name and your creation on their own. You could say, “I’m Art Fry”—and let them realize who you are—but you couldn’t say, “I’m Art Fry, inventor of Post-it Notes.” 

In that situation, what would you spend your time doing? A handful of people could answer, “Exactly what I’m doing now.” But most couldn’t. Our sense of what we find intrinsically rewarding is usually clouded by what we find—or think we will find—socially rewarding. Prestige and recognition misdirect us into doing things we dislike.

If people could no longer boast of their achievements, we’d probably see thousands if not millions of people in high-status jobs suddenly quit. Law firms and investment banks would empty pretty quickly. Enrollments at Harvard and other prestigious institutions would drop considerably. Politics might disappear entirely. How would a politician sustain his morale when robbed of the opportunity for self-promotion? 

That’s not to say all lawyers and bankers are self-deceiving hypocrites working at things they hate in the hope of being admired, nor that they are the only ones doing so. Surely there are some lawyers who love what they do, and just as surely there are aspiring novelists who don’t really enjoy creating literature, but who just want to “be novelists.”

It might be impossible to fully separate our genuine enjoyments from our desire for    social gratification. For all we talk about doing what makes us—individually—happy, we assert just as often that human beings are “social animals.” We don’t live in solitude, and everyone needs at least some kind of recognition. Even work that seems more purely creative—writing, architecture, fine cooking—is still done for an audience.

It’s probably fatuous, then, to ask “What would you do for no recognition? What would you do just for yourself?” If we had no one but ourselves to live for, we’d probably all commit suicide. So the better question is probably, “What would you do for no recognition other than that which the work itself generated?” That line might be a bit clumsier to include in a motivational speech, but it’s probably the best way to get to the truth.