Pages

Saturday, December 31, 2011

Thoughtfully inked up

I feel I may be one of the last people of my generation who doesn’t like tattoos (or wild piercings). I got one of each when I was 18, but I started disliking both after only three years. The piercing came out and closed up, and thankfully the tattoo is small and in a place people rarely see.


I have two reasons for disliking tattoos—one aesthetic and the other intellectual. I can explain the aesthetic reason pretty simply: I don’t like the way tattoos look. I think the human body looks pretty nice the way it is, and tattoos just muck it up. Small body modifications like the standard earlobe piercing don’t bother me, because they just frame the face, which the eye is drawn to anyway. But a tattoo on the arm, leg, chest, or wherever else draws the eye in the same way a scar does. It just makes for inelegant design, like slapping stickers on a Ferrari.

My intellectual reason for disliking tattoos is more complex, probably because people come up with complex ways of intellectualizing their tattoos. Once in a while you might find someone who admits their tattoos are purely for aesthetics, saying something like “They just look cool.” I would disagree, but the discussion would end there. 

Hardly anyone says this, though. Most people say their tattoos represent something. I’ve heard girls say, “I got a bird because it represents freedom to me,” or “I got a butterfly because it symbolizes change.” Not everyone has explanations this trite, but the explanations still follow the same template: “I got x because it means y.”

When I was 18 I would have agreed with this thinking. Sure, if something means something to you, why not get it stamped on your body? Now that thinking seems bafflingly odd. If some object or image carries meaning for you, wouldn’t it be far more powerful to create something based around that object or image? If birds represent freedom to you, why not paint one, sculpt one, photograph one, or write a poem about one? Having done one of those things, you probably will have expressed your feelings about birds more clearly, and you will have done it in a way that can be shared with more people.

Many would protest that a tattoo is art. That may be true, but it’s the art of the tattoo artist, not the person wearing it. Few people have any hand in the design of their tattoos, and even those who do are still leaving the final creation of it to someone else. If a tattoo is art, it’s art commissioned to a stranger. Surely the image has lost some of its meaning by the wearer being so far removed from it.

But there’s nothing wrong with decoration, right? A painting by someone else can be meaningful enough to you that you choose to hang it in your living room. But there’s something different going on with tattoos. The painting in the living room is pretty easy to look away from. Tattooseven artistic ones, even small ones—are conspicuous, mainly because of our innate visual interest in other human beings. Having an artistic tattoo is sort of like walking around with the Mona Lisa screen-printed on a t-shirt. It looks like an advertisement for one’s taste in art.

I think that’s basically what tattoos are—advertisements. They’re more about vanity and group acceptance than they are about self expression. The hope is that someone will see it and think the wearer cool, smart, sophisticated, deep, passionate, etc. Even those  who get tattoos in very private places are still hoping some special boy or girl will see it one day and think those things. 

Creative works (including blogs, heh heh) definitely have an element of vanity to them as well, but their is far more effort involved in making them than their is in getting a tattoo. Also, significantly, a blog or a painting typically isn’t thrust in the faces of friends, lovers, and random people on the street. If I wore a t-shirt that said “thepresenthistory.blogspot.com,” people would think me a pompous ass, full of pretentious notions of myself as “a writer.” But if I got a quill and inkpot tattooed on the inside of my forearm, then I would be an intellectual.
  

Friday, December 30, 2011

Demographically uncooperative

As I grabbed some groceries at my local food co-op this evening, I was struck by something printed on the side of the paper bag I was carrying. It was a quote from Peter Shumlin, who evidently is the Governor of Vermont and also a member of a food co-op in that state. Here’s what he said:
“The great thing about the co-op is that you see people from all walks of life coming together. It’s really a community center. It’s a community resource.”


It’s a nice sentiment that’s probably shared by many co-op shoppers and administrators, but I doubt there’s much truth in it. My guess is that the average food co-op draws from a significantly narrower demographic than the average supermarket.

Here’s a profile of the average co-op shopper: middle-class white with some college education. It only took seven words. There are exceptions to the profile, sure, but step into any co-op in the country and you probably have a pretty good idea of who you’re going to see. You might see a private divorce attorney standing in line behind a suburban hippie, but the real outsiders in our culture (the natives on the reservation, for instance) would see in these two people more similarities than differences. 

One thing that’s certain is that co-ops do not bring in people from all (or even most) walks of life. This isn’t because of a discriminatory mandate they’ve made. It’s just inherent in their business. Co-ops sell healthy, sustainable foods (or at least foods that are marketed as such), and they slap a higher price tag on them. Co-op shoppers pay the higher price because the foods appeal to one of three senses: their sense of taste, their sense of health, or their sense of piety. In many shoppers it’s probably a mixture of all three.

So a co-op shopper is someone who (a) believes tasty/healthy/sustainable food to be important and (b) has the money to pay for it. These criteria eliminate a huge proportion of the population. Only a handful of people care about preparing healthy and delicious foods from scratch. Supermarkets offer low prices. Everyone likes those. 

Stroll through a large supermarket and you’re likely to find a much more diverse collection of people than at a co-op. You might see the attorney and the hippie again, but you’ll also see more students, more minorities, more families, and more old people. You might also see people from the margins who’ve just scrounged up enough money to grab a beer. I don’t think I’ve ever seen that at the co-op.

Try to explain to a supermarket shopper the appeal of a co-op, and you’ll probably draw a blank stare. They’ve already found what they want. The supermarket meets their needs and wants (ill-considered though the latter may sometimes be), and in that sense it’s as much a community resource as any other place. 

I read something recently in my co-op newsletter about the Board of Directors being very keen on finding ways to “reach out to minorities.” I found the statement embarrassing in its condescension (vestiges of college diversity programs) and laughable in its business sense. If, statistically speaking, minorities have below the average income, then the best way to “reach out” to them is to give them what they want: inexpensive goods. Of course, doing that would require a co-op to change its business model to be more like that of—well, a supermarket. 

Co-ops exploit a niche of educated, health-conscious, halfway-affluent people. They shouldn’t feel the need to lie about it or make guilty announcements of their efforts to change it. Co-ops serve their customers admirably, and most of those who don’t shop at the co-op are unlikely ever to be convinced. For a certain part of the community, nutritious and environmentally friendly food has a strong appeal. But for a much larger part of the community, money appeals more. 

Thursday, December 29, 2011

Unlocking history

History ranks right up there with economics on the list of subjects most people are happiest to have permanently dispensed with upon graduating college. History classes seem like nothing more than a dismal cataloguing. I sure felt that way in school. Santayana’s maxim was a bad joke. What historical mistakes were there to learn from? All we had were names and dates.

It’s a shame so many people believe history ends with what gets taught in school. Students aren’t exactly wrong for thinking history boring. School history is boring. I doubt there’s any way they can make it interesting.

I became enthusiastic about history accidentally. My dad knew I like Russian novels, so he suggested I read Robert Massie’s biography of Peter the Great. When I finally picked it up, it fascinated me and led to all sorts of questions. The books I picked up to answer those questions led to other questions. And so forth.

Now the school subject that bored me more than any other is all I want to read about. I’ve been at it for a while, and there’s no end in sight. My reading list is long, and I can’t imagine ever being without new questions that I need answered.

So how is it that my interests should change so dramatically in so little time? If history went from a pain to a passion in only five months after graduation, that suggests the fault lay with the school rather than the student.

The problem is that schools take history and try to enforce too much of a structure on it. You sign up for a class that’s going to cover European history from 1750 to 1900. They cut the subject into discrete (and arbitrary) units, then give them to you piece by piece. I doubt this is the best way to learn anything, and I’m certain it’s not the best way to learn history.

Proper history isn’t a cataloguing of events, but rather a pulling of threads. Interesting details are tugged at until some relevant truth is discovered. To do this, you need time and freedom—two things school rarely allows you. 

You need time to find interesting details and the freedom to follow them wherever you see them leading. The broad scope of events is usually too abstract to be of much interest. In school you learn that Peter the Great founded St. Petersburg and began modernizing Russia. Big deal. But did you know that after the Streltsy revolt he personally oversaw the torture and execution of thousands of men formerly loyal to the tsar and even got up in their faces screaming “Confess!” ?

Suddenly 17th-century Russia seems like a place worth reading about. The details bring it to life. But schools can’t allow the time for learning details about one ruler in one country. In the interest of being balanced and unbiased, they try to present everything. The result is that they teach almost nothing. Piquant details are what excite us and stick with us, and we get very few of them in classrooms.

The freedom to follow your own interests is an even more important part of learning history. In a history class, the instructor has already decided what will be taught—has decided, in short, what you need to know. But in most cases it’s the curious student who knows best what he needs next.

The Russian Revolution and the Weimar Republic may have followed World War I, but that doesn’t mean you necessarily need to read about those things directly following your study of the War. Maybe what you really need to read is a biography of Jack Fisher. Or maybe what would really strike your fancy is learning about Japan’s post-war acquisitions in the Central Pacific. It may well be that something totally outside the scope of your class is what would best help you personally understand the things you just studied.

Trying to conform historical study to linear structures of geography and chronology is silly and unnatural. No one sits around remembering things in chronological order. I think about the trip I made to DC when I was 13, then I remember my high-school friend’s story about riding the DC subway when he was five, then I remember my first plane ride when I was six. Memory isn’t a line, but a web.

So too is a web the only sensible model for learning history. Say you just finished reading something interesting about the Russian Revolution. Maybe you’re less interested in learning what happens next in Russia than you are in learning about revolutions in general. You now read about the French Revolution, the American Revolution, the Revolutions of 1848, the Indian independence movement. By not following a carefully prescribed chronological path, you’re certainly skipping over things. But by following your genuine enthusiasm, you’re going to retain more of what you read and be the wiser for it.

I can’t imagine any school that would allow such truly independent study. In fact, with study that independent, the need for the school would cease to exist. A library would be better. If instead of saying to kids, “Here’s what happened in 1776,” we said to them, “Find one history or biography that sounds interesting and then follow up on the questions it raises,” we’d probably end up with far more historically literate people. And that literacy would give us more perspective as a society. 

People learn the most when they teach themselves. I think all we need to do to get kids interested in history is help them find one historical work (from any time and any place) that truly excites them. If they have any intellectual curiosity, they’re bound to seek other good books to put the first one in context. Eventually the past will fall into place. The route they take may be circuitous, but the web they build should in the end be tight enough to catch the important stuff.


Wednesday, December 28, 2011

Sinfully uncreative

Now that I’ve had this stupid little blog going for a whole month and written something for it more days than I haven’t, I wonder why I didn’t start it earlier. It’s fun, and it’s easy. All I have to do is write down the things I talk to myself about all day anyway. And as anyone who writes already knows, the process itself produces more ideas. Every one of my posts so far has come out longer than I initially imagined it would be. Once I begin saying things, I find I have more to say. 

In the past thirty days I’ve done more creating than I had done in the previous three years. That’s not so much a statement of the achievement of the past month as it is a statement of the tragedy of the time preceding it. 

Most of us spend hardly any time being independently creative. The things we create for school typically consist of phony ideas crammed into structures determined by others. The things we create for work (if anything) typically reflect no ideas at all and adhere to structures even more rigid and imposed. And in both cases, it’s someone else (the prof or the boss) initiating the project.

I now feel mildly ashamed for having spent so much time creating nothing of lasting and unique value. Sure, I’ve spent plenty of time creating valuable things like good meals or my physical fitness (exercise is a type of creativity), but these things are transitory and difficult to share with many people. There are few tangible items from my first 25 years that I can point to as my own.

For a long time I thought about doing more writing, starting a blog, doing something other than just going to work everyday and fulfilling the few personal responsibilities necessary to maintain my self-respect. But I always made some kind of excuse. I would say I had no time (untrue), that I had no ideas (How could I know that until I tried playing with them?), or that I would have a tiny or nonexistent audience (Maybe so at first, but who cares?).

Now I think there’s no excuse for any self-proclaimed intellectual not to have a blog. You’re a thoughtful and articulate person, right? Then why aren’t you writing and sharing it?

The internet would be a dream come true for any struggling writers of previous centuries. It’s an instantaneous and free vanity press. At the click of a button, your ideas are available to an audience of—potentially—the entire world. It’s a way to practice your craft at almost no risk. Anyone with a genuine love of writing would take advantage of that in a heartbeat.

Lest I sound grandiose, I should point out there are only about four people who currently read this blog of mine. I lack either the confidence or the shamelessness (sometimes the same thing) to promote myself very heavily, so my audience remains small. But still, the opportunity for me to reach more readers is available should I choose to take it. 

In university English programs, you hear a lot of talk about being a writer. Kids say they want to write novels or that they want to enroll in an MFA program and “learn to be a writer.” I was always skeptical of these kids, and now even more so. The first question I would ask any undergrad who says he wants to write is “Do you have a blog that you post on regularly and at length?” If not, why not? Blogging is writing. If you’re not willing to write about difficult questions you haven’t totally answered, for no pay, for no real recognition, and on a consistent basis, writing probably isn’t your thing.

Most of the college kids with authorial ambitions just like the idea of proudly holding up a well-received hardcover book with their name on it. But to succeed at it, I think you have to like the mundane and unglamorous part of just putting words on a page. I’m not saying I will succeed at it. I’m only saying that great novelists and essayists have probably all started small, written what came to them naturally, and taken whatever audiences and publishing opportunities they could get. 

Seizing opportunities and practicing are usually the only ways to get good at something.  Excluding a few extreme outliers, no one begins any career as a titan. Creating daily is how they get there.

Tuesday, December 27, 2011

Images of forgotten selves

My mother gave my family a very thoughtful and loving Christmas gift this year. She took our old home movies on Betamax tapes and had them transferred to DVD. It ain’t cheap. She and my dad started recording these movies on Christmas Eve 1986 (the night they were given the camera), only five weeks into my life. Most of these old movies hadn’t been watched by anyone in probably over 10 years, when we finally got rid of our Beta player.

We spent some time as a family watching these DVDs over the past few days. They were amusing but sort of baffling. My brother is barely talking, and I’m incapable of holding up my own head. At one point while watching this, my dad sort of shook his head in wonder and said to my brother and me, “It must be amazing to see this kind of footage and know that it’s you.”

The old man was born in 1946, so of course he’s never had that experience. The only records of his infancy are some fairly grainy black and white prints. I understood what he was getting at, but in truth I was ambivalent about the amazingness of seeing the infant Dan being bathed, being fed, etc. It was amazing to see my parents 25 years younger and to see what they had to do to raise their children, but I never had any strong feeling that I was looking at me

The thing about infants is that they’re essentially the same. One may be fussier than another and one may be smilier, but they haven’t developed clearly identifiable personalities. They have no mannerisms, their faces are less distinct than adults’, they communicate only the most basic needs through the most basic means. Imagine I had never seen a baby photo or baby video of myself. If you showed me at age 25 a video of me at age five weeks, would I be able—independently—to identify the baby as myself? Of course not. 

I feel no connection at all to that helpless little larva with squinty eyes. Why would I? Some hyper-geniuses claim to remember parts of infancy, but I sure don’t. I doubt that I would be able to identify myself at age one, either—presuming, again, I had no prior knowledge of my childhood appearance nor any prompting from family. And I might not be able to do it even at age two—or later. When you think about it, it’s really hard to say. If I sat down and watched home movies of myself from birth to age five—and we’ll just imagine they’re some kind of clinical home movies in which my family members don’t appear and my name isn’t said—I would probably come across something that I remembered independent of the movie. At that point I would be able to connect the image with the past experience and realize who I was looking at.

Until that point, though, the kid in the pictures would be a stranger. How could it be otherwise? Do you remember what your hair looked like when you were three or what your voice sounded like when you were four? You might tell yourself you do, but that’s only because you’ve been regularly looking at photos and watching videos of yourself for years. 

I have few distinct memories from the first five years of my life. I think this is probably true for most people. I have many strong impressions—remnants of emotions—from that time, but only a handful of memories of specific incidents. That makes it pretty tough to match up a photo or a video of early childhood with a genuine memory. Very rarely am I able to look at an old photo and say, “Yes, I remember this exact moment.” I can say things like, “Oh, this is from our trip to the Oregon Coast when I was six,” but again, I can only say this because I’ve been looking at those photos forever. 

People born in the second half of the twentieth century and later remember their lives in a way profoundly different from that of any human beings previously. We are inundated with images of ourselves. Every time I visit my folks, I walk through the hall where we have the portrait of me at three months, the family portrait where I’m one, the snapshot of me at age six holding my pet rat, and many others. I’ve seen all of these literally thousands of times, and I’d guess my experience is typical. The result of seeing these pictures over and over is that we have mental images of our past with no actual memories underneath to them. I have a picture of my family’s life in 1988. But do I remember it? No.

The conventional wisdom is that photos evoke memories. I don’t think that’s true. I think people use photos to build memories. The photo of me standing next to Yaquina Head Lighthouse doesn’t evoke a memory. It is the memory. I tell myself I remember it, because I’m holding incontrovertible proof that it took place. Clearly the photo knows more than I do.

I’m not sure it’s necessarily a bad thing that we’ve fabricated memories this way. But it does probably mean we don’t know ourselves quite as well as we think we do. 

Friday, December 23, 2011

Hello to a now-distant friend

Visiting my parents for the holidays means being reintroduced to something I’ve been without for a long time: TV. It’s always a bit of a rude shock—similar, I imagine, to the way it would feel riding a crowded subway after several months living in a quiet lake cabin. Every time I get a fresh dose of TV, it seems louder, brighter, faster, more crass than it did the last time.


Getting off TV happened more or less accidentally for me. I brought an old TV with me to my dorm room when I first moved away in 2005. In the first month of college, I wasn’t watching much of it, because I was busy meeting people and getting to know my new town. Then the old TV up and broke around the end of October, so I wasn’t watching any of it. I couldn’t justify the expense of a new TV then or for the remainder of my time in college. By the time I was in a financial position to do a little spending, I had long beforehand realized how much better my life was without a TV in it. 

Now when I happen to be around it, I find it extremely disruptive. I can’t think, I feel agitated, my sense of the progression of the day gets turned on its head. Funny that I should be so disturbed by something that 10 years ago felt as natural as breathing.

The other day I read that for a few years now, the average American home has had more TVs in it than people. We have 2.86 television sets to 2.5 people. My parents contribute to that absurd statistic. Back in the day, we had four TVs (five if you count the old one in a box that I took to college) and four people. I don’t think anyone ever planned to have enough TVs for each of us to sit in blissful solitude while watching our own preferred programming, but it just kind of happened that way over the years. In 2005 and 2006, my folks got rid of two people—but none of the TVs. Now in this house cable boxes outnumber human beings two to one. 

I’m fortunate at least that my parents are not the type who turn the TV on first thing in the morning and leave it blaring until bedtime. They read, work on projects around the house and in the yard, go on walks, and participate in plenty of creative and thoughtful activities. But from late afternoon through evening, you can bet there will be at least one TV going, and sometimes without anyone even really watching it. It’s at that point that I begin thinking about where in the house I might find some refuge.

Even though I seriously dislike the way TV makes me feel, there’s no denying its magnetism. Last night I was washing the dishes while my dad watched football on his monstrous big screen in the room adjoining the kitchen. I don’t care for football in general, and I especially don’t care for watching it on TV, but I still found myself pausing in my work with soapy hands at my side to gawk at the Colts making a comeback in the last two minutes. When it’s right there in front of me, I can’t help getting sucked in.

It’s almost impossible to keep your eyes off a lit-up TV screen. That’s where the movement and the voice of the room are. The screen seems like the place to be. 

I don’t miss it at all when I’m in my own place, but when it’s in my face and instantly available, I have to make conscious efforts to get the hell away from it and do the things that matter to me. If it hadn’t been for the TV noise filling the house last night, I probably would have written something for this blog. As it happened, I felt intellectually distracted all night, even when I retreated to a quiet room. The TV  had blasted the ideas right out of me. Only now—after a good night’s sleep and in the morning before the TVs come on—do I have the right flow to express myself. 

I can’t really fault people like my parents who retain the habit I kicked so long ago. They live fine lives, and I know it’s hard to see how you might get more out of life by getting rid of something. But I’m grateful that old TV of mine broke so I could learn a bit about living without.





Wednesday, December 21, 2011

Pack your bags, prepare for departure

Tomorrow morning I’m heading across the state to my parents’ house for five nights. The process of packing this evening took a grand total of about 30 minutes. That’s not a newsworthy item in itself, but it’s funny considering what a big deal capital-p Packing was throughout my childhood. Before any trip, my mother would hound my brother and me about packing up, getting packed, making sure we packed everything. I doubt she was atypical among mothers in this regard.

When I was really little, I basically fell into thinking that way, imagining that packing was a critical and time-consuming chore. But as I got older, and especially after I started backpacking frequently in my pre- and early teens, I realized how little time it actually took. All you had to do was know what you need, not need much, and focus briefly on cramming it into a bag. My brother and I would rarely even think about packing—no matter how long the trip—until about 8:00 the night before we left. We’d each knock it out in about an hour, and I can’t recall any time our procrastination or haste created any serious problems.

Some people never really learn this. They imagine travel, especially foreign travel, requires complex preparations and checklists. It’s sad, because it frequently limits people in where they go and what they do. I know some older people—my parents spring immediately to mind—who say they really want to travel abroad, but who constantly shy away from it because it seems like such a project just to put the stuff together. It’s easier to simply defer it until later.

The truth is you need almost nothing to travel. After clothes and a handful of hygiene items, what is there? Unless you’re going to the Amazon, Himalaya, or Sahara, the less planning you do is probably the better. Just grab what you would for a weekend trip to a friend’s house two hours away, then multiply that by about three.

You’re never going to do much interesting travel if you’re not low-maintenance. Sometimes the must-pack-everything people actually end up making trips happen, but that can be sadder than when they talk themselves into staying at home. In backpacker ghettos the world over, you often see people carrying not only an enormous overnight backpack stuffed to bursting, but also on their chests an equally stuffed daypack. I call these people sandwich-packers, and I have to think theirs is one of the most miserable plights on Earth.

Hanging out in the living room of a hostel in Sydney, I listened to two Canadian girls chat with some other travelers about how they wanted to spend a few nights in Bondi before leaving the Sydney area. I had seen these girls arrive, and they had brought mountains of stuff. Getting out to Bondi meant either taking a cab (unthinkable with Sydney’s prices) or taking public transportation, which would involve a couple train transfers and a fair amount of walking. “It would be nice to go out there,” one of the girls said, “but, I mean, we can’t take all our shit out there for just a few nights.”

What a tragedy, I thought. They had come to the other side of the planet, and they were going to spend their months-long trip only going to places that had the utmost in convenient travel infrastructure. And they weren’t just being wimps. I wouldn’t have wanted to haul all their shit out to Bondi either. I don’t know how their travels ended up, but I do know that only about a week later I hopped on a plane to Kuala Lumpur and spent four months getting around on motorbike taxis and tin-can provincial buses where you store your luggage between your knees and schoolchildren ride on the roof. I felt justified in having only my tiny little 10 kg bag.

If you travel minimally, you’re probably not going to look or smell very glamorous at any point in your trip. But there’s plenty of time for that when you get back home. You defeat the purpose of travel if you don’t have the freedom to move. Packing carelessly is one of the best things you can do for yourself.


Tuesday, December 20, 2011

Talk your way to the top

I fear I’m becoming a bad listener at work. The things my colleagues say are increasingly going in one ear and out the other while my eyes defocus. I don’t think it’s my fault, though, and I don’t think it’s your fault if the same thing is happening to you. The problem is that most of the people we work with are bad talkers.


If there’s one thing I’ve learned so far in my year and a half in the corporate-ish world, it’s that talk is king. Talk is a substitute for results in a job where the results are nearly identical no matter who’s standing in front of the computer. Talk is what distinguishes a professional from a mere worker. And talk is a way of inflating the mundane reality of jobs we’d rather not do.

Most everyone at work is constantly looking for an opportunity to interject. Sit back and observe the next meeting you get called into. It will likely be a barely controlled riot of interruptions, tangents, and self-repetitions. You will leave wondering what was decided and what was accomplished. The answer is nothing, and that’s because meetings are not places to decide or accomplish things. They are places to talk.

In my organization (and in plenty of others, I imagine), most of the work we do is individual. We work and work by ourselves, and then at specific intervals we briefly connect our work with that of others. This isolation, coupled with the generally procedural nature of the tasks, makes it nearly impossible to discern adequate work from outstanding work. In some cases, sadly, the constraints of one’s organizational role might prevent outstanding work from even happening. I can hardly tell if the people in my own department are great or just good enough, and I certainly can’t tell about people on the other side of the building.

Most people realize this on some level, even if only unconscious. Those who are insecure compensate by talking. Thus talk becomes the professional currency. I have no way of knowing if Ursula in Logistics is worth her wages, Ursula knows I have no way of knowing, Ursula wants me to know, so the next time Ursula and I get thrown in a meeting together, she talks until she’s blue in the face. Evidence that others are buying it is how much they talk in turn. A thriving trade develops.

Most of the talkers mean well enough. They’ve found themselves in jobs that mean little to them and that offer few intrinsic rewards, but in which they’re stuck for one reason or another. Talking emphatically about their work to a basically captive audience provides badly needed validation. The sad part is that all the talk only contributes to the wall of pretense surrounding any professional-type job. And that pretense makes it more difficult to actually change one’s position within the organization or to have the self-honesty to leave.

And of course the frustrating part is that I and the handful of other quiet people in the place, who have my utmost respect, have to sit and listen to all the talk. My own solution to occupational dissatisfaction—withdrawing in silence while plotting ways to save money and quit—isn’t anything to be terribly proud of. But at least my withdrawal doesn’t go in anyone else’s ears.


Sunday, December 18, 2011

Paper words, digital words

A little while ago, I had an argument with a coworker about the Kindle and other e-readers, and whether they are a boon or a curse for the reading community and for society. I generally try to avoid arguments with people at work for the good reason that, according to an adage I read somewhere, “Talk sense to a fool and he calls you foolish.” That was certainly the case here. Somehow I got sucked into it.


I came out firmly in favor of the Kindle. My much older coworker (who I’ll call “Babs” for convenience) came out firmly against. I’ll say right now, though, that I don’t own a Kindle, and I have a fond affection for the printed volume. But I recognize my affection is purely sentimental and illogical. I like the weight of books in my hands, I like looking at them on my shelf, I like the way they smell. Also, a lot of the books I read are long, oldish, not-widely-read history books that may have no Kindle version available or only a low-quality one.

My love for antiquated paper and ink notwithstanding, I think the Kindle is a fantastic innovation. Someday I may come around to buying one. Babs could only shake her head. Her first argument against them wasn’t really an argument at all, just a repeated statement that something about reading on an electronic tablet “isn’t the same” as reading on paper. Since there was little logic here to rebut, the only thing I could say is that the important part is literacy—not the medium by which we exercise it.

Her other point was more interesting to argue against. “Think about all the people who make books,” she said. “All the people working the printing presses and running bookstores. I just think about all those people who are going to lose their jobs.” It’s a nice sentiment, in a way—and expected from someone who is part of a generation for which “commodity” means a physical object—but it’s a totally backward way of looking at economics.

Babs is right that plenty of people in print publishing are going to lose their jobs. It’s already been happening for years. This is undoubtedly frustrating and even tragic for the individuals and families involved, but it’s only a fraction of the story. Think of all the people involved in building the software to make the Kindle work. Think of all the people involved in building the slick hardware with the screen that—damn it—really does look like a printed page. Think of the workers assembling the parts and shipping them. Think of the people making Kindle versions of old books, maintaining the purchasing and download systems, handling customer service. These are jobs that didn’t even exist a few years ago.

And it doesn’t end there. E-readers are opening up fantastic opportunities for amateur writers to self-publish at minimal cost. One of the barriers to launching a writing career—selling yourself to a monolithic, institutional publishing house—is coming down. This is also probably opening up opportunities for small-time freelance editors to begin finding more clients. These, too, are groups of workers benefiting from innovation.

But it doesn’t end there, either. E-readers benefit consumers. I enjoy the time I spend poking around libraries and bookstores looking for print volumes. But I understand there are plenty who don’t. The Kindle ends up saving its users a significant amount of time. A cynic would say these people are probably going to waste that saved time anyway—and he may be right—but he couldn’t say the opportunity isn’t there. Maybe a Kindle reader will find that he finally has the time to reconcile general relativity and quantum mechanics. Or to be less grandiose, maybe he’ll just find he has more time to do his reading and also play with his kids. That’s a good thing, right?

Ah, but there are still those poor typesetters and ink-rollers standing in the unemployment line. What about them? Well, they failed to abandon a sinking ship. Rather than sticking to their established jobs with the blinders on, they should have learned about programming for the Kindle. I don’t mean to belittle them, because I can’t say I won’t someday be caught behind the curve of some great new innovation that makes my job obsolete. But if I do, I won’t expect anyone’s pity or philanthropy.

It’s true that innovation frequently puts certain groups out of work. But it also puts so many more groups to work. It may destroy a few jobs here and there, but it never destroys wealth. It always creates it. And that’s the main thing.

Friday, December 16, 2011

Observe your fuck-ups

I made a wonderful boring accidental discovery at work today. When using a web browser in Windows, Alt+D selects all text in the address bar. I did a little jig to celebrate. 


Like I said, it’s totally boring. It’s relatively trivial, too, and I’m sure it’s not news to some people. But the way I discovered it illustrates a lesson worth keeping in mind. 

I was trying to press Alt+S to navigate to a particular tab in a database I spend a lot of my time working in. My finger slipped and I pressed Alt+D. It took me only a split second to see I wasn’t getting to the tab I wanted, so I more-or-less unconsciously corrected the mistake in my second attempt. After I had got where I wanted, though, I realized that during that split second when I was correcting, the address bar had turned blue. After poking around for a few seconds, I figured out it happened from pressing Alt+D. And I was happy, because I like to use quick keys as much as possible, and I had previously wondered whether there was a quick key for selecting all in the address bar. Hooray!

Obviously, this new trick is going to save me only a negligible amount of time in my web browsing. But this is far from the first time something like this has happened. I’m routinely showing my colleagues faster, more efficient ways to manipulate the systems we use—systems they’ve been using for a decade and I’ve been using for 18 months. They ask me how I find all these clever shortcuts. The answer is that I pay attention to my mistakes.

Everyone always says to learn from your mistakes. This advice is fairly easy to follow when our mistakes come from conscious choices and take place in a measurable amount of time. If you make a bad investment and lose thousands of dollars, that’s a mistake you can study. But a lot of our mistakes happen unconsciously and fast—slipping and pressing the wrong buttons, for example. We don’t study these mistakes because we usually don’t know what we did wrong, and we do know how to immediately remedy the mistake.

You mean to press a key combination that will print your document, but you accidentally press two keys that close the program. Fuck! You go and reopen your document and try again. But wait a second. What keys did you hit that closed the program? It was frustrating in the moment, but maybe it’s something you could use later.

This is how I’ve discovered most of the slick navigation that has eluded my more experienced colleagues. In my haste, I routinely mess up and do things I don’t mean to.  Nine times out of ten, the mistaken click or button combination does nothing at all, or at least nothing I didn’t already know about. But once in a while it will do something unexpected, and I will retrace my steps to see what I did and how I might employ it in the future.

My colleagues don’t do this. Their routine mistakes are just another of the many frustrations facing them each day. They would benefit from thinking forward to circumstances in which the “mistake” might be useful. On an individual level, none of the shortcuts I’ve found amount to a significant increase in productivity. But they add up to eventually make a difference.

More than anything, I think it’s just the right attitude to have. We all have so many routines in our lives. There must be times during those routines when we make mistakes that could actually improve other parts of our lives, but which we take no time to examine. Who knows what we might be missing? 

Thursday, December 15, 2011

Uniqueness in historical violence

I’ve been reading and thinking a lot lately about the World Wars. When I say “lately,” I mean for the past year. I’ve read almost nothing else. Until you’ve spent some serious time studying them, you have no idea of their scale. They were bloody and destructive almost beyond imagining. The cursory surveys you get in high school and college tell you nothing.


One of the questions that continually comes to my mind is whether the carnage took place on such an appalling scale because of a particular rabidity of the belligerents or simply because of improvements in technology. Put another way, did millions (as opposed to hundreds of thousands in previous major wars) die because of mindsets unique to that period in time, or did millions die only because the artillery was better and airplanes could rain death from the sky?

It’s a difficult question to puzzle out. There’s no doubt the fighting was fierce. Over 700,000 became casualties at Verdun as France and Imperial Germany threw everything they had into the maelstrom. Over 1.2 million became casualties at Stalingrad 26 years later as Soviet Russia and Nazi Germany did the same.

The proximate cause of all the death and dismemberment was the devastatingly effective weaponry produced by 20th-century industry. At Verdun, the soldiers defended themselves with machine guns, and hurled two-thousand-pound high-explosive shells at enemy positions. At Stalingrad, they had tanks, dive-bombers, self-propelled artillery, and truck-mounted rocket launchers. For the first time in history, soldiers had the means to level entire cities in short order.

But of course the weapons didn’t discharge themselves. They were fired by living, breathing men at the behest of the military leaders and politicians who gave the wars their shape. Were these leaders unique in how they saw the world and how they saw their mission in war? Were the circumstances unique? If 20th-century weapons had been available 100 years earlier during the Napoloenic Wars, would those have become as bloody? Could a conflict as bloody happen again?

In many ways, the causes of the wars and the attitudes of those fighting them now seem impossibly remote. The First War was basically a clash of empires. Germany had stolen Alsace from France, they were posing a threat to Britain’s naval supremacy, and they were joining Austria-Hungary in challenging the backward empire of Russia. Imperialism still lives in 2011, but the notions of national pride, honor, and glory that carried the great empires to war have basically died. At least, no one now articulates these notions as credulously they did.

The Second War followed as a consequence of the First’s ravages. The human and material losses, as well as the shame of defeat, gave birth to the peculiar ideologies of Fascism, Nazism, and Stalinism. Observing these movements at the distance of 70 years—seeing their pageantry, their fanatical devotion, their constantly propagandizing stage management—they seem like historical anomalies. They were meteoric in both their rise and their fall. How could there ever be another man quite like Hitler, and how could he flourish in any circumstances other than those of a Germany ruined by war and humiliated by Versailles?

This would be the comforting conclusion to come to. It would be comforting to believe the unprecedented violence of these conflicts resulted from the storms of a special and never-to-be-repeated historical moment. But at heart, the architects of both wars were driven primarily by two things: desire and hate. These qualities certainly still exist. Circumstances can change at the drop of a hat, and there’s no telling who might be in power tomorrow.

The great war leaders of the 20th century fought with the means available to achieve their ends. Those means happened to be hellishly destructive. There’s little to suggest that great war-makers of the past fought with anything less than the full means available during their time, either. Nor is there much to suggest that current leaders would restrain themselves from using their every available means. The time since the World Wars has of course not been enough for the baser qualities that drove those conflicts to have been bred out of us. And if the destruction of those wars came from a logical progression from bows and arrows to rifles to aerial bombardment, it spells an uncertain future.

Wednesday, December 14, 2011

The aesthetically minimal man

I live in an apartment with nothing on the walls. It’s been that way since I moved in a year and a half ago. I have six beloved houseplants that add a bit of color, but otherwise the place is naked. I like it this way. If someone were to give me, say, a framed Mark Rothko print, I would hang it and probably enjoy it (he had pretty subdued tastes as well). But I don’t want one enough to go buy it myself. Nor would I care to buy anything else for decoration.


I used to be the opposite. In my high school and early college days, I liked to live in hyper-decorated rooms with Christmas lights hanging from the ceiling and posters covering nearly every inch of wall space. I felt a room like that was more exciting. I also felt it was a great way to broadcast my interests to anyone who might visit. My taste in movies and music, on garish display, would show my guests how fascinating I was. That was my hope, anyway.

Sometime since then I went and got boring. I may have felt all those old posters were a bit immature for a guy older than 21. I may have felt they just didn’t represent my tastes anymore. And I may have just decided that, after all the moving around in college, it was too much hassle to bother tacking anything onto the wall. I can’t exactly remember now, so I’m guessing it was a combination of all three.

The upshot is that I now live in a place that my 18-year-old self would probably be appalled to see. It would look so suspiciously adult to him. But what the idiot high school kid wouldn’t see is how superior the visually subdued environment is as a place to actually live. Those posters only amounted to so much mental clutter. I don’t need to see Robert De Niro’s face on a Goodfellas poster every time I sit on the couch. I don’t need to be reminded that I like Pink Floyd every time I walk in the door. I already know it.

We only have so much time and energy to think and feel. Routinely subjecting ourselves to visual (or auditory) stimulation that doesn’t stimulate our intellect or emotions seems like a losing proposition. The time it took my brain to process my Nirvana poster each day was minuscule, of course, but it was still something, and it all adds up. 

The busy-ness of my old rooms would now exhaust me. I would feel distracted and restless. A handful of beautiful paintings wouldn’t be so bad to have. But so many people these days are in the habit of filling their homes to the rafters with visual bric-a-brac that I can’t help thinking there’s a lot of good thinking time being crowded out. 

It’s the opportunities for thinking that make an aesthetically minimal living space trump the alternative. My mind works better in bare walls. I also feel it might be a better environment for conversation. When I have someone over now, we’re just two people sitting among some greenery having a chat. De Niro’s mean mug isn’t there to distract us. It’s a quiet pleasure most people wouldn’t recognize right away, but it’s well worth trying.

Tuesday, December 13, 2011

Remember not to forget to unfuck yourself

 I got a little gem from the staff of my apartment building today. The Winter Newsletter scotch-taped to my door this afternoon reminded me where to park and where not to park, how to dispose of my Christmas tree, and why I shouldn’t let grease and food particles run down the drain. They also had me “note that we charge $30 for after hour lockouts.” Added to that was this helpful advice: “Please leave a spare key with a trusted friend or neighbor if you are likely to lock yourself out.”

I had to wonder what kind of paradoxically responsible airhead would be both likely to lock himself out and aware that he was likely to do so. It seems that the people most likely to lock themselves out are the most likely precisely because they don’t consider themselves to be so. 

Here’s how we might rephrase the apartment staff’s statement: “If you are predictably and reliably stupid, please consider taking steps to mitigate the ill effects of your stupidity.” That’s basically what they’re getting at. But we could go further:

“Hey, stupid. Yeah, I’m talking to you. Remember all the times in the past when you ruined everything and imposed on the people around you? Like the time you locked your keys in the car with the engine still running? Or the time you tried to microwave your soup while it was still in the can—do you remember that, doofus? Or the time you poured motor oil in the wiper fluid reservoir and antifreeze in the oil reservoir? That was a doozy. I don’t point these things out to shame you. I only point them out to show that you have established a historical pattern of thinking with your asshole and blowing it big-time. You fuck up frequently. You fuck up severely.

“So here’s the short version of it, fucknuts: the probability of your locking yourself out of this apartment late on a Sunday night is approximately 73%. That ain’t so so good, brainiac. For all I know, you’ve already done it once before. And if you do it, you’ll make the maintenance man unhappy, and he will make you pay him $30, which will make you unhappy.

“That’s not to say there isn’t a solution. If you leave a key with a neighbor, you can lock yourself out to your heart’s content without consequence. No one is asking you to stop being stupid or to stop making stupid mistakes. We love you the way you are. But if you just make a few simple arrangements beforehand, you can make your stupid mistakes less of an inconvenience. Thank you.”

Monday, December 12, 2011

High-resolution holidays

Now is about the time you start hearing people talk about the resolutions they’re going to make once January 1 rolls around. There’s an endless variety. You might hear a coworker say she’s going to get back into going for long fitness walks. A friend of a friend says he’s going to start reading more of those books he’s been meaning to get to.   Pardon me while I laugh. 



No matter how they’re phrased, all New Year’s resolutions are essentially the same thing, which is self-delusion. It boils down to this: if you were genuinely serious about whatever it is you’re resolving to do, you wouldn’t wait for an arbitrary date on the calendar to begin doing it. There’s nothing remotely special about January 1. It’s just another day. It doesn’t even correspond to any celestial event. If you resolved to start eating right on the summer solstice, I would still think you were full of shit, but at least I would have more respect for the thought you put into picking a significant date.

The bigger point, though, is that change doesn’t happen by deferring it to the future. The future is interesting, but it’s only an idea. For anything you want to do, there must be some way to get started today. There’s no time like the present, because the present is the only time we actually live in.

I wouldn’t be so cynical as to say that all resolutions are bogus. People can and do improve themselves. But the postdated resolutions people begin trotting out each December bear none of the qualities of effective resolutions. People start eating healthy and exercising when they see an embarrassing photo of themselves and suddenly realize how far they’ve let themselves go. People stop drinking when they have a falling out with an old friend and suddenly realize how much they’ve lost through alcohol. The commonality among resolutions like this is the inescapable feeling that one has to begin now. A resolution will never stick if there isn’t a sense of urgency to it. 

But the holiday season provides a convenient excuse for our weaknesses. It’s filled with bad food and distracting activities that we can point to as the source of our failings. Once the season is over, and all the bad things are gone, we’ll get down to that serious business of living. Only it doesn’t work like that. You’re going to eat better once the cookies and candies are gone? Well, there are always cookies and candies available. You’re going to start that website or read those books or paint those pictures or whatever once there’s less friends and family stuff going on? Well, there’s always friends and family stuff going on. 

To really change any part of your life, you have to realize that your life is happening at every moment. Time is continuous. It doesn’t start and stop at our defined intervals. Any moment is as ripe as any other to begin doing anything. The holiday season does present particular difficulties (especially for the perennial resolution of eating better), but if your January resolution is going to stick, doesn’t that necessarily mean defying those difficulties next December? Why not rip the band-aid off now and be stronger than it all this year? If you don’t get started now, you probably won’t ever.

Saturday, December 10, 2011

Religions without gods

I stopped believing in God when I was about 13 years old. Like dear old Bert Russell, I just didn’t feel God had provided enough evidence. I had been raised (sort of) to be a believing Christian. For a time I actually did believe, but only in the naive way kids can be said to really believe in anything. They were totally unexamined beliefs, held because I had been told to hold them. Once I began looking at the world through eyes of reason, God disappeared.


I wouldn’t say that I became completely unreligious, though. I spent my teens living in a sort of dissipated manner, but that was mostly because I had so little responsibility for my own life. Food and shelter were taken care of, and the pocket change my folks gave me for cleaning the house and mowing the lawn was enough to buy pot on the weekends. Why bother living my life by any code?

When I got out on my own, the empty pleasures of high school (which had never been that much fun anyway) weren’t enough to sustain me. I spent a few years feeling lost, and when I was 21 I became a strict vegetarian. I’m not sure anymore why I made that choice, but I certainly believed in it at the time. I think what attracted me to it more than anything was the asceticism of it. It demanded a level of self-control that, it seemed, would inevitably sharpen one’s sense of identity and purpose. And anyone who’s ever known or been a vegetarian knows how it becomes a religion. 

After about 18 months of it, though, I began to question whether it was really such a healthy diet. I began eating meat again, but my eating habits overall may have become more religious rather than less. I cut out all processed foods, all grains and starches, and all but the tiniest amount of sugar. I also began working out in more intense and focused ways than I had before. It was the religion of yoga and bean soup shifted laterally to the religion of kettlebells and steak salad. I did and still do make the occasional exception, but so far it appears this religion is going to stick.

If there’s one defining characteristic of true religions, it’s that of practice. I say “true religions,” because much of what passes for religion in America is really just a Sunday-morning lecture series that’s forgotten as quickly as were the lessons of Political Science 101. Real religions require their adherents to do things. Orthodox Jews have kashrut and separate sets of plates (and sometimes even separate dishwashers) for meat and dairy. Balinese Hindus have their morning offering. Muslims everywhere have their numerous abstentions and their five daily prayers. It’s these rituals—practiced day in, day out—that give a religion its character.

I think most people hunger for some sort of religious practice. It reminds us who we are. But when we don't find it—maybe for lack of searching, maybe for the vagaries of fate—we'll generally settle for habit. Junk food, television, wasting money on stuff we don’t need—these things become religions by default. A good percentage of people find rituals that genuinely fire their character and inspire self-improvement. But many don’t. Those who do typically have a hard time articulating why, exactly, their rituals matter to them. It just sort of works, they might say. 

A religious practice of fitness and healthy eating (like I’ve stumbled upon) is hardly groundbreaking, and it definitely doesn’t provide all the answers in life. But then, no one ever said that it would. What it does is give me just a little something that makes me different from a lot of other people, and it makes feel generally positive about my life. This seems like a good goal for religions, and if it were the goal of Christianity, I might have been more inclined to just stick with that one to begin with.