Skip to content

The Great Sellout

May 2, 2012

“The best minds of my generation are thinking about how to make people click ads.” – Jeff Hammerbacher

A famous American game show once asked: “Who Wants To Be a Millionaire?” By now, I think we can safely say that nearly everyone wants desperately to be a millionaire. This pervasive attitude runs rampant in Silicon Valley. There, young entrepreneurs compete with one another to design the next hot app or massive social network. They leave no stone unturned in their great quest for fame and fortune. Facebook’s $1 billion purchase of photo sharing app Instagram has only whetted this entrepreneurial appetite for riches. The men (and women) behind these startups tells us that they are helping people connect, and maybe they are, but only if by connecting them they become millionaires.

Answer: Everyone.

We can think of this current situation as a kind of intelligence bubble, ready to burst. Here are the brightest minds of my generation, and they’re more concerned with generating ad revenue and user subscriptions than they are with the urgent concerns of now. I recently met a girl who told me that she’d make a great elementary school teacher, but her real dream was to work in advertising. I asked her why and she weakly explained that advertising would provide her with a great creative outlet. It would give her meaningful work. What work, I asked, could be more meaningful than teaching a child? She couldn’t tell me. I wonder if she’ll one day look back on her life with pride, remembering all those clever ads she wrote – ads that moved products off of shelves, but which stirred not a single heart.

What I find most troubling is the growing superficiality of my generation’s interest in global issues. There’s this pervasive belief that technology (and by extension, capitalism) can solve the world’s problems (just watch a TED talk). Kony 2012 is a great example of this digital era activism (the so-called slacktivism). Some have defended the popularity of the Kony campaign, arguing that something that creates interest in an international issue, however superficial that interest may be, cannot be bad. Yet, the Kony campaign makes no effort to help its supporters understand the circumstances that allow a man like Kony to exist. Regular people – Ugandans, Congolese, Rwandans – may as well not exist. The real problems – colonialism, resource exploitation, rampant corruption – are not worth discussing. The entire conflict is boiled down to a single, solitary symbol of a man (who may or may not already be dead), whose capture or death is meant to make us feel good about ourselves. There’s this need to believe that social media, the great diversion of our age,  can actually be a force for good. We use global issues like Kony and climate change to distract ourselves from the guilt we feel.

Of course, there are plenty of extraordinarily talented people out there who are capable of solving the great issues of our time. What I see most people my age doing instead is convincing themselves, day after day, that their job is not merely satisfying, but meaningful. They delude themselves because staring truth in the face is too painful and too frightening. They come up with clever nicknames for themselves like “coding ninja” or “copywriting ninja.” It’s a way for them to imbue banal, meaningless work with artificial, self-created value. Without it, they’d be starting out into the abyss.

Below, I’ve embedded a clip from one of my favorite films, the 1938 George Cukor classic Holiday, with Cary Grant and Katherine Hepburn. In the clip, Grant outlines his plan to get out of business (“while [he’s] still young and feels good all over”), travel, and figure out what he wants to do with his life. Watch as his fiancee and her father try to persuade him to stay in business and make money (“But you don’t understand how exciting business can be” and “There’s no such thrill in the world as making money”). It’s a classic Faustian bargain. Most people today have no idea that they’re involved in such a great sellout. Because it’s not about money; it’s about self-deception. Once we’ve convinced ourselves that our meaningless work has meaning, only then do we lose our souls.

The Letter Writing Project: Response 12

May 1, 2012

You can read all the letters and look at the updated statistics here.

Uzbekistan! Now that’s an excuse!

The Rise of Klout and the Triumph of Celebrity

May 1, 2012

“Klout is one of the worst ideas ever put online.” – Tom Scott, founder,

“This is the intersection of self-loathing with brand opportunity.” – Wired Magazine

“You are never penalized for connecting or engaging with someone with a low Klout score.” –

You know that person who has close to 1000 Facebook friends and a seemingly endless supply of wall posts, all of which garner scores of “likes” and comments?  This individual probably maintains a very strong influence over his or her friends. If this “influencer” suggests you check out a new bar or restaurant, chances are that you will check it out. Klout, an online app, measures just how influential your acquaintance is on a rating scale of 1-100. The more influential Klout finds your acquintance, the more likely it is that he or she will receive perks, prizes, and maybe even land that dream job.

As outlandish as this may seem, it looks like the future of social media. If you ever wondered where social media was taking us, look no further than Klout. While some believe that Klout is democratizing influence, giving average users a chance to influence others and reap the benefits, others believe that Klout is destroying our relationships. One website,, even has an algorithm that measures your level of “asshattery” (try it if you have a Twitter handle).  Despite the outcry, Klout offers a vision of our social media future – one in which our connectivity and engagement correlates with the benefits and services that we receive. Those who refuse to engage will be left in the cold.

An example of how Klout works.

What Is Klout?

Klout founder Joe Fernandez envisioned social media as “an unprecedented eruption of opinions and micro-influence, a place where word-of-mouth recommendations—the most valuable kind—could spread farther and faster than ever before.” To this end he invented Klout, which uses a ratings metric to measure the social media influence of a user. This metric takes into account three variables: “true reach,” amplification, and network. “True reach” is the total number of people you influence. Amplification measures just how much you actually influence people. Network measures the influence of the people who are in your network (that is, it measures how far your influence can travel).

Klout uses all of your social media to tabulate its rating of your influence. It draws from your Facebook, Twitter, Google+, FourSquare, and LinkedIn. It leaves no social media stone unturned (OK. Maybe Pinterest). To give you an idea of how the ratings work, pop star Justin Bieber has a perfect score of 100 (could be that he has over 21 million followers on Twitter!), while President Obama (the most powerful man on the planet) has a paltry score of 91. The average user scores in the 20s, and according to Klout, it becomes exponentially more difficult to raise your score once you get into the 50s. Just because you have a few friends that re-tweet you does not mean that you are influential.

We Are All (Unpaid) Marketers Now

A few weeks ago, I wrote that we are now all marketers. In reference to the rise of personal branding, I argued that my generation has been forced to learn marketing skills  in order to survive in a highly competitive job market. Now, it looks like marketing ourselves isn’t enough. Instead, we are being asked to market other brands in return for a few perks. It works like this: brand recognizes that you have a high Klout score (meaning that you have solid social media influence), the brand provides you with good customer service or a sample of its product, then brand hopes that you will sing its praises to your social media followers. The benefit may be free stuff, but what they’re asking you to do is market for them…for free! Klout calls its high-scoring users “influencers,” but I just call them unpaid marketers.

Klout's promise.

Klout, like Pinterest, is an amazing gift to brands major and minor. The Wall Street Journal identified this when it wrote that a brand’s goal [in using Klout] is  “to find the equivalent of the blogger in Texas, get her engaged, and push a product pitch across the Web.” Allow me to translate this for you into language that you can understand. Companies want to use Klout to identify influential people, exploit them, and make them sell their products. Klout helps companies identify the people who might be able to push their brands out into the blogosphere or onto Twitter and Facebook and now even gives the companies the chance to give those people free stuff in the form of “Klout Perks.”

This transformation of the consumer into a commodity ought to worry you. Shoving products down people’s throats is one thing, but it’s quite another when the person doing the shoving is your friend. I don’t fault the brands for doing this – it’s the natural evolution of advertising in the age of social media – but you don’t have to be a part of it. However, as one Klout user put it, the VIP status associated with a high Klout score “is an ego thing.” Democratizing influence means that we have also democratized celebrity. When CEO Joe Fernandez says that he “see[s] Klout as a form of empowerment for the little guy,” I wonder if what Klout is empowering is only a stronger sense of entitlement for the “me” generation.

Defining Who We Are

Klout is part of a wider trend in which we ask others to define who we are.  We are allowing ourselves to be defined by an arbitrary ratings system. Instead of thinking deeply about who we are, what we want from life, and what we expect from our relationships, we immerse ourselves in social media and imagine ourselves as true VIPs, the centers of our perfectly constructed social universes. As one writer put it: “Because, in this era of self-created media/social networks, Klout isn’t measuring some distant and massive media corporation. Rather, it’s measuring you.” Once you realize how scary that really is, you begin to understand the troubling direction in which we are heading. The New Yorker summed up this dilemma of identity very neatly when it asked: “Do you really want something in your pocket that will tell you what you’re worth?”

What I find most worrying of all is that Klout appears to be transforming into something that leaves us with no choice to opt out. Think about this: “In February, the enterprise-software giant introduced a service that lets companies monitor the Klout scores of customers who tweet compliments and complaints; those with the highest scores will presumably get swifter, friendlier attention from customer service reps.” Combine this with perks, such as upgrades and free samples, and you begin to see the formation of a class system. One marketing guru described this as the formation of “social media caste systems.” That is, places “where people with high scores get preferential treatment by retailers, prospective employers, even prospective dates.”

I’m reminded of a famous exchange from Shakespeare’s Henry IV. In it, Falstaff pleads with Prince Henry (Hal/Harry) to forgive him for what he’s done. He ends his speech with this line: “Banish not him thy Harry’s company, Banish plump Jack, and banish all the world” (2.5.437-438). Hal responds, chillingly: “I do. I will.” As our own disconnected lives plead with us, “Banish us and banish all the world,” we too respond like Hal. “I do. I will.”

Should Everyone Learn to Code?

April 30, 2012

Since the beginning of the Computer Age, there has been this persistent fear that one day, computers will become smarter than we are. In the most pessimistic scenarios this means nothing less than the complete annihilation of the human race. We fear an artificial intelligence that can think and feel like a human can. This fear may be misplaced; we are generations away from this type of computer intelligence. But perhaps we are asking all of the wrong questions. Instead of wondering if this could happen, we should ask ourselves why this possibility scares us.

Steve Jobs: Modern Prometheus.

As technology becomes more and more complex, the average user understands it less and less. This forms a knowledge chasm that appears impassable. It may also explain why Steve Jobs was idolized by so many; like Prometheus, Jobs delivered the fire of the gods (iPhone, iPad, iPod) to mere mortals. He bridged the impassable chasm by making products that appeared simple and intuitive enough to understand. Only, Jobs’s bridge was constructed of paper and glue. He hid the complexities of his products behind a veneer of simple design and total functionality. He made us believe that we, the average user, could understand and control our computers.

Of course, we can understand and control our computers. Computers are dumb, requiring the commands of a programmer in order to operate. Do you know the language of your computer? Is it irresponsible not to know?

The Importance of Code

Peel back a website to its source code and you will see a string of letters, numbers, and symbols, the architecture of the Internet. This is code. The Internet, smartphones, computers, video game consoles, this website – all governed by the digital languages we call code. Look at the image below. On the left is this website. On the right is the same page of the website broken down into the code that defines it.

Mirror image.

Code underwrites the “magic” of modern electronics. There is nothing special or remarkable about your smartphone or your laptop computer. Each requires an input from the user before producing the right output. A computer requires the user to tell it what to do. Programmers make this easy for us; they write the codes that tell our computers what to do. When we click a button and are taken to the right screen, we have the programmer to thank. The programmer writes elegant code that tells the computer what to do so that we don’t have to.

Should We All Learn Code?

Recently, company FreeCause asked all of its employees to learn how to code. CEO Michael Jaconi explained the decision this way: “[Coding] would arm our employees with a new skill set, bring our technical and non-technical teams closer together, and provide the entire company with a deeper understanding and appreciation of what we do.” To put it simply, learning code would mean that all FreeCause employees spoke the same language.

Separated by the knowledge chasm.

In our technologically reliant society, is it irresponsible to remain ignorant of code? I argue yes. We should not remain content to allow so-called experts set the parameters for our Internet use. While we need not become experts ourselves, we must understand the languages of the digital age in order to participate in this ongoing dialogue about privacy, use, and protection. To remain ignorant of code is to silence our voices in conversation that directly impacts our lives.

The language of programming is not inaccessible. Take a course online. Join Codecademy. Watch a video on YouTube. Take a book out from your local library. Your options are limitless, your excuses negligible. It’s more important than ever to have at least a basic understanding of code. Start now.

AXIOM #5: As technology becomes more and more complex, the average user understands it less and less.

The Letter Writing Project: Response 11

April 28, 2012

You can read all about the project and all of the responses here.

Saving the Traditional University

April 27, 2012

Earlier this week, I wrote about online education and the effects that it will have on the traditional university. Widespread changes will likely be subtle and may require decades before establishing themselves in the public consciousness; American universities are powerful corporations with a monetary interest in maintaining the status quo. The last thing that the universities want are websites that offer free, university-level courses that provide students with acceptable certificates of accomplishment. Such a system would, in theory, make the traditional universities all but obsolete.

In the spirit of fairness, I’ve imagined how America’s universities could make themselves relevant again. Some of my thoughts cut right to the heart of the identity of America’s elite universities; these ideas I’ve since either tempered, altered, or nixed outright, knowing that they would otherwise derail my purpose, which is not to attack certain fields of study, but to recognize what schools do well, what they do poorly, and what they could improve. These ideas are an invitation to a dialogue about what higher education is, what it ought to be, and how we can make our idealized image of higher education a reality.

Join the Revolution

The first thing that universities must do is get with the times. It might appear counter-intuitive, but universities must get online. Universities like Princeton, Stanford, and the University of Pennsylvania, among several others, have already recognized that online education is very much a part of their educational vision and a part of their future. Some schools may sneer at this attitude, but those schools will be left behind, relics of a bygone era in education.

How can universities best educate their paying students when they are offering their courses online for free? Universities offer resources that non-paying students simply cannot access: free high-speed Internet, laboratories, peer-to-peer study sessions, face-to-face meeting time with their professors, and real real time feedback on projects and coursework. In an interview with Charlie Rose, Sebastian Thrun (former Stanford professor and founder of Udacity) mentions that 170 of the 200 Stanford students enrolled in his now infamous Artificial Intelligence course chose to watch his lectures online. When he asked them why they preferred to watch the lectures when they could attend the class in-person, the students said that they preferred his online lecture style and the ability to rewind. So doesn’t it make sense to put lectures up online, freeing students to spend that time studying, while also making it more likely that the students will “attend” the lecture? That class time, now free, can be used for the seminars where real peer-to-peer learning happens.

Least you forget, America’s biggest universities are brands. Harvard. Yale. MIT. These are world-famous education brands. The rest of America’s biggest schools are competing with one another for similarly lofty international reputations. What better way to export your brand than by offering free lessons and courses, taught by your finest faculty, to people all over the world?

Curb Grade Inflation

As part of my work-study as an undergraduate at the University of Chicago, I worked in the registrar’s office. Now, when an alumnus dies, the university checks the course records of the deceased. Often, I’d have to retrieve an old microfilm cartridge and then run it through a microfilm reader. These grades, most of them from the 1930s, 40s, and 50s, never ceased to amaze me. Students commonly received Cs and Ds, even in such basic courses as Western Civilization and English 101. Grades, it would seem, were handed out on merit. Compare these scores with those of Humanities courses today. I once entered grades for a class where all fourteen students received an A. Tell me, how is that even possible?

The rise of grade inflation. Private schools are the worst offenders; no surprise since they rely far more on donor money than do public universities.

Grade inflation is widespread, a festering virus that has infected every university in America (perhaps not this college). While there are a number of reasons for this trend, it has had the effect of rendering a bachelor’s degree nearly worthless. Let’s recognize grade inflation for what it is: the sad by-product of the self-esteem movement and a symptom of the modern university’s never ending pursuit of alumni donations. What good are grades if everyone knows that they’re worthless?1

Degree Standards: Rigor Creates Value

Let me be clear right now: my history degree is hardly worth the paper that it’s printed on. That’s not because the study of history is worthless, or because I didn’t learn anything as an undergrad. Rather, it’s because the history major lacks rigor. Take a look at the requirements for a history major from America’s top university:

“All History concentrators are required to take 10 half-courses.” (emphasis mine)

1. History 97 – History Analysis

2. 1 Reading Seminar

3. 1 Research Seminar

4. 1 half-course in western History

5. 1 half-course in non-western History

6. 1 half-course in premodern History

7-10. 4 additional electives (here, they note: “Normally, only one of these electives may be in a Related Field”).

To summarize, there is minimal emphasis placed on historiographical skills, a broad emphasis on breadth of historical understanding, and very little opportunity for or emphasis on interdisciplinary study (in other words, they treat the modern study of history as if this were the 19th century). Interestingly, the department does not require the study of a foreign language.

I’ve come up with my own program, which consists of a whopping 18 (!) courses (in a trimester school year – between 15 and 16 courses in a semester system).

1. 4 courses Historical Analysis (including research, writing for History, theory, and historiography)

2. 5 courses in History (including a minimum of 3 classes related to the student’s research)

3. 6 courses (or 4 in a semester system) in a foreign language (or demonstration of intermediate competency)

4. 3 courses in interdisciplinary fields (e.g. archaeology, anthropology, statistics, economics, sociology, geography, etc.)

An undergraduate degree ought to mean something. The only way to truly imbue a degree with value is by adding rigor and standards to each area of study.2

The Value of a Physical Campus

The University of Chicago's new center for the arts.

Traditional universities have at least one thing that online education lacks: a physical campus. There’s no denying that the physical campus is itself a quintessential and extremely important part of the university experience. Thus, schools ought to be thinking hard about how to make their campus a place where students can both collaborate and innovate. Whether this means investing in expensive labs, creating different kinds of libraries, or offering unique study spaces, schools must do whatever it takes to instill value in the place itself. In other words, they must think about what their campuses offer that cyberspace cannot.

The importance of place takes on even more importance when considering the skyrocketing costs of higher education. If you are paying $60k/year, you at least deserve unlimited access to campus resources: laboratories, libraries, study rooms, football fields, equipment, etc. Many schools prefer to keep these things locked up in webs of bureaucracy. Unlock these tools and resources and make them freely available to the student body. It’s the least they deserve for what they’re paying.

Help Students Network

When asked about the price of a college education, many people will tell you that you’re paying for the brand name on the degree. Harvard. Yale. Stanford. There’s some truth to that; a brand name could help you get your foot in the door. However, if you expect the name on your degree to get you a job, you’ll be in for a world of hurt. The real value of your undergraduate education is in the opportunities you have for networking. Stanford has recognized this, providing ample opportunities for its students to rub shoulders with some of the most important names in Silicon Valley.

Stanford is not alone of course. Many universities, especially the best universities in the country, offer a steady stream of networking events for their undergraduates. However, schools don’t do enough to emphasize the importance of these networking events. Schools should be doing everything they can to both provide opportunities to network and to encourage undergraduates to participate in these events.

Make the Liberal Arts More Exclusive/Incentivize the STEM Fields

Here’s where things get a little controversial. I strongly believe that universities should limit the number of students who can major in liberal arts fields. These openings should be made available on a scholarship only basis. This both creates competition for spots within the field and increases the overall value of the major. It also reduces the financial burden on students and the U.S. economy by limiting the amount of total student debt, especially for students in fields with little monetary value (the liberal arts). Students in the STEM fields should not be prohibited from studies in the humanities (they should, in fact, be encouraged to study outside their field!), but they should be limited to a minor in an unrelated liberal arts field. As for the scholarship students, they should have to forfeit their scholarship if they decide to change majors.

Academics would protest this change on the grounds that students should be free to study whatever they want. That’s fine for students fortunate enough to be paying almost nothing for their education. However, for the rest of the student body, a degree in the STEM fields is the only thing that makes financial sense. Academics need students to keep their jobs; they might complain when we suggest cutting Gender Studies or Romance Languages and Literature, but the burden is on them to justify their field when a degree costs students over $150,000, causes the nation’s national student debt to balloon, lacks rigor, and offers no value on the job market. Yes, college is a time for exploration and inquiry, but not at its current costs.

1 I am not sure that this is true for some of the STEM fields. For example, engineering students routinely receive Bs, Cs, and Ds. Yet, they still receive recognition from employers because most are aware of the rigor of the courses, the general lack of grade inflation in the field, and the overall difficulty involved. An A in an engineering class means a lot more than my A in a class on the Crusades.
2 Once again, the STEM fields come out on top here. Many of these programs already possess the kind of rigor that creates strong students with measurable competencies.

Samsung’s Identity Crisis

April 26, 2012

Earlier this week, a large black bus pulled up in front of a Sydney, Australia Apple store. Out of the bus poured black-clad protestors waving signs that read “WAKE UP.” The protestors stood outside the store, holding up their signs and chanting “wake up!” Who organized the protest and what it actually meant have caused some speculation on the Internet. Not unreasonably, tech website Mashable suggests that the protest was organized by Australian ad agency Tongue  for electronics giant Samsung, in anticipation of the May 3 launch of Samsung’s new Galaxy III smartphone.

Not content to carve out its own share of the smartphone market, Samsung has relentlessly attacked Apple in 2012. In January, Samsung ran a commercial during the Super Bowl that attacked Apple’s iPhone and its product culture. The commercial highlights the coolest feature of Samsung’s Galaxy Note phone. But something weird is going on. Instead of highlighting the fact that the iPhone does not have the same functions as the Galaxy Note, Samsung attacks Apple users. The 2012 ads demonstrate just how confused Samsung is. Is Samsung trying to convert Apple customers (if they are, they’re doing a bad job of it by insulting potential customers)? Is the Galaxy phone supposed to be an alternative to the iPhone for “everyone else?” The only thing we know for certain is that Samsung wants to be taken seriously.

Let’s step into my WABAC machine and travel back to the first few years of the past decade. Then, Apple had finally carved out a share of the computer market, having been brought back from the brink of bankruptcy by returned-CEO Steve Jobs. In 1997, Apple launched its “Think Different” campaign, one of the most successful ad campaigns of the last twenty years. The goal of that campaign was to articulate the Apple philosophy:

Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes. The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them. About the only thing you can’t do is ignore them. Because they change things. They push the human race forward. While some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do. – Apple Inc.

With this one campaign, Apple defined its brand and its customers. To this day, that quote above still defines how consumers view Apple the corporation, Apple products, Steve Jobs, and even Apple users.

From L to R: PC, Mac.

In 2002, Apple changed track. Having successfully established their brand and having carved out for itself a small share of the market, Apple turned its attention to its competitors. Apple launched a series of ads in which its products were favorably compared to PC products. These now-famous ads (Mac v. PC) did everything a negative ad should. It reinforced the Apple brand (simple, minimal, sleek) and set it against an image of its competitor (dull-witted, geeky, and a little chubby). The ads never skewed negative in tone, but their messages were impossible to ignore. Macs, these ads implied, were not just cooler than PCs, they were better. Did Apple target PC users? No. Apple was trying to convert PC users, not alienate them.

Flash forward to 2012. Apple is on pace to become the richest company in the world. It possesses what is arguably the most famous and well-defined brand in advertising. Samsung, a company with no brand image at all and only minimal name recognition with the average consumer, is trying to cut into Apple’s share of the market by attacking it. The Korean electronics giant would do well to follow the formula that Apple created.

Step 1: Define your brand. Who are you? What do you stand for? What does it say about someone who buys a Samsung product?

Step 2: Attack your competitor. Understand who or what you want to attack. Is it a product (the iPhone)? A company (Apple)? Or your potential customer (Apple users)? Do what Apple did. Contrast your product with your competitor’s. Make it about the product. If the Galaxy is really so much better than the iPhone, show us.

Step 3: Let your customers do your marketing for you. Hardly anyone waits in line to buy a Galaxy phone. Until people do, you will never compete with Apple.

Remember, you can’t skip Step 1, jump right to Step 2, and expect to sway consumers. The Galaxy is a great phone, but this current marketing plan will only alienate the people that Samsung is trying desperately to convert.