Saturday, June 27, 2009

Internet Groans Under Weight of Michael Jackson Traffic by Jacqui Cheng


The news of pop icon Michael Jackson's collapse and subsequent death sent ripples across the Web on Thursday afternoon, affecting numerous services and sparking yet another spam campaign. Twitter, Google, Facebook, various news sites, and even iTunes were practically crushed under the weight of the sudden spike in Internet traffic. The phenomenon may not be new on an individual level, but combined across services, it was truly one of the most significant in recent memory.

When news first broke that the Jackson had collapsed in his home, Twitter was immediately abuzz. There were several points when the Ars staff observed between 6,000 and 13,000 new tweets per minute mentioning Michael Jackson before Twitter began to melt down—all before anyone other than TMZ.com was reporting his death. Of course, most of us are intimately familiar with the famed Fail Whale at this point, though Twitter's meltdown was mostly reflected in a major slowing of the service and the inability to send new tweets.

In fact, Twitter cofounder Biz Stone told the L.A. Times that the news of Jackson's passing caused the biggest spike in tweets per second since the US Presidential Election. (Similarly, Facebook—also known as Wannabe Twitter—saw a spike in status updates that was apparently three times more than average for the site, though a spokesperson said the site remained free of performance issues.)

Google, on the other hand, began receiving so many searches for news about Jackson that it caused the search engine to believe it was under attack. The site went into self-protection mode, throwing up CAPTCHAs and malware alerts to users trying to find news. A Google spokesperson described the incident as "volcanic" compared to other major news events, confirming that there was a service slowdown for some time.

The spammers have come out in droves as well (never assume that someone isn't working on a way to instantaneously exploit the death of a major celebrity). Security researchers at Sophos put up a warning this morning saying that the first wave of spam messages has gone out claiming to have "vital information" regarding Jackson's death. There doesn't appear to be any call to action or URL, but is meant as a way to harvest e-mail addresses if recipients make the mistake of replying. We're sure this trend will continue in the coming days, not just about Michael Jackson, but also actress Farrah Fawcett (who also passed earlier in the day on Thursday), and former Tonight Show sidekick Ed McMahon.

Finally, Apple's iTunes Store appeared to experience some slowdowns upon the confirmation of Jackson's passing, though the service has been running smoothly since. Users, however, are paying (literally) tribute to MJ, as several of Jackson's hit singles are climbing into the iTunes Top 10, including "Man In the Mirror," "Thriller," and "Don't Stop 'Til You Get Enough." This is pretty impressive considering that he has made the list with three songs in less than 24 hours, competing with the likes of Black Eyed Peas' "Boom Boom Pow" (which has been close to the top of the iTunes Top 10 for what seems like three millennia—seriously, please quit buying it so it goes away).

Coupled with news that fans have been gathering in cities across the US to perform the renowned (if not a bit morbid, all things considered) Thriller zombie dance, we can't help but feel as if the user-driven age of the Internet will keep his memory alive in ways that past music icons have not had. (Only a couple of Ars staffers—we're looking at you Bangeman and Timmer—are old enough to remember the passing of Elvis Presley in 1977, which dominated newspapers and TV for days afterwards.) YouTube even has a Michael Jackson spotlight on its front page right now, so if you're feeling nostalgic, head on over and check out MJ's smooth moves from the days of yore. R.I.P., King of Pop.

Tuesday, June 23, 2009

The Ins and Outs of Borderline Tennis Calls by Alan Schwarz


When a line judge at Wimbledon rules on a hair-splittingly close call and says the ball is out, the inevitably disgruntled player should not only consider challenging the call for review by digital replay system.

He should consult a recent issue of Current Biology.

A vast majority of near-the-line shots called incorrectly by Wimbledon line judges have come on balls ruled out that were actually in, according to a study published in October by researchers at the University of California-Davis. To the vision scientist, the finding added to the growing knowledge of how the human eye and brain misperceive high-speed objects. To the tennis player, it strongly suggests which calls are worth challenging and which are best left alone.

The researchers identified 83 missed calls during the 2007 Wimbledon tournament. (Some were challenged by players and overruled, and others were later identified as unquestionably wrong through frame-by-frame video.) Seventy of those 83 calls, or 84 percent, were on balls ruled out — essentially, shots that line judges believed had traveled farther than they actually did.

Called perceptual mislocalization by vision scientists, this subconscious bias is known less formally to Wimbledon fans as “You cannot be serious!” — John McEnroe’s infamous dissent when, yes, a 1981 shot was ruled out. Now that players can resort to a more evolved appeal procedure, the researchers’ discovery suggests that players should generally use their limited number of challenges on questionable out calls rather those that are called in, because such out calls have a far better chance of being discovered as mistaken on review, then overturned.

“What we’re really interested in is how visual information is processed, and how it can be used to a player’s advantage,” said David Whitney, an associate professor at U.C.-Davis’s Center for Mind and Brain and the paper’s lead author. “There is a delay of roughly 80 to 150 milliseconds from the first moment of perception to our processing it, and that’s a long time. That’s one reason why it’s so hard to catch a fly — the fly’s ability to dance around is faster than our ability to determine where it is.”

This is the third Wimbledon in which players can challenge questionable calls for review by the Hawk-Eye system, which uses high-speed video cameras to record balls’ flight. (About 25 percent of all challenges result in overturned calls.) There is no cost to the player when a call is proved correct, but after three such episodes in a set a player may not challenge again. Whether through strategy or residual tennis etiquette, most players leave many challenges unused.

Theoretically, line judges should be equally prone to call an out ball in as they are an in ball out. But when objects travel faster than humans’ eyes and brains can precisely track them — for example, Andy Roddick’s 150-mile-per-hour serves — they are left having to fill in the gaps in their perception. In doing so they tend to overshoot the object’s actual location and think it traveled slightly farther than it truly did.

Both successful challenge calls as well as the overlooked mistakes that the researchers later identified were several times more likely to come on “long” calls than “in” calls. (The same pattern existed at Wimbledon last year, Whitney said, although the paper did not present that data.) So players are better off using as many challenges as possible on balls called out, because those are the calls most likely to be wrong; if a player thinks an “in” call was wrong, chances are his own eyes were as fooled as line judges’ sometimes are.

Without knowing it, tennis officials are already told to try to compensate for this mislocalization effect. Published instructions for United States Tennis Association line judges tell them to “focus your eyes on the portion of the line where the ball will land,” rather than attempt to track the ball in flight. “Get to the spot well before the ball arrives,” they are advised.

Rich Kaufman, the association’s director of officials and a linesman and chair umpire from 1976 to 1997, said that of all things “one of the hardest things to teach new linesmen is to take their eye off the ball.”

“I once asked an eye doctor, then what am I seeing on a bounce?” Kaufman said. “The doctor said that’s your brain working — you think you see the initial point of impact but it’s the blur of the entry and exit of the ball.”

A player using his knowledge of this effect in challenging calls could see a benefit of about one or two overturned points per match, Whitney said, plus any psychological boost from feeling vindicated rather than robbed. But Whitney added that understanding how the brain misperceives visual stimuli can help in more real-life matters, like the design and placement of high-speed safety equipment, automobile brake lights and warning signs of all types.

As for Wimbledon, it appears as if the new information can only help players, not the judges who vex them. Kaufman said: “You have to call what you see. Or what you think you see.”

Monday, June 22, 2009

Why the Eyes Have it by Christopher Chabris


Why are we ­humans so good at seeing in color? Why do we have eyes on the front of our heads rather than on the sides, like horses? And how is it that we find it so easy to read when written language didn’t even exist until a few thousand years ago—a virtual millisecond in evolutionary time?
Most of us, understandably, have never given much thought to questions like these. What is surprising is that most cognitive scientists ­haven’t either. People who study the brain generally ask how it works the way it does, not why. But Mark Changizi, a professor at Rensselaer ­Polytechnic Institute and the author of “The Vision ­Revolution,” is indeed a man who asks why, and lucky for us: His ideas about the brain and mind are fascinating, and his explanations for our habits of seeing are, for the most part, persuasive.
Mr. Changizi takes care not to call himself a practitioner of evolutionary psychology. This is the one discipline of the mind sciences that focuses on why questions, but it often answers them by telling just-so stories that cannot be ­disproved. (Why do men have better spatial ability than women? Because a long time ago, in Africa, men needed spatial skills to track prey and to kill at a distance—a plausible theory but one that is difficult to test with experiments.) Instead Mr. Changizi calls ­himself a “theoretical neuroscientist,” seeking explanations for the design of the mind that are based on mathematical and physical analysis. He has his own stories, it is true, but they are grounded solidly in neuroscience, and they are backed up by data about a surprising range of human activities, from ­the colors found in historical ­costumes to the ­correspondence between the shapes found in written letters and the shapes found in ­nature.
Let’s start with the question of color. It is such a natural part of our visual experience that we don’t stop to wonder why we can see it at all. ­Without color television there would have been no “Miami Vice,” of course, but were we really missing out on so much when we had only black and white? The consensus explanation for our superior ability to perceive color is that primates evolved it to see fruit—you can’t order dinner if you can’t read the menu.
Mr. Changizi thinks otherwise. He proposes that color vision is useful for distinguishing the changes in other ­people’s skin color—changes that are caused by shifts in the volume and oxygenation levels of the blood. Such shifts, like blushing, often signal emotional states. The ability to see them is adaptive because it helps an observer to “read” states of mind and states of health in others, information that is in turn useful for predicting their behavior.
Our brains evolved in a time when people lived their entire lives without ever seeing someone with a skin color different from their own. Thus the skin color we grow up seeing, Mr. Changizi says, is “neutral” to us: It serves as a kind of baseline from which we notice even minor deviations in tint or hue. Almost every language has distinct words for some 11 basic colors, but none of them aptly describe the look of skin, which seems colorless (except in our recent multicultural societies, where skin color is newly prominent). As one might expect, primates without color vision tend to have furry faces and hands and thus less need to perceive skin color; ­primates with color vision are more “naked” in this respect, humans most of all.
James Steinberg
Conventional wisdom may be similarly misleading when it comes to binocular vision. It is said that we have two forward-facing eyes, which send our brains two separate images of almost everything in our field of view so that the brain can compare those images to estimate the distance of objects—a generally useful thing to know. But people who are blind in one eye, Mr. Changizi notes, can perform tasks like driving a car by using other cues that help them to judge distance. He offers a different explanation: that two eyes give us a sort of X-ray vision, allowing us to see “through” nearby objects to what is beyond.
You can experience this ability yourself by closing one eye and holding your forefinger near your face: It will appear in your field of vision, of course, and it will block what lies beyond or behind it. If you open both eyes, though, you will suddenly perceive your finger as transparent—that is, you will see it and see, ­unblocked, the full scene in front of you. Mr. Changizi observes that an animal in a leafy environment, with such an ability, gains an advantage: It can lurk in tall grass and still see what is “outside” its hiding place. He correlates the distance between the eyes and the density of vegetation in the habitats of animals and finds that animals with closer-set eyes do tend to live in ­forests rather than on plains.
As for reading, Mr. Changizi stops to observe how remarkable this ability is and how useful, giving us access to the minds of dead people (i.e., deceased writers) and permitting us to take in words much faster than we can by merely listening to them. He claims that we learn to read so easily because the symbols in our written alphabets have evolved, over many generations, to resemble the building blocks of natural scenes—­exactly what previous millennia of evolution adapted the brain to perceive quickly. A “T,” for example, appears in nature when one object overlaps ­another, like a stone lying on top of a stick. With statistical analysis, Mr. Changizi finds that the contour patterns most common in nature are also most common in letter shapes.
Mr. Changizi has more to say about our visual experience—about optical illusions, for instance, which he sees as artifacts of a trick the brain uses to cope with the one-tenth of a second it takes to process the light that hits our eyes and to determine what is actually in front of us. He calls for a new academic discipline of “visual linguistics,” and he tells us why there are no ­species with just one eye.
What does all this add up to? Provocative hypotheses but not settled truth—at least not yet. As a theoretician, Mr. Changizi leaves it to others to design experiments that might render a decisive ­verdict. Someone else will have to study how accurately people can perceive mental states from shifting skin tones, and someone else will have to ­determine whether, in most cases, looking at another ­person’s skin adds any useful information to what is easily known from facial expression, tone of voice and body ­language.
Still, the novel ideas that Mr. Changizi outlines in “The Vision Revolution”—together with the evidence he does present—may have a big effect on our understanding of the human brain. Their implication is that the environments we evolved in shaped the design of our visual system according to a set of deep principles. Our challenge now is to see them clearly.