July 07, 2008

Behind The Scenes

I decided to put together a blog displaying some of the art used in Mike White & Derek Jessome's 2006 animated music video. Not nearly as epic as the Kung Fu Panda Art Book by any means, but I had meant to this for a year now and haven't had the time until now to track down and assemble all the artwork that was made for the 'Work On You' project.

Anyways, here it is, enjoy.

Milt Kahl Pencil Test

Ken Harris Dance Animation Pencil Test

The Demo Reel of Bridgette Connell

The Demo Reel of Pedro Daniel Garcia

The Demo Reel of Pablo Navarro

The Demo Reel of Richard Bailey

Picture of the Day

Hot Dog

Gum Condom

Triple Whopper

Space Invaders Thirty Years Later

In 1977 Tomohiro Nishikado, a 33-year-old Japanese computer programmer, was a company employee, like millions of others. In his spare time he played video table-tennis, manoeuvring white paddles up and down a black-and-white TV screen to return a white blob of a ball to his opponent. In the amusement arcades, he had noticed a new game called Breakout, which involved moving a paddle from side to side to hit a ball, in order to destroy a series of blocks at the top of the screen.

One day, Nishikado had an idea. What if the blocks in Breakout could fire back? “I was absolutely hooked on Breakout,” Nishikado recalls. “I had already developed some games, so I wanted to make one that was better than this. I realised that the fun thing about Breakout was the sense of accomplishment when you finish a stage by clearing a set number of targets. At the time, we had heard good things about Star Wars, so I thought it might be a good idea to shoot some aliens.”

It was a great idea, so great that Space Invaders, the game he invented, is still going strong 30 years later, with a new version Space Invaders Extreme, released yesterday.

When Space Invaders arrived in Japanese arcades in 1978, it was a sensation. Entire arcades were given over to the game; at one point, the Japanese Government was forced to mint extra 100-yen coins because, it is said, the game’s cash-boxes were removing so many from circulation.

The following year, it reached Britain, and it hooked me right in – my idea of a perfect Saturday morning as a boy was being given £3 and then seeing how long I could make it last. The first time I managed to clock the machine (score so many points that the score counter returns to zero) is as memorable as my first kiss, but took a lot longer.

Seen from the high-definition flat-screen world of 2008, Space Invaders is laughably simple. Five rows of white aliens march horizontally across a screen, descending progressively towards your laser cannon, which is protected by a series of green buildings. You must destroy all the invaders before they reach the bottom of the screen. As they descend, the aliens drop bombs, which you must avoid. You have three lives, then it’s game over.

It’s simple, but hugely addictive, the frustration of being destroyed keeps you coming back for more. It’s a model of perfect gameplay, but Nishikado wanted more. “I wasn’t particularly happy with the game,” he says. “The capacity of the hardware was very low. I wanted to make it faster and more colourful, but I couldn’t.”

Elsewhere, though, Space Invaders was creating waves. In Kyoto, Japan’s second city, Shigeru Miyamoto, a young graphic artist and amateur cartoonist with no real interest in video games, played Space Invadersfor the first time. Two years out of college, Miyamoto had just landed his first job, with a local company that specialised in producing playing cards but had recently branched out into electronic entertainment. The company was called Nintendo, and over the next 30 years Miyamoto and his team would transform it into an entertainment giant.

The cast of celebrated characters that Miyamoto has created in the scores of games he has personally developed for the company – Mario, Zelda, Donkey Kong – is matched only in the 20th century by Walt Disney. Miyamoto’s most recent triumph is the best-selling Nintendo Wii console. His lasting legacy, however, is likely to be the part that he has played in turning a minority activity into a global business. The Mario games alone have sold nearly 300 million copies, and in a recent report, PricewaterhouseCoopers estimated that the global sales of video games will be worth $68.3 billion in 2012, up from $41.9 billion in 2007. For comparison, music industry sales were worth $11.5 billion last year. And it all started with Space Invaders.

We’ve come a long way in 30 years. Happy birthday, Space Invaders.

A (VERY) BRIEF HISTORY OF GAMING

1977 Atari releases the 2600, its fi rst game console, and the fi rst to use cartridges. It sells over 2 million units by the end of 1980.

1978 The golden age of the arcade begins with Space Invaders.

1980 Pac-Man arrives in arcades, the fi rst video game to come with a named, animated hero.

1981 Donkey-Kong, and Mario (Jump-Man) is born.

1984 Nintendo Famicom, its fi rst home entertainment console.

1986 Launch of NES home console.

1989 Sim City created by Will Wright; Sega launches the Megadrive.

1990 Launch of SNES.

1994 Sony launches PlayStation. Sega counters with Saturn in 1995.

1996 Lara Croft’s debut; Nintendo 64 offers 3D.

1999 Sega launches Dreamcast.

2000 PS2 launch followed in 2001 by Nintendo GameCube and Gameboy Advance; Microsoft Xbox. Sega quits console business.

2004 Nintendo PSP in 2005.

2006 Next-gen consoles Xbox 360 and PS3, plus the Nintendo Wii.

Space Invaders Extreme is out on Sony PSP and Nintendo DS

[Via TimesOnline]

By

Play the game here:

Military Supercomputer Sets Record

SAN FRANCISCO — An American military supercomputer, assembled from components originally designed for video game machines, has reached a long-sought-after computing milestone by processing more than 1.026 quadrillion calculations per second

The new machine is more than twice as fast as the previous fastest supercomputer, the I.B.M. BlueGene/L, which is based at Lawrence Livermore National Laboratory in California.

The new $133 million supercomputer, called Roadrunner in a reference to the state bird of New Mexico, was devised and built by engineers and scientists at I.B.M. and Los Alamos National Laboratory, based in Los Alamos, N.M. It will be used principally to solve classified military problems to ensure that the nation’s stockpile of nuclear weapons will continue to work correctly as they age. The Roadrunner will simulate the behavior of the weapons in the first fraction of a second during an explosion.

Before it is placed in a classified environment, it will also be used to explore scientific problems like climate change. The greater speed of the Roadrunner will make it possible for scientists to test global climate models with higher accuracy.

To put the performance of the machine in perspective, Thomas P. D’Agostino, the administrator of the National Nuclear Security Administration, said that if all six billion people on earth used hand calculators and performed calculations 24 hours a day and seven days a week, it would take them 46 years to do what the Roadrunner can in one day.

The machine is an unusual blend of chips used in consumer products and advanced parallel computing technologies. The lessons that computer scientists learn by making it calculate even faster are seen as essential to the future of both personal and mobile consumer computing.

The high-performance computing goal, known as a petaflop — one thousand trillion calculations per second — has long been viewed as a crucial milestone by military, technical and scientific organizations in the United States, as well as a growing group including Japan, China and the European Union. All view supercomputing technology as a symbol of national economic competitiveness.

By running programs that find a solution in hours or even less time — compared with as long as three months on older generations of computers — petaflop machines like Roadrunner have the potential to fundamentally alter science and engineering, supercomputer experts say. Researchers can ask questions and receive answers virtually interactively and can perform experiments that would previously have been impractical.

“This is equivalent to the four-minute mile of supercomputing,” said Jack Dongarra, a computer scientist at the University of Tennessee who for several decades has tracked the performance of the fastest computers.

Each new supercomputing generation has brought scientists a step closer to faithfully simulating physical reality. It has also produced software and hardware technologies that have rapidly spilled out into the rest of the computer industry for consumer and business products.

Technology is flowing in the opposite direction as well. Consumer-oriented computing began dominating research and development spending on technology shortly after the cold war ended in the late 1980s, and that trend is evident in the design of the world’s fastest computers.

The Roadrunner is based on a radical design that includes 12,960 chips that are an improved version of an I.B.M. Cell microprocessor, a parallel processing chip originally created for Sony’s PlayStation 3 video-game machine. The Sony chips are used as accelerators, or turbochargers, for portions of calculations.

The Roadrunner also includes a smaller number of more conventional Opteron processors, made by Advanced Micro Devices, which are already widely used in corporate servers.

“Roadrunner tells us about what will happen in the next decade,” said Horst Simon, associate laboratory director for computer science at the Lawrence Berkeley National Laboratory. “Technology is coming from the consumer electronics market and the innovation is happening first in terms of cellphones and embedded electronics.”

The innovations flowing from this generation of high-speed computers will most likely result from the way computer scientists manage the complexity of the system’s hardware.

Roadrunner, which consumes roughly three megawatts of power, or about the power required by a large suburban shopping center, requires three separate programming tools because it has three types of processors. Programmers have to figure out how to keep all of the 116,640 processor cores in the machine occupied simultaneously in order for it to run effectively.

“We’ve proved some skeptics wrong,” said Michael R. Anastasio, a physicist who is director of the Los Alamos National Laboratory. “This gives us a window into a whole new way of computing. We can look at phenomena we have never seen before.”

Solving that programming problem is important because in just a few years personal computers will have microprocessor chips with dozens or even hundreds of processor cores. The industry is now hunting for new techniques for making use of the new computing power. Some experts, however, are skeptical that the most powerful supercomputers will provide useful examples.

“If Chevy wins the Daytona 500, they try to convince you the Chevy Malibu you’re driving will benefit from this,” said Steve Wallach, a supercomputer designer who is chief scientist of Convey Computer, a start-up firm based in Richardson, Tex.

Those who work with weapons might not have much to offer the video gamers of the world, he suggested.

Many executives and scientists see Roadrunner as an example of the resurgence of the United States in supercomputing.

Although American companies had dominated the field since its inception in the 1960s, in 2002 the Japanese Earth Simulator briefly claimed the title of the world’s fastest by executing more than 35 trillion mathematical calculations per second. Two years later, a supercomputer created by I.B.M. reclaimed the speed record for the United States. The Japanese challenge, however, led Congress and the Bush administration to reinvest in high-performance computing.

“It’s a sign that we are maintaining our position,“ said Peter J. Ungaro, chief executive of Cray, a maker of supercomputers. He noted, however, that “the real competitiveness is based on the discoveries that are based on the machines.”

Having surpassed the petaflop barrier, I.B.M. is already looking toward the next generation of supercomputing. “You do these record-setting things because you know that in the end we will push on to the next generation and the one who is there first will be the leader,” said Nicholas M. Donofrio, an I.B.M. executive vice president.

By breaking the petaflop barrier sooner than had been generally expected, the United States’ supercomputer industry has been able to sustain a pace of continuous performance increases, improving a thousandfold in processing power in 11 years. The next thousandfold goal is the exaflop, which is a quintillion calculations per second, followed by the zettaflop, the yottaflop and the xeraflop.

By JOHN MARKOFF [Via nytimes.com]

Why Facebook Is the Future

On Aug. 14, 2007, a computer hacker named Virgil Griffith unleashed a clever little program onto the Internet that he dubbed WikiScanner. It's a simple application that trolls through the records of Wikipedia, the publicly editable Web-based encyclopedia, and checks on who is making changes to which entries. Sometimes it's people who shouldn't be. For example, WikiScanner turned up evidence that somebody from Wal-Mart had punched up Wal-Mart's Wikipedia entry. Bad retail giant.

WikiScanner is a jolly little game of Internet gotcha, but it's really about something more: a growing popular irritation with the Internet in general. The Net has anarchy in its DNA; it's always been about anonymity, playing with your own identity and messing with other people's heads. The idea, such as it was, seems to have been that the Internet would free us of the burden of our public identities so we could be our true, authentic selves online. Except it turns out--who could've seen this coming?--that our true, authentic selves aren't that fantastic. The great experiment proved that some of us are wonderful and interesting but that a lot of us are hackers and pranksters and hucksters. Which is one way of explaining the extraordinary appeal of Facebook.

Facebook is, in Silicon Vall--ese, a "social network": a website for keeping track of your friends and sending them messages and sharing photos and doing all those other things that a good little Web 2.0 company is supposed to help you do. It was started by Harvard students in 2004 as a tool for meeting-- or at least discreetly ogling--other Harvard students, and it still has a reputation as a hangout for teenagers and the teenaged-at-heart. Which is ironic because Facebook is really about making the Web grow up.

Whereas Google is a brilliant technological hack, Facebook is primarily a feat of social engineering. (It wouldn't be a bad idea for Google to acquire Facebook, the way it snaffled YouTube, but it's almost certainly too late in the day for that. Yahoo! offered a billion for Facebook last year and was rebuffed.) Facebook's appeal is both obvious and rather subtle. It's a website, but in a sense, it's another version of the Internet itself: a Net within the Net, one that's everything the larger Net is not. Facebook is cleanly designed and has a classy, upmarket feel to it--a whiff of the Ivy League still clings. People tend to use their real names on Facebook. They also declare their sex, age, whereabouts, romantic status and institutional affiliations. Identity is not a performance or a toy on Facebook; it is a fixed and orderly fact. Nobody does anything secretly: a news feed constantly updates your friends on your activities. On Facebook, everybody knows you're a dog.

Maybe that's why Facebook's fastest-growing demographic consists of people 35 or older: they're refugees from the uncouth wider Web. Every community must negotiate the imperatives of individual freedom and collective social order, and Facebook constitutes a critical rebalancing of the Internet's founding vision of unfettered electronic liberty. Of course, it is possible to misbehave on Facebook--it's just self-defeating. Unlike the Internet, Facebook is structured around an opt-in philosophy; people have to consent to have contact with or even see others on the network. If you're annoying folks, you'll essentially cease to exist, as those you annoy drop you off the grid.

Facebook has taken steps this year to expand its functionality by allowing outside developers to create applications that integrate with its pages, which brings with it expanded opportunities for abuse. (No doubt Griffith is hard at work on FacebookScanner.) But it has also hung on doggedly to its core insight: that the most important function of a social network is connecting people and that its second most important function is keeping them apart.

BY: LEV GROSSMAN (www.time.com)