July 07, 2008

Triple Whopper

Space Invaders Thirty Years Later

In 1977 Tomohiro Nishikado, a 33-year-old Japanese computer programmer, was a company employee, like millions of others. In his spare time he played video table-tennis, manoeuvring white paddles up and down a black-and-white TV screen to return a white blob of a ball to his opponent. In the amusement arcades, he had noticed a new game called Breakout, which involved moving a paddle from side to side to hit a ball, in order to destroy a series of blocks at the top of the screen.

One day, Nishikado had an idea. What if the blocks in Breakout could fire back? “I was absolutely hooked on Breakout,” Nishikado recalls. “I had already developed some games, so I wanted to make one that was better than this. I realised that the fun thing about Breakout was the sense of accomplishment when you finish a stage by clearing a set number of targets. At the time, we had heard good things about Star Wars, so I thought it might be a good idea to shoot some aliens.”

It was a great idea, so great that Space Invaders, the game he invented, is still going strong 30 years later, with a new version Space Invaders Extreme, released yesterday.

When Space Invaders arrived in Japanese arcades in 1978, it was a sensation. Entire arcades were given over to the game; at one point, the Japanese Government was forced to mint extra 100-yen coins because, it is said, the game’s cash-boxes were removing so many from circulation.

The following year, it reached Britain, and it hooked me right in – my idea of a perfect Saturday morning as a boy was being given £3 and then seeing how long I could make it last. The first time I managed to clock the machine (score so many points that the score counter returns to zero) is as memorable as my first kiss, but took a lot longer.

Seen from the high-definition flat-screen world of 2008, Space Invaders is laughably simple. Five rows of white aliens march horizontally across a screen, descending progressively towards your laser cannon, which is protected by a series of green buildings. You must destroy all the invaders before they reach the bottom of the screen. As they descend, the aliens drop bombs, which you must avoid. You have three lives, then it’s game over.

It’s simple, but hugely addictive, the frustration of being destroyed keeps you coming back for more. It’s a model of perfect gameplay, but Nishikado wanted more. “I wasn’t particularly happy with the game,” he says. “The capacity of the hardware was very low. I wanted to make it faster and more colourful, but I couldn’t.”

Elsewhere, though, Space Invaders was creating waves. In Kyoto, Japan’s second city, Shigeru Miyamoto, a young graphic artist and amateur cartoonist with no real interest in video games, played Space Invadersfor the first time. Two years out of college, Miyamoto had just landed his first job, with a local company that specialised in producing playing cards but had recently branched out into electronic entertainment. The company was called Nintendo, and over the next 30 years Miyamoto and his team would transform it into an entertainment giant.

The cast of celebrated characters that Miyamoto has created in the scores of games he has personally developed for the company – Mario, Zelda, Donkey Kong – is matched only in the 20th century by Walt Disney. Miyamoto’s most recent triumph is the best-selling Nintendo Wii console. His lasting legacy, however, is likely to be the part that he has played in turning a minority activity into a global business. The Mario games alone have sold nearly 300 million copies, and in a recent report, PricewaterhouseCoopers estimated that the global sales of video games will be worth $68.3 billion in 2012, up from $41.9 billion in 2007. For comparison, music industry sales were worth $11.5 billion last year. And it all started with Space Invaders.

We’ve come a long way in 30 years. Happy birthday, Space Invaders.

A (VERY) BRIEF HISTORY OF GAMING

1977 Atari releases the 2600, its fi rst game console, and the fi rst to use cartridges. It sells over 2 million units by the end of 1980.

1978 The golden age of the arcade begins with Space Invaders.

1980 Pac-Man arrives in arcades, the fi rst video game to come with a named, animated hero.

1981 Donkey-Kong, and Mario (Jump-Man) is born.

1984 Nintendo Famicom, its fi rst home entertainment console.

1986 Launch of NES home console.

1989 Sim City created by Will Wright; Sega launches the Megadrive.

1990 Launch of SNES.

1994 Sony launches PlayStation. Sega counters with Saturn in 1995.

1996 Lara Croft’s debut; Nintendo 64 offers 3D.

1999 Sega launches Dreamcast.

2000 PS2 launch followed in 2001 by Nintendo GameCube and Gameboy Advance; Microsoft Xbox. Sega quits console business.

2004 Nintendo PSP in 2005.

2006 Next-gen consoles Xbox 360 and PS3, plus the Nintendo Wii.

Space Invaders Extreme is out on Sony PSP and Nintendo DS

[Via TimesOnline]

By

Play the game here:

Military Supercomputer Sets Record

SAN FRANCISCO — An American military supercomputer, assembled from components originally designed for video game machines, has reached a long-sought-after computing milestone by processing more than 1.026 quadrillion calculations per second

The new machine is more than twice as fast as the previous fastest supercomputer, the I.B.M. BlueGene/L, which is based at Lawrence Livermore National Laboratory in California.

The new $133 million supercomputer, called Roadrunner in a reference to the state bird of New Mexico, was devised and built by engineers and scientists at I.B.M. and Los Alamos National Laboratory, based in Los Alamos, N.M. It will be used principally to solve classified military problems to ensure that the nation’s stockpile of nuclear weapons will continue to work correctly as they age. The Roadrunner will simulate the behavior of the weapons in the first fraction of a second during an explosion.

Before it is placed in a classified environment, it will also be used to explore scientific problems like climate change. The greater speed of the Roadrunner will make it possible for scientists to test global climate models with higher accuracy.

To put the performance of the machine in perspective, Thomas P. D’Agostino, the administrator of the National Nuclear Security Administration, said that if all six billion people on earth used hand calculators and performed calculations 24 hours a day and seven days a week, it would take them 46 years to do what the Roadrunner can in one day.

The machine is an unusual blend of chips used in consumer products and advanced parallel computing technologies. The lessons that computer scientists learn by making it calculate even faster are seen as essential to the future of both personal and mobile consumer computing.

The high-performance computing goal, known as a petaflop — one thousand trillion calculations per second — has long been viewed as a crucial milestone by military, technical and scientific organizations in the United States, as well as a growing group including Japan, China and the European Union. All view supercomputing technology as a symbol of national economic competitiveness.

By running programs that find a solution in hours or even less time — compared with as long as three months on older generations of computers — petaflop machines like Roadrunner have the potential to fundamentally alter science and engineering, supercomputer experts say. Researchers can ask questions and receive answers virtually interactively and can perform experiments that would previously have been impractical.

“This is equivalent to the four-minute mile of supercomputing,” said Jack Dongarra, a computer scientist at the University of Tennessee who for several decades has tracked the performance of the fastest computers.

Each new supercomputing generation has brought scientists a step closer to faithfully simulating physical reality. It has also produced software and hardware technologies that have rapidly spilled out into the rest of the computer industry for consumer and business products.

Technology is flowing in the opposite direction as well. Consumer-oriented computing began dominating research and development spending on technology shortly after the cold war ended in the late 1980s, and that trend is evident in the design of the world’s fastest computers.

The Roadrunner is based on a radical design that includes 12,960 chips that are an improved version of an I.B.M. Cell microprocessor, a parallel processing chip originally created for Sony’s PlayStation 3 video-game machine. The Sony chips are used as accelerators, or turbochargers, for portions of calculations.

The Roadrunner also includes a smaller number of more conventional Opteron processors, made by Advanced Micro Devices, which are already widely used in corporate servers.

“Roadrunner tells us about what will happen in the next decade,” said Horst Simon, associate laboratory director for computer science at the Lawrence Berkeley National Laboratory. “Technology is coming from the consumer electronics market and the innovation is happening first in terms of cellphones and embedded electronics.”

The innovations flowing from this generation of high-speed computers will most likely result from the way computer scientists manage the complexity of the system’s hardware.

Roadrunner, which consumes roughly three megawatts of power, or about the power required by a large suburban shopping center, requires three separate programming tools because it has three types of processors. Programmers have to figure out how to keep all of the 116,640 processor cores in the machine occupied simultaneously in order for it to run effectively.

“We’ve proved some skeptics wrong,” said Michael R. Anastasio, a physicist who is director of the Los Alamos National Laboratory. “This gives us a window into a whole new way of computing. We can look at phenomena we have never seen before.”

Solving that programming problem is important because in just a few years personal computers will have microprocessor chips with dozens or even hundreds of processor cores. The industry is now hunting for new techniques for making use of the new computing power. Some experts, however, are skeptical that the most powerful supercomputers will provide useful examples.

“If Chevy wins the Daytona 500, they try to convince you the Chevy Malibu you’re driving will benefit from this,” said Steve Wallach, a supercomputer designer who is chief scientist of Convey Computer, a start-up firm based in Richardson, Tex.

Those who work with weapons might not have much to offer the video gamers of the world, he suggested.

Many executives and scientists see Roadrunner as an example of the resurgence of the United States in supercomputing.

Although American companies had dominated the field since its inception in the 1960s, in 2002 the Japanese Earth Simulator briefly claimed the title of the world’s fastest by executing more than 35 trillion mathematical calculations per second. Two years later, a supercomputer created by I.B.M. reclaimed the speed record for the United States. The Japanese challenge, however, led Congress and the Bush administration to reinvest in high-performance computing.

“It’s a sign that we are maintaining our position,“ said Peter J. Ungaro, chief executive of Cray, a maker of supercomputers. He noted, however, that “the real competitiveness is based on the discoveries that are based on the machines.”

Having surpassed the petaflop barrier, I.B.M. is already looking toward the next generation of supercomputing. “You do these record-setting things because you know that in the end we will push on to the next generation and the one who is there first will be the leader,” said Nicholas M. Donofrio, an I.B.M. executive vice president.

By breaking the petaflop barrier sooner than had been generally expected, the United States’ supercomputer industry has been able to sustain a pace of continuous performance increases, improving a thousandfold in processing power in 11 years. The next thousandfold goal is the exaflop, which is a quintillion calculations per second, followed by the zettaflop, the yottaflop and the xeraflop.

By JOHN MARKOFF [Via nytimes.com]

Why Facebook Is the Future

On Aug. 14, 2007, a computer hacker named Virgil Griffith unleashed a clever little program onto the Internet that he dubbed WikiScanner. It's a simple application that trolls through the records of Wikipedia, the publicly editable Web-based encyclopedia, and checks on who is making changes to which entries. Sometimes it's people who shouldn't be. For example, WikiScanner turned up evidence that somebody from Wal-Mart had punched up Wal-Mart's Wikipedia entry. Bad retail giant.

WikiScanner is a jolly little game of Internet gotcha, but it's really about something more: a growing popular irritation with the Internet in general. The Net has anarchy in its DNA; it's always been about anonymity, playing with your own identity and messing with other people's heads. The idea, such as it was, seems to have been that the Internet would free us of the burden of our public identities so we could be our true, authentic selves online. Except it turns out--who could've seen this coming?--that our true, authentic selves aren't that fantastic. The great experiment proved that some of us are wonderful and interesting but that a lot of us are hackers and pranksters and hucksters. Which is one way of explaining the extraordinary appeal of Facebook.

Facebook is, in Silicon Vall--ese, a "social network": a website for keeping track of your friends and sending them messages and sharing photos and doing all those other things that a good little Web 2.0 company is supposed to help you do. It was started by Harvard students in 2004 as a tool for meeting-- or at least discreetly ogling--other Harvard students, and it still has a reputation as a hangout for teenagers and the teenaged-at-heart. Which is ironic because Facebook is really about making the Web grow up.

Whereas Google is a brilliant technological hack, Facebook is primarily a feat of social engineering. (It wouldn't be a bad idea for Google to acquire Facebook, the way it snaffled YouTube, but it's almost certainly too late in the day for that. Yahoo! offered a billion for Facebook last year and was rebuffed.) Facebook's appeal is both obvious and rather subtle. It's a website, but in a sense, it's another version of the Internet itself: a Net within the Net, one that's everything the larger Net is not. Facebook is cleanly designed and has a classy, upmarket feel to it--a whiff of the Ivy League still clings. People tend to use their real names on Facebook. They also declare their sex, age, whereabouts, romantic status and institutional affiliations. Identity is not a performance or a toy on Facebook; it is a fixed and orderly fact. Nobody does anything secretly: a news feed constantly updates your friends on your activities. On Facebook, everybody knows you're a dog.

Maybe that's why Facebook's fastest-growing demographic consists of people 35 or older: they're refugees from the uncouth wider Web. Every community must negotiate the imperatives of individual freedom and collective social order, and Facebook constitutes a critical rebalancing of the Internet's founding vision of unfettered electronic liberty. Of course, it is possible to misbehave on Facebook--it's just self-defeating. Unlike the Internet, Facebook is structured around an opt-in philosophy; people have to consent to have contact with or even see others on the network. If you're annoying folks, you'll essentially cease to exist, as those you annoy drop you off the grid.

Facebook has taken steps this year to expand its functionality by allowing outside developers to create applications that integrate with its pages, which brings with it expanded opportunities for abuse. (No doubt Griffith is hard at work on FacebookScanner.) But it has also hung on doggedly to its core insight: that the most important function of a social network is connecting people and that its second most important function is keeping them apart.

BY: LEV GROSSMAN (www.time.com)

July 06, 2008

20th Century's Greatest Voice Talent

BBC Radio has produced a 30-minute documentary on Mel Blanc in celebration of his 100th Birthday on May 30th. You can listen to the entire broadcast here:
That's All Folks! The Mel Blanc Story.

Raised By Cobra Commander

From as early as I can recall Chris Latta (real name Christopher Charles Collins) has scared me into feverish nightmares with his raspy evil voice. Anyone who grew up in Atlantic Canada during the early 1980s will no doubt remember the Rompa Room-action cartoon combos that would start at 6:00 am and feature Hercules and Astro Boy to give the kids a little blast of violence before heading off to kindergarten. It was during these much loved morning cartoon sessions that I first saw G.I.Joe. Needless to say my mind was blown.

I would sit there with my sister eating our cereal, eyes glued to the non-stop action. We knew full well that the antics of Flint and Lady J would be far, far more interesting than anything the abusive teachers would try to beat into our heads that day. With all its colorful, wonderfully stereotypical characters G.I.Joe had only one character that mattered to me. A character who’s voice would be ringing in my head all morning in school, and fuel my impersonation of him during recess. Cobra Commander.

As a kid I only got to see a handful of G.I.Joe episodes before it was replaced by Astro Boy. The damage however was done. Cobra Commander’s personality and voice had a massive impact on my 8-year-old mind. Keep in mind that Cobra Commander was leader of a terrorist organization bent on the destruction of the U.S. and the enslavement of the world. I didn’t care, he was a hero to me, and I wanted to be a member of Cobra so bad I was making my own uniform. My lifelong obsession with the military and villains started in the early mornings on CTV. It started with Chris Latta.

Had Latta only played Cobra Commander it would have been enough for me to always remember and appreciate him, but as I would learn, he had a villainous role that predated G.I.Joe. I wouldn’t discover it until a year or two later watching the amazing new wonder “Cable TV” at my grandmum’s house. We were watching a really wild space cartoon called Space Battleship Yamato. “Japanese Animation” was being broadcast in Canada since the late 70’s but it was still so early that people didn’t really even use the term yet. At any rate while watching Yamato I noticed (to the point of running around the room in hysterics) that a villain on the show called Sgt. Knox had the same voice as Cobra Commander. It was the same wheezy, sinister, cackling voice. The giant magical world of TV got a tiny bit smaller that day.

In the late 1970s voice acting, and cartoon production were still relatively pure art forms. Thankfully this was before Flash animation, child psychologists and Tipper Gore ruined it for everyone. So if you were clever enough you could make a villain on a cartoon as diabolical as you wanted to. I remember my old man getting such a kick out of Cobra Commander and Zartan’s fits of pure rage and threats of violence against even their own allies. It was like Jonny Quest all over again for him. Cartoons of this time were written and acted by men and ladies who had little interference from the networks. The bottom line was they worked for the toy companies, and if you ask me that never should have changed.

Just as Chris Latta’s voice acting career got rolling in the early 80’s a bizarre and miraculous thing happened. Due to the recession in the U.S. at the time the government told the FCC to back off and let toy companies do there own thing in the hopes of making some money. This created a window of almost unlimited freedom for smart companies like Hasbro and Mattel that lasted for almost five years. During this crazy time some of the best action cartoons the world had seen since the 1960s were made. Chris Latta was at ground zero for the whole party.

Transformers got its first season off just before the axe fell. I was lucky enough as a kid to have seen all of season one and two of this unforgettable animated masterpiece. Transformers seemed to play over and over, year after year and I am so grateful it did. This is where Latta’s greatest and most loved character was created. Most kids I knew who watched the Transformers loved the Decepticons and the Dinobots most of all, but most kids admiration was for the evil, deceitful, backstabbing, plot scheming, leader of “the Jets”, Starscream. Chris Latta’s performance as Starscream was incredible. He used some of the methods he had used with Cobra Commander, but cooled the rage down a bit in favor of a controlled and tactical sarcastic hatred. Starscream was not the leader this time but his attempts to obtain it were as entertaining as they now are legendary. Instead of Cobra Commanders unrivaled leadership Starscream was second, or more arguably third in command after Megatron and Soundwave. We had the pleasure seeing Starscream not only plot the destruction of mankind and the Autobots, but of his own leader as well. Evil and ambition no more apt a pupil than Starscream.
I could watch those Jets come screaming down out of the sky wreaking havoc forever. Nothing in cartoons has given me more pleasure than watching Starscream kill innocent humans on Transformers. Unrivaled entertainment.

Hasbro started to make so much money the had to stop counting it and start weighing it to see just how rich they had gotten from brainwashing kids like me. Not that I’m complaining, I like the number Hasbro and Mattel did on my brain. All that cash got people really pissed off and once the recession ended so did the party. The whiny nerdy voices of the FCC, the networks and the “frothing at the mouth with anger” parents groups got louder and louder until everything even remotely evil, violent, or scary had to go. Chris Latta’s characters and Hasbro animation continued on, but it was painfully obvious that the shows had been castrated of all that was good about them. The ridiculous Emperor Serpentor replaced Cobra Commander as Cobra’s leader. The once great Commander was now held in check until comic relief was needed. The ultimate disgrace forced on us all though lay in wait within Transformers the movie. As any kid who went to see it can tell you it was on of the saddest most disturbing things a child of the 80’s could have witnessed. It was like watching the friends you grew up with die; no it WAS watching the friends you grew up with die. Starscream along with countless others did not survive the “changing of the guard”. It was a god dam terrible time for kids animation and its only gotten worse since then.

Besides being cast as the original voice of C. Montgomery Burns on The Simpson’s it was in the early 90s that Chris Latta’s live action career started to pick up speed. Chris was a big intimidating man who often played the thug, mugger or tough guy on sitcoms like Married With Children and Seinfeld. Star Trek TNG and Deep Space Nine also got Chris several rolls. Although much like how he hid behind animation in the 80s Chris Latta continued to hide behind inches of foam latex makeup for his parts on Star Trek. It seems only fitting that Chris was playing aliens in costume. He was trained in mime and theater as most costumed performers are and brought a lot of skill to each of his memorable Star Trek roles. As much as I love the 60’s Trek I have to admit not liking Star Trek TNG or Deep Space nine very much. I actually hate them. Upon learning of Chris Latta’s roles on the show I tracked down and watched those few episodes but hell I love the guy. I’d watch all his TV commercials too if I knew where to get them.

Chris Latta Passed away on June 12, 1994 from a lost battle with cancer. He was taken from us just as his career was starting take off in live action TV. He was, in my opinion, one of the greatest voice actors animation was ever lucky enough to have and has hade more of an impact on me than any other actor I can think of. What Chris Latta accomplished in his short career as a player of villainous roles has created for him a legacy that will live on in the hearts of 80s kids forever. There will never be another like him.

Written by Cory Laffin

See the full list of Chris Latta's voice acting credits here.

July 04, 2008

Presto

Heads-up, Pixar’s latest short Presto is now available on iTunes for $1.99. It’s currently the #1 selling short film download on Apple’s site. This is Pixar's most recent short film, a slapstick ode to Warner Bros. and Tom & Jerry cartoons of old. Directors Doug Sweetland and character designer Teddy Newton used cartoon greats such as Chuck Jones and Tex Avery to inspire their own creative visions.

A Preview:

Diablo 3 Artwork

Click on image to see breath-taking concept art behind the up
coming swords & sorcery PC/Mac game, Diablo III, along
with some video clips of great game play footage as well.

A Classic

12 Notable Movies with Multiple Directors

With the news that Steven Spielberg and Peter Jackson will be co-directing Tin Tin (although, due to Director's Guild rules, only Spielberg will be credited), we thought it'd be interesting to take a look at some other famously co-directed movies. It was.


Grindhouse

When it was released a little over a year ago, Grindhouse rode a wave of internet hype so high and fast it had the industry believing. And why not? It was two feature length movies -- one directed by a revitalized Quentin Tarantino after the success of Kill Bill, and another directed by Robert Rodriguez, himself flush from the success of Sin City. So, what went wrong? Well, maybe most people who talk about how excited they are about a movie on the internet stay on the internet when that movie is released, perhaps to talk about the next movie they're really excited about. Or maybe they just saw the running time was over three hours and figured they'd rather watch nine sitcoms instead. Whatever the reason, Grindhouse mightily disappointed in the box office, scrapping plans of a sequel. Critically, though, it flourished. Apparently, in some cases there is accounting for taste, but there's no accounting for laziness.



There's Something About Mary

The team of Bobby and Peter Farrelly have directed a few hilarious comedies together -- and, more recently, a few not-so-hilarious comedies. While their debut, Dumb and Dumber, was a box office phenomenon, There's Something About Mary took their success to another level and became a touchstone of the 90s. How many times have you seen the hair gel scene parodied? (Answer: Far too fucking many.) It represents a level of commercial and critical success that neither they nor many other comedies have achieved since, and with the disaster that was The Heartbreak Kid, and with Mary now ten years behind them, it seems that the comedy world has passed them by.


The Big Lebowski

This list could be populated almost entirely by Coen Brothers movies and, while it would be stupid and pointless, it might be kind of tough to argue with. It might have made sense to exclude teams of brothers; does it really count as having been directed by multiple people if the people involved seem to share one brain? I guess it has to, but what I can do -- and have done -- is limit every team to one selection. So why The Big Lebowski over Oscar winners Fargo and No Country for Old Men. Could it be because The Big Lebowski is the best fucking movie ever? That might just be the reason. Brilliantly acted and simultaneously hilarious and meditative, Lebowski is an iconic film that's not only one of the most quotable movies ever made, but is also one of the most purely enjoyable ways to spend two hours ever invented. Disagree? Shut the fuck up, Donny. You're out of your element.


Borat

When Todd Phillips signed on to make Borat, he was maybe the hottest comedy auteur in Hollywood. He had just followed up a solid comedy and one-of-the-movies-that-enjoyed-a-popularity-bump-by-way-of-riding-Tom-Green's-briefly-worn-coattails in Road Trip with Old School, and he seemed unstoppable. Then, part of the way through production on Borat, Sasha Baron Cohen went to a rodeo in character, stood up in the ring in front of the crowd and sang the "Kazakhstan National Anthem" ('Kazakhstan is number one exporter of potassium / All other countries have inferior potassium.') to the tune of the U.S. National Anthem. The crowd didn't like it (kind of the point), and neither did Phillips.

He left the project and was replaced more than adequately by longtime sitcom guru Larry Charles, who helmed the movie to the type of success that inspires almost immediate backlash due to a disgusting level of oversaturation. In the meantime, Todd Phillips has struggled to get anybody from Old School interested in making Old School Dos. Also, School for Scoundrels was terrible. Don't see that.


Casino Royale

No, this is not the 2006 reboot of the James Bond franchise. This is the 1967 disaster of a movie that proves the adage, "Too many cooks spoil the broth, and also they fail miserably when they try to collaborate on a movie together." If you're like me, you Netflix'd this bad boy when you saw the names of Billy Wilder, Woody Allen, Terry Southern, Peter Sellers and Joseph Heller under the 'writing credits' section of imdb. I don't just like all of those guys, I love them. If it was feasible, I'd impregnate them all. Or let them impregnate me. Don't care. At any rate, as someone who was aware of the above adage, I was skeptical. And I was write to be, as, although Casino Royale brings together as much or more comedy talent as any movie before or since, it is a confusing mash-up of loosely related plots and settings that fails at pretty much everything it attempts. It is not only the weirdest James Bond movie -- weird enough that it's pretty dissociating that it even exists -- it's one of the strangest movies ever.


Superman 2

Originally, Richard Donner signed on to direct the first two Superman movies in one extended shoot. His (and the studio's) plan was to shoot both movies back to back, then edit them and release them both in relatively short order. However, with the photography on Superman complete and 3/4 complete for Superman II, the decision was made to halt production so that Donner could settle down with his editors and prepare the first movie for release. After a series of intractable disagreements with the producers -- most notably the fact that they decided to cut all of Marlon Brando's scenes from the movie as a result of his demands for over 10% of profits -- Donner was replaced on the project by Richard Lester, who rejiggered the entire project to suit the whims of the producers as well as the ideas he'd fermented while working as an uncredited producer on the original shoot. When it was finally released, Superman II proved to be a disappointment. Eventually, in 2006, the original director's cut was released on DVD, although much of the footage shot by Lester still had to be used due to the aforementioned fact that only 75% of the movie had been shot before production halted. The results aren't always pretty, but they're certainly interesting.


The Matrix

More brothers. Or are they brother and sister now? Sex changes aside, the Wachowskis scored a gargantuan surprise hit with The Matrix in 1999. It was a movie that I and many confirmed others ignored based on the previews, mainly because of deja vu over Johnny Mnemonic. Unlike the case with the Coen brothers, The Matrix is a no-brainer choice to represent the cooperative career of the Wachowskis, whose previous and subsequent to resonate with audiences failed to garner more than niche interest. I mean, come on -- raise your hand if you thought Speed Racer was some sort of elaborate prank the first time you saw the trailer. The Matrix, meanwhile, melded pop philosophy with kung fu and moderately attractive women in long leather jackets in a way that had never been done before. The special effects broke new ground, and we learned to love Keanu again. Well, sort of. We learned to tolerate him. And, somehow, it was actually believable that he knew kung fu.


Four Rooms

If, without looking at imbd, you can name either of the directors who worked on Four Rooms who are not Robert Rodriguez or Quentin Tarantino, give yourself a point. If you can also name a single movie either of them worked on, give yourself a round of applause. When Four Rooms was released, Rodriquez and Tarantino were hot commodities. Tarantino had released Pulp Fiction the previous year and was a critics darling; Rodriguez was the king of stylized action with his transition from the ultra-low budget El Mariachi to the ridiculous but also insanely entertaining sequel, Desperado. It's hard to say that anybody was eagerly anticipating the other two sections of the movie, except in that one of them featured Madonna, to the delight of probably somebody. All in all, Four Rooms was timed well enough to generate interest, but it wasn't good enough or cohesive enough (to be expected) to generate anything resembling a following, and has become an afterthought in the oeuvres of Tarantino and Rodriguez. Unfortunately for them, this was the pinnacle of exposure for the other two directors, Allison Anders and Alexandre Rockwell.


Menace II Society

Brothers again: this time, the Hughes brothers weigh in with by far their most appreciated and important work, although it should be said, their movies are generally underappreciated. Menace the story of one young man's descent into a life of crime in Watts, L.A. Gritty and deterministic, this movie's depiction of urban life went unchallenged for a few years until the release of the sickeningly gritty and frighteningly deterministic Kids.


Ratatouille

As perhaps the best director working in animation today, it probably shouldn't have been a surprise that Ratatouille turned out as well as it did. However, given the track record of movies that've changed directors in the middle of production, it's hard to argue that skepticism was unwarranted. Indeed, the original director, Jan Pinkava, had been working on Ratatouille for almost five years when Bird replaced him on the the project for unknown reasons. Pinkava retained a co-director credit on the film, but has refused to comment on the switch and has since left Pixar. Bird, meanwhile, looks untouchable.


Sin City

The third entry to involve Robert Rodriguez, Sin City also marked the transition of comic book writer and artist Frank Miller to the director's chair -- a role he's playing fully for the hotly anticipated Christmas '08 release The Spirit. It was only logical for Miller to lend a hand in directing Sin City, as Rodriguez made the decision to translate Miller's unique visual style to the screen. Sin City has to be considered one of the most effective and seamless movies ever co-directed by two people who are not related to each other, perhaps a testament to the communication, the shared vision and the distribution of roles between the two. It probably didn't hurt that the movie kicked ass for two straight hours, either. Two sequels are in the works.


American History X

In one of the most publicized feuds in movie history, director Tony Kaye attempted to remove his name from the credits of American History X after... something happened. There are conflicting reports that that 'something' could have been either Tony Kaye's insistence on reshooting much of the movie while making peace and poetry a central theme, or it could have been Ed Norton's insistence that he be given more screen time. Whatever the case, Kaye ended up leaving the production and attempted to get the film credited to Humpty Dumpty. Unfortunately for him, the Director's Guild doesn't tend to allow any names other than 'Alan Smithee' to replace that of the original director, and additionally, Kaye talked to the press about why he wanted his name off the project: a direct no-no in the eyes of the Guild. So, while Kaye fought with the studio and the guild over what name would appear in the director's credit, Norton and the producers assumed the role of director for the final edit of the movie, which, it should be noted, contains a ton of screen time for Ed Norton. And not a lot of poetry.
[Via filmwad.com]

AXTI

Unboxing a Mint Apple //c 20 Years Later

2235325608 3Be4D5Eb6D

Geek porn - taking painstaking photos of every step of the unboxing progress - has been around for just a short while on the Internet. It’s so recent in fact, that it didn’t exist when the Apple //c was brand new. Fortunately, there are still unopened Apple //c’s out there in the world, and Flicker user Dansays found one on eBay. And because he’s a contemporary geek, he documented every step of the process. Fascinating reminder of just how completely Apple went design in the mid-1980s. Frog Design’s Snow White language is still as sophisticated today as it was then. And the intricacies of the packaging! It’s like looking into the future - 20 years ago. Make sure to click through to see many, many more images.

Flickr via Boing Boing via Andre Torrez’s notes.

Marty McFly

Tangle 1.1.1

I’m sure there’s a major discovery to be made in the world of science that would explain how my iPod headphones get tangled up so thoroughly and rapidly. It seems that no matter what cunning tricks I employ, nor how tidy I try to be, my headphones always appear in a knotted mess when I want to use them, which tends to make me angry on the scale of ‘want to kick a puppy’. Surprisingly, then, I really like Tangle, which, in a broad sense, is rather like untangling a set of iPod headphones or ten.

It’s safe to say that Tangle is gaming at its purest level. There are no characters or storylines. Instead, there are a bunch of green circles, connected with gray lines, displayed in an aesthetic manner that most 8-bit computers would have little trouble with. The idea is to drag the circles around until no lines are crossed, whereupon you’re provided with a jaunty little jingle, a time, and a means of accessing the next level (which has more lines to uncross).

Tangle isn’t rocket science—it has a kind of mindless quality that’s akin to Tetris. But as most people who’ve sampled Alexey Pajitnov’s classic will testify, it’s often the simplest games that are the most enduring. Although Tangle isn’t on a par with the Russian block-stacking game, and, frankly, is a little overpriced, it’s still a fun title to while away the odd half-hour. And despite the extremely basic visuals, on-screen feedback is clear, and the online leaderboard enables you to pit your capabilities against Tangle ninjas around the world.

Product Review By Craig Grannell [Via cultofmac.com]


Tangle screen grab

Manufacturer: MC Hot Software

Price: $20

URL: mchotsoftware.com/tangle/

Mercury is shrinking, Nasa space probe reveals

The smallest planet in the solar system is shrinking, a space mission to Mercury has revealed.

The Messenger space probe, which began its 5 billion mile journey to Mercury four years ago, has provided new evidence that the closest planet to the sun is contracting in on itself.

Nasa scientists revealed images of cliffs and faults crossing the planet's surface, which they say are result of the crust buckling as Mercury's molten core cools down and solidifies.

The planet may have lost around 3 miles from its 3,000 mile diameter through out its history.

They said Messenger had revealed the shrinking caused by this cooling is far greater than had previously been thought.

In a special issue of the journal Science, researchers reveal that volcanoes have covered large swathes of the landscape with smooth plains of lava flow during the early part of Mercury's history – around three to four billion years ago.

Messenger, which managed to take photographs of around a fifth of the planet's surface in a fly-by in January this year, has also solved the mystery surrounding the source of Mercury's magnetic field.

Scientists had been unsure whether the magnetic activity was actively produced by the stirring of the planet's molten core, as it is on Earth, or was simply the result of large deposits of iron that act like a giant bar magnet, as is the case on the Moon.

The mission revealed that Mercury is similar to the Earth's, raising hopes that the planet can be used to study how Earth would have behaved and looked in its early history.

"After five months of analysis we know have some fantastic results," said Sean Solomon, principal investigator on the Messenger Mission and a researcher at the Carnegie Institution of Washington, Washington DC.

"The dominant landforms on Mercury are features called lobate scarps. These are huge cliffs that mark the tops of faults that formed during the contraction of the surrounding area.

"The data we have from the fly-by tells us that the total contraction is at least one-third greater than we had appreciated."

Mercury is unusual compared to other planets in the solar system because it has such a large core.

Scientists estimate that the mainly iron-based core accounts for 60 per cent of the planet's mass and 75 per cent of its diameter.

As solid iron is more dense than liquid iron, as the planet's core cools, its innards contract, causing the crust to buckle inwards and create the faults seen on the surface.

Scientists estimate that Mercury's diameter has decreased by about one tenth of one percent - a big shrinkage in geological terms.

During its first fly-by of Mercury in January this year, Messenger flew within 125 miles of the planet's surface to take pictures and carry out other tests.

The images sent back by Messenger have solved a 30 years controversy about patches of smooth rock first seen on the surface by the Mariner 10 mission in 1975.

Researchers debated whether these were hardened lava or material thrown out by asteroid impacts.

The images showed a large volcanic vent in the centre of one of these smooth patches, which have now been confirmed as the remains of lava flows that spewed out of the vents like a volcanic fountain.

Another vent also showed signs of molten rock that had oozed outwards over the surface.

Colour images of an area known as the Caloris basin, showed that it was completely filled with these smooth volcanic plains.

Scientists also said they were unable to rule out that there was still volcanic activity taking place on the surface of the planet.

Messenger is due to return to Mercury for another fly-by in October this year and again in September next year before setting into an orbit around Mercury in 2011.

It is hoped that far more of the planet's surface can be studied in greater detail and evidence for ongoing volcanic eruptions may be found.

James Head, a planetary geologist at Brown University, Providence, Rhode Island, warned, however, that eruptions will have become more infrequent as the planet's surface contracted.

He said: "The more shrinkage there is the less lava will get out and Mercury has experienced a huge amount of shrinkage.

"The volcanic activity we have seen seems to date from the first half of the solar system's history, but because the interior of the planet is still active we will be looking for evidence that there are still eruptions."

The probe has also shown how violent volcanic eruptions have helped to shape the surface of the planet.

By Richard Gray [Via telegraph.co.uk]

Sigg Jones

Another day, another shiny, fantastic, eye-poppingly good short film from the students at powerhouse animation school Supinfocom. "Sigg Jones" is just another example of the high calibre projects the school continues to turn out year after year after year. Created by Douglas Lassance, Matthieu Bessudo, and Jonathan Vuillemin (2 out of the 3 have already been snatched up by The Mill). Created using 3ds Max, Premiere, Photoshop during the 2005/2006 academic year at Supinfocom. Watch Sigg Jones here.

Will Harry Potter and the Half-Blood Prince be Movie Magic?

Harry Potter and the Half-Blood Prince

Yes, it’s nearly that time again! Harry Potter is returning in Harry Potter and the Half Blood Prince. Empire Magazine has a cover of Harry determined and bloodied, ready to go to battle for his beloved Hogwarts. They also have a link to USA Today’s first look at the new film. That first look includes some new photos, well some are new, some have been previously released.Half Blood Prince which arrives in theaters November 21, will among other things, explore Draco Malfoy giving in to the dark side. It’s not like we didn’t see this coming, did we? He’s been a nasty little creature from the first film.

As usual Hogwarts is threatened and Harry and company must do their best to save the old school so they can fight on for another day and at least two more films.

The Harry Potter literary saga is now complete. The readers of the books know the end to the story. It’s very likely that people who haven’t read the books know how it ends, so there has to be an effort to make these movies stand on their own apart from the books and each other.

Can this be accomplished? So far, I don’t believe any of the films have been particularly great. They’re okay to good and they please many of the Harry Potter fans. Looking at the saga purely from the perspective of their merit as films I don’t think any of them have achieved greatness.

The final book, “Deathly Hallows” will be split into two films and I wonder if those final films, along with Half Blood Prince will truly create magic on film.

Have a look at the new Empire Magazine Cover as well as a gallery of great new photos from the film below:

Half Blood Prince Empire Cover

MICHAEL GAMBON as Albus Dumbledore in Warner Bros. Pictures\' fantasy \


JIM BROADBENT as Professor Horace Slughorn and EMMA WATSON as Hermione Granger in Warner Bros. Pictures\' fantasy \

EMMA WATSON as Hermione Granger, RUPERT GRINT as Ron Weasley and DANIEL RADCLIFFE as Harry Potter in Warner Bros. Pictures\' fantasy \

Director DAVID YATES on the set with DANIEL RADCLIFFE as Harry Potter and BONNIE WRIGHT as Ginny Weasley in Warner Bros. Pictures\' fantasy \

TOM FELTON as Draco Malfoy in Warner Bros. Pictures\' fantasy \

RUPERT GRINT as Ron Weasley in Warner Bros. Pictures\' fantasy \

By Robin Ruinsky [Via Film School Rejects]

July 03, 2008

"MUTO" - Some Seriously Amazing Stuff!




Sarah Silverman

Mythbusting: Ideas Do Not Spread Because They Are Good

I’d like to debunk a myth that has gone on, rampant and unchallenged in marketing circles, especially viral and social marketing, for some time now, but first I feel a few caveats are in order.

First: product quality is important, no amount of marketing will alchemize a bad product into a good one. Second: even the most virulent of viral marketing campaigns can leave a brand or product right where it started. And third: I acknowledge that far too often the term “viral” is thrown around, misunderstood and slathered on like a panacea, but most of the people who do this, also attempt to ruin many other good concepts with psuedo-science and smoke-and-mirrors.

Now the myth: For an idea, piece of content or product to spread or (cringe) “go viral” it has to be a great product. This is WRONG.

When Richard Dawkins coined the term meme in 1976 (over three decades ago and before I was born) he said:

Remember that `survival value’ here does not mean value for a gene in a gene pool, but value for a meme in a meme pool.

That book, The Selfish Gene, posited (and largely put the argument to bed) that genes replicate for their own good, not the good of the host. Genes survive and thrive not based on how much value they bring to the creature they inhabit but based on how good they are at replicating, they’re selfish. There are plenty of genes who’s phenotypes produce negative results for their hosts, yet they continue to spread.

The same is true, and perhaps even more obviously, for memes. Auto-toxic memes are harmful to their host, and exo-toxic memes are dangerous to others. The list of virulently “adopted” bad ideas is endless, but here’s a small sample:

  • Blood feuds

  • Terrorism

  • Suicide

  • Drug abuse

  • Antisemitism

  • Pyramid schemes

  • Cults

Daniel Dennett gave a talk on harmful memes at TED in 2002:


So clearly, ideas do not spread based on their “quality” or the “value” they provide, in fact they have an entirely different set of selection criteria, which Francis Heylighen has detailed.Perhaps finally we can rid ourselves of the admittedly quaint and comforting notion that we only adopt ideas, content and products because of how good and useful they are and start to understand that we adopt them because they are good at getting adopted.

Via Dan Zarrella