Friday, March 4, 2011

The wrong solution

A few days ago, I watched the movie Avatar, a movie I'd been unwilling to pay to go see because, well, I'd been told that it really has to be watched in 3D, and frankly, I find 3D uncomfortable and disconcerting, and it doesn't bode well for a movie if a technical issue makes a significant difference to whether it's worth watching (mind you, I wouldn't watch 2001 pan and scanned, so...)

And it was awful. Just awful. I'm sorry, I know that will probably rub some of my readers up the wrong way, but as I told a friend, the dialog and acting came across as sub-George Lucas.

But that wasn't the only issue. A major disconcerting part was the thing that the movie was supposed to showcase: it was entirely computer generated. And while there were many scenes where the computer generation wasn't obvious, a major issue - to me - was that there were far too many scenes where it was. Where everything from textures to human movement (I'm not just talking about the big blue people here) was unreal and, well, creepy.

(Let me just say that it's an extraordinary technical accomplishment, before anyone gets the wrong idea, it's just it doesn't work.)

Now, obviously James Cameron did something right, because the movie made more than a billion dollars, but I have to say this because it needs to be said: this could have been a more watchable movie. Leaving aside the cheesy script, it could have been more watchable had the right technologies been used. Having seen it, I have to wonder if the issue with watching it in 2D isn't that you don't get the same breadth that 3D viewing would give you, but that 3D is necessary so that you're distracted by it, and overlook the computer graphics.

Lessons to learn? Well, I guess they're twofold: (1) just because something is technologically impressive doesn't mean it's ready yet, and (2) just because something isn't ready yet doesn't mean you can't make billions from it. See also: the Apple iPad.

No comments:

Post a Comment