> Quoting Miike Quenling Ellis <flagrant_sake at yahoo.com>:
>
>> that explains all those bad CG films. some day the tech will get
>> advanced
>> enough so that it actually looks like a real film.
>
> I wonder about this. The programmers say that the sheer programming
> requirements
> of visual effects is mind-boggling, so you quickly run out of overhead,
> i.e.
> you'd need vast amounts of processing power to display relatively simple
> things,
> like a landscape or even a single halfway photographic-quality tree. The
> films
> are evolving as fast as the CGI technology, so it's the never-ending race
> of the
> Red Queen.
The above's a little behind the times. For one thing, filmmakers _do_ have the resources to make film-quality CG imagery. Check into the capabilities of the rendering farms created by WETA for the Tolkien films.
Are we rendering the tree or the landscape as a complete model in the computer? That'd take massive amounts of data (branches, curvature, texture, light bounce, etc.). But, if we're simply compositing images of trees onto a flat image, then it's really pretty easy to do.
And one doesn't have to render the stuff in complete, excruciating detail, either. Let's say we've modelled a tree in our computer... but the tree's only a tiny part of an overall scene. So one doesn't need to create a fully-detailed tree. One uses a model that's just one good enough to make the shot work. (Albert Whitlock, one of the greatest matte painters, used a lot of pointillist techniques in his work. The paintings looked very indefinite, almost impressionistic... but the viewers own minds would 'fill in the details' when they saw the final scenes.)
The problem with computer-generated effects isn't resolution or fineness of detail. It's mainly things like motion (CG creatures don't always move right), or motion artifacts (like blur), or a mismatch in the color palette (the animals in _Jumanji_ looked pretty washed-out), or things like perspective.