So we can start our journey into the future by being certain about one thing. While there's still plenty of ongoing research into procedural texture generation (which uses algorithms to generate textures on the fly) and other shortcuts, the traditional use of bitmap images for textures, along with vertices and indices for models, isn't going to disappear any time soon. This constant increase of polygons and textures is an unfortunate necessity for the search for ever more realistic, or more rather, more detailed graphics. Texture size and the number of textures used have ballooned each year, too, and we're now at the point where even just an HD texture pack can be the same size as the rest of the game's assets combined (take a step forward, Far Cry 6).ĭoom (1993) vs Doom (2016) - 0.01 GB vs. In the early days of 3D games, the number of polygons per frame was often used as a selling point, but now it never gets mentioned, because it's just a ridiculous amount.įor example, the first Tomb Raider running on the PlayStation used 250 triangles for Lara, whereas the PC version of Shadow of the Tomb Raider can use up to 200,000 for the main character. Just two years separate the Nature tests of 3DMark2001 and 3DMark03Įverything that could be measured in numbers was increasing at a frantic rate. But all their efforts did was demonstrate that the evolution in graphics, both software and hardware, was too rapid to accurately predict just how things were going to turn, at that time. One company tried to give it a go and MadOnion (later Futuremark) attempted to show everyone what graphics may look like with 3DMark, based on feedback they received from other developers and hardware vendors. It would be the evolution of graphics chips that would drive the development of 3D graphics, but predicting how games might look in the near future was somewhat tricky, despite the obvious path that GPUs would take. Wobbly textures? Glitchy polygons? It has to be the original Tomb Raider from 1996Īt the same time, PC hardware vendors were also getting in on the 3D act, and in just 5 years, desktop computers around the world were sporting graphics cards boasting support for shaders, z-buffers, texture mapping, et al. By the standards of today, games for those early machines, such as the first PlayStation, were primitive in the extreme, but developers were still getting to grips with 'modern' rendering. It would be decades before any casual gamer got to see these things in action on home entertainment systems.Īnd it was the likes of Sony, Sega, and Nintendo that did this, with their 3D-focused consoles. The same is true for other rendering standards: Gouraud shading (Henri Gouraud, 1971) texture mapping (Edwin Catmull, 1974) bump mapping (Jim Blinn, 1978). The first commercial hardware to make use of the buffer appeared within 5 years or so, but the general public would have to wait over 20 years, until the mid-90s, for the arrival of the Nintendo 64 and its z-buffer enabled Reality co-processor.įor its time, one of the most powerful graphics chips available to the general public. student Wolfgang Straßer, 1974, working at TU Berlin at the time. The concept of the z-buffer is generally attributed to Ph.D. This is nothing more than a portion of memory used to store depth information about objects in a scene and is primarily used to determine whether or not a surface is hidden behind something else (which in turn allows objects to be discarded, instead of rendering them and can also be used to generate shadows). vertex transformations, viewport projections, lighting models) are decades old, if not older. Much of the theoretical aspects of 3D rendering (e.g. Which technologies will become as commonplace as texture filtering or normal mapping is today? What systems will help developers reach these higher standards? Join us as we take a look at what awaits us in the future of 3D graphics.īefore we head off into the future, it's worth taking stock of the advances in 3D graphics over the years. Today's best-looking titles already look stunning, so how much better can they possibly get? As CPUs and GPUs continue to get more powerful with each new generation, the push for ever more realistic graphics in blockbuster games shows no signs of slowing down.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |