One example of this concept is a video game that rapidly renders changing 3D environments to produce an illusion of motion. Computers have been capable of generating 2D images such as simple lines, images and polygons in real time since their invention.
However, quickly rendering detailed 3D objects is a daunting task for traditional Von Neumann architecture-based systems. An early workaround to this problem was the use of sprites, 2D images that could imitate 3D graphics. 3D computer graphics or three-dimensional computer graphics, (in contrast to 2D computer graphics) are graphics that use a three-dimensional representation of geometric data (often Cartesian) that is stored in the computer for the purposes of performing calculations and rendering 2D images. Such images may be stored for viewing later or displayed in real-time.
3D computer graphics rely on many of the same algorithms as 2D computer vector graphics in the wire-frame model and 2D computer raster graphics in the final rendered display. In computer graphics software, the distinction between 2D and 3D is occasionally blurred; 2D applications may use 3D techniques to achieve effects such as lighting, and 3D may use 2D rendering techniques.
3D computer graphics are often referred to as 3D models. Apart from the rendered graphic, the model is contained within the graphical data file. However, there are differences: a 3D model is the mathematical representation of any three-dimensional object. A model is not technically a graphic until it is displayed. A model can be displayed visually as a two-dimensional image through a process called 3D rendering or used in non-graphical computer simulations and calculations.
Real-time computer graphics or real-time rendering is the sub-field of computer graphics focused on producing and analyzing images in real time. The term can refer to anything from rendering an application's graphical user interface to real-time image analysis, typically using a GPU.
Recently, Epic Games and The Mill in New York teamed up to revolutionize the conventions of digital filmmaking in an endeavor code-named Project Raven. Epic provided an advanced version of its Unreal Engine along with features still under development for real-time production. The Mill, an expert in VFX and production, used its just-released virtual production tool kit called Mill Cyclops, along with a technology--laden, fully adjustable vehicle rig called Mill Blackbird that captures high-quality environment data. Car manufacturer Chevrolet gave the collaboration additional traction with design information for two of its sports cars (a 2017 Camaro ZL1 and an FNR concept vehicle) and further functionality resulting in a new level of customer.
Innovation occurred initially when lighting and other ever-changing environmental information was captured and processed instantaneously on set and then applied to two CG models, thereby seamlessly replacing the Blackbird as seen through the camera – all in real time. Later, this same real-time technology was used to create, alter, and produce a short film, which uses augmented reality (AR) to highlight a new methodology of production. Called “The Human Race,” the film merges real-time photorealistic visual effects and live-action storytelling, as the two Chevrolet vehicles – one an autonomous futuristic car and the other a high-end sports car – zip over a winding mountainous road during a race between man and machine.
Neither car is real in the live-action scenes; both are computer-generated. What’s special here is that they were not added in post. Rather, they were rendered at photoreal quality in real time and composited seamlessly into the scene, utilizing the lighting information captured by the Blackbird, the only physical vehicle at the location aside from the film car carrying the director, DP, and others.
The film will be used to kick off a multi-platform campaign marking the 50th anniversary of the Camaro. First, though, a live re-creation of the AR visualization was shown during Epic’s presentation at the Game Developers Conference in March, showcasing the real-time technologies used for the project.