② Some types of cameras expose the film by sliding a
rectangular slit across the film. This leads to interesting effects when
objects are moving in a different direction from the exposure slit
(Glassner 1999; Stephenson 2007).
Furthermore, most digital cameras read
out pixel values from scanlines in succession over a period of a few
milliseconds; this leads to rolling shutter artifacts, which have
similar visual characteristics. Modify the way that time samples are
generated in one or more of the camera implementations in this chapter to
model such effects. Render images with moving objects that clearly show
the effect of accounting for this issue.
② Write an application that loads images rendered by the
EnvironmentCamera, and uses texture mapping to apply them to a sphere
centered at the eyepoint such that they can be viewed interactively. The
user should be able to freely change the viewing direction. If the
correct texture-mapping function is used for generating texture
coordinates on the sphere, the image generated by the application will
appear as if the viewer was at the camera’s location in the scene when it
was rendered, thus giving the user the ability to interactively look around
the scene.
② The aperture stop in the RealisticCamera is modeled
as a perfect circle; for cameras with adjustable apertures, the aperture is
generally formed by movable blades with straight edges and is thus an
-gon. Modify the RealisticCamera to model a more realistic
aperture shape and render images showing the differences from your model.
You may find it useful to render a scene with small, bright, out-of-focus
objects (e.g., specular highlights), to show off the differences.
② The standard model for depth of field in computer
graphics models the circle of confusion as imaging a point in the scene to
a disk with uniform intensity, although many real lenses produce circles
of confusion with nonlinear variation such as a Gaussian distribution.
This effect is known as “Bokeh” (Buhler and Wexler 2002). For example,
catadioptric (mirror) lenses produce doughnut-shaped highlights when small
points of light are viewed out of focus. Modify the implementation of
depth of field in the RealisticCamera to produce images with this
effect (e.g., by biasing the distribution of lens sample positions).
Render images showing the difference between this and the standard model.
②Focal stack rendering: a focal stack is a series
of images of a fixed scene where the camera is focused at a different
distance for each image. Hasinoff and Kutulakos (2011) and Jacobs et al. (2012)
introduce a number of applications of focal stacks, including freeform
depth of field, where the user can specify arbitrary depths that are in
focus, achieving effects not possible with traditional optics. Render
focal stacks with pbrt and write an interactive tool to control focus
effects with them.
③Light field camera: Ng et al. (2005) discuss the
physical design and applications of a camera that captures small images of
the exit pupil across the film, rather than averaging the radiance over the
entire exit pupil at each pixel, as conventional cameras do. Such a camera
captures a representation of the light field—the spatially and
directionally varying distribution of radiance arriving at the camera
sensor. By capturing the light field, a number of interesting operations
are possible, including refocusing photographs after they have been taken.
Read Ng et al.’s paper and implement a Camera in pbrt that
captures the light field of a scene. Write a tool to allow users to
interactively refocus these light fields.
③ The RealisticCamera implementation places the film at
the center of and perpendicular to the optical axis. While this is the
most common configuration of actual cameras, interesting effects can be
achieved by adjusting the film’s placement with respect to the lens system.
For example, the plane of focus in the current implementation is always
perpendicular to the optical axis; if the film plane (or the lens system)
is tilted so that the film isn’t perpendicular to the optical axis, then
the plane of focus is no longer perpendicular to the optical axis. (This
can be useful for landscape photography, for example, where aligning the
plane of focus with the ground plane allows greater depth of field even
with larger apertures.) Alternatively, the film plane can be shifted so
that it’s not centered on the optical axis; this shift can be used to keep
the plane of focus aligned with a very tall object, for example.
Modify RealisticCamera to allow one or both of these adjustments and
render images showing the result. Note that a number of places in the
current implementation (e.g., the exit pupil computation) have assumptions
that will be violated by these changes that you will need to address.