3.11 Interactions

The last abstractions in this chapter, SurfaceInteraction and MediumInteraction, respectively represent local information at points on surfaces and in participating media. For example, the ray–shape intersection routines in Chapter 6 return information about the local differential geometry at intersection points in a SurfaceInteraction. Later, the texturing code in Chapter 10 computes material properties using values from the SurfaceInteraction. The closely related MediumInteraction class is used to represent points where light interacts with participating media like smoke or clouds. The implementations of all of these classes are in the files interaction.h and interaction.cpp.

Both SurfaceInteraction and MediumInteraction inherit from a generic Interaction class that provides common member variables and methods, which allows parts of the system for which the differences between surface and medium interactions do not matter to be implemented purely in terms of Interactions.

<<Interaction Definition>>= 
class Interaction { public: <<Interaction Public Methods>> 
Interaction() = default; Interaction(Point3fi pi, Normal3f n, Point2f uv, Vector3f wo, Float time) : pi(pi), n(n), uv(uv), wo(Normalize(wo)), time(time) {} Point3f p() const { return Point3f(pi); } bool IsSurfaceInteraction() const { return n != Normal3f(0, 0, 0); } bool IsMediumInteraction() const { return !IsSurfaceInteraction(); } const SurfaceInteraction &AsSurface() const { CHECK(IsSurfaceInteraction()); return (const SurfaceInteraction &)*this; } SurfaceInteraction &AsSurface() { CHECK(IsSurfaceInteraction()); return (SurfaceInteraction &)*this; } // used by medium ctor PBRT_CPU_GPU Interaction(Point3f p, Vector3f wo, Float time, Medium medium) : pi(p), time(time), wo(wo), medium(medium) {} PBRT_CPU_GPU Interaction(Point3f p, Normal3f n, Float time, Medium medium) : pi(p), n(n), time(time), medium(medium) {} PBRT_CPU_GPU Interaction(Point3f p, Point2f uv) : pi(p), uv(uv) {} PBRT_CPU_GPU Interaction(const Point3fi &pi, Normal3f n, Float time = 0, Point2f uv = {}) : pi(pi), n(n), uv(uv), time(time) {} PBRT_CPU_GPU Interaction(const Point3fi &pi, Normal3f n, Point2f uv) : pi(pi), n(n), uv(uv) {} PBRT_CPU_GPU Interaction(Point3f p, Float time, Medium medium) : pi(p), time(time), medium(medium) {} PBRT_CPU_GPU Interaction(Point3f p, const MediumInterface *mediumInterface) : pi(p), mediumInterface(mediumInterface) {} PBRT_CPU_GPU Interaction(Point3f p, Float time, const MediumInterface *mediumInterface) : pi(p), time(time), mediumInterface(mediumInterface) {} PBRT_CPU_GPU const MediumInteraction &AsMedium() const { CHECK(IsMediumInteraction()); return (const MediumInteraction &)*this; } PBRT_CPU_GPU MediumInteraction &AsMedium() { CHECK(IsMediumInteraction()); return (MediumInteraction &)*this; } std::string ToString() const; Point3f OffsetRayOrigin(Vector3f w) const { return pbrt::OffsetRayOrigin(pi, n, w); } Point3f OffsetRayOrigin(Point3f pt) const { return OffsetRayOrigin(pt - p()); } RayDifferential SpawnRay(Vector3f d) const { return RayDifferential(OffsetRayOrigin(d), d, time, GetMedium(d)); } Ray SpawnRayTo(Point3f p2) const { Ray r = pbrt::SpawnRayTo(pi, n, time, p2); r.medium = GetMedium(r.d); return r; } PBRT_CPU_GPU Ray SpawnRayTo(const Interaction &it) const { Ray r = pbrt::SpawnRayTo(pi, n, time, it.pi, it.n); r.medium = GetMedium(r.d); return r; } Medium GetMedium(Vector3f w) const { if (mediumInterface) return Dot(w, n) > 0 ? mediumInterface->outside : mediumInterface->inside; return medium; } Medium GetMedium() const { return mediumInterface ? mediumInterface->inside : medium; }
<<Interaction Public Members>> 
Point3fi pi; Float time = 0; Vector3f wo; Normal3f n; Point2f uv; const MediumInterface *mediumInterface = nullptr; Medium medium = nullptr;
};

A variety of Interaction constructors are available; depending on what sort of interaction is being constructed and what sort of information about it is relevant, corresponding sets of parameters are accepted. This one is the most general of them.

<<Interaction Public Methods>>= 
Interaction(Point3fi pi, Normal3f n, Point2f uv, Vector3f wo, Float time) : pi(pi), n(n), uv(uv), wo(Normalize(wo)), time(time) {}

All interactions have a point normal p Subscript associated with them. This point is stored using the Point3fi class, which uses an Interval to represent each coordinate value. Storing a small interval of floating-point values rather than a single Float makes it possible to represent bounds on the numeric error in the intersection point, as occurs when the point normal p Subscript was computed by a ray intersection calculation. This information will be useful for avoiding incorrect self-intersections for rays leaving surfaces, as will be discussed in Section 6.8.6.

<<Interaction Public Members>>= 

Interaction provides a convenience method that returns a regular Point3f for the interaction point for the parts of the system that do not need to account for any error in it (e.g., the texture evaluation routines).

<<Interaction Public Methods>>+=  
Point3f p() const { return Point3f(pi); }

All interactions also have a time associated with them. Among other uses, this value is necessary for setting the time of a spawned ray leaving the interaction.

<<Interaction Public Members>>+=  
Float time = 0;

For interactions that lie along a ray (either from a ray–shape intersection or from a ray passing through participating media), the negative ray direction is stored in the wo member variable, which corresponds to omega Subscript normal o , the notation we use for the outgoing direction when computing lighting at points. For other types of interaction points where the notion of an outgoing direction does not apply (e.g., those found by randomly sampling points on the surface of shapes), wo has the value left-parenthesis 0 comma 0 comma 0 right-parenthesis .

<<Interaction Public Members>>+=  

For interactions on surfaces, n stores the surface normal at the point and uv stores its left-parenthesis u comma v right-parenthesis parametric coordinates. It is fair to ask, why are these values stored in the base Interaction class rather than in SurfaceInteraction? The reason is that there are some parts of the system that mostly do not care about the distinction between surface and medium interactions—for example, some of the routines that sample points on light sources given a point to be illuminated. Those make use of these values if they are available and ignore them if they are set to zero. By accepting the small dissonance of having them in the wrong place here, the implementations of those methods and the code that calls them is made that much simpler.

<<Interaction Public Members>>+=  

It is possible to check if a pointer or reference to an Interaction is one of the two subclasses. A nonzero surface normal is used as a distinguisher for a surface.

<<Interaction Public Methods>>+=  
bool IsSurfaceInteraction() const { return n != Normal3f(0, 0, 0); } bool IsMediumInteraction() const { return !IsSurfaceInteraction(); }

Methods are provided to cast to the subclass types as well. This is a good place for a runtime check to ensure that the requested conversion is valid. The non-const variant of this method as well as corresponding AsMedium() methods follow similarly and are not included in the text.

<<Interaction Public Methods>>+=  
const SurfaceInteraction &AsSurface() const { CHECK(IsSurfaceInteraction()); return (const SurfaceInteraction &)*this; }

Interactions can also represent either an interface between two types of participating media using an instance of the MediumInterface class, which is defined in Section 11.4, or the properties of the scattering medium at their point using a Medium. Here as well, the Interaction abstraction leaks: surfaces can represent interfaces between media, and at a point inside a medium, there is no interface but there is the current medium. Both of these values are stored in Interaction for the same reasons of expediency that n and uv were.

<<Interaction Public Members>>+= 
const MediumInterface *mediumInterface = nullptr; Medium medium = nullptr;

3.11.1 Surface Interaction

As described earlier, the geometry of a particular point on a surface (often a position found by intersecting a ray against the surface) is represented by a SurfaceInteraction. Having this abstraction lets most of the system work with points on surfaces without needing to consider the particular type of geometric shape the points lie on.

<<SurfaceInteraction Definition>>= 
class SurfaceInteraction : public Interaction { public: <<SurfaceInteraction Public Methods>> 
SurfaceInteraction() = default; SurfaceInteraction(Point3fi pi, Point2f uv, Vector3f wo, Vector3f dpdu, Vector3f dpdv, Normal3f dndu, Normal3f dndv, Float time, bool flipNormal) : Interaction(pi, Normal3f(Normalize(Cross(dpdu, dpdv))), uv, wo, time), dpdu(dpdu), dpdv(dpdv), dndu(dndu), dndv(dndv) { <<Initialize shading geometry from true geometry>> 
shading.n = n; shading.dpdu = dpdu; shading.dpdv = dpdv; shading.dndu = dndu; shading.dndv = dndv;
<<Adjust normal based on orientation and handedness>> 
if (flipNormal) { n *= -1; shading.n *= -1; }
} SurfaceInteraction(Point3fi pi, Point2f uv, Vector3f wo, Vector3f dpdu, Vector3f dpdv, Normal3f dndu, Normal3f dndv, Float time, bool flipNormal, int faceIndex) : SurfaceInteraction(pi, uv, wo, dpdu, dpdv, dndu, dndv, time, flipNormal) { this->faceIndex = faceIndex; } void SetShadingGeometry(Normal3f ns, Vector3f dpdus, Vector3f dpdvs, Normal3f dndus, Normal3f dndvs, bool orientationIsAuthoritative) { <<Compute shading.n for SurfaceInteraction>> 
shading.n = ns; if (orientationIsAuthoritative) n = FaceForward(n, shading.n); else shading.n = FaceForward(shading.n, n);
<<Initialize shading partial derivative values>> 
shading.dpdu = dpdus; shading.dpdv = dpdvs; shading.dndu = dndus; shading.dndv = dndvs;
} std::string ToString() const; void SetIntersectionProperties(Material mtl, Light area, const MediumInterface *primMediumInterface, Medium rayMedium) { material = mtl; areaLight = area; <<Set medium properties at surface intersection>> 
if (primMediumInterface && primMediumInterface->IsMediumTransition()) mediumInterface = primMediumInterface; else medium = rayMedium;
} PBRT_CPU_GPU void ComputeDifferentials(const RayDifferential &r, Camera camera, int samplesPerPixel); PBRT_CPU_GPU void SkipIntersection(RayDifferential *ray, Float t) const; using Interaction::SpawnRay; RayDifferential SpawnRay(const RayDifferential &rayi, const BSDF &bsdf, Vector3f wi, int /*BxDFFlags*/ flags, Float eta) const; BSDF GetBSDF(const RayDifferential &ray, SampledWavelengths &lambda, Camera camera, ScratchBuffer &scratchBuffer, Sampler sampler); BSSRDF GetBSSRDF(const RayDifferential &ray, SampledWavelengths &lambda, Camera camera, ScratchBuffer &scratchBuffer); PBRT_CPU_GPU SampledSpectrum Le(Vector3f w, const SampledWavelengths &lambda) const;
<<SurfaceInteraction Public Members>> 
Vector3f dpdu, dpdv; Normal3f dndu, dndv; struct { Normal3f n; Vector3f dpdu, dpdv; Normal3f dndu, dndv; } shading; int faceIndex = 0; Material material; Light areaLight; Vector3f dpdx, dpdy; Float dudx = 0, dvdx = 0, dudy = 0, dvdy = 0;
};

In addition to the point p, the surface normal n, and left-parenthesis u comma v right-parenthesis coordinates from the parameterization of the surface from the Interaction base class, the SurfaceInteraction also stores the parametric partial derivatives of the point partial-differential normal p slash partial-differential u and partial-differential normal p slash partial-differential v and the partial derivatives of the surface normal partial-differential bold n Subscript slash partial-differential u and partial-differential bold n Subscript slash partial-differential v . See Figure 3.30 for a depiction of these values.

<<SurfaceInteraction Public Members>>= 
Vector3f dpdu, dpdv; Normal3f dndu, dndv;

Figure 3.30: The Local Differential Geometry around a Point normal p Subscript . The parametric partial derivatives of the surface, partial-differential normal p slash partial-differential u and partial-differential normal p slash partial-differential v , lie in the tangent plane but are not necessarily orthogonal. The surface normal bold n Subscript is given by the cross product of partial-differential normal p slash partial-differential u and partial-differential normal p slash partial-differential v . The vectors partial-differential bold n Subscript slash partial-differential u and partial-differential bold n Subscript slash partial-differential v record the differential change in surface normal as we move u and v along the surface.

This representation implicitly assumes that shapes have a parametric description—that for some range of left-parenthesis u comma v right-parenthesis values, points on the surface are given by some function f such that p equals f left-parenthesis u comma v right-parenthesis . Although this is not true for all shapes, all of the shapes that pbrt supports do have at least a local parametric description, so we will stick with the parametric representation since this assumption is helpful elsewhere (e.g., for antialiasing of textures in Chapter 10).

The SurfaceInteraction constructor takes parameters that set all of these values. It computes the normal as the cross product of the partial derivatives.

<<SurfaceInteraction Public Methods>>= 
SurfaceInteraction(Point3fi pi, Point2f uv, Vector3f wo, Vector3f dpdu, Vector3f dpdv, Normal3f dndu, Normal3f dndv, Float time, bool flipNormal) : Interaction(pi, Normal3f(Normalize(Cross(dpdu, dpdv))), uv, wo, time), dpdu(dpdu), dpdv(dpdv), dndu(dndu), dndv(dndv) { <<Initialize shading geometry from true geometry>> 
shading.n = n; shading.dpdu = dpdu; shading.dpdv = dpdv; shading.dndu = dndu; shading.dndv = dndv;
<<Adjust normal based on orientation and handedness>> 
if (flipNormal) { n *= -1; shading.n *= -1; }
}

SurfaceInteraction stores a second instance of a surface normal and the various partial derivatives to represent possibly perturbed values of these quantities—as can be generated by bump mapping or interpolated per-vertex normals with meshes. Some parts of the system use this shading geometry, while others need to work with the original quantities.

<<SurfaceInteraction Public Members>>+=  
struct { Normal3f n; Vector3f dpdu, dpdv; Normal3f dndu, dndv; } shading;

The shading geometry values are initialized in the constructor to match the original surface geometry. If shading geometry is present, it generally is not computed until some time after the SurfaceInteraction constructor runs. The SetShadingGeometry() method, to be defined shortly, updates the shading geometry.

<<Initialize shading geometry from true geometry>>= 
shading.n = n; shading.dpdu = dpdu; shading.dpdv = dpdv; shading.dndu = dndu; shading.dndv = dndv;

The surface normal has special meaning to pbrt, which assumes that, for closed shapes, the normal is oriented such that it points to the outside of the shape. For geometry used as an area light source, light is by default emitted from only the side of the surface that the normal points toward; the other side is black. Because normals have this special meaning, pbrt provides a mechanism for the user to reverse the orientation of the normal, flipping it to point in the opposite direction. A ReverseOrientation directive in a pbrt input file flips the normal to point in the opposite, non-default direction. Therefore, it is necessary to check if the given Shape has the corresponding flag set and, if so, switch the normal’s direction here.

However, one other factor plays into the orientation of the normal and must be accounted for here as well. If a shape’s transformation matrix has switched the handedness of the object coordinate system from pbrt’s default left-handed coordinate system to a right-handed one, we need to switch the orientation of the normal as well. To see why this is so, consider a scale matrix bold upper S left-parenthesis 1 comma 1 comma negative 1 right-parenthesis . We would naturally expect this scale to switch the direction of the normal, although because we have computed the normal by bold n Subscript Baseline equals partial-differential normal p slash partial-differential u times partial-differential normal p slash partial-differential v ,

StartLayout 1st Row 1st Column bold upper S left-parenthesis 1 comma 1 comma negative 1 right-parenthesis StartFraction partial-differential normal p Over partial-differential u EndFraction times bold upper S left-parenthesis 1 comma 1 comma negative 1 right-parenthesis StartFraction partial-differential normal p Over partial-differential v EndFraction 2nd Column equals bold upper S left-parenthesis negative 1 comma negative 1 comma 1 right-parenthesis left-parenthesis StartFraction partial-differential normal p Over partial-differential u EndFraction times StartFraction partial-differential normal p Over partial-differential v EndFraction right-parenthesis 2nd Row 1st Column Blank 2nd Column equals bold upper S left-parenthesis negative 1 comma negative 1 comma 1 right-parenthesis bold n Subscript Baseline 3rd Row 1st Column Blank 2nd Column not-equals bold upper S left-parenthesis 1 comma 1 comma negative 1 right-parenthesis bold n Subscript Baseline period EndLayout

Therefore, it is also necessary to flip the normal’s direction if the transformation switches the handedness of the coordinate system, since the flip will not be accounted for by the computation of the normal’s direction using the cross product. A flag passed by the caller indicates whether this flip is necessary.

<<Adjust normal based on orientation and handedness>>= 
if (flipNormal) { n *= -1; shading.n *= -1; }

pbrt also provides the capability to associate an integer index with each face of a polygon mesh. This information is used for certain texture mapping operations. A separate SurfaceInteraction constructor allows its specification.

<<SurfaceInteraction Public Members>>+=  
int faceIndex = 0;

When a shading coordinate frame is computed, the SurfaceInteraction is updated via its SetShadingGeometry() method.

<<SurfaceInteraction Public Methods>>+=  
void SetShadingGeometry(Normal3f ns, Vector3f dpdus, Vector3f dpdvs, Normal3f dndus, Normal3f dndvs, bool orientationIsAuthoritative) { <<Compute shading.n for SurfaceInteraction>> 
shading.n = ns; if (orientationIsAuthoritative) n = FaceForward(n, shading.n); else shading.n = FaceForward(shading.n, n);
<<Initialize shading partial derivative values>> 
shading.dpdu = dpdus; shading.dpdv = dpdvs; shading.dndu = dndus; shading.dndv = dndvs;
}

After performing the same cross product (and possibly flipping the orientation of the normal) as before to compute an initial shading normal, the implementation then flips either the shading normal or the true geometric normal if needed so that the two normals lie in the same hemisphere. Since the shading normal generally represents a relatively small perturbation of the geometric normal, the two of them should always be in the same hemisphere. Depending on the context, either the geometric normal or the shading normal may more authoritatively point toward the correct “outside” of the surface, so the caller passes a Boolean value that determines which should be flipped if needed.

<<Compute shading.n for SurfaceInteraction>>= 
shading.n = ns; if (orientationIsAuthoritative) n = FaceForward(n, shading.n); else shading.n = FaceForward(shading.n, n);

With the normal set, the various partial derivatives can be copied.

<<Initialize shading partial derivative values>>= 
shading.dpdu = dpdus; shading.dpdv = dpdvs; shading.dndu = dndus; shading.dndv = dndvs;

3.11.2 Medium Interaction

As described earlier, the MediumInteraction class is used to represent an interaction at a point in a scattering medium like smoke or clouds.

<<MediumInteraction Definition>>= 
class MediumInteraction : public Interaction { public: <<MediumInteraction Public Methods>> 
MediumInteraction(Point3f p, Vector3f wo, Float time, Medium medium, PhaseFunction phase) : Interaction(p, wo, time, medium), phase(phase) {} std::string ToString() const;
<<MediumInteraction Public Members>>  };

In contrast to SurfaceInteraction, it adds little to the base Interaction class. The only addition is a PhaseFunction, which describes how the particles in the medium scatter light. Phase functions and the PhaseFunction class are introduced in Section 11.3.

<<MediumInteraction Public Methods>>= 
MediumInteraction(Point3f p, Vector3f wo, Float time, Medium medium, PhaseFunction phase) : Interaction(p, wo, time, medium), phase(phase) {}

<<MediumInteraction Public Members>>=