C.3 BasicScene and Final Object Creation
The responsibilities of the BasicScene are straightforward: it
takes scene entity objects and
provides methods that convert them into objects for rendering. However,
there are two factors that make its implementation not completely trivial.
First, as discussed in Section C.2, if the
Import directive is used in the scene specification, there may be
multiple BasicSceneBuilders that are concurrently calling
BasicScene methods. Therefore, the implementation must use mutual
exclusion to ensure correct operation.
The second consideration is performance: we would like to minimize the time
spent in the execution of BasicScene methods, as time spent in them
delays parsing the remainder of the scene description. System startup time
is a facet of performance that is worth attending to, and so
BasicScene uses the asynchronous job capabilities introduced in
Section B.6.6 to create scene objects
while parsing proceeds when possible.
<<BasicScene Definition>>=
class
BasicScene {
public:
<<
BasicScene Public Methods>>
BasicScene();
void SetOptions(SceneEntity filter, SceneEntity film,
CameraSceneEntity camera, SceneEntity
sampler,
SceneEntity integrator, SceneEntity accelerator);
void AddNamedMaterial(std::string name, SceneEntity material);
int AddMaterial(SceneEntity material);
void AddMedium(MediumSceneEntity medium);
void AddFloatTexture(std::string name, TextureSceneEntity texture);
void AddSpectrumTexture(std::string name, TextureSceneEntity texture);
void AddLight(LightSceneEntity light);
int AddAreaLight(SceneEntity light);
void AddShapes(pstd::span<ShapeSceneEntity> shape);
void AddAnimatedShape(AnimatedShapeSceneEntity shape);
void AddInstanceDefinition(InstanceDefinitionSceneEntity instance);
void AddInstanceUses(pstd::span<InstanceSceneEntity> in);
void Done();
Camera GetCamera() {
cameraJobMutex.lock();
while (!camera) {
pstd::optional<
Camera> c = cameraJob->
TryGetResult(&cameraJobMutex);
if (c)
camera = *c;
}
cameraJobMutex.unlock();
return camera;
}
Sampler GetSampler() {
samplerJobMutex.lock();
while (!
sampler) {
pstd::optional<
Sampler> s =
samplerJob->
TryGetResult(&
samplerJobMutex);
if (s)
sampler = *s;
}
samplerJobMutex.unlock();
return
sampler;
}
void CreateMaterials(const NamedTextures &sceneTextures,
std::map<std::string,
Material> *namedMaterials,
std::vector<
Material> *materials);
std::vector<
Light> CreateLights(const NamedTextures &textures,
std::map<int, pstd::vector<
Light> *> *shapeIndexToAreaLights);
std::map<std::string,
Medium> CreateMedia();
Primitive CreateAggregate(
const NamedTextures &textures,
const std::map<int, pstd::vector<
Light> *> &shapeIndexToAreaLights,
const std::map<std::string,
Medium> &media,
const std::map<std::string,
Material> &namedMaterials,
const std::vector<
Material> &materials);
std::unique_ptr<
Integrator> CreateIntegrator(
Camera camera,
Sampler sampler,
Primitive accel,
std::vector<
Light> lights) const;
NamedTextures CreateTextures();
<<
BasicScene Public Members>>
SceneEntity integrator, accelerator;
std::vector<ShapeSceneEntity> shapes;
std::vector<AnimatedShapeSceneEntity> animatedShapes;
std::vector<InstanceSceneEntity> instances;
std::map<InternedString, InstanceDefinitionSceneEntity *> instanceDefinitions;
private:
<<
BasicScene Private Methods>>
Medium GetMedium(const std::string &name, const FileLoc *loc);
void startLoadingNormalMaps(const ParameterDictionary ¶meters);
<<
BasicScene Private Members>>
AsyncJob<
Sampler> *samplerJob = nullptr;
mutable
ThreadLocal<Allocator> threadAllocators;
Camera camera;
Film film;
std::mutex cameraJobMutex;
AsyncJob<
Camera> *cameraJob = nullptr;
std::mutex samplerJobMutex;
Sampler sampler;
std::mutex mediaMutex;
std::map<std::string,
AsyncJob<
Medium> *> mediumJobs;
std::map<std::string,
Medium> mediaMap;
std::mutex materialMutex;
std::map<std::string,
AsyncJob<
Image *> *> normalMapJobs;
std::map<std::string,
Image *> normalMaps;
std::vector<std::pair<std::string, SceneEntity>> namedMaterials;
std::vector<SceneEntity> materials;
std::mutex lightMutex;
std::vector<
AsyncJob<
Light> *> lightJobs;
std::mutex areaLightMutex;
std::vector<SceneEntity> areaLights;
std::mutex textureMutex;
std::vector<std::pair<std::string, TextureSceneEntity>> serialFloatTextures;
std::vector<std::pair<std::string, TextureSceneEntity>> serialSpectrumTextures;
std::vector<std::pair<std::string, TextureSceneEntity>> asyncSpectrumTextures;
std::set<std::string> loadingTextureFilenames;
std::map<std::string,
AsyncJob<
FloatTexture> *> floatTextureJobs;
std::map<std::string,
AsyncJob<
SpectrumTexture> *> spectrumTextureJobs;
int nMissingTextures = 0;
std::mutex shapeMutex, animatedShapeMutex;
std::mutex instanceDefinitionMutex, instanceUseMutex;
};
<<BasicScene Method Definitions>>=
void
BasicScene::SetOptions(
SceneEntity filter,
SceneEntity film,
CameraSceneEntity camera,
SceneEntity sampler,
SceneEntity integ,
SceneEntity accel) {
<<
Store information for specified integrator and accelerator>>
<<
Immediately create filter and film>>
LOG_VERBOSE("Starting to create filter and film");
Allocator alloc = threadAllocators.Get();
Filter filt =
Filter::Create(filter.name, filter.parameters, &filter.loc, alloc);
// It's a little ugly to poke into the camera's parameters here, but we
// have this circular dependency that
Camera::Create() expects a
//
Film, yet now the film needs to know the exposure time from
// the camera....
Float exposureTime = camera.parameters.GetOneFloat("shutterclose", 1.f) -
camera.parameters.GetOneFloat("shutteropen", 0.f);
if (exposureTime <= 0)
ErrorExit(&camera.loc,
"The specified camera shutter times imply that the shutter "
"does not open. A black image will result.");
this->film =
Film::Create(film.name, film.parameters, exposureTime, camera.cameraTransform,
filt, &film.loc, alloc);
LOG_VERBOSE("Finished creating filter and film");
<<
Enqueue asynchronous job to create sampler>>
<<
Enqueue asynchronous job to create camera>>
cameraJob = RunAsync([camera,this]() {
LOG_VERBOSE("Starting to create camera");
Allocator alloc = threadAllocators.Get();
Medium cameraMedium = GetMedium(camera.medium, &camera.loc);
Camera c =
Camera::Create(camera.name, camera.parameters, cameraMedium,
camera.cameraTransform, this->film, &camera.loc, alloc);
LOG_VERBOSE("Finished creating camera");
return c;
});
}
When SetOptions() is called, the specifications of the geometry and lights in the scene
have not yet been parsed. Therefore, it is not yet possible to create the
integrator (which needs the lights) or the acceleration structure (which
needs the geometry). Therefore, their specification
so far is saved in member variables for use when parsing is
finished.
<<Store information for specified integrator and accelerator>>=
<<BasicScene Public Members>>=
However, it is possible to start work on creating the Sampler,
Camera, Filter, and Film. While they could all be created
in turn in the SetOptions() method, we instead use RunAsync()
to launch multiple jobs to take care of them. Thus, the
SetOptions() method can return quickly, allowing parsing to resume,
and creation of those objects can proceed in parallel as parsing proceeds
if there are available CPU cores. Although these objects usually take
little time to initialize, sometimes they do not: the RealisticCamera
requires a second or so on a current CPU to compute exit pupil bounds and
the HaltonSampler takes approximately 0.1 seconds to initialize its
random permutations. If that work can be done concurrently with parsing
the scene, rendering can begin that much more quickly.
<<Enqueue asynchronous job to create sampler>>=
The AsyncJob * returned by RunAsync() is held in a member variable.
The BasicScene constructor also initializes threadAllocators
so that appropriate memory allocators are available depending on whether
the scene objects should be stored in CPU memory or GPU memory.
<<BasicScene Private Members>>=
Briefly diverting from the BasicScene implementation, we will turn to
the Sampler::Create() method that is called in the job that creates
the Sampler. (This method is defined in the file
samplers.cpp with the rest of the Sampler code.) It
checks the provided sampler name against all the sampler names it is
aware of, calling the appropriate object-specific creation method when it
finds a match and issuing an error if no match is found. Thus, if the
system is to be extended with an additional sampler, this is a second place
in the code where the existence of the new sampler must be registered.
Most of the values that are passed to the object constructors are extracted
from the ParameterDictionary in the object-specific Create()
methods, though some that are not in the available parameter list
(like here, the uncropped image resolution)
are directly passed as parameters to the Create() methods.
<<Sampler Method Definitions>>=
Sampler Sampler::Create(const std::string &name,
const
ParameterDictionary ¶meters,
Point2i fullRes,
const FileLoc *loc, Allocator alloc) {
Sampler sampler = nullptr;
if (name == "zsobol")
sampler =
ZSobolSampler::Create(parameters, fullRes, loc, alloc);
<<
Create remainder of Sampler types>>
else if (name == "paddedsobol")
sampler = PaddedSobolSampler::Create(parameters, loc, alloc);
else if (name == "halton")
sampler = HaltonSampler::Create(parameters, fullRes, loc, alloc);
else if (name == "sobol")
sampler = SobolSampler::Create(parameters, fullRes, loc, alloc);
else if (name == "pmj02bn")
sampler = PMJ02BNSampler::Create(parameters, loc, alloc);
else if (name == "independent")
sampler = IndependentSampler::Create(parameters, loc, alloc);
else if (name == "stratified")
sampler = StratifiedSampler::Create(parameters, loc, alloc);
else
ErrorExit(loc, "%s: sampler type unknown.", name);
if (!sampler)
ErrorExit(loc, "%s: unable to create sampler.", name);
parameters.
ReportUnused();
return sampler;
}
The fragment that handles the remainder of types of samplers, <<Create
remainder of Sampler types>>, is not included here.
All the other base interface classes like Light, Shape,
Camera, and so forth provide corresponding Create() methods,
all of which have the same general form.
BasicScene also provides methods that return these asynchronously
created objects. All have a similar form, acquiring a
mutex before harvesting the result from the asynchronous job if needed.
Calling code should delay calling these methods as long as possible,
doing as much independent work as it can to increase the likelihood that
the asynchronous job has completed and that the AsyncJob::GetResult() calls
do not stall.
<<BasicScene Public Methods>>=
<<BasicScene Private Members>>+=
std::mutex
samplerJobMutex;
Sampler sampler;
Medium creation is also based on RunAsync()’s asynchronous job
capabilities, though in that case a std::map of jobs is
maintained, one for each medium. Note that it is important that a mutex be
held when storing the AsyncJob * returned by RunAsync() in
mediumJobs, since multiple threads may call this method
concurrently if Import statements are used for multi-threaded
parsing.
<<BasicScene Method Definitions>>+=
<<BasicScene Private Members>>+=
std::mutex
mediaMutex;
std::map<std::string,
AsyncJob<
Medium> *>
mediumJobs;
Creation of each Medium follows a similar form to Sampler
creation, though here the type of medium to be created is found from the
parameter list; the MediumSceneEntity::name member variable holds
the user-provided name to associate with the medium.
<<Define
create lambda function for
Medium creation>>=
auto create = [medium, this]() {
std::string type = medium.
parameters.
GetOneString("type", "");
<<
Check for missing medium “type” or animated medium transform>>
if (type.empty())
ErrorExit(&medium.loc, "No parameter \"string type\" found for medium.");
if (medium.renderFromObject.IsAnimated())
Warning(&medium.loc,
"Animated transformation provided for medium. Only the "
"start transform will be used.");
return
Medium::Create(type, medium.
parameters,
medium.renderFromObject.startTransform,
&medium.
loc,
threadAllocators.
Get());
};
All the media specified in the scene are provided to callers via a map
from names to Medium objects.
<<BasicScene Method Definitions>>+=
The asynchronously created Medium objects are consumed using calls
to AsyncJob::TryGetResult(), which returns the result if it is
available and otherwise unlocks the mutex, does some of the enqueued
parallel work, and then relocks it before returning. Thus, there is no
risk of deadlock from one thread holding mediaMutex, finding that
the result is not ready and working on enqueued parallel work that itself
ends up trying to acquire mediaMutex.
<<Consume results for asynchronously created
Medium objects>>=
<<BasicScene Private Members>>+=
std::map<std::string,
Medium>
mediaMap;
As much as possible, other scene objects are created similarly using
RunAsync(). Light sources are easy to handle, and it is especially
helpful to start creating image textures during parsing, as reading image
file formats from disk can be a bottleneck for scenes with many such
textures. However, extra attention is required due to the cache of images
already read for textures (Section 10.4.1). If an image
file on disk is used in multiple textures, BasicScene takes care not
to have multiple jobs redundantly reading the same image. Instead, only
one reads it and the rest wait. When those textures are then
created, the image they need can be efficiently returned from the cache.
In return for the added complexity of this asynchronous object creation, we have
found that for complex scenes it is not unusual for this version of pbrt to be able to start rendering 4 times more quickly than the previous
version.