PA3: Textures
Add visual complexity to your renders by adding 2D and 3D textures.
We will primarily be referring to chapters of Ray Tracing - The Next Week in this assignment.
Task 1: The Texture class
Read Ray Tracing - The Next Week: Chapter 4, and create a Texture
base class following this chapter (perhaps in a new texture.h
file within your include/darts
directory. Remember to add this file to the list of headers or source files in CMakeLists.txt
. Instead of accepting u
, v
, and p
parameters as done in the book, we suggest you have your Texture::value()
function take a const HitInfo &
as the parameter (you can then retrieve u
, v
, and p
from within this HitInfo
). In fact, it can be useful to allow textures to vary based on the incoming ray direction, so we also include that as the parameter wi
.
Here is what our interface looks like:
// Base class for all textures class Texture { public: Texture(const json &j = json::object()); virtual ~Texture() = default; virtual Color3f value(const Vec3f &wi, const HitInfo &hit) const = 0; };
We put our implementation file in src/textures/texture.cpp
.
Constant textures
Create a derived class ConstantTexture : public Texture
which has a member variable Color3f color;
and whose value(...)
function just returns it. Make a constructor that takes a const json &j
and tries to assign the parameter named "color"
to your color member variable by using j.value("color", color);
. We put both our class definition and implementation into the file src/textures/constant.cpp
.
Retrofitting the parser
To make this work with the rest of darts, we need to refactor our code a bit so that Material
s use Texture
s instead of Color3f
s, and so that it plays nice with our JSON parser.
We first need to inform our JSON parser how to create each type of texture, and what type string (e.g. "constant"
) to look for when creating it. Calling the following macro at the bottom of src/textures/constant.cpp
will do this for the new ConstantTexture
:
DARTS_REGISTER_CLASS_IN_FACTORY(Texture, ConstantTexture, "constant")
Now, change all the Color3f
members of your Material
s to Texture
s, e.g. Color3f albedo;
in Lambertian
becomes shared_ptr<const Texture> albedo;
. You'll also need to change the corresponding constructors of any Material
to call DartsFactory::
instead of reading a color directly from the passed json
object, e.g. this:
albedo = j.value("albedo", albedo);
becomes:
albedo = DartsFactory<Texture>::create(j.at("albedo"));
DartsFactory::
will look for a "type"
field in the passed-in json
object, and create the appropriate Texture
subclass. This works because of the DARTS_
You'll also want to support texturing the Metal::
parameter. This is currently a float, but you can still turn it into a Texture
, and when calling its value(...)
function, just convert to a scalar using the luminance()
function in darts/
. An alternative approach would have been to treat color and grayscale textures separately, or allow roughness to differ across the three color channels, but we'll stick with the simpler approach.
Once you have this implemented, you should be able to render the scene scenes/assignment3/constant-cornell-box.json
. This scene looks similar to the Cornell box from the previous assignment, except all albedos are now specified with a constant texture.
Backwards scene compatibility
One nuisance with our current solution is that we would have to re-write all our previous scene files (that directly specified a color) to instead use a "constant"
texture. Previously we could have simply written:
"albedo": [0.8, 1.0, 0.3],
in the scene file, but now we have to write the more clunky:
"albedo": { "type": "constant", "color": [0.8, 1.0, 0.3] }
Let's add backwards compatibility so that the scene can specify colors directly too.
Take a look at DartsFactory::
in darts/
. Recall that this method of our DartsFactory
is responsible for creating instances of materials, surfaces, etc. (determined by the base class T
) by checking for a "type"
field in the passed-in json
object. We'd like the factory to do something a bit more sophisticated when we try to create a subclass of Texture
.
To do that, we'll need to create a specialization of the DartsFactory::
method that is used specifically when the template type is a Texture
. Create a method in factory.h
with the following signature (just below the current DartsFactory::
implementation):
template <> inline shared_ptr<Texture> DartsFactory<Texture>::create(const json &j) { // your code here }
Start by copying over the implementation of the generic DartsFactory::
method.
To allow backwards compatibility, you can inspect the const json &j
by calling j.is_object()
, j.is_array()
or j.is_number()
. If j
is an object, you can proceed just as in the generic implementation to construct the appropriate Texture
. However, if j
is an array or number, you should create a ConstantTexture
directly. You can do this by calling BaseFactory::create_instance("constant", ...)
. Note that you may also need to modify your ConstantTexture
constructor.
Once you have this implemented, you should be able to render all the scenes from the previous assignment. Give it a try.
Task 2: Solid textures
You now have the framework to easily add other types of textures, and these textures can be used any place you previously used a fixed Color3f
!
Checkerboard
Add a CheckerTexture
class following the description in Section 4.3 of the book, but adapted to the darts framework. Your constructor should look for fields named "odd"
and "even"
and use those together with DartsFactory<Texture>::"checker"
.
Once you have this implemented, you should be able to render the scene scenes/assignment3/checker.json
producing an image like the one shown here.
Perlin noise
Read Chapter 5 of the book and then implement Perlin noise. Also implement the noise_texture
class from the book, but call this MarbleTexture
instead. Allow the JSON scene file to specify the scale
member variable of your MarbleTexture
, and instead of always interpolating between black and white as in the book, allow the user to specify these two colors in the JSON scene file just like we did with "odd"
and "even"
for the checkerboard. Look for the named fields "veins"
and "base"
in the JSON object for this purpose. Make sure to register the texture with the DartsFactory
and add your Perlin
and MarbleTexture
file to your CMakeLists.txt
.
Render scenes/assignment3/marble.json
.
Transforming texture lookups (optional)
Particularly with procedural textures, you might find it useful to be able to scale, rotate, and otherwise transform how the textures are placed on your surfaces. A simple way to accomplish is to add the Transform xform;
member variable directly to the Texture
base class. This way, you can transform the lookup point (hit.p
) by xform
in a Texture's value(const HitInfo & hit)
method. You can read this in from the json
object in the Texture
base class constructor Texture::Texture(const json & j)
, but make sure to only look for it if j.is_object() == true
. This allows our shorthand for specifying constant colors directly as json
arrays to continue working. The PBR book discusses a more full-featured way to handle such mappings.
Task 3: Image texture mapping
The ImageTexture class
Read Chapter 6 of the book and implement image texturing mapping. We already provide you with image loading support, which you should leverage using the Image3f
class defined in include/
and src/image.cpp
. Your ImageTexture
class should accept a "filename"
string parameter from the json
object. Once you read this string, use the global file resolver to resolve this filename into a path, like so:
string path = get_file_resolver().resolve(filename).str();
get_
is defined in darts/
.
Texture coordinates
Extend your Sphere::
function to output the UV coordinates as described in the book. Since there are many different conventions for spherical coordinates and uv coordinates, we already provide handy functions to consistently perform the Cartesian-to-spherical or spherical-to-Cartesian coordinate mappings for you. They are direction_
and spherical_
in include/
. You should now be able to render scenes/assignment3/earth.json
.
Next, extend your Quad::
function to output UV coordinates. Render scenes/assignment3/textured-box.json
.
Finally, extend single_
to output UV coordinates. The function already receives three parameters (t0
, t1
, and t2
) that specify the UV texture coordinates at the three triangle vertices. As with the normals, if the mesh doesn't provide per-vertex UV coordinates Triangle::
passes nullptr
s for the t0
, t1
, and t2
parameters and you'll need to fall back to some reasonable default. In this case, you should just store the barycentric coordinates in place of interpolated UV coordinates in hit.uv
.
If you render scenes/02/triangles_textures/teapot.json
you should see a scene with a teapot mesh that has texture coordinates and a texture applied, as shown below.
Task 4: Fresnel blend material
Many real-world materials have a more complex structure because the surface actually consist of multiple layers: e.g. a clear varnish on top of wood. There are many sophisticated techniques in graphics that try to model such multiple layers in a physically based way. We will instead take a physically inspired but simple approach to blend between a dielectric varnish and a substrate layer based on the index of refraction of the varnish.
We will first implement a simple linear blend between two material. Implement the class BlendMaterial
in a new file src/materials/blend.cpp
. This material will blend between two materials based on a user-specified amount
, and then call a different scatter()
function for each. To handle this, the material should store two shared_ptrs
to Materials
(specified as "a"
and "b"
in the JSON file) and a shared_ptr
to a Texture
("amount"
in JSON) specifying the amount to blend between a
and b
(0
giving full weight to a
, and 1
giving full weight to b
). Make sure to register the new material with the DartsFactory
and add it to your CMakeLists.txt
file.
In the BlendMaterial::scatter()
function, evaluate the amount
texture and convert the value to a scalar using luminance()
. Using the calculated percentage, draw a random number using randf()
. If randf()
is less than the percentage, then call the scatter()
function of material b
, otherwise, call the scatter()
function of material a
.
You should now be able to render scenes/assignment3/blend.json
.
This allows us to blend with a user-specified amount, but real varnish exhibits Fresnel reflection, where the surface becomes increasingly reflective at grazing angles. Since our BlendMaterial
stores the blend amount as a texture, we can accomplish this by creating a texture that changes the blend amount based on the angle between the ray and the surface normal. Implement a new FresnelTexture
in a new file src/textures/fresnel.cpp
. This texture should take a single parameter "ior"
, specifying the index of refraction. The FresnelTexture::value
function should then evaluate the Fresnel equations for the given direction wi
and hit normal stored in hit
. We provide an implementation of the full Fresnel equations as fresnel_
in include/
.
We can now vary the amount of reflection and refraction based on the view direction, by using our FresnelTexture
to specify the blend "amount"
.
Render scenes/assignment3/fresnel_blend.json
.
Task 5: Normal mapping (grad students only; undergrad extra credit)
Implementing either bump mapping or tangent-space normal mapping. Ch. 9.3 of the PBR book covered bump mapping, and there are many resources online [1, 2, 3, 4] to learn about normal mapping.
Think about how this should be specified in the JSON scene file, and how you would need to extend your implementation to support modifying the surface normal appropriately. In our implementation, we a normal map acts as a wrapper around another material. It is specified in the JSON file like (this is also what our Blender exporter assumes):
{ "type": "normal map", "name": "some material name", "normals": { "type": "image", "filename": "the_normal_map.exr" }, "nested": { "type": "lambertian", "albedo": [0.75, 0.75, 0.75] } }
For either of these techniques you will need access to not just the surface normal at the hit point, but also the tangent (and bi-tangent). Hint: you will need to compute a globally-consistent uv parametrization. The tangent and bi-tangent are then simply and . Extend the HitInfo
structure and return this information for the surfaces that support bump/normal mapping. Normal mapping requires a bit less work to implement, so we recommend you start with that.
Render some images with the technique turned on/off and which convincingly show that this feature is working correctly.
Task 6: Interesting scene
Now create an interesting scene (or scenes) showcasing all the features you've implemented. Be creative. Find some interesting meshes and textures online, or create some new procedural textures by combining your existing functionality. Consider applying textures to various parts of a material, like the blend factor, the roughness, and emission. Sketchfab.com and polyhaven.com have thousands of high-quality textured meshes available for free. Here's a few that we created and provide the scene files for:
What to submit
In your report, make sure to include:
- Rendered images of all the scenes in
scenes/assignment3
and your interesting scene
Then submit according to the instructions in the Submitting on Canvas section of Getting started guide.