Task 1: Filling in the sample loop
Fill in PathTracer::raytrace_pixel() in pathtracer.cpp. This function returns a Spectrum corresponding to the integral of the irradiance over this pixel, which you will estimate by averaging over ns_aa samples. Since we are taking ns_aa samples per pixel, think about its relationship to antialiasing.
The inputs to this function are integer coordinates in pixel space. When ns_aa > 1, you should generate ns_aa random rays through this pixel by randomly sampling points and casting a ray through . When ns_aa == 1, you should instead generate your ray through the center of the pixel, i.e., . You can evaluate the radiance of a ray r with est_radiance_global_illumination(r).
Notes:
- PathTracer owns a
gridSampler, which has a method you can use to get random samples in (see sampler.h/cpp for details). - You will most likely want to pass a location to the camera that has been scaled down to coordinates. To do this, you might want to use the width and height of the pixel buffer, stored in
sampleBuffer.wandsampleBuffer.h. - You can generate a ray by calling
camera->generate_ray()(which you will implement in Task 2). - Remember to be careful about mixing int and double types here, since the input variables have integer types.
Task 2: Generating camera rays
Fill in Camera::generate_ray() in camera.cpp. The input is a 2D point you calculated in Task 1. Generate the corresponding world space ray as depicted in this slide.
The camera has its own coordinate system. In camera space, the camera is positioned at the origin, looks along the axis, has the axis as image space "up". Given the two field of view angles hFov and vFov, we can define a sensor plane one unit along the view direction with its bottom left and top right corners at
Vector3D(-tan(radians(hFov)*.5), -tan(radians(vFov)*.5),-1)
Vector3D( tan(radians(hFov)*.5), tan(radians(vFov)*.5),-1)
respectively. Convert the input point to a point on this sensor so that maps to the bottom left and maps to the top right. This is your ray's direction in camera space. We can convert this vector to world space by applying the transform c2w. This vector becomes the direction of our ray, r.d (don't forget to normalize it!!). The origin of the ray, r.o, is simply the camera's position pos. The ray's minimum and maximum values should be the nClip and fClip camera parameters.
Tasks 1 and 2 will be tricky to debug before implementing part 3, since nothing will show up on your screen!
Task 3: Intersecting Triangles
Fill in both Triangle::intersect() methods in static_scene/triangle.cpp. You are free to use any method you choose, but we recommend using the Moller Trumbore algorithm from this slide. Make sure you understand the derivation of the algorithm: here is one reference.
Remember that not every intersection is valid -- the ray has min_t and max_t fields defining the valid range of t values. If t lies outside this range, you should return false. Else, update max_t to be equal to t so that future intersections with farther away primitives will be discarded.
Once you get the ray's -value at the intersection point, you should populate the Intersection *isect structure in the second version of the function as follows:
tis the ray's -value at the hit point.nis the surface normal at the hit point. Use barycentric coordinates to interpolate betweenn1, n2, n3, the per-vertex mesh normals.primitivepoints to the primitive that was hit (use thethispointer).bsdfpoints to the surface bsdf at the hit point (useget_bsdf()).
After implementing this, you should be able to run your pathtracer on dae/sky/CBspheres_lambertian.dae to test your first three tasks! Don't forget to press R to start rendering. The spheres won't show up (we'll implement this in the next part!), so you should see an empty box:
Task 4: Intersecting Spheres
Fill in both Sphere::intersect() methods in static_scene/sphere.cpp. Use the quadratic formula and this slide. You may wish to implement the optional helper function Sphere::test(). As with Triangle::intersect(), set r.max_t in both routines and fill in the isect parameters for the second version of the function. For a sphere, the surface normal is a scaled version of the vector pointing from the sphere's center to the hit point.
Now the spheres should appear in dae/sky/CBspheres_lambertian.dae:
Writeup
Don't forget to read this article for writeup instructions!