Below, a transformed sphere and a transformed cube - one with a solid texture and one with a surface texture | ||
Non uniform scaling of sphere and cube | Simple near cutting plane | |
Textured objects with reflection | Textured objects with reflection and refraction. | More complex near cutting plane |
Your raytracer should:
scale <x> <y> <z>
cube <r> <g> <b> <texture-name_s> <kd> <ks> <n_i> <kr> <gl> <kt> <n1> <tr>
cube 0.8 0.2 0.2 sines 0.7 0.6 3 0.6 0.0 0.5 1.05 0.0 sphere 0.3 0.6 0.2 stripes 0.7 0.6 3 0.0 0.0 0.0 0.0 0.0
make lab4 input.txt convert output.ppm output.jpg display output.jpg
for each pixel form ray for each object inverse transform ray (source point and direction) into object space if (intersection of ray and object exists) if (closest intersection so far) // compare in world space transform point and normal back into world space // besides intersection info, enough info to retrieve texture colors is also needed record object, object space point, world point, world normal if closest object is textured get texture color from object, texture and object space point do illumination model using color, world point, world normal record pixel color