Nodal Scene Interface
Nodal Scene Interface
Authors
May 9, 2017
Contents ii
1 Background 4
2 The Interface 6
2.1 The interface abstraction . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2 The C API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.1 Context handling . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2.2 Passing optional parameters . . . . . . . . . . . . . . . . . . . . . 8
2.2.3 Node creation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.4 Setting attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.2.5 Making connections . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.6 Evaluating procedurals . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2.7 Error reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2.8 Rendering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3 The Lua API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.1 API calls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3.2 Function parameters format . . . . . . . . . . . . . . . . . . . . . 16
2.3.3 Evaluating a Lua script . . . . . . . . . . . . . . . . . . . . . . . 17
2.3.4 Passing parameters to a Lua script . . . . . . . . . . . . . . . . . 18
2.3.5 Reporting errors from a Lua script . . . . . . . . . . . . . . . . . 18
2.4 The C++ API wrappers . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.5 The interface stream . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3 Nodes 20
3.1 The root node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.2 The global node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.3 The set node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.4 The mesh node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5 The faceset node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.6 The cubiccurves node . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.7 The particles node . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
ii
CONTENTS iii
4 Script Objects 35
5 Rendering Guidelines 36
5.1 Basic scene anatomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.2 A word or two about attributes . . . . . . . . . . . . . . . . . . . . . 37
5.3 Instancing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.4 Creating osl networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
5.5 Lighting in the nodal scene interface . . . . . . . . . . . . . . . . . . . . 40
5.5.1 Area lights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.5.2 Spot and point lights . . . . . . . . . . . . . . . . . . . . . . . . . 41
5.5.3 Directional and HDR lights . . . . . . . . . . . . . . . . . . . . . 42
5.6 Defining output drivers and layers . . . . . . . . . . . . . . . . . . . . . 43
5.7 Light layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
5.8 Inter-object visibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
List of Figures 46
List of Tables 47
Index 49
Chapter 1
Background
The Nodal Scene Interface (nsi) was developed to replace existing apis in our renderer
which are showing their age. Having been designed in the 80s and extended several
times since, they include features which are no longer relevant and design decisions
which do not reflect modern needs. This makes some features more complex to use
than they should be and prevents or greatly increases the complexity of implementing
other features.
The design of the nsi was shaped by multiple goals:
Simplicity The interface itself should be simple to understand and use, even if com-
plex things can be done with it. This simplicity is carried into everything which
derives from the interface.
Interactive rendering and scene edits Scene edit operations should not be a spe-
cial case. There should be no difference between scene description and scene
edits. In other words, a scene description is a series of edits and vice versa.
Tight integration with Open Shading Language osl integration is not superfi-
cial and affects scene definition. For example, there are no explicit light sources in
nsi: light sources are created by connected shaders with an emission() closure
to a geometry.
Scripting The interface should be accessible from a platform independent, efficient
and easily accessible scripting language. Scripts can be used to add render time
intelligence to a given scene description.
Performance and multi-threading All api design decisions are made with perfor-
mance in mind and this includes the possibility to run all api calls in a concur-
rent, multi-threaded environment. Nearly all software today which deals with
large data sets needs to use multiple threads at some point. It is important for
the interface to support this directly so it does not become a single thread com-
munication bottleneck. This is why commands are self-contained and do not rely
4
CHAPTER 1. BACKGROUND 5
The Interface
6
CHAPTER 2. THE INTERFACE 7
NSIContext_t NSIBegin(
int nparams,
const NSIParam_t *params );
struct NSIParam_t
{
const char *name;
const void *data;
int type;
int arraylength;
size_t count;
int flags;
};
This structure is used to pass variable parameter lists through the c interface. Most
functions accept an array of the structure in a params parameter along with its length
in a nparams parameter. The meaning of these two parameters will not be documented
for every function. Instead, they will document the parameters which can be given in
the array.
The name member is a c string which gives the parameters name. The type
member identifies the parameters type, using one of the following constants:
NSITypeFloat for a single 32-bit floating point value.
NSITypeDouble for a single 64-bit floating point value.
NSITypeInteger for a single 32-bit integer value.
NSITypeString for a string value, given as a pointer to a c string.
NSITypeColor for a color, given as three 32-bit floating point values.
NSITypePoint for a point, given as three 32-bit floating point values.
NSITypeVector for a vector, given as three 32-bit floating point values.
NSITypeNormal for a normal vector, given as three 32-bit floating point values.
NSITypeMatrix for a transformation matrix, given as 16 32-bit floating point
values.
CHAPTER 2. THE INTERFACE 9
void NSICreate(
NSIContext_t context,
NSIHandle_t handle,
const char *type,
int nparams,
const NSIParam_t *params );
This function is used to create a new node. Its parameters are:
context
The context returned by NSIBegin. See subsection 2.2.1
handle
A node handle. This string will uniquely identify the node in the scene.
If the supplied handle matches an existing node, the function does nothing if all
other parameters match the call which created that node. Otherwise, it emits an
error. Note that handles need only be unique within a given interface context. It
is acceptable to reuse the same handle inside different contexts. The NSIHandle_t
typedef is defined in nsi.h:
CHAPTER 2. THE INTERFACE 10
type
The type of node to create. See chapter 3.
nparams, params
This pair describes a list optional parameters. There are no optional parameters
defined as of now. The NSIParam_t type is described in subsection 2.2.2.
void NSIDelete(
NSIContext_t ctx,
NSIHandle_t handle,
int nparams,
const NSIParam_t *params );
This function deletes a node from the scene. All connections to and from the node are
also deleted. Its parameters are:
context
The context returned by NSIBegin. See subsection 2.2.1
handle
A node handle. It identifies the node to be deleted.
It accepts the following optional parameters:
void NSISetAttribute(
NSIContext_t ctx,
NSIHandle_t object,
int nparams,
const NSIParam_t *params );
CHAPTER 2. THE INTERFACE 11
This functions sets attributes on a previously created node. All optional parameters
of the function become attributes of the node. On a shader node, this function is used
to set the implicitly defined shader parameters.
void NSISetAttributeAtTime(
NSIContext_t ctx,
NSIHandle_t object,
float time,
int nparams,
const NSIParam_t *params );
This function sets time-varying attributes (i.e. motion blurred). The time parameter
specifies at which time the attribute is being defined. There is no expected order for
the time parameter. In most uses, attributes that are motion blurred must have the
same specification throughout the time range. A notable exception is the P attribute
on particles which can be of different size for each time step because of appearing or
disappearing particles.
void NSIDeleteAttribute(
NSIContext_t ctx,
NSIHandle_t object,
const char *name );
This function deletes any attribute with a name which matches the name parameter
on the specified object. There is no way to delete an attribute only for a specific time
value.
void NSIConnect(
NSIContext_t ctx,
NSIHandle_t from,
const char *from_attr,
NSIHandle_t to,
const char *to_attr,
int nparams,
const NSIParam_t *params );
void NSIDisconnect(
NSIContext_t ctx,
NSIHandle_t from,
const char *from_attr,
NSIHandle_t to,
CHAPTER 2. THE INTERFACE 12
void NSIEvaluate(
NSIContext_t ctx,
int nparams,
const NSIParam_t *params );
This function includes a block of interface calls from an external source into the current
scene. It blends together the concepts of a straight file include, commonly known as an
archive, with that of procedural include which is traditionally a compiled executable.
Both are really the same idea expressed in a different language (note that for delayed
procedural evaluation one should use the procedural node).
The nsi adds a third option which sits in-betweenLua scripts (section 2.3). They
are much more powerful than a simple included file yet they are also much easier to
generate as they do not require compilation. It is, for example, very realistic to export
a whole new script for every frame of an animation. It could also be done for every
CHAPTER 2. THE INTERFACE 13
character in a frame. This gives great flexibility in how components of a scene are put
together.
The optional parameters accepted by this function are:
filename.................................................................string
The name of the file which contains the interface calls to include.
type ..................................................................... string
The type of file which will generate the interface calls. This can be one of:
apistream To read in a nsi stream.
lua To execute a Lua script, either from file or inline. See section 2.3 and
more specifically subsection 2.3.3.
dynamiclibrary To execute native compiled code in a loadable library.
script...................................................................string
A valid Lua script to execute when type is set to "lua".
parameters .............................................................. string
Optional procedural parameters.
backgroundload ......................................................... int (0)
If this is nonzero, the object may be loaded in a separate thread, at some later
time. This requires that further interface calls not directly reference objects
defined in the included file. The only guarantee is that the file will be loaded
before rendering begins.
enum NSIErrorLevel
{
NSIErrMessage = 0,
NSIErrInfo = 1,
NSIErrWarning = 2,
NSIErrError = 3
};
The text of the message will not contain the numeric identifier nor any reference
to the error level. It is usually desirable for the error handler to present these values
together with the message. The identifier exists to provide easy filtering of messages.
The intended meaning of the error levels is as follows:
NSIErrMessage for general messages, such as may be produced by printf in
shaders. The default error handler will print this type of messages without an
eol terminator as its the duty of the caller to format the message.
NSIErrInfo for messages which give specific information. These might simply
inform about the state of the renderer, files being read, settings being used and
so on.
NSIErrWarning for messages warning about potential problems. These will gen-
erally not prevent producing images and may not require any corrective action.
They can be seen as suggestions of what to look into if the output is broken but
no actual error is produced.
NSIErrError for error messages. These are for problems which will usually break
the output and need to be fixed.
2.2.8 Rendering
void NSIRenderControl(
const char *action
NSIContext_t ctx,
int nparams,
const NSIParam_t *params );
This function is the only control function of the api. It is responsible of starting,
suspending and stopping the render. It also allows for synchronizing the render with
interactive calls that might have been issued. The action parameter can take one of
the following string values:
start This starts the render the scene in the provided context. The
render starts in parallel and the control flow is not blocked.
wait Wait for a render to finish.
synchronize For an interactive render, apply all the buffered calls to
scenes state.
suspend Suspends render in the provided context.
resume Resumes a previously suspended render.
cancel Cancel render in the provided context and free all related re-
sources
The function also accepts optional parameters:
progressive.............................................................int (0)
If set to 1, render the image in a progressive fashion.
CHAPTER 2. THE INTERFACE 15
interactive.............................................................int (0)
If set to 1, the renderer will accept commands to edit scenes state while rendering.
The difference with a normal render is that the render task will not exit even if
rendering is finished. Interactive renders are by definition progressive.
frame........................................................................int
Specifies the frame number of this render.
No need to pass a nsi context to function calls since its already embodied in the
nsi Lua table (which is used as a class).
The type parameter specified can be omitted if the parameter is an integer, real
or string (as with the Kd and filename in the example above).
nsi parameters can either be passed as a variable number of arguments or as
a single argument representing an array of parameters (as in the "ggx" shader
above)
There is no need to call NSIBegin and NSIEnd equivalents since the Lua script is
run in a valid context.
Function C equivalent
SetAttribute NSISetAttribute
SetAttributeAtTime NSISetAttributeAtTime
Create NSICreate
Delete NSIDelete
DeleteAttribute NSIDeleteAttribute
Connect NSIConnect
Disconnect NSIDisconnect
Evaluate NSIEvaluate
data - The actual attribute data. Either a value (integer, float or string) or an
array.
Here are some example of well formed parameters:
--[[ strings, floats and integers do not need a type specifier ]] --
p1 = {name="shaderfilename", data="emitter"};
p2 = {name="power", data=10.13};
p3 = {name="toggle", data=1};
Evaluate
"filename" "string" 1 ["test.nsi.lua"]
"type" "string" 1 ["lua"]
It is also possible to evaluate a Lua script inline using the script parameter. For
example:
Evaluate
"script" "string" 1 ["nsi.Create(\"light\", \"shader\");"]
"type" "string" 1 ["lua"]
Both a file name and an inline script can be specified to NSIEvaluate at the same time,
in which case the inline script will be evaluated before the file and both scripts will
share the same nsi and Lua contexts. Any error during script parsing or evaluation
will be sent to nsis error handler. Note that all Lua scripts are run in a sandbox in
which all Lua system libraries are disabled. Some utilities, such as error reporting, are
available through the nsi.utilities class.
CHAPTER 2. THE INTERFACE 18
Evaluate
"filename" "string" 1 ["test.lua"]
"type" "string" 1 ["lua"]
"userdata" "color[2]" 1 [1 0 1 2 3 4]
print( nsi.scriptparameters.userdata.data[5] );
The error codes are the same as in the C api and are shown in Table 2.3.
Lua C equivalent
nsi.ErrMessage NSIErrMessage
nsi.ErrWarning NSIErrMessage
nsi.ErrInfo NSIErrInfo
nsi.ErrError NSIErrError
Note that since Lua is part of the api, one can use Lua files for api streaming2 .
1 The streamable nature of the RenderMan api, through rib, is an undeniable advantage. Render-
Nodes
The following sections describe available nodes in technical terms. Refer to chapter 5
for usage details.
20
CHAPTER 3. NODES 21
an nsi context without being connected to the root but it wont affect the render in
any way. The root node has the reserved handle name .root and doesnt need to
be created using NSICreate. The root node has two defined attributes: objects and
geometryattributes. Both of which are explained in the section 3.12.
maximumraydepth.refraction............................................int (4)
Specifies the maximum bounce depth a refraction ray can reach.
maximumraydepth.specular..............................................int (2)
Specifies the maximum bounce depth a specular ray can reach (sometimes re-
ferred to as a glossy ray).
maximumraydepth.diffuse ............................................... int (3)
Specifies the maximum bounce depth a diffuse ray can reach.
maximumraydepth.hair...................................................int (4)
Specifies the maximum bounce depth a hair ray can reach. Note that hair are
akin to volumetric primitives and might need elevated ray depth to properly
capture the illumination.
show.displacement......................................................int (1)
When set to 1, enables displacement shading. Otherwise, it must be set to 0,
which forces the renderer to ignore any displacement shader in the scene.
show.osl.subsurface....................................................int (1)
When set to 1, enables the subsurface() osl closure. Otherwise, it must be set
to 0, which will ignore this closure in osl shaders.
statistics.progress....................................................int (0)
When set to 1, prints rendering progress as a percentage.
P..........................................................................point
The positions of the objects vertices. Typically, this attribute will have the
NSIParamIndirect flag and will be addressed indirectly through a P.indices
attribute.
CHAPTER 3. NODES 23
nvertices...................................................................int
The number of vertices for each face of the mesh. The number of values for this
attribute specifies total face number (unless nholes is defined).
4 0 0 9 0 0 9 3 0 4 3 0
5 1 0 6 1 0 6 2 0 5 2 0
7 1 0 8 1 0 8 2 0 7 2 0
10 0 0 13 0 0 13 3 0 10 3 0 ]
faces........................................................................int
This attribute is a list of faces indices. It identifies which faces of the original
geometry will be part of this face set.
nvertices...................................................................int
A single value which gives the number of vertices for each curve. This must be
at least 4.
P..........................................................................point
The positions of the curve vertices. The number of values provided, divided by
nvertices, gives the number of curves which will be rendered.
width ..................................................................... float
The width of the curves.
basis.....................................................string (catmull-rom)
The basis functions used for curve interpolation. Possible choices are:
b-spline B-spline interpolation.
catmull-rom Catmull-Rom interpolation.
Attributes may also have a single value, one value per curve, one value per vertex
or one value per vertex of a single curve, reused for all curves. Attributes which fall in
that last category must always specify NSIParamPerVertex. Note that a single curve
is considered a face as far as use of NSIParamPerFace is concerned.
appear or disappear during the motion interval. Having such identifiers allows
the renderer to properly render such transient particles. This implies that the
number of ids might vary for each time step of a motion-blurred particle cloud
so the use of NSISetAttributeAtTime is mandatory by definition.
All other attributes on this node are considered parameters of the shader. They may
either be given values or connected to attributes of other shader nodes to build shader
networks . osl shader networks must form acyclic graphs or they will be rejected.
Refer to section 5.4 for instructions on osl network creation and usage.
visibility..................................................................int
This attribute controls the visibility for all ray types. When marching the at-
tribute visibility chain for a particular object, this attribute is gathered along the
way to determine the final visibility status. An example situation:
The visibility.specular attribute determined from the attribute chain
is 1 (visible to specular rays)
The visibility.diffuse attribute determined from the attribute chain is
3 (visible to diffuse rays)
The visibility attribute determined from the same chain is -2 (invisible
to everything)
The result makes the object invisible to specular (maxabs(1, 2) = 2) rays
and visible to diffuse rays (maxabs(3, 2) = 3 ).
matte....................................................................int (0)
If this attribute is set to 1, the object becomes a matte for camera rays. Its trans-
parency is used to control the matte opacity and all other closures are ignored.
Tx Ty Tz 1
Any extra attributes are also forwarded to the output driver which may interpret them
however it wishes.
pixelfilterwidht.........................................................float
The width of the pixel filter in pixels.
pixelaspectratio.........................................................float
Ratio of the physical width to the height of a single pixel. A value of 1.0 corre-
sponds to square pixels.
shutterrange ............................................................ double
Time interval during which the camera shutter is at least partially open. Its
defined by a list of exactly two values :
Time at which the shutter starts opening.
Time at which the shutter finishes closing.
shutteropening..........................................................double
A normalized time interval indicating the time at which the shutter is fully open
(a) and the time at which the shutter starts to close (b). These two values define
the top part of a trapezoid filter. The end goal of this feature it to simulate a
mechanical shutter on which open and close movement are not instant. Figure 3.1
shows the geometry of such a trapezoid filter.
aperture
0 a b 1 t
Figure 3.1: An example shutter opening configuration with a=1/3 and b=2/3.
fov........................................................................float
The field of view angle, in degrees.
fstop ..................................................................... float
Relative aperture of the camera.
focallength .............................................................. float
Focal length of the camera lens.
focaldistance ............................................................ float
Distance in front of the camera at which objects will be in focus.
aperturesides..........................................................integer
Number of sides of the cameras aperture. A value of zero is a special case that
results in a circular aperture.
apertureangle ............................................................ float
A rotation angle (in degrees) to be applied to the cameras aperture, in the image
plane.
fov........................................................................float
Specifies the field of view for this camera node, in degrees.
mapping..................................................................string
Defines one of the supported fisheye mapping functions:
equidistant Maintains angular distances.
equisolidangle Every pixel in the image covers the same solid angle.
orthographic Maintains planar illuminance. This mapping is limited to a
180 field of view.
stereographic Maintains angles throughout the image. Note that stereo-
graphic mapping fails to work with field of views close to 360 degrees.
CHAPTER 3. NODES 34
fov........................................................................float
Specifies the vertical field of view, in degrees. The default value is 90.
horizontalfov ............................................................ float
Specifies the horizontal field of view, in degrees. The default value is 360.
eyeoffset.................................................................float
This offset allows to render stereoscopic cylindrical images by specifying an eye
offset
Script Objects
It is a design goal to provide an easy to use and flexible scripting language for nsi. The
Lua language has been selected for such a task because of its performance, lightness
and features1 . A flexible scripting interface greatly reduces the need to have api
extensions. For example, what is known as conditional evaluation and Ri filters in
the RenderMan api are superseded by the scripting features of nsi.
To be continued . . .
35
Chapter 5
Rendering Guidelines
.root
transform transform
geometry camera
A minimal (and useful) nsi scene graph contains the three following components:
1 For the scene to be visible, at least one of the materials has to be emissive.
36
CHAPTER 5. RENDERING GUIDELINES 37
The scene graph in Figure 5.1 shows a renderable scene with all the necessary
elements. Note how the connections always lead to the .root node. In this view, a
node with no output connections is not relevant by definition and will be ignored.
.root
transform
attribute
inheritance
plastic
attribute
override
Those familiar with the RenderMan standard will remember the various ways to
attach information to elements of the scene (standard attributes, user attributes, prim-
itive variables, construction parameters2 ). In nsi things are simpler and all attributes
are set through the SetAttribute() mechanism. The only distinction is that some
attributes are required (intrinsic attributes) and some are optional: a mesh node needs
to have P and nvertices defined otherwise the geometry is invalid3 . In osl shaders,
attributes are accessed using the getattribute() function and this is the only way to
access attributes in nsi. Having one way to set and to access attributes makes things
simpler (a design goal) and allows for extra flexibility (another design goal). Figure 5.2
2 Parameters passed to Ri calls to build certain objects. For example, knot vectors passed to
RiNuPatch.
3 In this documentation, all intrinsic attributes are usually documented at the beginning of each
Attributes inheritance Attributes attached at some parent transform (in this case,
a metal material) affect geometry downstream
Attributes override It is possible to override attributes for a specific geometry by
attaching them to a transform directly upstream (the plastic material overrides
metal upstream)
Note that any non-intrinsic attribute can be inherited and overridden, including vertex
attributes such as texture coordinates.
5.3 Instancing
.root
transform
plastic
instance
attribute
override
Figure 5.3: Instancing in nsi with attribute inheritance and per-instance attribute
override
OSL network
noise
frequency=1.0
lacunarity=2.0
output attributes
ggx_metal .surfaceshader
read_attribute read_texture roughness
attributename="st" texturename="dirt.exr" dirtlayer
output uv
output
The semantics used to create osl networks are the same as for scene creation. Each
shader node in the network corresponds to a shader node which must be created using
NSICreate. Each shader node has implicit attributes corresponding to shaders param-
eters and connection between said parameters is done using NSIConnect. Figure 5.4
depicts a simple osl network connected to an attributes node. Some observations:
Listing 5.1: NSI stream to create the osl network in Figure 5.4
.root
Figure 5.5: Various lights in nsi are specified using the same semantics
CHAPTER 5. RENDERING GUIDELINES 41
There are no special light source nodes in nsi (although the environment node, which
defines a sphere of infinite radius, could be considered as a light in practice). Any scene
geometry can become a light source if its surface shader produces an emission() clo-
sure. Some operations on light sources, such as light linking, are done using more
general approaches (see section 5.8). Follows a quick summary on how to create dif-
ferent kinds of light in nsi.
// Copyright (c) 2009-2010 Sony Pictures Imageworks Inc., et al. All Rights Reserved.
surface emitter [[ string help = "Lambertian emitter material" ]]
(
float power = 1 [[ string help = "Total power of the light" ]],
color Cs = 1 [[ string help = "Base color" ]])
{
// Because emission() expects a weight in radiance, we must convert by dividing
// the power (in Watts) by the surface area and the factor of PI implied by
// uniform emission over the hemisphere. N.B.: The total power is BEFORE Cs
// filters the color!
Ci = (power / (M_PI * surfacearea())) * Cs * emission();
}
surface spotLight(
color i_color = color(1),
float intenstity = 1,
float coneAngle = 40,
float dropoff = 0,
float penumbraAngle = 0 )
{
color result = i_color * intenstity * M_PI;
if (dropoff > 0)
{
result *= clamp(pow(cosangle, 1 + dropoff),0,1);
}
Ci = result / surfacearea() * emission();
}
.root
transform
camera
output driver
Figure 5.6 depicts a nsi scene to create one file with three layers. In this case, all layers
are saved to the same file and the render is using one view. A more complex example
is shown in Figure 5.7: a left and right cameras are used to drive two file outputs, each
having two layers (Ci and Diffuse colors).
CHAPTER 5. RENDERING GUIDELINES 44
.root
transform transform
Diffuse Ci Diffuse Ci
left.exr right.exr
.root
camera
light sources
Figure 5.8: Gathering contribution of a subset of lights into one output layer
The ability to render a certain set of lights per output layer has a formal workflow
in nsi. One can use three methods to define the lights used by a given output layer:
1. Connect the geometry defining lights directly to the outputlayer.lightset at-
tribute
2. Create a set of lights using the set node and connect it into outputlayer.lightset
3. A combination of both 1 and 2
Figure 5.8 shows a scene using method 2 to create an output layer containing only
illumination from two lights of the scene. Note that if there are no lights or light
sets connected to the lightset attribute then all lights are rendered. The final output
pixels contain the illumination from the considered lights on the specific surface variable
specified in outputlayer.variablename (section 3.14).
The ghost transform has a visibility attribute set to -1 which makes the ghost
invisible to all ray types
The hat of the ghost has its own attribute with a visibility set to 1 which makes
it visible to all ray types
The mirror object has its own attributes and these are used to override the
visibility of the ghost as seen from the mirror. The nsi stream code to achieve
that would look like this:
Connect "mirror_attribute" "" "ghost_attributes" "visibility"
"value" "int" 1 [2]
45
root
ghost transform
visibility=2
hat attributes
visibility 1
List of Figures
46
List of Tables 47
List of Tables
48
Index
49
INDEX 50
scripting, 4
serialization, 5
setting attributes, 10
setting rendering attributes, 27
shader
node, 26
shader creation in Lua, 15
solar light, 42
sortkey, 30
spherical camera, 34
spot light, 41
start render, 14
stereo rendering, 43
stereographic fisheye mapping, 33
stop render, 14
struct
NSIParam t, 8
suspend render, 14
synchronize render, 14
texturememory, 21
type for attribute data, 8
uint16, 30
uint32, 30
uint8, 30
user attributes, 37
visibility, 27