Core Feature List

This page exists to suggest, detail and track the features that JPatch needs to a level of functionality expected by a typical user who wants to create animation.

Most of the actual discussion of these features takes place on the JPatch forum.

Features not required for “Core” but which still should be part of JPatch, should be flagged with an [Advanced] tag.

Brief Overview

From the beginning, JPatch was envisioned as a tool for creating character-based animation. Although the primary focus was on spline-based models, features such as bones and morphs were always part of the design.

Because much of what was originally in JPatch was inspired by Hash’s Animation:Master, JPatch was initially seen by some as an A:M clone. However, this was never the goal for JPatch, and eventually because of issues with supporting spline patches, JPatch replaced spline-based patches with subdivision surfaces.

The initial scope for JPatch was much smaller than what it currently is. With the addition of a built-in renderer and animation module, JPatch has moved from a tool for creating POV-Ray models to a stand-alone tool for creating animation.

Target Market

JPatch distinguishes itself from other programs by focusing on:

  • Character based animation;
  • Intuitive tools;
  • Multiple renderers supported (POV-Ray, Renderman, Sunflow);
  • Platform agnostic (Windows, Macintosh, Linux)

Subdivision Surface Representation and Realtime display [Done]

Orthogonal Display

This mode uses orthographic projection to render objects.

Perspective Display

This mode uses perspective projection to render objects.

General half-wing representation of meshes

A half-wing representation is an efficient way to represent models.

Infinitely sharp corners and creases, and "interpolate boundary" behavior

Subdivision surfaces use an interpolating method that converts models into smooth surfaces. However, there is need to be able to tag edges and corners as “sharp” so that they are not automatically smoothed.

Non-integer corner sharpness

In addition to flagging a corner as “sharp”, it’s useful to be able to specify how sharp a corner should be.

Non-integer crease sharpness, varying crease sharpness

As with non-integer corners, being able to specify the sharpness of a crease (i.e. an “edge”) is also useful.

Dicing of hierarchical meshes

Subdividing a mesh consists of replacing each face with several smaller faces to approximate a curved surface. When the mesh because “sufficiently smooth”, subdivision stops.

The issue here is to perform this action efficiently, to balance between memory usage and recomputing values.

Rendering materials with basic material attributes

The built in renderer needs to be able to render the mesh with at least some basic attributes, such as color and specularity.

Subdivision Surface Modeling

Assign materials to faces

Extrude/Intrude tool

Lathe tool

Add/Remove Vertices/Edges/Faces

Split faces

Corner/crease tool

Editing of level-2 vertices and edges

Editing of level 3,4 and 5 vertices and edges

General tools

JPatch 0.4 uses tools roughly modeled after those in A:M. These should probably be replaced by Maya-style tools. Here is a link to documentation on various manipulators in Blender.

Lasso select [Done?]

Rotate [Done]

The Rotate tool allows a point or bone to be rotated around a center point.

Move [Done]

The Move (Translate) tool allows a point or bone to be moved.

Scale [In Process]

The Scale tool allows the selected points be scaled (resized):

Stretch/Squash

The Stretch tool allows a bone to be made longer or shorter. There should be a Maintain Volume option which will make the attached mesh fatter/thinner as the bone is squashed/stretched.

FIXME Will JPatch implement this, or will surge be implemented via scaling along the z axis?

Combo

The Combo tool combines the Rotate, Move and Scale tools into a single tool.

FIXME Will JPatch have a similar tool?

This is the Combo tool in Blender:

U/V Mapping Tools

U/V mapping refers to the process of “painting” a texture onto a mesh. This is done by “projecting” the 3D points of the model (x,y,z) onto a 2D map of the model (u,v), much like a 2D map of the Earth is created by projecting the 3D points of the Earth’s sphere onto a 2D plane. This 2D “map” can be colored using a traditional paint tool, and the textures are then projected from the 2D texture back to the surface of the model.

Here’s a screenshot from Cheetah3D showing a 3D model, and its 2D texture map:

Because projecting a 3D texture onto a 2D map will cause distortion, often several different sorts of options are available. Additionally, different parts of a model may use different forms of projection.

U/V Maps can be used to control the following attributes of a model (depending on what the renderer supports):

  • Color: An image applied to a mesh
  • Transparency: The transparency (alpha) of a surface
  • Bump: Simulated 3d texture on a model
  • Specularity: How shiny a surface appears
  • Diffuse: How diffuse a surface appears
  • Reflectivity: How reflective surface is.

Various sorts of u/v maps (from this tutorial):

The maps applied to the model:

Planar Projection

Projects the 3D texture onto 2D space by ignoring the z depth.

An example of planar projection:

Box Projection

A variant of Planar Projection.

Cylindrical Projection

Projects the points of the model’s mesh outward as if the map were wrapped around the object like a cylinder.

Spherical Projection

Projects the points of the model’s mesh outward as if the map were wrapped around the object like a sphere.

LSCM (Least Squares Conformal Map) [2.0]

LSCM (Least Squares Conformal Map) is an advanced method of mapping which attempts to automatically create a u/v map with the least amount of distortion by preserving the angles of polygons in the mesh.

This has been implemented in Blender 2.42.

Material editor

The material editor in JPatch is based heavily on the material attributes in POV-Ray.

It would be nice if JPatch were able to call the specific renderer instead of using the internal renderer (i.e. using POV-Ray or Sunflow to create the preview image).

Model Import [In Progress]

off

obj

jpt [Done]

JPatch will be able to import models created with the spline modeler (prior to version 7.x).

Model Export

sds

rib

RIB is the format for Renderman compliant renderers.

polygon

off

obj

pov

POV is the format used by the Persistance of Vision raytracer, a popular cross-platform raytracer.

Rendering

Users must be able to create renders with JPatch out of the box. For users who need features not supplied by JPatch’s embedded renderer, JPatch should be able to communicate to external renderers.

JPatch currently opens a modal render window to display the current frame being rendered, similar to Blender.

Internal Renderers

These exist so that users will be able to user JPatch as a stand-alone application, instead of having to rely on an additional tool to perform rendering.

OpenGL Rendering (Previz)

JPatch needs a fast renderer for “Previz” (pre-visualization) purposes. Currently, JPatch uses OpenGL, which has native support in Java.

Raytraced Rendering

JPatch currently uses Inyo, an open-source Java based raytracer. It is expected that Inyo will be replaced by Sunflow, another open-source Java-based raytracer. Sunflow has more features than Inyo, and enjoys strong community support.

Sunflow also has a fast progressive refinement mode, which would be useful to integrate into JPatch’s viewport.

External Rendering

For users who need features that cannot be found in the built-in renderer, JPatch needs to be able to work with external renderer. JPatch was initially conceived as a modeling tool with no rendering capabilities, relying on either POV or Renderman.

JPatch communicates to the external renderer by creating a data file describing the frame to be rendered. It then creates a thread that calls the external renderer and passes the name of the file to be rendered. When the external renderer is finished rendering the frame, the thread is notified, and JPatch displays the rendered frame to the user.

In theory, it is possible to have JPatch watch the created image file as it is being written, so the frame the user could see the frame as it is being generated, instead of waiting for the entire frame being rendered. This is supported by POV.

JPatch can fairly easily be adapted to work with other renderers. For example, a fork of JPatch was modified to use Sunflow as an external renderer.

POV

The Persistance of Vision (POV) raytracer is a popular cross-platform raytracer. JPatch currently supports POV.

Renderman

In addition being Pixar’s flagship rendering tool, Renderman is also a specification. A list of Renderman compliant renderers can be found here. Free and open source versions of Renderman include Aqsis, jrMan and Pixie.

Rigging

Rigging is the process of setting up a model so that it can be animated. It typically refers to the process of adding bones to the model, but also includes morphs, poses, and other issues.

A hierarchical bone rig in A:M:

Morph editor

This allows the points on the mesh to be edited and stored to simulate “muscle” actions, such as smiles, blinks, mouth shapes, muscle bulges, and so on.

A morph’s weight ranges from -1 to 1. There can be multiple morph targets for a single morph - for example, one at -1, .24 and 1.0.

This was implemented in JPatch 0.4.

Skeleton editor

This allows a skeleton to be assigned to a mesh, so that moving the bones of the skeleton manipulates the mesh.

This was implemented in JPatch 0.4.

A rigged character’s bones in A:M:

Smartskin Morphs

This is the same as a regular morph, but is bound to a particular bone rotation. For example, posing a character using only bones will often cause the mesh to be distorted unrealistically, because there are no underlying muscles to support the mesh. Using “smartskin” morphs, morphs can be assigned to to various bone angles, so that as the bone flexes at a particular angle and axis, the morph is applied to correct the distortion.

Automatic vertex to bone binding

This causes mesh points to automatically be assigned to the bones they are closest to. Each bone is assigned a separate color, and mesh points bound to that bone are set to the same color as the bone they are assigned to. If a bone is flagged as not effecting any mesh point, that bone will be ignored.

This method often requires correction of points, but is faster than assigning all the mesh points manually.

This was implemented in JPatch 0.4.

Weighting tools [Advanced]

A mesh point may be influenced by more than one bone. Weighting defines how much influence a particular bone has on a point.

Discussion

David: Is this even planned for JPatch?

Poses

A Pose consists of both a morphs and bone rotations. For example, a mouth shape may require a bone rotation (to open the jaw) and morphs (to pose the lips). Another common example would be a hand pose, such as a clenched fist.

Discussion

David: From the user’s point of view, Poses and Morphs can probably be combined into a single interface.

Pose-control widgets (as in "Stop staring")

Actions

An action is a short snippet of animation.

Discussion

David: Can Actions be replaced by Clips? They seem to be doing the same thing.

Hair [2.0]

Discussion

David: I’ve already implemented a hair simulation that works well. The main thing it’s missing is collision detection, which is generally pretty expensive. Most programs use proxies (for example, spheres) which are a lot cheaper than performing collision detection with a mesh.

Still, this isn’t a high priority item for JPatch. From the UI side, there would be a need to be able to assign hair to the mesh, assign color to the hair, edit the hair - the actual hair simulation turns out to be the simplest part!

Animation

Timeline

The timeline allows moving to a particular frame in the animation, as well as a method for playing back a range of frames.

The Maya timeline:

Discussion David: While Googling around, I found this suggested redesign for Maya's timeline. It’s got some interesting ideas in it. (Some of them may have already been incorporated in).

Motion Curve Editor

Displays the animation keys in a 2D format, and shows how each key is interpolated to the next key. The following types of interpolations are supported:

  • Instantaneous Key values jump instantly from one value to the next.
  • Linear Key values progress linearly from one value to the next.
  • Spline Key values follow a spline curve, giving values an automatic ease-in and ease-out.

By default, JPatch applies a Spline interpolation. JPatch also automatically adjusts the tangents to prevent spline undershoot and overshoot.

Here’s an example of overshoot in Blender:

Here’s the example automatically corrected:

Spline values may be adjusted manually.

Here’s the Maya Graph Editor:

Here’s a shot of A:M’s motion curve editor (from Jeff Lew’s training video):

Dope Sheet

The Dope Sheet (sometimes called the Graph Editor) is similar to the Motion Curve Editor in that it shows the keys for each element. However, the Dope Sheet only displays the position of the keys along the timeline, but not the actual key values themselves.

The Dope Sheet makes it easy to adjust the timing of an animation.

Here’s an example of the Dope Sheet in A:M, with a block of keys selected:

Scene graph [2.0]

A scene graph is a hierarchical representation of the scene being animated.

Limits and Constraints

A constraint works to keep something within a particular range.

The current design distinguishes between limits and constraints. Each floating point attribute can be bounded to limits (minimum and/or maximum). Limits are always enforced immediately, that means that they are “stronger” than constraints, and a bounded attribute is never allowed to take a value outside of its limits, not even temporarily. Constraints, in contrast, are enforced in a separate step. First the node is setup according to its location within the scene graph, then constraints on a node are applied (they too have an order, so one constraint can override another).

This also implies that - to prevent loops - a node can not be constrained to one of its descendants in the scene graph. There also needs to be a loop check for constraints (e.g. you can’t constraint A to B, B to C and C to A)

Spherical Limits

The minimum and maximum rotation angle for a bone. This prevents armatures from being twisted into impossible poses.

Aim At Constraint

Aim At aims a bone at a target bone. For example, this can be used to have guns on a turret automatically follow another object, or a character’s eyes follow a moving object.

Here’s an example from the Art of Illusion user manual:

Note the “cross eyed” effect. This would taken care of by only having one eye constrained using Aim At, and the other eye constrained to the first eye with an Orient Like constraint.

In his training video, Jeff Lew notes that Aim At can be used to constraint elbows and knees, to get better control over the IK solver. Of course, this assumes there’s a full IK solver running, which 1.0 won’t have.

Aim Roll At Constraint

Bone copies roll of target bone.

Attach To Constraint

Attach To is exactly the same as applying both the Translate Like and Orient Like constraints. This can be used to attach an object to another object.

Discuss

Reflex has a cool graphical was of representing this:

Where possible, using a graphical mechanism to represent constraints is a good thing.

Sascha: I’d use just “attach to” constraints instead of separate translate-like and orient-like constraints. The “attach to” constraint would have 6 checkboxes (translate like x, y and z and rotate like x, y and z), so the user can turn on/off the individual translate and orient constraints per axis.

David: I like this idea a lot, but I think some sort of icon at the attachment point should be used to indicate if Translate Like (a pin?) and Orient Like (a curved arrow?) are active.

Constrain To Path Constraint [2.0]

Constrain To Path causes a bone follow a spline path. JPatch doesn’t currently implement spline paths.

Kinematic/IK Solver Constraint[2.0]

IK chain follows target bone.

Orient Like Constraint

This causes rotations applied to one bone to be applied to another bone. Consider an object constrained to an airplane that is performing a loop. If only the Translate Like constraint is applied, the constrained object will move with the airplane, but not rotate with it. The Orient Like constraint takes care of this, causing the object to rotate with the constraining object.

Translate Like Constraint

This causes Translate action applied to one bone to be applied to another bone as well. For example, if you wanted to place and object in a moving car, you could constrain the object’s root bone to Translate Like the car’s root bone. Moving the car would then cause the object to move with the car.

Rotate Plane IK Constraint

This constrains the IK solver to find a solution that lies in a given plane. A Rotate Plane IK solver is simpler to implement than a full IK solver. This is a simple one-joint IK chain, with one end locked in place:

Discuss

David: Version 1.0 will have limited support for IK. This is primarily useful when you want to lock feet in place, and move the body. The feet of the character will rotate around (since they aren’t part of the chain); this can either be manually corrected, or fixed by constraining the feet with a Aim At bone. This is also useful for being able to manipulate an arm by grabbing the wrist and dragging it.

Onion skinning

An example of Onion Skinning from in Reflex:

Discussion These are called “ghosts” in Reflex.

Sascha: I’d move onion skinning under animation. Since actions are simple animations there is some overlap though. Are cycles special cases of actions, or should they be treated specially?

Walk Cycles

Walk Cycles allow an animator to create a reusable walk, and have it automatically applied to a path. For example, the animator might create a walk cycle that takes 12 frames for the character to move 8 units. The cycle can then be applied to an arbitrary path over any arbitrary time, and JPatch will automatically calculate the correct rate of speed for the walk cycle to be applied.

Here’s an example from the Blender User Manual

Here is a link to an image of Animation:Master’s stride editor.

Discuss

David:

Typically, a path is a user-defined spline. However, since moving a character from one point to another automatically defines a spline path, it would seem that there would be no need to create a separate path object. Here’s a shot from Reflex, which displays motion curves (another neat feature to have in JPatch):

When a cycled action is activated, JPatch could automatically display the “path” for the range of that walk cycle.

It is important to accurately measure the length of a stride (the distance a character moves in a single walk cycle), because of the stride length is incorrectly measured, it will appear that the character’s feel are “slipping”.

Blender tracks the following parameters for its Actions:

  • Timeline Range:
    • Strip Start: When the strip starts, relative to the timeline
    • Strip End: When the strip ends, relative to the timeline
  • Action Range:
    • Action Start: When the action begins, relative to the strip
    • Action End: When the action ends, relative to the strip
  • Blending:
    • Blendin: How many frames to ease into the action
    • Blendout: How many frames to ease out of the action
  • Options:
    • Repeat: How many times the action is repeated
    • Stride: Stride length
    • Use Path: Constrains an object to a path
    • Hold: ?
    • Add: Does this action add to or replace a prior action?

These can be incorporated into the idea of Animation Layers.

Animation Layers/Action Strips

Reflex has a concept where the timeline actually consists of multiple, overlaying timelines called layers:

Use Layers in Reflex much as you would use layers in a common graphics editor. 
Layer movements to create the primary action, add secondary actions on top, try
something new you're unsure of in a new Layer, or switch between two or more versions 
of the same movement to decide which you like better. You can cycle animation on 
individual layers, as well. Want to increase or decrease the amplitude of a movement? 
Layer envelopes work similar to Layer "opacity" in graphics editors but the envelope 
values can vary over time.

You can toggle layers on, off, and solo, similar to using a multitrack recorder:

Layers are a convenient way to create reusable motions (perhaps the name clips would be good), as well as cycles (such as walk cycles).

So each layer would have the following options:

  • On: Can be used to turn a layer on or off.
  • Lock: Used to prevent a layer from being edited.
  • Cycle: If non-zero, length of cycle.
  • Name: Name of layer
  • Relative: If on, position of character will be relative. not absolute.
  • Layer Intensity Envelope: How much of layer to add.
  • Layer Rate Envelope: How quickly to play back layer.
  • Restart: Reset the current cycle frame to 0.

Blender does something similar with their NLA (non-linear animation) tools, where strips of reusable animations can be used to build up animations:

Audio

At a minimum, JPatch needs to allow the following:

  • Optional loading an audio file for playback with a scene.
  • Mute button for audio playback.
  • Volume control.
  • Start of playback, either in time or frame number (in case audio needs an offset).
  • Playback of audio file with animation playback on the timeline.
  • Playback of audio with “scrubbing” along the timeline.
  • Playback of audio with preview of rendered frames.

Synchronized Sound Playback

Sound playback can be optionally toggled on to match playback of animation, or rendered frames.

Lipsync Editor

The lipsync editor allows the animator to assign visemes (facial expressions associated with sounds) to a pre-recorded audio track. Since visemes are generally mutually exclusive, the lipsync editor should make it possible select a single viseme (and strength) per sound, and automatically replace the prior viseme.

There should be some way to match phonemes or words to the timeline or waveform, since that would make it easier both to create a lipsync, and match up expressions to words.

The dopesheet for lipsync in A:M is hierarchical, breaking down words into phonemes automatically:

Discussion David: It would be nice to incorporate something like Yolo into JPatch.

Simple audio editing

Discussion David: I’m not sure what you had in mind here, but I’d leave it out, if possible.

advanced

rendering layers

OpenEXR support

post-processing effects

layer compositing
transitions
motion blur
focal blur

material editor

The Material Editor in Maya:

java shaders (Sunflow?)
RSL shaders (RenderMan)
GLSL shaders (OpenGL 2.x)

Project Management

Project Management involves how the assets of a JPatch project are maintained. JPatch should support:

  • Copying assets (moving projects from one computer to another)
  • Reuse of assets (models, scenes, animations)
  • Sharing assets (collaboration)

JPatch 0.4 uses an XML file format.

Status: Workspace-manager class, predefined folder structure local history are currently being worked on.

Workspaces and project directory structures (as in Eclipse)

Local history with rollback functions (as in Eclipse)

Collaboration tools

Projects like IMP (the Internet Movie Project) exist to the use of computer animation tools to create collaborative films over the Internet. JPatch should be designed with the goal of supporting such projects. This includes:

  • Use of XML as a common file format
  • Sharing of projects on CVS/SVN servers

Collada [2.0]

Collada (COLLAborative Design Activity) defines an open standard XML schema for exchanging digital assets among various graphics software applications that might otherwise store their assets in incompatible formats. COLLADA documents that describe digital assets are XML files, usually identified with a .dae (digital asset exchange) filename extension.

The COLLADA Schema supports all the features that modern 3D interactive applications need, including programmable shader effects and physics simulation. It can also be easily extended by the end users for their own specific purposes. In other words, COLLADA is not designed to be a temporary data transport mechanism, but rather to be the schema for the source data for your digital assets. It is not designed as a delivery mechanism, but to be a content holder for any target platform. COLLADA’s choice of language is XML, thus gaining many of the benefits of the eXtensible Markup Language including native support of international UTF-8 encoding. The COLLADA XML Schema document is publically acccessible on the Internet for online content validation.

Discussion The wikipedia COLLADA article notes that “…some applications have adopted COLLADA as their native format or as one variety of native input rather than simply using it as an intermediate format.” It may be worth looking into COLLADA as a native format for JPatch.

In any event, this may be a useful format to support if it gains widespread adoption and support.

Hair and Feathers [Advanced]

Creating a hair simulator isn’t that difficult; there’s already one written in Java that can be used. However, it’ll take a bit of work to integrate into JPatch. There needs to be some way to:

  • Specify that polys in JPatch have hair, along with the color and density of the hair. Most systems seem to have adopted a UV mapping scheme.
  • Groom the hair. Some sort of “brush” tool is generally the most intuitive system.
  • Specify the physics of the system - how it springy it is, etc.
  • Handle collisions. This is typically done by using proxy objects, such as spheres. (This would be helpful for cloth simulation as well).

Hair guides (from A:M):

Many renderers already have support for “tubelike” objects, so rendering is probably not that problematic.

Handling feathers, leaves and other such objects adds another level of complexity, because “hairs” would need to be replaced with some sort of object, typically a “card” with texture applied. Renders like Sunflow support instancing of objects, but with large numbers of objects (such as what feathers would require), it would still require a large amount of time to render.

Because of this issue, it might make sense for JPatch to have a specialized rendering system which could handle hair and feathers, and be composited with other rendering pipelines.

An interesting feature in the A:M hair animation system is that the hair “thickness” can be specified with a graph:

This can then be applied to hairs, to create leaves or feathers:

 
  dev/core_feature_discussion.txt · Last modified: 2007/12/09 18:29 by 66.205.143.6 (dcuny)