Tuesday, 19 November 2013

on Leave a Comment

Creating and Rendering Sparks

This session demonstrates sparks from a collision, as well as from a pulsing (expression driven) emitter. Starting from scratch we generate the required elements to produce the effect: volume emitter, particles, fields, animation, expressions, Collision Events, particle shader and utility nodes. Sparks are a fairly common, yet simple, effect that can be effectively created using Maya dynamics.  For rendering we utilize the Tube software render type as the final output, yet we also discuss how sparks can be rendered using the MultiStreak hardware render type.
                                   
The following example uses hardware particles to create spark from a circular cutting saw and examines how they can be integrated with a scene.
Create a ground plane and an emitter.  Sparks will generally streak as they move through Space and will exhibit several other properties in terms of lifespan and colour. Set the emitter to be directional, so that sparks can fly off in the direction of the cutting saw’s rotation and Set the direction to be  1.0 in Y. It will be far easier and more intuitive to have the emission occur in a single direction and just rotate the emitter than to control the sparks’ direction Through the direction attribute.  Set the rate to 300, speed to 15.0 and change the particle type To streak, clicking the Current Render Type button so that you can start changing the look of The sparks.  Several particle attributes will need to be changed in order for the sparks to look like the real thing:

Spark shape Lifespan Colour
Spark shape will be combination of the render attributes scaled by the speed of the particles, so it is important to get the movement right first. Rotate the emitter so that the particles are being emitted at a 45 degree slant to the ground plane and add a gravity field a low magnitude but no attenuation so that the particle want to return to the ground. Once the speed is correct, set the TALE SIZE and LINE WIDTH attributes so that the sparks being to take shape. Because the tails extends backward from the spark position, you will need to change the min/max distance for the emitter so that the spark tails appear from the emitter and not before it.
on Leave a Comment

MATCHMOVING

What is Match moving?

In cinematographymatch moving is a cinematic technique that allows the insertion of computer graphics into live-action footage with correct position, scale, orientation, and motion relative to the photographed objects in the shot. The term is used loosely to describe several different methods of extracting camera motion information from a motion picture. Sometimes referred to as motion tracking or camera solving, match moving is related to rotoscoping and photogrammetry

2D vs. 3D
Match moving has two forms. Some compositing programs, such as ShakeAdobe After Effects, and Discreet Combustion, include two-dimensional motion tracking capabilities. Two dimensional match moving only tracks features in two-dimensional space, without any concern to camera movement or distortion. It can be used to add motion blur or image stabilization effects to footage. This technique is sufficient to create realistic effects when the original footage does not include major changes in camera perspective.

Three-dimensional match moving tools make it possible to extrapolate three-dimensional information from two-dimensional photography. These tools allow users to derive camera movement and other relative motion from arbitrary footage. The tracking information can be transferred to computer graphics software and used to animate virtual cameras and simulated objects. Programs capable of 3D match moving include:

Automatic vs. interactive tracking
There are two methods by which motion information can be extracted from an image. Interactive tracking, sometimes referred to as "supervised tracking", relies on the user to follow features through a scene. Automatic tracking relies on computer algorithms to identify and track features through a shot. The tracked points movements are then used to calculate a "solution". This solution is composed of all the camera's information such as the motion, focal length, and lens distortion.
The advantage of automatic tracking is that the computer can create many points faster than a human can. A large number of points can be analyzed with statistics to determine the most reliable data. The disadvantage of automatic tracking is that, depending on the algorithm, the computer can be easily confused as it tracks objects through the scene. Automatic tracking methods are particularly ineffective in shots involving fast camera motion such as that seen with hand-held camera work and in shots with repetitive subject matter like small tiles or any sort of regular pattern where one area is not very distinct. This tracking method also suffers when a shot contains a large amount of motion blur, making the small details it needs harder to distinguish.
The advantage of interactive tracking is that a human user can follow features through an entire scene and will not be confused by features that are not rigid. A human user can also determine where features are in a shot that suffers from motion blur; it is extremely difficult for an automatic tracker to correctly find features with high amounts of motion blur. The disadvantage of interactive tracking is that the user will inevitably introduce small errors as they follow objects through the scene, which can lead to what is called "drift".


Professional-level motion tracking is usually achieved using a combination of interactive and automatic techniques. An artist can remove points that are clearly anomalous and use "tracking mattes" to block confusing information out of the automatic tracking process. Tracking mattes are also employed to cover areas of the shot which contain moving elements such as an actor or a spinning ceiling fan.
Powered by Blogger.