NNFlowVector comes bundled with a few utility tools, collectively called “NNFlowVector Utils”. These tools use the vector output of NNFlowVector to help you with certain compositing tasks. The tools are meant to encourage artists to use good quality motion vectors for more than adding motion blur to material, to show the vast possibility of things that are doable. The tools do not try and be a complete tool set for all things motion vectors, and smart vectors, can be used for but rather as a nice additional collection of utils that could come in handy quite often. These are the tools that together make up “NNFlowVector Utils”:
This tool can create a Tracker node with up to four tracked points from anywhere in the input material. It expects an image sequence as input that carry the “motion” layer. It helps if you also got the RGB layer, because then you can view the images when working with the tool. This is the standard output way from NNFlowVector, i.e. if you pre-comp out the result of that node when set to generate “motion vectors”, it will output both the RGB channels for the material as well as the generated “motion” layer.
You then find a good frame where the points are visible that you want to track. You enable as many points you need, and place them where necessary on the image. You then press the button “bake and create tracker”, which will calculate the tracks for the specified frame range, and finally create a new Tracker node in the DAG that is ready to use somewhere in your comp.
This tool is very similar to the tool “MotionVector_to_Tracker” mentioned above, except that it instead expects an input that carry all the smart vector layers. The rest of the tool works exactly the same.
This tool can automatically animate Roto shapes and RotoPaint brush strokes to follow movement of objects in plates (using the motion vectors from a NNFlowVector node). You can either apply animation to all the individual vertices or as a global translation to the shapes/strokes. If you apply the animation to the individual vertices, you do get the shapes to deform to the local movement over time, and not just track the overall movement.
It works as follows:
You first create a Roto node and draw the roto shape you need, or using a RotoPaint node paint what is needed using the normal brushes. You can key frame a couple of important shapes over time to help the solve if you want. Create a “MotionVector_to_Roto” node and connect it to a pre-comped output from NNFlowVector having motion vectors. When having both the node panels open of the “MotionVector_to_Roto” node and the “Roto”/“RotoPaint” node, do select the “Roto”/“RotoPaint” node in the DAG and also select the shapes/strokes you want the animation to applied to in the node. In the “MotionVector_to_Roto” parameters, use the solve buttons at the bottom to either solve a frame range between key frames at a time or the “ALL” button to solve the full frame range specified in the node.
This tool can create a frame blend of neighbouring frames, either 3 or 5 frames (3 means 1 frame on each side of the current frame, and 5 means 2 frames on each side of the current frame). The powerful thing with it is that before blending the frames, they are individually distorted using the motion vectors to match to the current frame, i.e. all frames are pre-warped to the current frame before used. You can choose between “average” and “median” as blending modes. One use case for this would be to smooth out flickering material (average), and another one would be to remove small and fast moving objects like snow flakes (median).
This tool uses the motion vectors to distort a neighbouring frame to the current one. You can choose which frame to use with the “which” parameter. The “which” parameter got a valid range of -2 to 2. One use case for this node would be to distort a neighbouring frame to the current one when doing cleanup of small objects passing by such as wires.
This tool converts Nuke compatible smart vectors to normal motion vectors. The smart vectors are a more complex type of motion vectors used by several advanced NukeX nodes, such as VectorDistort. They do however also contain normal motion vectors which can be extracted, so you can use them with tools that expect a motion vector input (like for example many of the NNFlowVector Utils nodes mentioned above). If you already got pre-comped smart vectors from the NNFlowVector node, it’s much faster to convert them to motion vectors than rendering a new motion vector output from NNFlowVector.