Affiliate Link Generator Copied!

|
1.415.462.1982

2024.1 Boris FX SynthEyes

EDIT in Admin Edit in WP

Add to Wishlist

Boris FX SynthEyes
  

Downloads

Download demos and trial versions, product documentation, free presets and more.

Boris FX SynthEyes

You may download a free Demo after logging in.

Boris FX SynthEyes

Production Grade Matchmoving

Looking for a reliable and high-performing 3D camera solving application? Look no further than Boris FX SynthEyes! Designed to optimize camera, object, geometry, and planar tracking, stabilization, and motion capture, SynthEyes is packed with an extensive list of features, affordable pricing, and seamless exports to various applications.

Helping VFX artists since 2003, SynthEyes™ is a standalone application optimized for camera, object, geometry, planar tracking, and much more. Discover real tracking power and performance with complete control over tracking and solving, blazing-fast performance, a huge feature list, and exports to many applications — all at an affordable price!

  • Incredibly fast tracking and solving saves time, even on shots with thousands of frames.
  • Professional tracking means complete control over tracking and solving
  • Full lens distortion analysis, including radial-4th and anamorphic-6th with corresponding distortion nodes in major compositing applications
  • Exports to an extensive list of industry-leading 3D and compositing applications

Experience the unrivaled power and performance of SynthEyes today and take your VFX projects to new heights!

SynthEyes’ standalone 3D tracking application allows VFX artists to perform complex 3D set extensions, place CG characters into scenes, and animate them using motion capture. Over its twenty-year history, SynthEyes has been used on blockbusters, including Black Panther, The Guardians of the Galaxy series, The Curious Case of Benjamin Button, Pan’s Labyrinth, and King Kong, as well as beloved episodics such as Foundation, Stranger Things, Game of Thrones, Fringe, and Lost.

SynthEyes has been helping VFX artists since 2003 in more than 90 countries. Whether you’re starting out, or need to step up to real tracking power and performance, put SynthEyes to work today!

What’s new in SynthEyes 2024.1

Introducing the new “SynthEyes Advanced Lens Distortion” plugin for After Effects.

  • Directly supports all SynthEyes lens models, animated distortion, and off-center lenses.
  • Export 3D meshes and textures directly to After Effects with the AE version selector set to “Beta after 2024.1.”
  • Updated plugins for compatibility with Apple M1/M2/M3 processors and Multi-Frame Rendering on Windows and macOS.
  • Automatically reduces the number of exported tracker layers based on a user-defined limit.
  • Supports solver-side distortion, though not identical to Fusion and Nuke’s zero-pass workflow due to AE limitations.
  • Properly supports shots with non-square pixels without precomps.

 

What can SynthEyes help me do?

You can use SynthEyes to help insert animated creatures or vehicles; stabilize shaky conventional or 360°VR shots; extend or fix a set; add virtual sets to green-screen shoots; replace signs or insert monitor images; remove unwanted objects from shots; produce 360° virtual reality or 3D stereoscopic films; create architectural previews; reconstruct accidents or crashes; do product placements after the shoot; move imagery from one shot to another; add 3D cybernetic implants, cosmetic effects, or injuries to actors; produce panoramic backdrops or clean plates; build textured 3-D meshes from images; add 3-D particle effects; or capture body motion to drive computer-generated characters. And those are just the more common uses; we’re sure you can think of more.

What are its features?

How much time do you have? For a longer list, click the Details tab up top. Here are a few to start.

  • SynthEyes offers 3-D tracking, set reconstruction, stabilization, and motion capture.
  • It handles camera tracking, 2- and 3-D planar tracking, object tracking with or without a reference mesh, geometry tracking, geometric hierarchy tracking, secondary tracking, camera+object tracking, fiducial tags, survey shots, multiple-shot tracking, tripod (nodal, 2.5-D) tracking, mixed tripod and translating shots, stereoscopic shots, nodal stereoscopic shots, zooming shots, 360° VR shots, lens distortion, light solving.
  • A keyer simplifies and speeds tracking for green-screen shots.
  • The image preprocessor helps remove grain, compression artifacts, off-centering, or varying lighting and improve low-contrast shots, or convert to and from 360° VR shots.
  • Meshes can be built from tracking data, and their textures extracted from the image sequence, producing higher resolution and lower noise than any individual image.
  • SynthEyes offers complete control over the tracking process for challenging shots, including an efficient workflow for supervised trackers, combined automated/supervised tracking, offset tracking, incremental solving, rolling-shutter compensation, a hard and soft path locking system, distance constraints for low-perspective shots, and cross-camera constraints for stereo.
  • A solver phase system lets you set up complex solving strategies with a visual node-based approach (not in Intro version).
  • You can set up a coordinate system with tracker constraints, camera constraints, an automated ground-plane-finding tool, by aligning to a mesh, a line-based single-frame alignment system, manually, or with some cool phase techniques.
  • The ViewShift system allows you to do object removals, combine split takes, generate animated texture maps, and more.

Eyes starting to glaze over at all the features? Don’t worry, there’s a big green AUTO button too.

What can SynthEyes talk to?

SynthEyes is a tracking app; you’ll use the other apps you already know to generate the pretty pictures. SynthEyes exports to about 25 different 2-D and 3-D programs. A revolutionary Instructible Assistant, Synthia™, helps you work faster and better, from typed or even spoken natural language directions. The Sizzle scripting language lets you customize the standard exports, or add your own imports, exports, or tools. You can customize toolbars, color scheme, keyboard mapping, and viewport configurations too. Advanced customers can use the SyPy Python API/SDK.

 

 

 

Features

Production Grade Tracking And Solving

SynthEyes features an extensive supervised tracking feature set with high-performance automatic tracking, 3D planar tracking, AprilTags, cleanup, and add-tracker tools.

  • SynthEyes offers advanced tracking capabilities, including automatic tracking, 3D planar tracking, AprilTags, and tools to clean up and add trackers
  • Track both cameras and moving objects in your footage
  • Use a Geometric Hierarchy tracking system to track objects with complex relationships
  • Precise 3D and 2D Planar Tracking with mask options
  • Benefit from neural-based tracking methods and handle multiple supervised trackers simultaneously
  • Tools for identifying and managing problematic trackers
  • Support for tracking in stereoscopic 3D
  • Calibrate various lens types for accurate tracking
  • Ideal for 360° VR shots and motion capture
  • Quickly solve complex, long shots
  • Support for different lens models, including radial-4th and anamorphic-6th
  • Control camera and object paths for precise solves
  • Achieve stable results with tripod-mounted cameras
  • Animated lens distortion parameters to track realistically
  • Flexible solver settings to meet your specific needs
  • Post-solve tools to clean up and fine-tune your tracking results
  • Solve and stabilize 360° VR footage with ease
  • Match to 3D set models such as lidar scans

Scene Exports

SynthEyes can export the tracked scene to a large number of other packages and formats, supporting formats such as USD, FBX, OBJ, and Alembic.

  • Export native project data to 3D packages including Maya, 3ds Max, Blender, Lightwave, and Cinema 4D
  • Create project scenes in compositing applications, including After Effects, Nuke, Fusion/Resolve, Flame, and Houdini
  • Animated stabilization and distortion maps can be exported using distortion maps (on packages supporting them)
  • More advanced pipelines: Take advantage of flexible ASCII text exporters for tracker 2D paths and 3D positions, camera/object paths, and animated illumination
  • Included exporters for
    • After Effects (2-D, 3-D incl. C4D layers, 360VR, and 360VR stabilization exports, with plugin effects for AE CC/CS6)
    • Alembic ABC (1.5+ Ogawa format)    (Consider USD instead?)
    • AutoCAD DXF
    • Bentley Microstation
    • Biovision BVH (import also)
    • Blender (ongoing versions including 3.0+, full-featured with 360VR)
    • Brainstorm 3D (for InfinitySet etc)
    • Carrarai (Now in the ‘Older’ submenu)/li>
    • Cinema 4D
    • COLLADA
    • Combustion (2- & 3-D. Now in the ‘Older’ submenu)
    • Electric Image
    • 3D Equalizer. Read about the SynthEyes to 3D Equalizer exporter.
    • Filmbox FBX, including deforming rigs and vertex caches (import too). Good for Maya, 3dsmax, Unreal, …
    • FLAIR motion control cameras
    • Flame (2- & 3-D)
    • Fusion (full-featured 3-D with 360VR and 360VR stabilization; 2-D paste of trackers from SynthEyes to Fusion; 2-D planar corner pin). Can usually export Fusion’s built-in lens distortion node with 0-pass (overscan rendering), 1-pass, or 2-pass lens workflow.
    • Hash Animation:Master
    • Hitfilm (powerful full 3D export! + 360VR stabilization)
    • Houdini (CMD and USD)
    • Inferno (2- & 3-D)
    • Lidar XYZ
    • Lightwave (including 360VR)
    • MAXscript (3ds max, 3D Studio VIZ) (Filmbox or USD preferred! Now in the ‘Older’ submenu.)
    • Maya ASCII. Updated .ma exporter now available in the Customer Support area. Filmbox (FBX), USD, or Alembic may also be suitable. Old Maya ASCII exporter is now in the ‘Older’ submenu.
    • Metashape/PhotoScan
    • Mistika
    • MDD animated mesh vertices
    • Modo (including 360VR)
    • Motion (2- & 3-D)
    • Nuke (3-D & 2-D planar corner pin). Can usually export Nuke’s built-in lens distortion node with 0-pass (overscan rendering), 1-pass, or 2-pass lens workflow.
    • OBJ meshes
    • Particle Illusion
    • PC2 Point Caches
    • Poser
    • Realsoft 3D
    • Resolve (Fusion, see above). (See Andrew Hazelden’s tutorial Using SynthEyes with Resolve 15).
    • Shake (2- & 3-D, in the ‘Older’ submenu)
    • Smoke2008 (2- & 3-D)
    • Softimage dotXSI. Now in the ‘Older’ submenu
    • toxik (pre2009) (in the ‘Older’ submenu)
    • trueSpace (in the ‘Older’ submenu)
    • Universal Scene Description aka USD. Full-featured! (For many apps including Houdini, Maya, Unity3D, Unreal…)
    • Vue 5 & 6 Infinite (in the ‘Older’ submenu)
  • Features vary from exporter to exporter. See the demo version.
  • Certain older exporters are found in the File/Export/Older submenu; look there if you can’t find a listed export.
  • If you are keen on a certain export and are willing to help, see what is needed to develop new exporters.
  • Product names listed here are trademarks of their various owners.

Coordinate System Setup

SynthEyes offers powerful tools to position, align, and size the entire scene in a 3D environment, so you can ensure the world scale and coordinates of your scene match the rest of your workflow.

  • Auto-placement tool analyzes scene structure to create a good initial scene coordinate system
  • Quick precision setup wizard — click 3 trackers, then realign
  • Easy manual complete-scene re-positioning and scaling
  • Line-based alignment system for nodal-tripod and lock-off shots, or use the mesh pinning tool in its camera setting
  • Flexible coordinate system alignment controls for complex situations
  • Constrained Points (Axis Control) view to quickly examine coordinate system settings of all trackers at once with a right-click menu for modifications
  • Coordinate placement methods based on camera positioning
  • Select the desired solution using coordinate polarity controls when several solutions are possible
  • Exact constraints for survey data, including infrequent GPS waypoints
  • Distance constraints to utilize on-set sizing measurements
  • Coordinate system setup before or after solving

Lens Calibration

SynthEyes has extensive tools not only for calculating distortion during solving, but also for lens calibration from lens grids. Rectify Lens Grid for “just fix it” unmodeled lens distortion correction, especially for complex lens types. Tilt detection and correction if grid spacing and camera-to-grid measurements are available. Lens Master Calibration system handles linear, inverse linear, anamorphic lenses, and four fisheye lens variants.

  • Rectify Lens Grid for distortion correction
  • Calibration for various lens types
  • Calibration methods including random-dot calibration
  • Lens distortion presets for different workflows

Geometric Hierarchy Tracking

Spectacularly powerful and flexible toolset for 3D tracking multi-level constrained hierarchies of moving parts, directly tracking supplied meshes, or using normal (supervised) trackers.

  • Toolset for tracking moving parts in 3D
  • Tracking of kinematic chains and secondary animation
  • Integration with motion capture for BVH export

Stabilization

SynthEyes features integrated stabilization for normal and 360VR shots driven by 3D solves, including creating a “physical” rig for export.

  • Stabilization capabilities are built into the image preprocessor including normal (with lens distortion) and 360VR shots
  • Stabilization driven by full 3D solves, or approximate 2D tracking data
  • Manual adjustments as needed
  • Automatic “zoom” determination for normal shots (multiple modes)
  • Stabilization rig creator for non-360VR shots creates a “physical” equivalent for the stabilization, so downstream 3D packages can perform the actual stabilization
  • Export of 360VR stabilization data directly to supported packages including After Effects, Fusion, and Hitfilm
  • Setup of 360VR stabilization as an animated textured sphere for alternative export
  • Integrated resampling to other output resolutions and aspects

Object Separation

SynthEyes has user-friendly tools to separate elements and exclude them from tracking.

  • Automatically select trackers only within a green screen region
  • Set up regions for each moving object or garbage mattes for actors, for example
  • Keyframe-animated splines to define regions
  • Quick square and circle spline setup
  • Import spline control points from tracker paths for rapid setup
  • Animated enables for splines
  • Use a rotoscoped alpha-channel

ViewShift System

Use ViewShift for complex object removals, combining split takes, generating animated texture maps, and more!

  • Camera mapping and rendering system
  • Uses matchmoved 3D camera paths and set models
  • Removes objects against various reflector meshes (not just planes)
  • Quick setup tools for working from a clean plate
  • Illumination compensation for better matching
  • Multiple timing modes
  • Control via animated splines or mesh outlines
  • Various output modes for previews and compositing
  • Matte generation with softening
  • Animated texture maps from meshes
  • Multiple independent ViewShift outputs

Meshes In/Out

SynthEyes can read and write a variety of mesh and vertex cache formats, supporting most commonly used 3D / compositing applications used in the industry.

  • Import and export industry standard 3D mesh formats: OBJ, C4D, COLLADA (DAE), DXF, PTS, LWO and XYZ Lidar, SynthEyes SBM, 3DS (import only)
  • Custom Mesh options for quick insertion into SynthEyes scenes on par with basic cubes, spheres, etc.
  • Vertex caches in/out: Alembic .abc, Maya .mcx, Lightwave .mdd, 3dsmax .pc2

Scripting

Feel the awesome next-generation power and flexibility of typed or spoken natural language control with the Synthia instructible assistant. Automate frequent tasks using Synthia, Sizzle, or Python scripting.

  • Use native Sizzle scripting language or Python API/SDK
  • The Synthia natural language intelligent assistant turns plain language into powerful script commands
  • Use scripts to modify and extend SynthEyes functionality, including access to scene and image preprocessing

Types of Shots

SynthEyes handles a wide variety of shots: moving objects, nodal, zoom, stereo, single frame, motion capture, surveys…

  • Camera tracking
  • Moving-object tracking (of geometry or via trackers, singly or in hierarchies, with or without a known mesh)
  • Stereoscopic tracking (stereo features not available in Intro version)
  • Fully automatic match-moving for straightforward shots
  • Rapid supervised tracking for difficult tracks.
  • Fixed (known or unknown) or zooming lens field of view
  • Manual or automated lens distortion calculation
  • Nodal solving: compute camera pan/tilt and tracker directions when camera was mounted on a tripod (ie computes a ~2½-D track because 3-D is not possible)
  • Nodal shots down to a single tracker in whole in part, with automatic roll lock.
  • Mixed nodal and translating shots
  • Nodal “all-far” stereoscopic shots
  • 360° Virtual Reality shots (spherical equirectangular)
  • Multiple cameras and moving objects with simultaneous solving, including mixes such as tripod-mounted camera(2½-D) with moving object(3-D), or a video shot plus digital camera reference stills
  • Survey shots: multiple stills taken from different locations, possibly with different image formats and zoom.
  • Motion Capture — for faces or whole bodies, using multiple cameras. Single-camera mixed rigid- and deforming- facial mocap. Conversion to joint angles for BVH export.

Image Preprocessor

Preprocess the images to aid tracking, fix colors, correct lens distortion, or reduce RAM consumption.

  • Flexible image preprocessing engine for easier tracking and RAM storage of large film-resolution shots (Intro version: RAM cache limited to 1.25 GB).
  • Disk Caching complements RAM caching (Pro only).
  • Image stabilization engine, directable by user, integrated with 3-D tracking.
  • Image format re-sampling and aspect changes.
  • Kurves spline-based color correction (Y, R, G, B, S).
  • Traditional hue/saturation and animated level adjustments.
  • 1D and 3D Color LUTs:  .3dl  .cube  .csp  .ilut  .olut  .1dlut
  • Color adjustment settings can be written as a .cube 3D LUT for reuse in SynthEyes or other applications.
  • R/G/B channel selection and inversion.
  • Blur to remove film grain and compression artifacts.
  • Additional Noise Reduction algorithm.
  • Additional separate luma and chroma blurs for DNG noise reduction.
  • High-pass filter for varying lighting, such as strobes and explosions.
  • Lens distortion removal—animate for zooms with distortion.
  • Classic (quadratic/cubic/quartic), advanced (radial-4, anamorphic-6, fisheye), image-based, or table-based distortion models.
  • Import and Export lens distortion as color-coded image maps for standardized interchange with other applications.
  • Optic axis re-centering.
  • DeRez to maximize RAM cache.
  • Mirror imaging for mirror-based stereo rigs.
  • Create and restore image preprocessor presets; presets can affect some or all image preprocessor controls.
  • Save the processed image sequence to disk.
  • For QC checks of 2-pass lens distortion workflows, render meshes with distortion matching the original footage, overlaid on the raw imagery or over black for external compositing.
  • Reapply lens distortion and cropping for total lens distortion workflow.
  • Conversion back and forth between normal perspective and 360° spherical virtual reality shots.
  • Selectable resampling filter.
  • 8-bit, 16-bit fixed, 32-bit float processing available, storage in those formats plus 16-bit float.
  • Separate alpha image sequences can be attached.
  • Use for geometric conform of stereo shots.
  • PrepSet Manager so configurations can be saved and reused from shot to shot or project.
  • Animated Region of Interest to reduce RAM on object-tracking shots.
  • Multi-threaded image fetch for high performance.

Tracking

Extensive supervised tracking feature set, high-performance automatic tracking, 3-D planar tracking, AprilTags, cleanup and add-tracker tools…

  • Geometric Hierarchy tracking system (described in its own tab).
  • 3-D and 2-D planar tracking subsystem, featuring auto-mask, roto-masking, alpha-masking, in-plane masks, layer ordering.
  • Corner detector for man-made scenes emphasizes non-local corners, as image pixels may be unhelpful.
  • Neural automatic and supervised tracking methods, for general use, crossed markers, and checkerboards.
  • Lighting-invariant supervised tracking mode.
  • SimulTrack window for simultaneous atemporal monitoring of one or more supervised trackers
  • High-performance multi-threaded autotrack
  • Built-in support for locating and identifying the QR-code-like AprilTags fiducial patterns for multi-shot set alignments and some object-tracking applications.
  • Stereoscopic autotrack with automated correspondence
  • Rapid supervised stereo-pair creation
  • “Find erratic trackers” tool to help identify trackers on actors, vehicles, reflections, false features, etc, before starting the solve
  • Mix supervised and automatic tracking on the same shot
  • Match, white spot, black spot, symmetric, planar trackers
  • Tracker size, aspect, and search size can be animated
  • Offset trackers, including short-term temporary offsets, long-term drift-correcting offsets, or full-length offsets for detailed modeling.
  • Hand-animation mode with splined paths
  • Bidirectional tracking
  • Automatic periodic keying
  • Smoothing at keys
  • Reference crosshairs that can be animated per tracker per frame.
  • Mini-view shows tracker figure of merit while tracking.
  • “Radar” display to quickly identify problematic trackers or frames
  • Tracker cleanup wizard to locate problematic trackers
  • Real-time error calculation and position calculation after initial 3-D solve
  • Special tracking modes for hand-held shots
  • Nudge tool for hand-place tracker seeds
  • Option to re-track automatic trackers with supervised algorithms to increase accuracy

Solving

The powerful and flexible solver includes constraints, path locks, camera/camera stereo locks, lens distortion, zooms, zero-weighted trackers, phases for multistage solving, a light solver…

  • High performance to accommodate shots with many thousands of frames and trackers (on suitable computers)
  • Fixed or zooming lenses with no requirement for on-set information
  • Multiple advanced lens models, including Radial-4, Anamorphic-6 (pro version), and fisheye types (cropped fisheye lenses are typical for action cams, drones, and surveillance cameras).
  • Native 360VR solving — see the 360 VR section.
  • Can solve for the rolling-shutter value, or compensate for a known one, to produce an idealized solve for a non-rolling camera.
  • Per-axis animated soft or hard locks for cameras or moving objects
  • Animated soft or hard field of view (FOV, ie zoom) locking
  • Animated lens distortion parameters, especially keyed-frame animation with linear interpolation, for generating smooth solves without distortion-induced jitter.
  • Anamorphic distance solving, to address this challenging issue, which can’t be described with a traditional 2-D distortion.
  • Flexible camera/camera locking controls for stereo—work with varying IOD and vergence
  • “Overall distance” constraint for camera/origin or object/camera distance allows smooth control over this jittery direction, especially for low-perspective object tracks.
  • Solver controls to emphasize rapidly stopping the solve for operator intervention and repair (or for continuing as best possible)
  • Tracker monitoring during solving for increased robustness
  • Object and tracker weighting
  • Zero-weighted trackers: 3-D positions without affecting the camera path. Used for instant position calculation during tracking. Stereo version also
  • Zero-weighted frames: marginal frames (for example with high blur) can be prevented from compromising the rest of the solve
  • Skip-frame track for auto-tracking and solving
  • Frame-decimated solves
  • Controlled post-solve camera/object path filtering
  • Post-solve tracker clean-up tool, can be run automatically
  • Refine previously-computed solutions, for rapid tweaking
  • Rapidly create additional trackers with specified accuracies after initial solve
  • Coalesce co-located trackers: trackers on the same image feature, but typically during different sections of the shot
  • Light solver to determine light position or direction from ray or planar geometry, or from a reflection from a mesh
  • Unsolve tool, to remove compromised portions of a long solve before restarting it
  • PHASES for controllable and repeatable node-based solver strategy setup, documentation, and re-use. (Phases not supported in Intro version)
  • Phase-based methods to set up a coordinate system, including Set Horizon or Set Heading, Linearize to any straight line for dolly shots, and easy scene scaling from camera height or distance to tracker.
  • Novel phase-based techniques for setting relative scale of shots with moving cameras AND moving objects.
  • Phase view and panel, graph editor interface, and Sizzle scripting interface. Phase configuration copy/paste, open/save, and a user phase library.

360° Virtual Reality

SynthEyes can work with 360° spherical equirectangular images for tracking, stabilization, and display.

SynthEyes supports an entire workflow for stabilizing 360VR shots and inserting 3D objects into them. See the 12-part 2-hour 360VR tutorial and the follow-on when we added native 360VR solving.

  • Supports standard monocular 360°x180° spherical equirectangular images. (Smaller formats such as 360×120 can be padded to the nominal size.) Read about stereo 360VR.
  • Can use image maps for easy fisheye to 360VR conversion. Generate maps via calibration or from detailed manufacturer data (esp. Entaniya).
  • Workflow for simple VR stabilization.
  • Direct omnidirectional native 360VR solving.
  • Or, convert 360VR shots to linear for tracking.
  • Absolute world stabilization from a 3D solve, so the viewer sees an easy-to-understand consistent relationship between the virtual world and their real world.
  • Many ways to do horizon alignment, for example based on a 3D coordinate setup or a golden frame.
  • 360VR integrated into full 3D scene exports to After Effects, Blender, Fusion, Lightwave, and Modo.
  • Export numeric 360° VR stabilization information to After Effects; use an included SynthEyes effect plugin, or those built into AE CC2018 and later, to stabilize the images within After Effects (ie with no SynthEyes render).
  • Similar 360° VR stabilization exports for Fusion (including free) and Hitfilm. These do not require any extra plugins.
  • 360VR object tracking: independently-moving cars, boats, planes, etc visible in the camera image.
  • Zero-weighted trackers solved directly from 360 VR tracks.
  • Add Many, Tracker Cleanup, Coalesce in 3-D, Drop on Mesh, Error View, Texture Extraction, Tracker Radar all support 360VR.
  • Able to align the VR camera relative the camera path if desired.
  • Generate additional perspective cameras that closely track inserted meshes, for fast and efficient rendering even by conventional non-360° rendering applications.
  • Able to convert those conventional rendered images into 360° VR.
  • Display of meshes correctly over 360VR images in the camera view.
  • Able to generate quick 360VR preview movies from the camera view.
  • Built-in a spherical screen for viewing 360° VR shots in the perspective view, and for export to other applications.
  • Can create a spherical screen for exporting 360° VR shots for viewing in other applications.
  • Export 360° VR stabilization information as image distortion map sequences for use in other applications supporting distortion maps.

Set Reconstruction

Amazing texture extraction system pulls low-noise textures from shots. Create meshes from computed 3-D tracker locations. Add cards for quick geometry.

  • Texture-extraction subsystem uses all (or specified) images in the sequence for high-quality results.
  • Multiple extraction modes suitable for different shot types.
  • Supports 360VR input, including quick conversion from fisheye to 360VR: wide field of view makes on-set photography coverage easier.
  • Intro version: extract textures to a maximum resolution of 2048×2048
  • Mesh-to-mesh blocking of textures
  • Garbage matting for textures
  • Automatic alpha-channel creation for extracted textures
  • Alpha channel painting
  • Add-Card with lasso and robust automatic plane-fitting
  • Light direction and position determination from ray or planar geometry, or even a reflection from a mesh
  • Light illumination level measurement for shots with rapidly-varying lighting
  • Mesh-building commands to turn a collection of trackers into an exportable 3-D mesh, to build shadow catchers and proxy geometry
  • Assemble-mesh mode for quick mesh building
  • “Far meshes” allow reasonable-size meshes to hold distant backdrops
  • “Punch-in” tool for adding trackers to meshes
  • Smudge and relax tools for adjusting meshes to better match imagery.
  • Linkage system between trackers and the vertices they create in meshes
  • Pinning tool for aligning meshes to single frames
  • Alignment tools for aligning meshes to trackers, or the entire scene to a mesh

Graph Editor

Examine tracker or camera position or velocity curves, or keyframe track views for coverage analysis. Always-visible error curve mini-view.

  • Graph and track editing for monitoring and cleanup
  • Works on trackers, cameras, moving objects, image preprocessor
  • Sort trackers alphabetically, by error, by history, or by lifespan
  • Advanced multi-curve display
  • Squished-track view and other features to learn about the entire shot at a glance
  • “Number zone” showing changable numeric channel values for the current frame.
  • Quick glitch fixing and tracker splitting
  • Bake animation to keys, or decimate keys to spline
  • Embed in viewport, or floating; any number at once
  • Separate mini-view shows a color-coded error curve and numeric value for the selected tracker(s), the overall total error curve, or the tracking figure of merit for an individual tracker while tracking.

Perspective Window

Navigate the scene independent of the solved camera, place trackers on imported meshes or lidar data, produce 3D stereo views, create preview movies with antialiasing and motion blur.

  • Navigate the 3-D scene independent of the solved camera. Native or Maya-style navigation.
  • Projection screens with green-screen or alpha keying permit better previewing of scenes. Can lock to a specific location in 3D.
  • Projection screens can dynamically remove distortion or convert 360VR to linear.
  • Place seed points on imported meshes/lidar to aid tough object tracks.
  • Use mesh planes for clipping lidar data vertex selection.
  • Anaglyph, Interlaced, and Left-over-Right stereo display in perspective views (including for 3-D viewing of non-stereo shots).
  • Create preview movies from the tracked camera for real-time playback, including antialiasing and motion blur, plus timecode, timestamp, or frame number burn-in.
  • Create separate non-tracked render cameras to show the 3-D environment from a separate viewpoint, with or without hand camera animation.
  • Preview movies can be written as sequences with alpha for downstream compositing (useful for dynamic texturing effects).
  • Lit or unlit texture display with alpha and alpha only and front-projection onto meshes.
  • Shaded, outlined shaded, wireframe, occluded wireframe, or cartoon outline wireframe display modes.
  • Depth map outputs.
  • Shadow-casting onto the ground plane or constructed mesh(es). Can create a shadow map texture for a mesh that can be written to disk for use in other applications.
  • Object-manipulation handles.

Images In/Out

SynthEyes can read a variety of still image formats, movie files such as ARRI, BRAW, and RED. It can also write various image and movie formats for preview movies.

  • Image Input: ARRIRAW (.ARI or .MXF), AVI*, Blackmagic RAW .BRAW, BMP, Cineon, CinemaDNG/DNG**, DPX, DV*, JPEG, MPEG, QuickTime™ (Apple) MOV*, MP4*, OpenEXR, Photoshop PSD, PNG, Apple ProRes™ (.mov, all platforms), RED R3D (GPU-accelerated), SGI RGB, Targa, TIFF, WMV (Win). RED SDK 8.0.0 support includes Komodo, Helium, Dragon, HDR and RED ROCKET and OpenCL/CUDA decoding. Lidar .XYZ import including per-vertex RGB.
  • Apple ProRes™ movies may be read and written on Windows and Linux SynthEyes versions, in addition to macOS, with all codecs from ProRes Proxy to full ProRes 4444 XQ, including 16-bit/channel data and an alpha channel.
  • Reading and writing HEVC/H.265 requires Win 10 Creator’s Edition (1703), macOS 10.13 (High Sierra), or later.
  • Important Note for reading HEVC/H.265 .mp4 files on Windows 10: Microsoft has (re)moved HEVC/H.265 support for some users to an optional Windows Store Download, due to some licensing issue. Try HEVC Video Extensions from Device Manufacturer (Free) or HEVC Video Extensions ($0.99). See also the note below.
  • Image Output: ASF (Win), AVI (Win), BMP, Cineon, DPX, JPEG, MOV*, MP4 (Win), OpenEXR, PNG, Apple ProRes™ (.mov, all platforms), SGI, Targa, TIFF, WMV (Win).
  • Pro version: Any practical resolution, image or pixel aspect ratio (no resolution limits), ie SD, HD, 2K, 3K, 4K, 5K, 6K, 8K, ….
  • Intro version: maximum of 1920×1080 HD.
  • 8 bits/channel, 16 bits/channel, “half” (16 bit) floating, and 32-bit floating-point. Availability varies based on image type.
  • Metadata extraction and writing for select image types. Ability to use a tandem file for metadata.
  • Timecode extraction with burn-in to camera and perspective views, and on Save Sequence and Preview Movies.

*: The readability of any given “movie” file depends on the operating system and codec software installed on your particular machine and cannot be predicted in advance. Additional file extensions may also be readable. Older Windows versions permit only HD MP4 resolution. Please verify particular formats via the SynthEyes demo version. Due to operating-system limitations, AVI/MOV processing on 64-bit systems may need go through a 32-bit server, slowing speeds and limiting output capabilities somewhat. Linux version supports only image sequences, ARRIRAW (only!) .mxf, Blackmagic .braw, Apple ProRes™, and RED .r3d movies, not other movie-file formats such as AVI, MOV, or MP4.

**: While DNG specifies the RAW image, the final RGB image delivered to applications is not specified as part of the standard, so DNG images appear differently in different applications. SynthEyes delivers a direct image that is much less processed than in applications such as Photoshop. Hopefully future versions of DNG will standardize this.

Workflow/User Interface

Flexible user-interface features such as rooms, viewport layouts, free-floating configurable toolbars, keyboard maps, shot presets, coordinate axis directions, versioning auto-save. Notes for inter-artist communication.

  • Stand-alone 64-bit application for Windows, Mac, or Linux. See system recommendations.
  • Extensive multi-threaded/multi-core and AVX optimization throughout the application; tuned for efficient operation on machines with many cores.
  • Multi-shot batch processing for unattended track/solve or sequence writing
  • Scene summary generator and “Notes” for communication between tracking artists and supervisors
  • Multi-way Branching Undo capability allows you to try and get back to different alternatives
  • File-information dialog for comments and image and mesh version tracking
  • Merge tracked files
  • Scrubbable and real-time RAM Buffer Playback with 3-D CG object overlays
  • “Rooms”-based approach for quick and flexible view configuration
  • High-DPI per-monitor support on Windows, Retina mode support on macOS. Linux version supports a system-wide high-DPI adjustment.
  • User-alterable layout specification language (XML-based). Option for panel on the right, for example.
  • Handy free-floating toolbars and manager
  • Configurable keyboard mapping
  • Configurable viewport layouts
  • Configurable shot presets
  • Configurable coordinate system definition (Max, Maya, Lightwave)
  • All views and control panels can be floated, if desired.
  • Hierarchy view for examining, cloning, and changing parenting etc
  • Extensive preferences panel with many user-alterable parameters and colors. Able to save preferences to a file and reload on the same or different machine.
  • PrepSet Manager to save your standardized image preprocessor presets for reuse in other shots or projects.
  • File re-finding after projects have been moved.
  • Highly configurable file auto-save and auto-increment for versioning
  • Interface can be slimmed down as a RAM player, including user-configurable safe area display
  • Customer care center featuring automatic downloads of new versions, messaging system, and feature suggestion system
  • Translated hover-over tooltips for 25 different languages, to facilitate usage by non-native-English users.
  • Unicode UTF-8 support in files, tracker/mesh/object/etc names, user-written notes, etc. OpenGL-based windows (vary by OS) can display only latin/roman-based character sets, unless specialized font data is available (currently available only for Hiragana).

 

Minimum Windows Requirements

  • 64-bit Intel version of Windows 10 or 11.
  • Intel or AMD “x64” processor with AVX (“Sandy Bridge” or later or comparable AMD, no Atom-based.)
  • 2 GB RAM minimum. 4+ GB strongly recommended. 8-32 GB or more typical for pro, 360VR, and film users. Disk usage: 1.7 GB.

Minimum macOS Requirements

  • macOS macOS Ventura (13), Monterey (12), Big Sur (11), Catalina (10.15)
  • Apple Silicon (M1/M2/etc) or Intel processor supporting AVX (“Sandy Bridge” or later).
  • 2 GB minimum RAM. 4+ GB strongly recommended. 8-32 GB or more typical for pro, 360VR, and film users. Disk usage: 410 MB.

Minimum Linux Requirements

  • Redhat/CentOS 7 LTS. Other linux versions are likely to work (read on) but are not officially supported.
  • 64-bit Intel x64 architecture processor with AVX
  • 2 GB minimum RAM. 4+ GB strongly recommended. 8-32 GB or more typical for pro, 360VR, and film users. Disk usage: 1.3 GB.

Windows System Requirements

  • *** IMPORTANT *** SynthEyes requires that your processor implements the AVX instruction set. (SynthEyes may notify you about this at startup.) Or check Microsoft’s cpuinfo64 tool for a plethora of information. The AVX instructions were introduced over a decade ago, so this is not unreasonable. SynthEyes uses AVX for performance not possible with old CPUs. SynthEyes 1905 can be run on older CPUs.
  • 64-bit Intel version of Windows 10 or 11; Windows 8 or 7 might be usable (try the demo) but are obsolete and won’t support SynthEyes in the near future. Windows 10 version 1903 or later is required for full globalization support. Windows 10 is important for 360VR, because Win7 cannot decode 4K mp4 video. HEVC/h.265 support uses an optional Windows 10 download.
  • Intel or AMD “x64” processor with AVX (“Sandy Bridge” or later or comparable AMD, no Atom-based processors.) See above.
  • 2 GB RAM minimum. 4+ GB strongly recommended. 8-32 GB or more typical for pro, 360VR, and film users.
  • 1280×768 or larger display with OpenGL support. High-DPI displays supported (100% or 200% scaling), Windows 10 required for full functionality. Large multi-head configurations require graphics cards with sufficient memory. RED GPU decode assist requires 2+ GB video RAM.
  • 3-button mouse/trackball with middle scroll wheel/button. Adjust “Reverse Mouse Wheel” preference to taste (normally to achieve push-in/pull-back behavior). Use the “No middle-mouse button” preferences for tablets. While trackpads operate with SynthEyes, they are not efficient or ergonometrically suitable for production tracking work.
  • Full keyboard with arrows and side number pad suggested for maximum efficiency.
  • Approximately 230 MB for a default install; 1.7 GB disk space with Tensorflow GPU overlay. Tutorials and other learning materials are online.
  • Warning: the “Nahimic” audio system (NahimicOSD.dll) has a parasitic video overlay that is problematic to a number of apps including SynthEyes, and may cause a hang or crash when entering the graph editor or perspective view. If so, disable or uninstall it. Quick directions that might work: Click the Windows icon, type “services”, right-click the Services app, click Run as Administrator. Scroll down to Nahimic, right-click and select Properties. Change the Startup type to Disabled. Click Apply, OK, etc. Your system may differ.

General

  • Keep the operating system up to date. While SynthEyes should work with older builds, sometimes OS vendors make changes that cause problems when SynthEyes is newer than an older operating system file.
  • A user familiar with general 3-D animation techniques such as keyframing.
  • A supported 3-D animation or compositing package to export paths and points to. Can be on a different machine or operating system.
  • Windows, Linux, and macOS licenses are separate. There is a cross-platform option for the Pro version. A seat license is not a floating license or for multiple users; read more about multiple installs. This helps keep SynthEyes affordable for everyone.

macOS X System Requirements

MacOS Catalina (10.15) or later.

  • *** IMPORTANT *** Cheese-grater Mac Pros from 2012 and earlier won’t work! SynthEyes requires the AVX instruction set. (SynthEyes may notify you about this at startup.) Look for AVX in the output of the terminal command “sysctl machdep.cpu.features”. If SynthEyes crashes at startup and you have an older processor, that is the problem. SynthEyes uses AVX for performance not possible with old CPUs. SynthEyes 1905 can be run on older CPUs.
  • MacOS Sonoma (14), Ventura (13), Monterey (12), Big Sur (11), Catalina (10.15). Big Sur or later required for Blackmagic RAW.
  • Universal build for Apple Silicon (M1/M2/etc) or Intel Mac with AVX support (“Sandy Bridge” or later), see above
  • 2 GB minimum RAM. 4+ GB strongly recommended. 8-32 GB or more typical for pro, 360VR, and film users. We highly recommend the 16GB or more for Apple Silicon due to the unified memory architecture.
  • 1280×768 or larger display with OpenGL support. “Retina” high-DPI displays supported (1:1 or 2:1 scaling). Large multi-head configurations require graphics cards with sufficient memory. RED GPU decode assist requires 2+ GB video RAM. Run SynthEyes on the same monitor as the menubar (movable in Display settings).
  • 3 button mouse/trackball with scroll wheel. Adjust “Reverse Mouse Wheel” preference to taste (normally to achieve push-in/pull-back behavior). Use the “No middle-mouse button” preferences for tablets. While trackpads and the “Magic Trackpad” operate with SynthEyes, they are not efficient or ergonometrically suitable for production tracking work.
  • Full keyboard with arrows and side number pad suggested for maximum efficiency.
  • Approximately 200 MB disk space to install. Tutorials and other learning materials are online.

General

  • Keep the operating system up to date. While SynthEyes should work with older builds, sometimes OS vendors make changes that cause problems when SynthEyes is newer than an older operating system file.
  • Newer SynthEyes versions will necessarily have to leave older operating systems behind.
  • A user familiar with general 3-D animation techniques such as keyframing.
  • A supported 3-D animation or compositing package to export paths and points to. Can be on a different machine or operating system.
  • Windows, Linux, and macOS licenses are separate. There is a cross-platform option for the Pro version. A seat license is not a floating license or for multiple users; read more about multiple installs. This helps keep SynthEyes affordable for everyone.

Linux System Requirements

  • *** IMPORTANT *** SynthEyes requires that your processor implements the AVX instruction set. You can check this by looking for AVX in the flags line of the output of the command “cat /proc/cpuinfo”. If SynthEyes crashes at startup and you have an older processor, that is the problem. The AVX instructions were introduced over a decade ago, so this is not unreasonable. SynthEyes uses AVX for performance not possible with old CPUs. SynthEyes 1905 can be run on older CPUs.
  • Redhat/CentOS 7. Other Linux versions are likely to work but are not officially supported.
  • 64-bit Intel x64 architecture processor with AVX instruction set (“Sandy Bridge” or later or comparable AMD)
  • 2 GB minimum RAM. 4+ GB strongly recommended. 8-32 GB or more typical for pro, 360VR, and film users.
  • 1280×768 or larger display with OpenGL support. SynthEyes supports high-DPI monitors under manual control, though OS-drawn dialogs are unlikely to match. RED GPU decode assist requires 2+ GB video RAM.
  • Note: 32-bit 8bpc RGBA format is required. 30-bit color (10bpc) is not supported until it is better developed and supported in Linux and programming information for developers is available.
  • 3 button mouse/trackball with scroll wheel. Adjust “Reverse Mouse Wheel” preference to taste (normally to achieve push-in/pull-back behavior). Use the “No middle-mouse button” preferences for tablets. While trackpads operate with SynthEyes, they are not efficient or ergonometrically suitable for production tracking work.
  • Full keyboard with arrows and side number pad suggested for maximum efficiency.
  • Approximately 415 MB for a default install; 1.3 GB disk space with Tensorflow GPU overlay. Tutorials and other learning materials are online.

General

  • Keep the operating system up to date within its overall release (eg CentOS 7 LTS). While SynthEyes should work with older builds, sometimes OS vendors make changes that cause problems when SynthEyes is newer than an older operating system file.
  • A user familiar with general 3-D animation techniques such as keyframing.
  • A supported 3-D animation or compositing package to export paths and points to. Can be on a different machine or operating system.
  • Windows, Linux, and macOS licenses are separate. There is a cross-platform option for the Pro version. A seat license is not a floating license or for multiple users; read more about multiple installs. This helps keep SynthEyes affordable for everyone.
  • In addition to image sequences, Linux supports only ARRI, Blackmagic BRAW, and RED R3D movies, not avi, mov, mp4, wmf, etc movies.

 

 

New in version 2024.1

  • Directly supports all SynthEyes lens models, animated distortion, and off-center lenses.
  • Export 3D meshes and textures directly to After Effects with the AE version selector set to “Beta after 2024.1.”
  • Updated plugins for compatibility with Apple M1/M2/M3 processors and Multi-Frame Rendering on Windows and macOS.
  • Automatically reduces the number of exported tracker layers based on a user-defined limit.
  • Supports solver-side distortion, though not identical to Fusion and Nuke’s zero-pass workflow due to AE limitations.
  • Properly supports shots with non-square pixels without precomps.

New Build 2304 Highlights

  • New “industry-standard” lens models in the solver and image preprocessor: 4th-order Brown-Conrady, 6th-order anamorphic (including rotation and horizontal and vertical scaling), and fisheye lens models, in match-moving-friendly and compositing-friendly versions. (Anamorphic lens models unavailable in the Intro version.)
  • Animated lens model parameters. To achieve low-jitter results, parameters can be animated on specific key frames, with linear interpolation in between. The solver optimizes the key values to produce the best overall solution including the intermediate frames. This is much better approach than filtering, or solving on every frame independently! Parameters can be animated on every frame, mostly to help decide where to put key frames, but keying on every frame should not be used for final solves.
  • New “anamorphic distance” solving, to accommodate anamorphic lenses with different horizontal and vertical entrance pupil (nodal point) locations. Anamorphic distance changes the basic perspective transform, addressing problems that cannot be corrected by any possible 2-D lens distortion. Supported in Alembic, Blender, FBX, Nuke, and USD(A) exporters by vertex caches for all meshes.
  • Seven new SynthEyes 2304 tutorials.
  • New Anamorphic Shots Guide.
  • New “Lens/True Field of View” script computes and displays the field of view and focal length of the original image itself on the horizontal centerline. Compare these results to the on-set measurements, not the solver outputs, which correspond to the initial undistorted image or the output of the image preprocessor after running Lens Workflow.
  • New focus (distance) informational channel added to cameras in the Graph Editor to help with lens breathing shots. Scripts to load focus or other metadata into the channel from shot metadata or YAML files.
  • Major Nuke exporter update, including usage of the built-in Lens Distortion node (not STmaps) for 0-pass (overscan rendering), 1-pass, and 2-pass lens workflows. Support for the Nuke’s beta “New 3-D” system. Auto-run exported .nk files, or paste them into existing comps.
  • Exports new distortion models (except fisheyes) to Fusion’s LensDistort node for 0-pass (overscan rendering), 1-pass, and 2-pass lens workflows. Option to include absolute mesh pathnames, for use with Deadline.
  • New “Keep World Size” control on the Advanced Solver Settings panel for special circumstances where direct expert manual control is desired.
  • Added “Force 8-bit ProRes” preference for M1 Minis to work around macOS Ventura bug.
  • Workaround for a Windows issue that caused lidar meshes to be displayed much slower than necessary.


California Consumer Privacy Act (CCPA) Opt-Out IconYour Privacy Choices Notice at Collection