Vicon Walkthrough

From BlackBox
Jump to navigation Jump to search

About this walkthrough

This walkthrough was created by Ai Nakatani and originally located on the HVI lab wiki. This page (according to Ai's last notes) is still under construction.

Vicon Hardware

  • Datastation
    Hardware8.jpg

  • A host Workstation PC
    Hardware4.jpg

  • Camera units
    Hardware5.jpg Hardware6.jpg

  • Calibration objects
    • L-Frame
      Hardware3.jpg

    • Wand
      Hardware7.jpg

  • Markers
    Hardware1.jpg

Workflow

  1. Turn on two Windows machines in Room 3850
    1. sr-00153
    2. sr-00151
  2. Turn on the Datastation.
  3. Run one application each on both machines:
    1. Tarsus2 on sr-00153
      Vicon1.gif
    2. VICONiQ2 (also known as Eclipse or Workstation) on sr-00151
  4. Verify that the Vicon DataStation is connected
  5. Position cameras to cover desired capture volume while maintaining appropriate overlapping regions. You may skip this step if cameras are already positioned correctly (they usually are).
  6. Set up the VICONiQ2 hierarchy to organize your motion capture data
  7. Turn on the projector using a remote control.
    Drag and drop VICONiQ2 window to the far left side of the screen to see it on the projector.
    Hardware2.jpg Hardware9.jpg

    If you can't see the Desktop on the projector screen, switch to 1 (as shown below) to connect the Datastation PC with the projector.
    Hardware10.jpg

  8. Calibrate the system
  9. Attach the desired marker set to subject. (please refer to "Where to place markers" on p.28 of VICON System Manual)
  10. Capturing a range of motion (ROM) trial
  11. Create a calibrated subject
  12. Capture your data
  13. Process and edit the data
  14. Save and export the data to other Vicon or third party applications
  15. Don't forget to turn off the projector and the Datastation before you leave!

Connecting to the Vicon DataStation

Once the VCONiQ2 program is running on sr-00151, verify that the Vicon DataStation is connected to the software. To connect to the Vicon datastation, follow these steps;

  1. Click either the Capture or the Calibrate tab.
  2. On the lower left hand side of the window, find the two round indicators that are located next to each other. One indicator will either be labelled as DataStation Connected or DataStation Connected; the other will be labelled as Realtime Connected" or Realtime Not Connecting".
  3. If the labels indicate the you are not connected to the DataSation, click the button with two triangles on the lower right hand side of the window.

Vicon screenshot connect.JPG Vicon screenshot connected.JPG

Data Organization

  1. Go to Data Management
  2. If you are not working with an existing database, create a new database by following these steps:
    1. Click Eclipse > New Database....
    2. Select a location and enter the name for your new database.
    3. Select Generic Template .eni
    4. Click Create
  3. To create a new subject, click ViconIcon2.gif
  4. To create a new session in the subject, click ViconIcon3.gif
  5. A new trial will be made automatically when you capture. The system will automatically name trials unless you name them. It will increment a number at the end of the take name (Trial001, Trial002, and so on).
  • Icons that appear in the hierarchy
Icon Notes
ViconIcon1.gif Project
ViconIcon2.gif Subject
ViconIcon3.gif Session
ViconIcon4.gif Trial
ViconIcon5.gif Unprocessed video data, opens in monitor window when clicked.
File extension: .tvd
ViconIcon6.gif Processed virtual 3D motion, opens in Workspace window when clicked.
File extension: .c3d

Calibration

Summary

  • Calibration determines each camera's location and orientation of the 3D Workspace so that the location of each marker can be obtained.
  • Calibration should be performed frequently.
  • Two types of calibration must be done:
  1. Dynamic calibration
  2. Static calibration

If you reposition even one camera, follow these steps before calibration:

  1. Select Setup
    Vicon2.gif

  2. Select Mask tab on the right
  3. Click Start Recording Background
  4. Click Stop Recording Background

Dynamic calibration

Summary

  • Calculates the relative positions and orientations of the cameras
  • Linearises the cameras
  • Tool required: 240mm calibration wand

Steps

  1. Click Calibrate
    Vicon4.gif

  2. Change the viewing pane to Camera (right below Setup)
  3. Click connect/disconnect button (two red arrow heads pointing at each other) to connect to the Datastation. DataStation Connected and RealTime Connected icons should turn green from blue. If it turns yellow, check if the Datastation is turned on.
    Vicon12.gif

  4. Make sure 240_mm_Wand is selected
  5. Click on the first camera (1) and Shift-click on the last (12). This should show small camera views for all of the cameras
  6. Click Start Wand Wave
    Vicon8.gif

  7. Move the calibration wand throughout the whole volume (important!)
    1. Start on the outside of the volume, facing inwards
    2. Wave the wand in a vertical figure of eight in front of you while moving around the outside of the volume
    3. After a circuit, move inwards and continue spiralling
    4. Finish by crouching in the center, waving the wand in circles near the floor and spiralling up to head height
      Calibration screen-shot (you should be able to see it on the projector during the calibration)
      Vicon6.gif

      Camera positions
      CameraPosition.gif
  8. Click Stop Wand Wave
  9. The calibration results will be shown in Status Report (Awesome! is the best result, followed by Excellent!, Good, and then Bad). All of the cameras should be in the Excellent or better.
    Vicon19.gif

Static calibration

Summary

  • Calculates the origin or center of the capture volume and determines the orientation of the 3D Workspace
  • Tool required: L-Frame calibration object

Steps

  1. Place the L-Frame calibration object at the center of the capture volume or wherever you'd like the origin of your floor to be.
    The side of the L-Frame with two markers at the end is the X-axis, and the Z-axis is perpendicular to the floor.
  2. Click Track L-Frame
    Vicon11.gif

  3. After a few seconds, click Set origin
    Vicon9.gif

  4. Now you can save out your calibration

Capturing a range of motion (ROM) trial

Summary

  • A range of motion (ROM) trial is used to change the proportions of a basic/generic skeleton template (VST) that has generic marker positions to the physical dimensions of the captured subject (VSK)
  • Basically, VSK file is a customized VST file. You can use one VST to create any number of customized VSK files; one VST can be used to create different VSKs to match different subjects' body proportions. The VSK file will make labeling much easier.
  • VST files can be created using VICONiQ's Modeling mode. How to create a VST file

Steps

  1. Go back to Data Management and make sure you are in the session you want to be.
  2. Select Capture and then Live 3D workspace
  3. Type in a file name
  4. Select Type 2D Camera Data but not Realtime Output
  5. Click Start to start capturing a subject's movement
    Vicon10.gif
  6. Have the subject start with a T-pose, rotate all joints (wrist, lower and upper arms, shoulders, back, neck, head, etc.) one by one, and end with a T-pose.
  7. Click Stop to finish capturing
  8. Click Load Into Post; doing so will automatically select Post Processing mode
    Vicon13.gif

Create a calibrated subject

Data Reconstruction

Summary

  • Reconstruction turns the 2D camera data from all cameras into one 3D data file
  • Poor calibration causes a poor 3D data

Steps

  1. Select Pipeline tab. If you don't see the tab, select Post Processing
  2. Double click on CircleFit, Reconstruct, Trajectory option. If the option is not available, click on the add button ViconIcon7.gif and double click on the CircleFit, Reconstruct, Trajectory option.
  3. Set the two parameters as follows:
    1. Min. Cameras to Start Trajectory: 2
    2. Min. Cameras to Reconstruct: 2
      Vicon14.gif

  4. Then, select Show Advanced Parameters from the pop-up window. This should show additional parameters. Try the following:
    1. Delete single frame trajectories: Yes
    2. Min circle diameter: 1
    3. Max circle diameter: 30 to 100
    4. Circle fitting error: 1.3 to 1.5
    5. Minimum circle fitting quality: 0.45 to 0.5
      Vicon15.gif

      You may try different numbers depending on capture conditions and quality of the calibration

  5. Right-click on CircleFit, Reconstruct, Trajectory and select Run Selected Op
    Vicon16.gif

  6. Now you can see the markers in motion through the timeline.

Labelling

Summary

  • Labeling is basically the assigning of specific names to specific markers; you need to label markers since VICONiQ doesn't know which marker belongs to which part of a human body
  • If the movement is simple, you usually need to label markers in a single frame
  • There are 4 different modes for labelling, but you need to know only two:
  1. Single allows you to assign just one marker at a time
  2. Sequence allows you to sequentially select markers and identify which ones they are
  • There are also 4 different rules for how the label is assigned to the marker through time:
  1. Forward assigns the markers from the current frame
  2. Backward is the opposite of Forward
  3. Whole assigns the entire markers as long as they exist. Useful if markers don't swap or get confused with one another
  4. Range labels a range of markers

Steps

  1. Select Post Processing and select Subjects
  2. Click Create Vicon Skeleton Template (VST)
  3. Select a VST file (for a full-body capture, choose iQ_HumanRTKM_V1.vst under C:\Program Files\Vicon\Models\VICONiQ2.0) and type in a name
    Vicon17.gif

  4. Select Labelling and click Sequence and Whole
    Vicon18.gif

  5. In 3D Workspace, manually label markers as follows (for the labels, please refer to "Where to place markers" on p.28 of VICON System Manual)
    1. Select a label (LFHD, RFHD, etc.) from the list
    2. Click a corresponding marker (white dot) in 3D Workspace
  6. Go to Pipeline
  7. Make sure to set the current frame to the frame that was labeled in the previous step.
  8. Right-click on Autolabel Range of Motion and select Run Selected Op (if the operation is not available in the list, add the operation by clicking ViconIcon7.gif) This will auto-label the rest of the frames in the ROM trial.
  9. Also run Delete Unlabelled Trajectories
  10. You can check trajectories continuity by looking Continuity Chart

Subject calibration

Steps

  1. Make sure you are in a T-pose frame. Select Subjects tab
  2. Make sure that the Active Subjects box is on Edit mode.
  3. Select G button and then T button. Now you should see a blue T on the timeline.
  4. Click a small button next to Calibrate Subject button.
  5. In the option window, set Calibration Quality to Medium. Also, if you set Autosave to Yes, then VSK will be saved automatically; otherwise, export VSK manually later.
  6. Finally, click Calibrate Subject. You should see your "sticks and boxes" subject model (if you don't see this, go to the View Options tab on the left and make sure the subjects box is checked).

Capture

Summary

  • Capture is the same as what you did for the T-pose.
  • Make sure that the range of motion (ROM) trial is in a good quality. If not, re-calibrate the system and re-capture a ROM trial.

Steps

  1. Go back to Data Management and make sure you are in the session you want to be.
  2. Select Capture and then Live 3D workspace
  3. Type in a file name
  4. Select Type 2D Camera Data but not Realtime Output
  5. Click Start to start capturing a subject's movement
  6. Click Stop to finish capturing
  7. Click Load Into Post; doing so will automatically select Post Processing mode

Process and edit the data

Summary

  • At this point, you should have the VSK file created for the subject, so you do not need to manually label all the markers for each trial.

Steps

  1. Select Pipeline tab. If you don't see the tab, select Post Processing
  2. Double click on Load Subject(s) option. Check if the VSK file that you created in the subject calibration step is selected as Input File Name(s). If not, click on the browse button and find the VSK file. Set Yes to Use File Name option.
  3. Right-click on Load Subject(s) and select Run Selected Op.
  4. Double click on CircleFit, Reconstruct, Trajectory option. If the option is not available, click on the add button ViconIcon7.gif and double click on the CircleFit, Reconstruct, Trajectory option.
  5. Set the two parameters as follows:
    1. Min. Cameras to Start Trajectory: 2
    2. Min. Cameras to Reconstruct: 2
  6. Then, select Show Advanced Parameters from the pop-up window. This should show additional parameters. Try the following:
    1. Delete single frame trajectories: Yes
    2. Min circle diameter: 1
    3. Max circle diameter: 30 to 100
    4. Circle fitting error: 1.3 to 1.5
    5. Minimum circle fitting quality: 0.45 to 0.5
  7. Right-click on CircleFit, Reconstruct, Trajectory and select Run Selected Op
  8. Also right-click on Trajectory Labeller and select Run Selected Op. This automatically labels all markers by using the VSK file.
  9. Check if markers are labeled correctly. If not, correct labels using Single and Forward under Labelling. Continuity Chart is again helpful.
  10. Go to Pipeline and run the following operations in the order by right-click and selecting Run Selected Op
    1. Trim Tails will remove unreliable trajectory data
    2. Filter Using a Butterworth Filter will remove frequencies that are inconsistent with the markers
    3. Fill Gaps using Splines
    4. Kinematic Fit will fit the kinematic skeleton into the markers using the VSK file
    5. Fill Gaps using Kinematic Model will filing in missing data from occluded markers
    6. Delete Unlabelled Trajectories (note: run this operation only if you don't have any mislabeled trajectories, that is, Trajectory Labeller operation has been successful)

Export the data

Steps

  1. Select Pipeline tab under Post Processing
  2. Run one of the following operations by right-click on the operation and selecting Run Selected Op:
  • Export data to C3D file for use with MotionBuilder
  • Export motion to V-file for use with Maya or MotionBuilder
  • Export data to CSM file; doing so will produce xyz marker coordinate values and export to a single file.
  • Export data to CSV file; doing so will produce all joint parameters and gait parameters. You can directly load them into Excel.
Note: if the operation you need to perform is not available, click on the add button ViconIcon7.gif to add the operation

Create a VST file

UNDER CONSTRUCTION

Summary

  • VST files can be created using VICONiQ's Modeling mode
  • You are creating bones and joints by creating segments.

Steps

  1. Capture a trial.
  2. After selecting Load Into Post, go to Pipeline and run CircleFit, Reconstruct, Trajectory and Export data to C3D file
  3. Go to Modeling
  4. Go to File and select Clear
  5. Import C3D file which you exported
  6. Choose No for T-Pose, and enter a single frame #.
  7. On the Outlier tab, click Root
  8. Click Manipulate and move a small box with three arrows. This is your first joint, so place it at an appropriate position. You can click-drag the screen to change the view.
  9. Make a segment that connects two joints by doing the following:
    1. Click Select and click the joint (Joint A)
    2. Click Segment and click anywhere to create a new joint (Joint B)
    3. Click off Segment. Now you see a new segment that connects Joint A to Joint B.
    4. Click Manipulate and move Joint B to an appropriate place. You may also want to change the joint type (Free, Hinge, Rigid, etc.)
  10. Repeat the previous step to create all the remaining segments
  11. You also need to create markers:
    1. Click Select and then Marker
    2. Click on the screen to create a marker. You'll see all newly created markers under ModelMarker of Outlier. You should always rename markers so it is easy to identify them later.
    3. Click off Marker
    4. Click Manupulate and marry the marker you created with the real marker captured by the cameras.
  12. Repeat the previous step to create all the remaining markers.
  13. Go to File, select Save as and save your VST under the correct directory.

TIPS

  • Use Alt key and mouse to rotate the view

Glossary

Calibration: Method used to measure the relative locations of all the cameras in the motion
Capture Volume: The actual amount of space in which you are able to capture data.
Datastation: Hardware which captures all camera and analogue data. This provides the link between the cameras and analogue devices and the PC running the Workstation software.
Subject: Person or object whose motion is to be captured
Trial: An act of data capture. Also refers to resulting C3D file.
Wand: Device used for dynamic calibration.
Workspace: 3D space in which motion will be captured
Workstation: Software that controls the Datastation. This is installed on a PC linked to the Datastation.

Related Documentation

Exact same version (VICONiQ2.0)

Newer version

Older version

Others

Papers

  • Dorfmuller-Ulhaas, K. 2003. Robust Optical User Motion Tracking using a Kalman Filter. In: Proceedings of VRST 2003 Conference.
  • McKenna, S.J., Jabri, S., Duric, Z., Rosenfeld, A., and Wechsler, H. Tracking Groups of People, Computer Vision and Image Understanding, vol. 80, no. 1, pp. 42-56, Oct. 2000.