Week: 24 – 30 May
Early this week, I tried to track several shots from the indie film project as personal practice since I’m not signed up as a matchmover in this project. Last week, I noticed that the videos have a lot of motion blurs which Emma and Marianna experienced difficulties to solve the tracking in 3D Equalizer. So I wanted to try and see if I’m able to do it in Blender using the knowledges that I learned several weeks ago. After looking at all the videos again, I decided to try on the CC_0100 and CC_0200 first.
CC_0100
CC_0100 has a fast camera zoom at the beginning which caused very high blur. The first method I tried was to use the ‘Detect Features’ to automatically place markers on areas where Blender think it can track for the whole duration of the videos. Unfortunately, it failed to recognise the ground area probably because of the blur during the zoom. I proceeded with the ‘solve’ function just to see if it can actually solve the camera movement with the automatic markers but the result was very bad.
I had no other option but to manually place markers and probably tracked it frame by frame. I analysed the video again and noticed the image was sharp towards the end of the footage. So I decided to do a backward tracking. There were actually many contract areas which I can place the markers, but most of them were small which completely disappear during motion blur.
When I skimming through the video, the only contrast areas that still has good visibility during motion blur was the road lines. So I placed several markers at the sharp corners of the road lines.
I then tracked all the markers backward frame by frame carefully and stop when the markers deviated from its original place because of the blurry frames. I manually placed the markers by guessing the correct positions. The road lines make it quite easy to predict the locations even when its blur because I can still see the line’s corners.
After several tries, correcting positions and removing bad markers, I managed to solve the camera movement with solve error of 0.89. It’s not great but that was lowest I can get before the camera alignment broke if I do a bit more clean up.
Video
CC_0200
Next I proceeded with the next video, CC_0200. This footage has a lot of camera movements with rotation and position. The same problems happened with this shot when using the auto detect features. I then used the same tricks as the CC_0100 to manually place and track the marker.
This shot was multiple times harder to track compared to CC_0100.
It took me about 5 hours on this shot and I finally managed to solve the camera with accurate camera movement, but with a very high solve error value of 15.30 px. I could get lower value than 1.0 by deleting high deviation markers, but the camera movements would not aligned properly anymore. I guessed I cannot rely on the numbers only and probably because of the high vibration of the original camera, Blender think the markers have bad deviation.
Video
I then sent both videos to our Discord server during today’s class (26th May). Luke said the tracked were good and asked if I’d like to pick other shot and help the team with the camera tracking.
I’m not really sure because the tracking I did before was done using Blender and not 3DEqualizer. I don’t remember much about 3DEqualizer since I only used it several time only during Term 1. I may need to re-watch the Dom’s tutorials to recall the important technical aspects of the software. I think this as a good opportunity for me to practice the software again as 3DEqualizer is one of the industry’s standard for matchmove and tracking.
So I said to Luke that I will try to track on the same footages that the other matchmovers were doing at this moment as a backup and see if I can good results.