I came out to the back garden yesterday and found this lovable guy playing on the patio!
Another test for Blender using the camera tracking feature. I’m still practicing with this quite a bit, and trying to check out its capabilities since I still get quite excited by the results. Considering this was footage just shoot on my iPhone 4s it turned out pretty good. Blender currently only has camera presets for the top end cameras, so finding sensor sizes and focal lengths for the iPhone on the net can be tricky. However, I found a post over in Blenderartists where Malcando has been collecting settings which seems to have worked perfectly.
I took the opportunity to test out my iRover rig too. It was a productive experience since I’ve discovered a few things that need sorted out before I use him again, particularly his feet. They need some tweaking since they don’t seem to rotate around the z axis without twisting. I had also previously added IK/FK switches too, but I don’t think he’s going to need any FK control for his feet. I may simplify the rig and remove the FK controls. I also need to add in some custom control bones to make the rig easier to pose. And I still have to work on his textures, particularly his eyes, which I intend to have as a digital style display.
Here’s another motion track using Blender 2.62. I thought I would try something a bit more static to test a better example of how well the object is moving with the camera. Another little lesson learnt with this one, I managed to figure out how to pull out a reflection pass to add to the comp. These tracking experiments are as much an experiment in use of Blenders’ node system as they are in testing out the tracking. Of course, I couldn’t just have a completely static object, so I had to throw a little bit of character animation in there as well!
This is the first relatively successful track I’ve managed to create with the new motion tracking features in Blender 2.62. I wasn’t really concentrating on the animation too much, I just gave the little dragon guy a bit of life to see how well he would sit among some live footage. I’m still trying to get the hang of it, I haven’t done much 3d motion tracking in the past (plenty of 2d though). This footage I just shot on my iPhone at my desk, so the compression and motion blur on clip isn’t overly conducive to tracking but with a bit of fiddling and tweaking by hand I eventually got a decent timeline. I had downloaded various clips from the internet, if you do a search for ‘free tracking clips’ you’ll find a few places to download them from, but I was having trying to find sensor sizes and focal length for these particular clips. I then decided to shoot some footage with my phone since I already knew these details and finally got this result. It’s far from perfect but it’s a good start with a Blender feature which I can see fast becoming a powerful tool in my arsenal of vfx tools.