3D Precise position/orientation tracking for body-worn camera Answered
It's not a new question but the available technology keeps changing so I'll ask this again, having not found any current answers online that are useable at a diy, non electrical engineering level...
I'm working on a project in which a body worn camera is tracked in space, then recreated as a virtual camera in the corresponding virtual 3d space (kind of like a lo-fi version of the cameras used on the "Avatar" set). This is to be done over 15-60 minute durations in an indoors area of about 30X30 feet.
What I need for this is tracking of the camera's position and orientation in space, precise and continuous (as much as I can get, but at least within and inch or 2, and a few degrees, of the true values), to be either logged on the tracking device and/or transmitted to a computer on set.
Since it is body worn, it needs to be light and wireless.
Since it's on a moving body in a space with some objects in it, optical tracking isn't really a good option.
I looked at a wide range of solutions, from "flock of birds" and the inertiaCube on the crazy-expensive side to sparkfun's IMU sensors, but frankly am a bit bewildered by the selection and don't know how to figure out which is the best to try first, given my needs. I'm really not looking to reinvent the wheel or rebuild everything from scratch, just find a solution for this not costing more than a few hundred dollars that is fairly reliable.
Software-wise I want to be able to get 3D XYZ position and rotation values from it that I'll be able to use in Realtime 3D software such as Max/MSP Jitter and the like.
Any ideas, stories of previous experience with such setups, links, etc would be extremely welcome.