Tracking the Move involves two major steps: image analysis and sensor fusion. Images from the PlayStation Eye are analyzed to locate the illuminated sphere that sits atop the controller (because the sphere is lit, it can be tracked even in complete darkness). Color segmentation is used to find the sphere in the image, and then a projected sphere model is fit to the image to extract the 3D position of the sphere. The results of the image analysis are fused with inertial sensor data from a 3-axis accelerometer and 3-axis gyroscope to provide the full state using a modified unscented Kalman filter (LaViola and Marks 2010).
The continual adoption of new technology has been a key factor in the growth of the video game industry. Advances in graphics, processing, display, and input technologies have both improved existing experiences and enabled new ones that appeal to a wider audience. Looking forward, there is every indication that the video game industry will continue to leverage new technology to help push the boundaries of play.
LaViola JJ Jr, Marks R. 2010. An introduction to 3D spatial interaction with video game motion controllers. Course presented at ACM SIGGRAPH 2010, Los Angeles.