I am working on a project to align real world cameras with their virtual counterparts rendered in DirectX. This has gone fairly well and we currently have a test application with a real video camera looking at a flat wall and the same environment in a virtual environment. The real world environment was measured (height of camera, distance to wall, height of wall, width of wall, FOV of camera) and these measurements were applied to the virtual (DirectX) environment. It lines up nicely and putting tape on the wall to mark certain measurements lines up almost perfectly within virtual counterpart.
This was our intial test, as we actually need to use the floor as our "mapped space". We have a 16 foot square area of the floor taped out into 8x8 cells that are 2 feet square each. At each corner of this grid will be one real camera. These real cameras need to have virtual counterparts so that we can map clicks in the virtual "floor grid" to actual locations in the real world "floor grid". The locations don't have to be exact, but at least able to know which cell was clicked on. Translating our test from the flat wall to the floor, we measured everything again but realized that there is a problem when looking at the floor in the real camera view. Mainly, the view of the floor is distorted compared to that of the wall. (By "looking at the floor" I am not referring to tilting the camera down, merely having the camera look straight ahead and observing the floor in the image the camera is displaying) This, I believe, is due to the focal length of the camera. I have a very limited understanding of photography and lenses so I am seeking any advice on how to compensate or correct this distortion so that our virtual and real environments can line up. I imagine that this correction would be done by adjusting the DirectX code, but perhaps it is a fundamental issue that I am not aware of with my limited understanding of cameras and lenses.
Any help and guidance with this issue would be most appreciated.