NavList:
A Community Devoted to the Preservation and Practice of Celestial Navigation and Other Methods of Traditional Wayfinding
From: Frank Reed
Date: 2009 Dec 13, 19:39 -0800
Douglas Denny, you wrote:
"This is an interesting technique.
The error difference using the Moon in you photographic lunar method could
most likely be horizontal parallax and other errors. Does the method of
translation of photographic star/Moon distance into RA and Dec incorporate
clearing the distance?"
Clearing the distance is automatic in a system like this. What you get is the Moon's topocentric RA and Dec. You then use whatever software you like to find a geographic location where the Moon would have exactly that topocentric position. Or, if we want to pretend we're doing some sort of 21st century version of 19th century lunars and we're trying to determine GMT (purely a fantasy), then you would vary GMT from your estimated position until the topocentric position measured from the photo matches.
And you wrote:
"Of course it must clear the distance with only arc seconds of error. Refraction errors maybe?"
If you have enough stars, this is accounted for automatically. Refraction is just a non-linear distortion of the field of view and for almost all altitudes, the usual algorithms in these pattern-matching systems can handle that. The stars amount to "control points". If we have a few dozen of them scattered across a frame, we end up with a mapping that lets us take any pixel position and read off its celestial coordinates. It's interesting to note that images taken at low angular elevations also yield an estimate of the vertical from this pattern-matched refraction data. So if we leave the Moon out of it, you could use three cameras, or more, to photograph the region three to ten degrees above the horizon continuously at widely separated azimuths. The refraction data will tell you how much the camera is inclined with respect to the horizon. Then you have another camera, probably rigidly fixed to the others, photographing towards the zenith. The inclination data from the low cameras then tell you where the true zenith lies in the upward pointing camera. The same pattern matching gives the RA and Dec of the zenith. And with a small amount of calculation, you're done. You've got a position fix. And the cameras can do this every few seconds all night long.
-FER
PS: On a related note, there's a "science" called digital photogrammetry which takes digital images of things like buildings taken from a few angles and combines them to produce accurate three-dimensional digital models. Even five years ago, these folks were getting 0.1 arcminute accuracy using mid to high end (so-called "prosumer") digital cameras.
NavList message boards: www.fer3.com/arc
Or post by email to: NavList@fer3.com
To , email NavList+@fer3.com