NavList:
A Community Devoted to the Preservation and Practice of Celestial Navigation and Other Methods of Traditional Wayfinding
Re: Sun squash- was Green Flash and Longitude
From: Bill B
Date: 2006 Jan 20, 04:01 -0500
From: Bill B
Date: 2006 Jan 20, 04:01 -0500
> Yup. > > Good catch Bill! As the old saying goes, pick off the easy meat, then work on points for style. It was bogus, IMHO, from the git go. "He knew the focal length of the lens, the size of the sensor in the camera and the number of pixels across the image." One point at a time: "He knew the focal length of the lens" Which matters, why? Perspective is a matter of distance, not focal length. An experiment--Take a camera, use the longest (focal length) lens you have or highest zoom setting, and move back to include your house, mailbox, and other landmarks in the frame. Now (without moving) use the widest-angle lens or lowest zoom setting you have. Enlarge the portion of the wide-angle shot matching the telephoto image to the same size as the telephoto shot. The perspective will be exactly the same. Ditto for the horizon and object at an "infinite" distance from the same location. "...the size of the sensor in the camera..." Again, so what? Size of the sensor vs. focal length/angle of view determines magnification, not perspective. The sensor is just a matrix/grid that determines resolution. Without going into depth about coverage, focal length and view angles, yadda yadda yadda; if the resolution is sufficient to provide enough pixels in the "grid" for accurate measurement, it's a done deal. "...and the number of pixels across the image." As you know by now, I do not hate repeating myself. So what? All the author proposes is using the sensor grid that transforms photons into 24 bit deep (or more) pixels as a measuring device. (Sorry about too much information, but for this explanation, bit depth does not matter much. 16+ million colors or 256 colors matter only if it would help the eye/software in resolving a color image. Sun and horizon, relatively high-contrast boundaries. In the end all the matters is that the sensor created a grid that we can calibrate/measure against known values (in this case, sun SD vs. elevation). Is this idea new to the list? Nope. Frank suggested it in his digital photos of Chicago from the shores of Indiana. The only question might be lens/sensor distortions of the image. That being said, I chose the obvious to pick on first. To Frank Swift: Unless this was a not-so-clever hoax on your part, I don't support shooting the messenger (it keeps me alive;-), and apologize if you are accurately recounting a published article. Perhaps GPS is forbidden in the race and some sort of "GPS watch" was used as the $30 radio-signal-controlled timepiece doesn't work in those waters? It just struck my funny bone. Digital camera, computer with a cel nav program (Davis?), $$ professional image-editing software (PhotoShop), Nautical Almanac data, (and in my mind's eye--pods on the mast etc. relaying data from 1 of the big 2, or 3, racing-software programs working with the boats polar diagrams.) The only thing missing was a radar gun.Bill