Welcome › Forums › Technical Hardware Support › Camera Communication
- This topic has 4 replies, 3 voices, and was last updated 2 years, 6 months ago by Anonymous.
-
AuthorPosts
-
-
April 29, 2022 at 4:56 PM #1034AnonymousInactive
Hello everyone, I have a question regarding the communication between the camera and RPi PCB.
I can see the hot-shoe precision geotag cable and the shutter release cable are soldered to a Lemo 4 pin connector. Then, it is connected to the “Green ID JR cable” in Fig 12.3-6, which is connected to the RPi PCB U$28 (GND), U$30 (Adafruit Pro Trinket 3V), and U$31 (NPN transistor).
Therefore, my question is, how is the time and location information embedded with each picture taken? To my understanding, the Trinket only works as an intervalometer. Wouldn’t the camera require communication with a component capable of time sync, processing, and storing?
-
May 4, 2022 at 2:43 PM #1035AnonymousInactive
As far as I understand, the time and location information is not embedded in each taken picture. All data is logged on the SD card and can be imported into the software that you use (e.g. agisoft metashape) or can be embedded in the image with something like EXIF tool!
-
May 5, 2022 at 6:03 AM #1036Ryan BKeymaster
Hi, Zi.
Luka is correct in that the time and position information for each image captured by the Sony camera is not embedded within the JPG files’ EXIF metadata. What happens is the hot-shoe feedback sensor is wired into one of the Event Input pins on the Applanix APX-18 GNSS/INS sensor. Each time the camera captures an image a simple signal is sent to and detected by the APX-18 and an Event (basically a timestamp) is recorded within the APX-18’s binary .T04 file. As part of the post-processing workflow, a camera events trajectory file is generated as part of the Applanix POSPac UAV data processing step (see section 2.5 of the documentation for details). This trajectory file contains the precise timestamps and position and orientation estimates for each Event detected by the APX-18. Within the 7_preprocess_images script, one of the operations performed is that trajectory file Events are matched (simple sequential alignment) with the corresponding JPG files captured by the Sony camera during data collection and a new trajectory file is generated (see section 2.9 of the documentation for details).
I hope this helps your understanding of how the exterior orientation parameters for each image captured by the Sony camera are estimated within the OpenMMS solution.
Ryan
-
May 6, 2022 at 11:59 AM #1044AnonymousInactive
Thanks, Luka and Ryan. Your answers are very helpful, so the pictures require post-processing. This leads to another question, how do the Livox MID-40 record time and time sync with the Sony a6000? Is it the same principle as the Sony camera?
-
May 6, 2022 at 2:02 PM #1045AnonymousInactive
Hi!
Yes, I mean in every case you need to post-process the data. You have to post-process the trajectory for LiDAR data georeferencing and for image coordinates too!
And regarding the time sync, technically, LiDAR and camera data do not have to be synced to work. It has to be georeferenced.
It is time synced because Lidar and camera, independently, have to be precisely geo-referenced, which means that they have to be recorded in reference to the GPS time. So they both are “synced” to GPS time.
Anyways, I think that technically LiDAR sync with GPS is made via PPS signal, while the camera is synced to GPS via hotshoe i.e. GPS is logging the moment of exposure in the GPS data log.
-
-
AuthorPosts
- You must be logged in to reply to this topic.