Camera Communication

Welcome Forums Technical Hardware Support Camera Communication

Viewing 4 reply threads
  • Author
    Posts
    • #1034
      Anonymous
      Inactive

      Hello everyone, I have a question regarding the communication between the camera and RPi PCB.

      I can see the hot-shoe precision geotag cable and the shutter release cable are soldered to a Lemo 4 pin connector. Then, it is connected to the “Green ID JR cable” in Fig 12.3-6, which is connected to the RPi PCB U$28 (GND), U$30 (Adafruit Pro Trinket 3V), and U$31 (NPN transistor).

      Therefore, my question is, how is the time and location information embedded with each picture taken? To my understanding, the Trinket only works as an intervalometer. Wouldn’t the camera require communication with a component capable of time sync, processing, and storing?

    • #1035
      Anonymous
      Inactive

      As far as I understand, the time and location information is not embedded in each taken picture. All data is logged on the SD card and can be imported into the software that you use (e.g. agisoft metashape) or can be embedded in the image with something like EXIF tool!

    • #1036
      Ryan B
      Keymaster

      Hi, Zi.

      Luka is correct in that the time and position information for each image captured by the Sony camera is not embedded within the JPG files’ EXIF metadata. What happens is the hot-shoe feedback sensor is wired into one of the Event Input pins on the Applanix APX-18 GNSS/INS sensor. Each time the camera captures an image a simple signal is sent to and detected by the APX-18 and an Event (basically a timestamp) is recorded within the APX-18’s binary .T04 file. As part of the post-processing workflow, a camera events trajectory file is generated as part of the Applanix POSPac UAV data processing step (see section 2.5 of the documentation for details). This trajectory file contains the precise timestamps and position and orientation estimates for each Event detected by the APX-18. Within the 7_preprocess_images script, one of the operations performed is that trajectory file Events are matched (simple sequential alignment) with the corresponding JPG files captured by the Sony camera during data collection and a new trajectory file is generated (see section 2.9 of the documentation for details).

      I hope this helps your understanding of how the exterior orientation parameters for each image captured by the Sony camera are estimated within the OpenMMS solution.

      Ryan

    • #1044
      Anonymous
      Inactive

      Thanks, Luka and Ryan. Your answers are very helpful, so the pictures require post-processing. This leads to another question, how do the Livox MID-40 record time and time sync with the Sony a6000? Is it the same principle as the Sony camera?

    • #1045
      Anonymous
      Inactive

      Hi!

      Yes, I mean in every case you need to post-process the data. You have to post-process the trajectory for LiDAR data georeferencing and for image coordinates too!

      And regarding the time sync, technically, LiDAR and camera data do not have to be synced to work. It has to be georeferenced.

      It is time synced because Lidar and camera, independently, have to be precisely geo-referenced, which means that they have to be recorded in reference to the GPS time. So they both are “synced” to GPS time.

      Anyways, I think that technically LiDAR sync with GPS is made via PPS signal, while the camera is synced to GPS via hotshoe i.e. GPS is logging the moment of exposure in the GPS data log.

Viewing 4 reply threads
  • You must be logged in to reply to this topic.