Developer


Matrix Pilot Camera Targeting: Tree Test - Software Stabilized


Strategy: Have MatrixPilot / UAV DevBoard (UDB) point camera roughly in the right direction.

Take out the jitter movement using software stabilization.


The above flight was flown on a day with 10mph of wind, with a lot of turbulence (I included the landing in the video to show the turbulence).


1. The UDB is calculating its orientation 40 times / second using a 16 bit Direction Cosine Matrix

2. I use se Bill Premerlani's "High Bandwidth Dead Reckoning", which means MatrixPilot knows it's

absolute position 40 times / second, by integrating the accelerometers. The accelerometer positions are

corrected by allowing for the GPS delay and some of the GPS dynamics (GPS info arrives at least 1

seconds after the real event).

3. The camera code computes the target location from the above 40 times / second.


The main issue at the moment is that I'm using a camera with progressive scan. This causes each frame of the image to be distorted when the camera rotates (accelerates) in a new direction.

Pitch Servo resolution is 0.2 degrees which translates into moving the picture 7 pixels. Ideally it would be at least 1/10th of that, e.g. 0.02 degrees.

Photos of the build of this project are here.
Main Wiki for MatrixPilot is here.

For reference the flight path, in autonomous mode, is shown below.

E-mail me when people leave their comments –

You need to be a member of diydrones to add comments!

Join diydrones

Comments

  • Yes Pete I realise the adjustments are the hard part with DCM, Euler, or Quaternions. I was just saying the more basic parts could be adapted from software in the libraries already in use so its "only" the adjustments left. If I had a decent pan and tilt setup I would love to get my "hands" dirty with this.
    Inspirational
  • Developer
    @Ritichie: There are three common ways of representing orientation of the plane with respect to the Earth. MatixPilot uses Direction Cosine Matrix. This makes it very easy to do the maths and the software. DCM is a 3*3 matrix which describes how to rotate the plane's reference axis into the Earth reference axis.(and back again with the inverse matrix). If APM is using Euler angles (Yaw, Pitch, and Roll) as it's primary model for orientation, then the software will be much harder to write. The third representation is Quaternions, but we won't discuss that now.
  • Very impressive especially the coding.
    I LOVE your twinstar setup, very if it fits it works. Inspiring.

    APM already gives bearing to next waypoint so I'm guessing a POI bearing would use the same code and then attitude adjustments (ie banking angle).
  • Developer
    @Thomas, Yes I'm using the standard lens, (about 30 degrees of view), and I have further cropped about 10-15 percent of the video away for the video above. The post-process software stabilization attempts to keep the centre of the image stable by moving how the picture is projected onto the canvas. This means that the centre will stay steadier, but the outside of the video (the outside box of the image) will move around. This then distracts the eye a little, so I have cropped off the moving edges of the video. The overall result is a fairly small field of view. I originally was going to use full 1920 HD, so I would have had a lot of pixels left after all that. However I could not get that working with Cinelerra. So I reduced to 1240 * 720 (or something close to that), which I could then process with Cinelerra.
  • Developer
    @Marc,

    To include recognition and visual tracking, I would follow the work from PixHawk using the UDB and MatrixPilot. The UDB and MatrixPilot would plug to their architecture block as the IMU. Then add another computer running Linux (they call that the Flight Controller), which receives high speed telemetry from the IMU (matrixPilot), and add the camera as a USB device to the Linux computer (Flight Controller). PixHawk have most of the software and build environment already done for that. The communication protocol (MavLink) is likely to work just fine, and will probably save memory and CPU cycles over what we do now. However we would have to up our baud rate from 19200 to the 57K range.
  • Very nice demonstration Peter, well done!
    Without that targeting software, my main goal would be sooo far away in the future... A real work of art! Now we just have to wait until someone writes a recognition and visual tracking application suitable for a dsPIC , then this is turning into extremely interesting stuff...

    /Marc
  • This is a great tool for Photosynth-ing small discrete sites. Can't wait to try it...
  • That is a really great development and rather useful then sparkling led and aerobatic stunts ;-)
    I will implement the same feature to my GluonPilot/EasyStar/Kodak Zx1.
    @Pete: do you still use the standard lens on your camera or do you use any wideangle lens?
  • just.... just wow......

    any chance arducopter / pilot gonna get this function hehe
  • Moderator
    Nice one Pete well done
This reply was deleted.