Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 8 Next »

Pegasus 2

  • Successful flight test last weekend, Pegasus 2 maiden flight

  • Had some compass issues but a parameter reset fixed them

Antenna Tracker

  • Mech side: Finalizing some mechanical components (honestly hoping it’ll be finished by next week as they are small changes)

IMACS

  • Monitor mount status?

    • corner designs are done

    • working on MDF prototype for the front plate that stops the monitor from falling out. Need to procure self-tapping screws

  • Need a list of specific hardware parts so mech can start the layout and start designing mounts/cases

  • Framework sponsorship?

Autonomy

Task 1: Minimum Viable Product

Goal: Explain what we have done so far and what we want to accomplish to achieve a minimum viable product (MVP)

What we have done so far? Integrated Geolocation and visualized the results with a KML file.

image-20240924-171729.pngimage-20240924-171749.png

Geolocation is able to map pixels in the image to real world coordinates. See our flowchart here: DRAFT 2024-2025 Airside system software architecture - Autonomy - WARG (atlassian.net).

A bit about the flight test:

  • The drone was not simply hovering over the landing pad. We mixed Geolocation testing with pilot training so the drone was moving around the area.

  • The hardware on Houston was being used on the drone. I believe this was the GPS M9N and Pixhawk internal IMU and altimeter.

  • The yellow dots are the detections of where the landing pad is in the world. It is withing 3 meters.

We can create a script that reads the logs and generate a KML file that can be viewed in Google Earth.

How will the MVP look like?

The RPi will power on when the drone is turned on and the airside system will run on startup. The pilot will manually fly the drone over the area and the airside system will detect the hotspots and process them to find the real-world coordinates of the drone. It will save the information to the log files and we can run a script after the flight to generate the KML file.

What we plan to do to achieve the MVP.

  • Cluster Estimation

    • Cluster Estimation

    • Total number of points needed for initial run

    • Total number of points needed for runs after initial run

    • Open up to discussion on what we can do to improve this method for clustering

  • Detect IR beacon

    • I (Mihir Gupta) am a strong advocate for using Classical CV for detecting IR beacons

    • Faster for us to get a working solution so we can flight test sooner

Open to hear ideas for our system. From the conversations before this meeting there were a lot of ideas which is good but I wanted to present what we have done so far and let everyone know so that ideally we create ideas that branch of what we’ve done rather than starting from scratch.

Other

Note, from Ashish Agrahari , will IR emitter and IR camera filter be delivered in time for use in flight test? If not delivered, or IR implementation does not seem reasonable for this flight test, we should maybe go with balloon detection with openCV.

Also would be good to go over Autonomy roadmap 2024-09-21 Autonomy Roadmap and get thoughts on our implementation for Task 1 and 2.

  • No labels