Week 7 Update

Mechanical

  • The mechanical fabrication is done and testing will be started this week
  • Pulleys and belts have been attached to the robot
  • The drive wheel has been attached along with a pulley
  • A universal motor mount has been created to allow various motors to be tested.
  • Stepper motor has been mounted with pulleys and belt to drive the loading mechanism
Firing wheel
Firing wheel

Wheel collar
Wheel collar for mounting counter weight to shaft

Motor mount
Universal motor mount

Drive belt
Pulleys and belt for driving the firing motor

Stepper motor belt
Stepper motor and belt

Loading mechanism
Stepper motor and lead screw loading mechanism

Electrical

  • A motor driver for the brushless DC motor has been donated by Advanced Motion Control - AMC. (most recent update on this)
  • USB to DB9 cable has been purchased, allowing the uC to interface with MATLAB.
  • The ATMega128 recognizes serial input from MATLAB. Clock timing issues are still preventing the microcontroller from getting the correct values. Each bit of output from MATLAB is read as two bits wide, and then shifted by one bit to the left. This essential reduces us from 8-bits per character to 3-bits and makes decoding difficult and unreliable. Could function as a worst case scenario, but needs to be fixed.
  • Two rotary encoders will be potentially donated from US Digital

Software

Image Processing

  • The image setup routine has been streamlined and automated as much as possible.
  • Control points for transforms are found in a user specified region using the intersection of detected lines
  • The image is then cropped using detected vertical lines
  • Automating this allows for more robust image detection. If the camera is moved slightly, the software can still detect the correct control points for image transforms and croppinng.

Algorithm Overview

Lines are detected from a grayscale image using a Hough transform function. The function outputs coordinates of the endpoints of detected line segments.

Hough lines

The user is asked to specify a region of interest (ROI) using the mouse. Two ROI's are specified for the left and right side of the gameboard.

Region of interest

The intersection of detected lines are found using two successive cross products of the line endpoints. The first cross product of endpoints gives the equation of a line connecting those points. Then a second cross product is taken of vertical and horizontal lines to find the points of intersection (red points). Then the max and min intersection within each ROI is found (green points). The green points will be used as the control points for the projective transform.

Intersections

The projective transform is performed with the calculated control points. Then an affine transform is performed using the control points from the top left, top right and the midpoint of the bottom.

Transform

The affine transform is performed and the image is ready for cropping.

Transform

Current Challenges and Issue

  • The encoder from our HP printer didn't have a dedicated circuit, we needed to source (purchase) a different one
  • The microcontroller has timing issues that affect the accuracy of serial communication.
  • The ATMega128 was broken when trying to modify the fuse bits to correct the clock issue.
  • The brushless DC motor driver has been donated but we are unsure of the delivery date
  • Bearings ordered (needle) for the firing disk were not the proper type (radial), however collars made to allow them to work

Duties For The Following Week

Sean: Clean up and finish image processing
Curtis: Work on bearing issues and pulleys. Mount wheels for robot movement. Make guides for pucks. Refine any mechanical issues.
Eileen: Make hall effect sensors to use as encoders for firing and gantry motors. Get contact switches working.
Cyrus: Integrate the new BLDC driver into the system. Sort out UART timing issues. Get microcontroller working again.