Skip to content

Latest commit

 

History

History
22 lines (18 loc) · 1.16 KB

README.md

File metadata and controls

22 lines (18 loc) · 1.16 KB

Cooperative-Robot-Nao

Project by Dinghuang Zhang

Please go to branch Robot to see all details 00c13b3ab31ce650a5586ae22592803

Hardware settings Place the robot and the box, the distance between the robot and box is 8cm, and the high is same as robot foundation about waist. Software settings

  1. Power on the robot and press the chest button then wait for 3 minutes until robot say something.
  2. Open the folder named NaoMK7 this is the final vision of the code. Which include.
  3. If more keyword need add to the project please find the library in speech.py App.py Start the program and set target colour. If the surrounding light change, change the colour threshold here. Main.py Picking process combines vision recognition. Detection.py Vision recognition for target hand and centre point Speech.py Speech recognition MoveArm.py Location check and move arm according to Arm.py Hand.py Choose a hand to pick and hold Arm.py Inverse kinematic values

And all the code is available in Google Drive at https://drive.google.com/open?id=1B2eXUXg5A04eGfd2kP-wmQQk89akots0