ADEEPT AWR  Wheeled Robot Manual
ADEEPT AWR  Wheeled Robot Manual

ADEEPT AWR Wheeled Robot Manual

4wd smart robot kit compatible with raspberry pi 4 3 model b+ b, opencv target tracking
Table of Contents

Advertisement

Quick Links

1

Advertisement

Table of Contents
loading
Need help?

Need help?

Do you have a question about the AWR Adeept Wheeled Robot and is the answer not in the manual?

Questions and answers

Summary of Contents for ADEEPT AWR Adeept Wheeled Robot

  • Page 2 Resources Links  RobotName: Adeept_AWR RobotURL: https://github.com/adeept/Adeept_AWR RobotGit: https://github.com/adeept/Adeept_AWR.git [Official Raspberry Pi website] https://www.raspberrypi.org/downloads/ [Official website] https://www.adeept.com/ [GitHub] https://github.com/adeept/Adeept_AWR [Image file and Documentation for structure assembly] https://www.adeept.com/learn/detail-35.html...
  • Page 3 Components List Acrylic Plates The acrylic plates are fragile, so please be careful when assembling them in case of breaking. The acrylic plate is covered with a layer of protective film. You need to remove it first. Some holes in the acrylic may have residues, so you need to clean them before the use.
  • Page 4 Machinery Parts...
  • Page 5 Electronic Parts Motor X4 Raspberry Pi Camera X1 Servo x1 Wheel x4 ROBOT HAT X1 18650 Battery Holder Set X1 CAR LIGHT X2 3-Pin Wire X2...
  • Page 6 Adeept Ultrasonic Module X1 3 Tracking Module X1 4 PIN WIRE X1 5-Pin Wire X1 3-Pin Wire X2 Raspberry P1 Camera Ribbon X1...
  • Page 7 Tools Hex Wrench-2.0mm X1 Cross Screwdriver X1 Cross Socket Wrench X1 Large Cross-head Screwdriver X1 Winding Pipe X1 Ribbon X1 Self-prepared Parts Requirements for 18650 lithium battery: 18650 lithium battery is required for normal operation of the robot, and the current output is above 4A.
  • Page 8: Table Of Contents

    Content  1. Premise....................................1 1.1 STEAM and Raspberry Pi..............................1 1.2 About The Documentation............................. 1  2. Raspberry Pi System Installation and Development Environment Establishment............3 2.1 Install An Operating System for The Raspberry Pi......................3 2.1.1 Method A: Write 'Raspbian' to The SD Card by Raspberry Pi Imager..............3 2.1.2 Method B: Download The Image File Raspbian and Write It to The SD Card Manually........
  • Page 9 14.1 Multi-threading Introduction............................. 75 14.2 Realization of Police Lights / Breathing Lights......................75 14.3 Warning Lights or Breathing Lights in Other Projects....................80  15 Real-Time Video Transmission............................81  16 Automatic Obstacle Avoidance............................85  17 Why OpenCV Uses Multi-threading to Process Video Frames..................88 17.1 Single Thread Processing of Video Frames.........................88 17.2 Multi-thread Processing of Video Frames........................89 ...
  • Page 10: Premise

    1. Premise   1.1 STEAM and Raspberry Pi STEAM stands for Science, Technology, Engineering, Arts and Mathematics. It's a type of trans disciplinary education idea focused on practice. As a board designed for computer programming education, Raspberry Pi has lots of advantages over other robot development boards.
  • Page 12: Raspberry Pi System Installation And Development Environment Establishment

    2. Raspberry Pi System Installation and  Development Environment Establishment 2.1 Install An Operating System for The Raspberry Pi 2.1.1 Method A: Write 'Raspbian' to The SD Card by Raspberry Pi Imager Raspberry Pi Imager is an image writing tool to SD card developed by the Raspberry Pi Organization. It comes with many versions working on different systems and it's quite easy to use;...
  • Page 13 ●Insert the SD card into the card reader, connect the card reader with your computer. ●Run the Raspberry Pi Imager, select CHOOSE OS -> Raspbian(other) -> Raspbian Full - A port of Debian with desktop and recommended applications. ●Click on CHOOSE SD CARD for the SD card to write the Raspbian Full, please be noted that the image writing will automatically delete all files on the SD card if any.
  • Page 14: Method B: Download The Image File Raspbian And Write It To The Sd Card Manually

    ●Do not remove the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot, WiFi configuration without any peripherals may fail in the following process. 2.1.2 Method B: Download The Image File Raspbian and Write It to The SD Card Manually ●Since the image file is downloaded with Raspberry Pi Imager in 2.1.1, it can take a long time due to a slow network in some places.
  • Page 15 4. Download the image file `Raspbian`  - Torrent file:  [Raspbian-Raspbian Buster with desktop and recommended software]  -Zip file: [Raspbian - Raspbian Buster with desktop and recommended software]  5. Unzip the file, be noted that the path should be in English for the `.img` file extracted, no special ...
  • Page 16 ●On the Raspberry Pi website [Official Raspberry Pi website], select through Downloads -> Raspbian -> Raspbian Buster with desktop and recommended software, and click on the torrent or zip file to download. Unzip the file after download, be noted that the path should be in English for the .img file extracted, no special characters allowed;...
  • Page 17: Method C: Manually Download The Image File Provided By Us And Write It To The Sd Card (Not Recommended)

     ●Do not remove the SD card connected after writing is completed, we'll use for configuring SSH and WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot it up, WiFi configuration without any peripherals may fail in the following process. 2.1.3 Method C: Manually Download The Image File Provided by Us and Write It to The SD Card (Not Recommended) ●...
  • Page 18 3. Install the `Raspberry Pi Imager` 4. Download the image file `Adeept_AWR` [Image file for the Adeept_AWR Robot] 5. Unzip the file, be noted that the path should be in English for the `.img` file extracted, no special characters allowed. 6.
  • Page 19: Enable Ssh Server Of Raspberry Pi

    will automatically delete all files on the SD card if any. ●Click on WRITE, wait for the writing. ●Do not remove the SD card connected after writing is completed, we'll use for configuring WiFi connection later. Otherwise, if you remove the card, insert it into the Raspberry Pi and boot it up, WiFi configuration without any peripherals may fail in the following process.
  • Page 20: Method A: Enable Ssh With Peripherals

    2.2.1 Method A: Enable SSH with Peripherals ●If you use (2.1.3 to manually download the image file we provide and write it to the SD card) to write the operating system of the Raspberry Pi to the SD card, you do not need to refer to this section to open SSH, because The SSH service in the image is already enabled.
  • Page 21: Configure Wifi On Raspberry Pi

    ●If you haven't connected any monitor to the Raspberry Pi, follow these steps to enable SSH. 1. Do not remove the SD card after `Raspberry Pi Imager` writes the image file. 2. Create a file named `ssh` under any directory, without any extension name. You may create a `ssh.txt` and delete the `.txt` (make sure under Folder Options the box of Hide extensions for known file types is unchecked.
  • Page 22 4. Type in your own information for `Insert country code here`, `Name of your WiFi`, and `Password for your WiFi`. Pay attention to the capitalization. Refer to the example below: ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev update_config=1 country=US network={ ssid="MyName" psk="12345678" 5. Save and exit. Copy the `wpa_supplicant.conf` to the root directory of the SD card. 6.
  • Page 23: Log In To The Raspberry Pi And Install The App

    Power supply. ●The Motor HAT board of the Adeept Raspberry Pi Robot can supply power for the Raspberry Pi via GPIO port. However, since it may take a long time to install software on the Raspberry Pi, it's not recommended to supply with the batteries during this process. You may skip the installation of the Motor HAT board or camera during software installation;...
  • Page 24: Log Into Raspberry Pi (Linux Or Mac Os)

    of the Raspberry Pi, `raspberry` (pay attention to capitalization). There's no change on the screen when you're typing in, but it doesn't mean you're not entering the information. Press ‘enter’ after you finish typing in. ●So now you've logged into the Raspberry Pi. 3.2 Log into Raspberry Pi (Linux or Mac OS) ●Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi.
  • Page 25: Download Program Of The Raspberry Pi Robot

    ●In the previous section you've logged into the Raspberry Pi, and here type in the follow command in the terminal window: sudo git clone https://github.com/adeept/Adeept_AWR.git ●Press ‘enter’ to start downloading the program of the robot from GitHub. It may take some time, please...
  • Page 26: Install Corresponding Dependent Libraries

    wait until it's done. 3.5 Install Corresponding Dependent Libraries ● Follow the steps below to install the libraries if you wrote the image file to the SD card based on 2.1.1 Write 'Raspbian' to the SD card by `Raspberry Pi Imager` and 2.1.2 Download the image file `Raspbian` and write it to the SD card manually.
  • Page 27: Assembly And Precautions

    ● If it fails to enter the page, log into the Raspberry Pi via SSH, type in the command below to end the program auto run at boot to release resources, or else issues like camera initialization failure or occupied ports. sudo killall python3 ●Type in the command below to run `webServer.py`: sudo python3 Adeept_AWR/server/webServer.py...
  • Page 28 If your servo does not return to the original position automatically, you can manually run the server.py file and then try to connect the servo. ●Preparations before Assembly Connect the Adeept Ultrasonic Module with 4-Pin wire.
  • Page 29 Please note that the end marked with white strip is the signal input, and the end without white strip is the signal output and connect the WS2812 to the Adeept Robot HAT. The end marked with white strip is the...
  • Page 30 Connect the Raspberry Pi Camera and the ribbon. Note: That in the next operation, the Pi Camera of the Raspberry Pi should always be connected to th e Raspberry Pi, and do not reverse the wires of the Raspberry Pi. Assemble the following components Raspberry Pi Camera x1 Long Raspberry Pi Camera Ribbon x1...
  • Page 31 ● Fix four M2.5x10+6 Copper Standoffs on Raspberry Pi. Effect diagram after assembling...
  • Page 32 1. Connect the 18650 Battery Holder Set to the Adeept Motor HAT.
  • Page 33 2. Put two 18650 batteries in 18650 Battery Holder Set according to the following method.
  • Page 34 3. Connect servos to Adeept Motor HAT.
  • Page 35 4. Before switching on, you need to insert the configured SD card into the Raspberry Pi. For details, please refer to the third chapter of the document. Otherwise, the servo will not rotate to the middle position after booting. If SD card is not inserted, the servo needs to be rotated to the middle position manually.
  • Page 36 ●Body parts. 1. Fix Raspberry Pi Camera on Acrylic Plates Assemble the following components Effect diagram after assembling...
  • Page 37 2. Fix a rocker arm to the acrylic plate Assemble the following components Effect diagram after assembling...
  • Page 38 3. Assemble the camera. Assemble the following components Effect diagram after assembling...
  • Page 39 Fix a debugged servo to the acrylic plate. Assemble the following components Effect diagram after assembling...
  • Page 40 5. Fix the rocker arm on acrylic to the servo on acrylic. Assemble the following components Effect diagram after assembling...
  • Page 41 Assemble the following components Effect diagram after assembling...
  • Page 42 6. Fix motors to the acrylic Assemble the following components Effect diagram after assembling...
  • Page 43 7. Fix one section of the 18650 Battery Holder Set to acrylic Assemble the following components...
  • Page 44 Assemble the following components Effect diagram after assembling...
  • Page 45 Assemble the following components Effect diagram after assembling...
  • Page 46 8. Fix 3 Tracking Module on acrylic Assemble the following components Effect diagram after assembling...
  • Page 47 9. Fix two Car lights acrylic plate Assemble the following components Effect diagram after assembling...
  • Page 48 10. Fix Adeept Ultrasonic Module on the acrylic plate. Assemble the following components Effect diagram after assembling...
  • Page 50 11. Connect the Adeept Ultrasonic Module, Car Light, 3 Tracking Module Set and motor as shown below before assembling the body part. Assemble the following components...
  • Page 51 12. Assemble the body part. Assemble the following components Effect diagram after assembling...
  • Page 53 13. Connect the front part and the body part. Assemble the following components Effect diagram after assembling...
  • Page 55 14. Strengthen the body part. Assemble the following components Effect diagram after assembling...
  • Page 56 Assemble the following components Effect diagram after assembling...
  • Page 57 Assemble the following components Effect diagram after assembling...
  • Page 58 15. Install the wheels on the car. Assemble the following components Effect diagram after assembling...
  • Page 59: Tips For Structural Assemblage

    4.2 Tips for Structural Assemblage ●Since many servos are used in the product, the servo installation is critical for the robot. Before installing the rocker arm to the servo, you need to connect the servo to power and make the servo shaft rotate to the central position, so the rocker arm installed at the designated degree will be in the central position.
  • Page 60 ●You can also use a power lithium battery to power Motor HAT. Motor HAT supports the power supply that is below 15V. ●You can use a USB cable to supply power to Motor HAT when the installing the rocker arm of the servo during structural assembly.
  • Page 61: Controlling Robot Via Web App

    5 Controlling Robot via WEB App  ●The WEB app is developed for common users to control the robot in an easier way. It's convenient to use WEB app; you may use it to wirelessly control the robot on any device with a web browser (Google Chrome was used for testing).
  • Page 62 ·`MOTION GET`: Motion detection function based on OpenCV. When objects move in the view of the camera, the program will circle the part in the `Video` window, and the LED light on the robot will show respective changes. ·`AUTO MATIC`: Obstacle avoidance function based on ultrasonic. When the ultrasonic module on the robot detects an obstacle, it will automatically turn left, and take a step backward before turning if it's too close to the obstacle.
  • Page 63: Common Problems And Solutions(Q&A)

    6 Common Problems and Solutions(Q&A)  ●Where to find the IP address of the Raspberry Pi? Before connecting the Raspberry Pi via SSH, you need to know the IP address of the Raspberry Pi. Check the Management interface for your router, or download the app `Network Scanner` -> search for a device named `RASPBERRY` or `Raspberry Pi Foundation` to get the IP address.
  • Page 64 ●The servo doesn't return to the central position when connected to the driver board. In general, the Raspberry Pi will auto run `webServer.py` when booting, and `webServer.py` will run and control the servo ports to send a signal of rotating to the central position. When assembling the servo, you can connect it to any servo port anytime.
  • Page 65 ●Motor movement direction is incorrect Due to the different batches of motors, when the same signal is given, the direction of rotation of the motor may be different. We have set an interface to adjust the direction of rotation of the motor in the program. You need to open move.py.
  • Page 66: Set The Program To Start Automatically

    7 Set The Program to Start Automatically  7.1 Set The Specified Program to Run Automatically at Boot ●This section only introduces the auto-run method used by our products. If you need more information about the Raspberry Pi auto-run program, you can refer to this document from itechfythe document Auto-Run.
  • Page 67 ●For example, if we want to replace webServer.py with server.py, we only need to edit the following: Replace sudo python3 [RobotName]/server/webServer.py with sudo python3 [RobotName]/server/server.py ●Save and exit so that the robot will automatically run server.py instead of webServer.py the next time the robot is turned on.
  • Page 68: Remote Operation Of Raspberry Pi Via Mobaxterm

    8 Remote Operation of Raspberry Pi Via   MobaXterm ●To make daily use of the Raspberry Pi more convenient, we usually do not connect peripherals such as mouse, keyboard, and monitor to the Raspberry Pi. Since our Raspberry Pi is installed inside the robot, often with peripherals to control the Raspberry Pi, the efficiency of programming and testing will be seriously affected.
  • Page 69 to the Raspberry Pi again, if there is no save username and password will need to input the user name and password, if the IP address of the Raspberry Pi changed, you need to start a new dialogue. ● After a successful login, the left column is replaced with a file transfer system, which allows you to interact with the system inside the Raspberry Pi.
  • Page 70: How To Control Ws2812 Rgb Led

    9 How to Control WS2812 RGB LED   ●WS2812 LED light is a commonly used module on our robot products. There are three WS2812 lights on each module. Please pay attention when connecting. The signal line is different in direction, which needs to be connected to WS2812 after being led out from the Raspberry Pi.
  • Page 71 class LED: __init__(self): self.LED_COUNT Set to the total number of LED lights on the robot product.There are more LED lights on the Raspberry Pi self.LED_PIN Set to the input pin number of the LED group self.LED_FREQ_HZ 800000 self.LED_DMA self.LED_BRIGHTNESS = self.LED_INVERT False self.LED_CHANNEL...
  • Page 72 ●Instantiate the object and execute the method function. The function colorWipe () needs to pass in three parameters, namely R, G, and B, which correspond to the brightness of the three primary colors of red, green, and blue. The value range is 0- 255, the larger the value, the higher the brightness of the corresponding color channel.
  • Page 73: How To Control The Servo

    10 How to Control The Servo   10.1 Control The Steering Gear to Rotate to A Certain Angle ● Since the servo can use the PWM signal to control the rotation angle of a mechanism, it is a more commonly used module on robot products.
  • Page 74: Control The Slow Motion Of The Steering Gear

    ● pwm.set_pwm (3, 0, 300) This method is used to control the rotation of a servo to a certain position, where 3 is the servo port number, which corresponds to the number identified on the Motor HAT driver board, but pay attention to the rudder When the machine is connected to the drive board, do not insert the reverse direction of the ground wire, VCC and signal wire, brown to black, red to red, yellow to yellow;...
  • Page 75: Non-Blocking Control

    10.3 Non-blocking Control ●You can find the RPIservo.py file in the server folder of the robot product, copy it to the same folder as the program you want to run, and then you can use this method in your program. import RPIservo # Import a library that uses multiple threads to control the steering gear...
  • Page 76: How To Control Dc Motor

    11 How to Control DC Motor   ●If the Raspbian image version you installed is Raspbian Full provided by the official website, you do not need to install other dependent libraries. We only need to control the GPIO port of the Raspberry Pi for simple high and low levels and PWM to control the L298N chip on Motor HAT, thus controlling the direction and speed of the motor.
  • Page 77 setup(): # GPIO initialization, GPIO motor cannot be controlled without initialization global pwm_A, pwm_B GPIO.setwarnings(False) GPIO.setmode(GPIO.BCM) GPIO.setup(Motor_A_EN, GPIO.OUT) GPIO.setup(Motor_B_EN, GPIO.OUT) GPIO.setup(Motor_A_Pin1, GPIO.OUT) GPIO.setup(Motor_A_Pin2, GPIO.OUT) GPIO.setup(Motor_B_Pin1, GPIO.OUT) GPIO.setup(Motor_B_Pin2, GPIO.OUT) motorStop() # Avoid motor starting to rotate automatically after initialization try: # Try is used here to avoid errors due to repeated PWM settings pwm_A = GPIO.PWM(Motor_A_EN, 1000) pwm_B = GPIO.PWM(Motor_B_EN, 1000) except:...
  • Page 78 Control A and B motors to rotate at full speed for 3 seconds motor_A(1, 100) motor_B(1, 100) time.sleep(3) Control A and B motors to rotate in opposite directions at full speed for 3 seconds motor_A(-1, 100) motor_B(-1, 100) time.sleep(3) Stop the motor rotation of A and B ports motorStop() ●The above codes can be used to control the motor movement.
  • Page 79: Ultrasonic Module

    12 Ultrasonic Module   ● The camera used by our Raspberry Pi robot is monocular, which cannot collect depth information. Therefore, many of our robot products use ultrasonic ranging modules to obtain depth information and detect whether there is an obstacle in a certain direction to obtain the distance of the obstacle. ●The principle of ultrasonic ranging is shown in the following formula: ●...
  • Page 80 GPIO.setmode(GPIO.BCM) GPIO.setup(Tr, GPIO.OUT,initial=GPIO.LOW) GPIO.setup(Ec, GPIO.IN) checkdist(): GPIO.output(Tr, GPIO.HIGH) # Set the input end of the module to high level and emit an initial sound wave time.sleep(0.000015) GPIO.output(Tr, GPIO.LOW) while not GPIO.input(Ec): # When the module no longer receives the initial sound wave pass t1 = time.time() # Note the time when the initial sound wave is emitted...
  • Page 81: Line Tracking

    13 Line Tracking   ● Some of our robot products are equipped with a three-channel infrared line patrol module. The line patrol module is converted to the robot's line patrol function design. The three-channel infrared line patrol module contains 3 groups of sensors, where each group of sensors consists of an infrared emitting LED and an infrared sensor photoelectric Transistor composition, the robot determines whether there is a line detected by detecting the infrared light intensity detected by the infrared sensor phototransistor.
  • Page 82 GPIO.setup(line_pin_middle,GPIO.IN) GPIO.setup(line_pin_left,GPIO.IN) run(): Read the values of three infrared sensor phototransistors (0 is no line detected, 1 is line detected) This routine takes the black line on white as an example status_right = GPIO.input(line_pin_right) status_middle = GPIO.input(line_pin_middle) status_left = GPIO.input(line_pin_left) # Detect if the line hunting module senses the line status_middle == 1: Control the robot forward...
  • Page 83 ●When your project needs to use the line patrol function, you don't need to rewrite the above code, just copy findline.py and move.py in the robot program server folder to the same as your own project In the folder, then use the following code to use the line patrol function: import findline findline.setup()
  • Page 84: Make A Police Light Or Breathing Light

    14 Make A Police Light or Breathing Light   14.1 Multi-threading Introduction ● This chapter introduces the use of multi-threading to achieve some effects related to WS2812 LED lights. Multi-threading is a commonly used operation in robot projects. Because robots have high requirements for real-time response, when performing a certain task, try not to block main thread communication.
  • Page 85 Use the Threading module to create threads, inherit directly from threading.Thread, and then override the __init__ method and the run method class RobotLight(threading.Thread): __init__(self, *args, **kwargs): Here initialize some settings about LED lights self.LED_COUNT # Number of LED pixels. self.LED_PIN # GPIO pin connected to the pixels (18 uses PWM!).
  • Page 86 super(RobotLight, self).__init__(*args, **kwargs) self.__flag = threading.Event() self.__flag.clear() # Define functions which animate LEDs in various ways. setColor(self, R, G, B): Set the color of all lights color = Color(int(R),int(G),int(B)) range(self.strip.numPixels()): self.strip.setPixelColor(i, color) self.strip.show() setSomeColor(self, R, G, B, ID): Set the color of some lamps, the ID is the array of the serial number of this lamp color = Color(int(R),int(G),int(B)) #print(int(R),' ',int(G),' ',int(B)) self.strip.setPixelColor(i, color)
  • Page 87 Call this function to turn on the police light mode self.lightMode = 'police' self.resume() policeProcessing(self): The specific realization of the police light mode while self.lightMode == 'police': Blue flashes 3 times range(0,3): self.setSomeColor(0,0,255,[0,1,2,3,4,5,6,7,8,9,10,11]) time.sleep(0.05) self.setSomeColor(0,0,0,[0,1,2,3,4,5,6,7,8,9,10,11]) time.sleep(0.05) self.lightMode != 'police': break time.sleep(0.1) Red flashes 3 times range(0,3):...
  • Page 88 self.colorBreathB = B_input self.resume() breathProcessing(self): Specific realization method of breathing lamp while self.lightMode == 'breath': All lights gradually brighten range(0,self.breathSteps): self.lightMode != 'breath': break self.setColor(self.colorBreathR*i/self.breathSteps, self.colorBreathG*i/self.breathSteps, self.colorBreathB*i/self.breathSteps) time.sleep(0.03) All lights are getting darker range(0,self.breathSteps): self.lightMode != 'breath': break self.setColor(self.colorBreathR-(self.colorBreathR*i/self.breathSteps), self.colorBreathG-(self.colorBreathG*i/self.breathSteps), self.colorBreathB-(self.colorBreathB*i/self.breathSteps)) time.sleep(0.03) lightChange(self):...
  • Page 89: Warning Lights Or Breathing Lights In Other Projects

    run(self): Functions for multi-threaded tasks while self.__flag.wait() self.lightChange() pass __name__ == '__main__': RL=RobotLight() # Instantiate the object that controls the LED light RL.start() # Start thread Start breathing light mode and stop after 15 seconds RL.breath(70,70,255) time.sleep(15) RL.pause() Pause for 2 seconds time.sleep(2) Start the police light mode and stop after 15 seconds RL.police()
  • Page 90: Real-Time Video Transmission

    RL=robotLight.RobotLight() # Instantiate the object that controls the LED light RL.start() # Start thread Start breathing light mode and stop after 15 seconds RL.breath(70,70,255) time.sleep(15) RL.pause() Pause for 2 seconds time.sleep(2) Start the police light mode and stop after 15 seconds RL.police() time.sleep(15) RL.pause()
  • Page 91 ●This chapter does not introduce the OpenCV part first, only introduces how to see the real-time picture of the Raspberry Pi camera on other devices. ●First download flask-video-streaming this project in the Raspberry Pi. You can download it from Clone on GitHub or download it on your computer and then pass it to the Raspberry Pi.
  • Page 92 ●Finally, uncomment the code that imports Camera from camera_pi, # from camera_pi import Camera delete in front of #, note that there is a space after the # here, and also delete, the changed code is as follows: from camera_pi import Camera ●The following is the complete code of the modified app.py:...
  • Page 93 mimetype='multipart/x-mixed-replace; boundary=frame') __name__ == '__main__': app.run(host='0.0.0.0', threaded=True) ●After editing, press CTRL+X to launch the editing, and prompt whether to save the changes, press Y and Entry after saving the changes. ●Then you can run app.py: sudo app.py ●Open the browser on the device on the same local area network as the Raspberry Pi (we use Google Chrome to test), and enter the IP address of the Raspberry Pi plus the video streaming port number: 5000 in the address bar, as shown in the following example: 192.168.3.157:5000...
  • Page 94: Automatic Obstacle Avoidance

    16 Automatic Obstacle Avoidance  ●The ultrasonic module of this product can only move up and down with the camera, and the left and right movement can only rotate with the body of the body, and can not move left and right relative to the body, so the obstacle avoidance function of this robot is relatively simple, as long as there is an obstacle in front Turn left and retreat if the obstacle is too close, and move forward if the obstacle is far away or there is no obstacle.
  • Page 95 # Initialize servo angle pwm = Adafruit_PCA9685.PCA9685() pwm.set_pwm_freq(50) pwm0_direction = pwm0_init = num_import_int('init_pwm0 = ') pwm0_max = pwm0_min = pwm0_pos = pwm0_init pwm1_direction = pwm1_init = num_import_int('init_pwm1 = ') pwm1_max = pwm1_min = pwm1_pos = pwm1_init pwm2_direction = pwm2_init = num_import_int('init_pwm2 = ') pwm2_max = pwm2_min = pwm2_pos = pwm2_init...
  • Page 96 # Change the scanned direction scanDir == 1: scanDir = elif scanDir == -1: scanDir = # Restore scanned location scanPos += scanDir*2 print(scanList) # If the distance of the nearest obstacle in front is less than the threshold min(scanList) < rangeKeep: # If the closest obstacle is on the left scanList.index(min(scanList)) == 0: # Then, turn right...
  • Page 97: Why Opencv Uses Multi-Threading To Process Video Frames

    17 Why OpenCV Uses Multi-threading to  Process Video Frames ●The OpenCV function is based on the GitHub project flask-video-streaming, we changed the camera_opencv.py to perform OpenCV related operations. 17.1 Single Thread Processing of Video Frames ● First, we introduce the process of single-thread processing of video frames. Let ’s start with a simple one, so that you will understand why OpenCV uses multiple threads to process video frames.
  • Page 98: Multi-Thread Processing Of Video Frames

    make it abnormally stuck. 17.2 Multi-thread Processing of Video Frames ●Next, the process of multi-thread processing of video frames is introduced: ● Process explanation: In order to improve the frame rate, we separate the analysis task of the video frame from the process of acquisition and display, and place it in a background thread to execute and generate drawing information.
  • Page 99 import threading import imutils class CVThread(threading.Thread): This class is used to process OpenCV analysis of video frames in the background. For more basic usage principles of the multi-threaded class, please refer to 14.2 __init__(self, *args, **kwargs): self.CVThreading = super(CVThread, self).__init__(*args, **kwargs) self.__flag = threading.Event() self.__flag.clear() mode(self, imgInput):...
  • Page 100 self.CVThreading = resume(self): Resume the thread self.__flag.set() run(self): Process video frames in the background thread while self.__flag.wait() self.CVThreading = self.doOpenCV(self.imgCV) class Camera(BaseCamera): video_source = 0 __init__(self): os.environ.get('OPENCV_CAMERA_SOURCE'): Camera.set_video_source(int(os.environ['OPENCV_CAMERA_SOURCE'])) super(Camera, self).__init__() @staticmethod set_video_source(source): Camera.video_source = source @staticmethod frames(): camera = cv2.VideoCapture(Camera.video_source) if not camera.isOpened(): raise...
  • Page 101 img = camera.read() cvt.CVThreading: If OpenCV is processing video frames, skip pass else: If OpenCV is not processing video frames, give the video frame processing thread a new video frame and resume the processing thread cvt.mode(img) cvt.resume() Draw elements on the screen img = cvt.elementDraw(img) # encode as a jpeg image and return it yield...
  • Page 102: Opencv Learn To Use Opencv

    18 OpenCV Learn to Use OpenCV  ●The real-time video transmission function comes from the open source project of Github the MIT open source agreement flask-video-streaming. ●First, prepare two .py files in the same folder in the Raspberry Pi. The code is as follows: ·app.py #!/usr/bin/env python3 from...
  • Page 103 from thread import get_ident except ImportError: from _thread import get_ident class CameraEvent(object): """An Event-like class that signals all active clients when a new frame is available. """ __init__(self): self.events = {} wait(self): """Invoked from each client's thread to wait for the next frame.""" ident = get_ident() ident not in...
  • Page 104 """Invoked from each client's thread after a frame was processed.""" self.events[get_ident()][0].clear() class BaseCamera(object): thread = None # background thread that reads frames from camera frame = None # current frame is stored here by background thread last_access = # time of last client access to the camera event = CameraEvent() __init__(self): """Start the background camera thread if it isn't running yet."""...
  • Page 105 frame frames_iterator: BaseCamera.frame = frame BaseCamera.event.set() # send signal to clients time.sleep(0) # if there hasn't been any clients asking for frames in # the last 10 seconds then stop the thread time.time() - BaseCamera.last_access > 10: frames_iterator.close() print('Stopping camera thread due to inactivity.') break BaseCamera.thread = None...
  • Page 106: Using Opencv To Realize Color Recognition And Tracking

    19 Using OpenCV to Realize Color Recognition  and Tracking 19.1 Color Recognition and Color Space ●For the development preparation and operation of OpenCV function, please refer to 18. ●Create camera_opencv.py in the folder where app.py and base_camera.py in 18. The code related to the OpenCV color tracking function to be introduced in this chapter is written in camera_opencv.py ●For safety reasons, this routine does not control the motor or servo motion, and only outputs OpenCV calculation results.
  • Page 107: Specific Code

    general process is as follows 19.3 Specific Code ●camera_opencv.py import import from base_camera import BaseCamera import numpy...
  • Page 108 Set target color, HSV color space colorUpper = np.array([44, 255, 255]) colorLower = np.array([24, 100, 100]) font = cv2.FONT_HERSHEY_SIMPLEX class Camera(BaseCamera): video_source = 0 __init__(self): os.environ.get('OPENCV_CAMERA_SOURCE'): Camera.set_video_source(int(os.environ['OPENCV_CAMERA_SOURCE'])) super(Camera, self).__init__() @staticmethod set_video_source(source): Camera.video_source = source @staticmethod frames(): camera = cv2.VideoCapture(Camera.video_source) if not camera.isOpened(): raise RuntimeError('Could not start camera.')
  • Page 109 c = max(cnts, key=cv2.contourArea) ((box_x, box_y), radius) = cv2.minEnclosingCircle(c) M = cv2.moments(c) center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"])) X = int(box_x) Y = int(box_y) Get the center point coordinates of the target color object and output print('Target color object detected') print('X:%d'%X) print('Y:%d'%Y) print('-------')
  • Page 110: Hsv Color Component Range In Opencv

    19.4 HSV Color Component Range in OpenCV HSV\color Black Grey Orange Yellow Green Cyan Purple White Blue H_min 0|156 H_max 10|180 S_min S_max V_min V_max...
  • Page 111: Machine Line Tracking Based On Opencv

    20 Machine Line Tracking Based on OpenCV  20.1 Visual Line Inspection Process ●For the development preparation and operation of OpenCV function, please refer to 18. ●Create camera_opencv.py in the folder where app.py and base_camera.py in 18, the code related to the OpenCV visual line tracking function to be introduced in this chapter is written in camera_opencv.py.
  • Page 112: Specific Code

    20.2 Specific Code import import from base_camera import BaseCamera import numpy import time import threading import imutils Set the color of the line, 255 is the white line, 0 is the black line lineColorSet = Set the horizontal position of the reference, the larger the value, the lower, but not greater than the vertical resolution of the video (default 480) linePos = class...
  • Page 113 Convert the picture to black and white, and then binarize (the value of each pixel in the picture is 255 except 0) img = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) retval, img = cv2.threshold(img, 0, 255, cv2.THRESH_OTSU) img = cv2.erode(img, None, iterations=6) #Use Corrosion Denoising colorPos = img[linePos] #Get an array of pixel values for linePos try:...
  • Page 114: Create A Wifi Hotspot On The Raspberry Pi

    ● Under normal circumstances, if the robot program is not connected to the WIFI when it is turned on, it will automatically turn on the hotspot. You can use your phone or computer to search for the WIF named Adeept. The default password is 12345678.
  • Page 115: Install Gui Dependent Item Under Window

    22 Install GUI Dependent Item under Window  ● Our old version of robot programs all provide a desktop GUI program to control the robot. The GUI program is written in Python language, but this method uses a higher threshold and difficulty, and is not recommended for novices.
  • Page 116: How To Use Gui

    23 How to Use GUI  ●Because the web and the GUI are not connected, if you want to use the GUI to control the robot product, you need to manually run server.py. (The method is the same as manually running webserver.py, except that the object is changed to server.py).
  • Page 117: Control The Ws2812 Led Via Gui

    the standard program. If you are interested in this, you can try to expand more. We will offer the installation and application methods of other functions in the follow-up tutorials. Please subscribe our Youtube channel for more. ●Change LED Color:You can control the colors of the LEDs on the robot in real time by dragging these three sliders.
  • Page 118 import socket Some settings related to LED lights come from the WS281X routine Source Code:https://github.com/rpi-ws281x/rpi-ws281x-python/ LED_COUNT LED_PIN LED_FREQ_HZ 800000 LED_DMA LED_BRIGHTNESS = LED_INVERT False LED_CHANNEL Process arguments parser = argparse.ArgumentParser() parser.add_argument('-c', '--clear', action='store_true', help='clear the display on exit') args = parser.parse_args() Create NeoPixel object with appropriate configuration.
  • Page 119 tcpSerSock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR,1) tcpSerSock.bind(ADDR) tcpSerSock.listen(5) Start listening to the client connection, after the client connection is successful, start to receive the information sent from the client tcpCliSock, addr = tcpSerSock.accept() while True: data = '' Receive information from the client data = str(tcpCliSock.recv(BUFSIZ).decode()) if not data: continue...
  • Page 120 Python uses Tkinter to quickly create GUI applications and instantiate them while importing import tkinter lights_on(): Call this method to send the light-on command 'on' tcpClicSock.send(('on').encode()) lights_off(): Call this method to send the light off command 'off' tcpClicSock.send(('off').encode()) Enter the IP address of the Raspberry Pi here SERVER_IP = '192.168.3.35' Next is the configuration related to TCP communication, where PORT is a defined port number, you can choose freely from 0-65535, it is recommended to choose the number after 1023, which needs to be consistent with the port number...
  • Page 121 Use Tkinter's Button method to define a button. The button is on the root window. The name of the button is 'ON'. The text color of the button is # E1F5FE. The background color of the button is # 0277BD. )function btn_on = tk.Button(root, width=8, text='ON', fg='#E1F5FE', bg='#0277BD', command=lights_on) Choose a location to place this button btn_on.place(x=15, y=15)
  • Page 122: Real-Time Video Transmission Based On Opencv

    25 Real-time Video Transmission Based on  OpenCV ● This chapter introduces real-time video transmission, which can transmit the images collected by the camera to other places in real time for displaying images or handing it to the host computer for machine vision processing.
  • Page 123 IP = '192.168.3.11' Then initialize the camera, you can change these parameters according to your needs camera = picamera.PiCamera() camera.resolution = (640, 480) camera.framerate = 20 rawCapture = PiRGBArray(camera, size=(640, 480)) Here we instantiate the zmq object used to send the frame, using the tcp communication protocol, where 5555 is the port number The port number can be customized, as long as the port number of the sending end and the receiving end are the same context = zmq.Context()
  • Page 124 Clear the stream in preparation for the next frame rawCapture.truncate(0) ● In the following, we explain the program on the receiving end. Since the libraries used here are cross-platform, PC.py can be run on a Windows computer or another Linux computer. ●PC.py : First import the required libraries import...
  • Page 125: Use Opencv To Process Video Frames On The Pc

    Display image cv2.imshow("Stream", source) Generally, waitKey () should be used after imshow () to leave time for image drawing, otherwise the window will appear unresponsive and the image cannot be displayed cv2.waitKey(1) ●When running the program, we first run RPiCam.py in the Raspberry Pi and PC.py in the PC to see the real-time picture of the Raspberry Pi in the PC.
  • Page 126 footage_socket = context.socket(zmq.PAIR) footage_socket.bind('tcp://*:5555') while True: Received video frame data frame = footage_socket.recv_string() Decode and save it to the cache img = base64.b64decode(frame) Interpret a buffer as a 1-dimensional array npimg = np.frombuffer(img, dtype=np.uint8) Decode a one-dimensional array into an image source = cv2.imdecode(npimg, 1) Display image cv2.imshow("Stream", source)
  • Page 127 import numpy as np Here we instantiate the zmq object used to receive the frame Note that the port number needs to be consistent with the sender's context = zmq.Context() footage_socket = context.socket(zmq.PAIR) footage_socket.bind('tcp://*:5555') while True: Received video frame data frame = footage_socket.recv_string() Decode and save it to the cache img = base64.b64decode(frame)
  • Page 128 source = cv2.erode(source, None, iterations=6) Display image cv2.imshow("Stream", source) Generally, waitKey () should be used after imshow () to leave time for image drawing, otherwise the window will appear unresponsive and the image cannot be displayed cv2.waitKey(1)
  • Page 129: Enable Uart

    27 Enable UART  ● UART is a more commonly used communication protocol between devices. Using UART, you can allow MCUs such as Arduino, STM32, or ESP32 to communicate with the Raspberry Pi, which can make your robot more powerful. ●However, for some Raspberry Pis, the UART that is enabled by default is not a full-featured UART, so you need to refer to the following steps to enable the full-featured UART.
  • Page 130: Uart Output On Gpio Pins

    for other purposes requires this default behaviour to be changed. On startup, systemd checks the Linux kernel command line for any console entries, and will use the console defined therein. To stop this behaviour, the serial console setting needs to be removed from command line. ●This can be done by using the raspi-config utility, or manually.
  • Page 131: Relevant Differences Between Pl011 And Mini Uart

    27.5 Relevant Differences Between PL011 and Mini UART ● The mini UART has smaller FIFOs. Combined with the lack of flow control, this makes it more prone to losing characters at higher baudrates. It is also generally less capable than the PL011, mainly due to its baud rate link to the VPU clock speed.
  • Page 132 ●It should be noted that the port number when using the WEB application is 5000, the port number when using the GUI program is 10223, and the port number when using the mobile APP is 10123. ●The controller on the left can control the robot to move back and forth, left and right, and the controller on the right can control other movements of the robot.
  • Page 133: Conclusion

    If you want to try our other products, you can visit our website: www.adeept.com For more product information, please visit: https://www.adeept.com/learn/ For more product latest video updates, please visit: https://www.adeept.com/video/ Thank you for using Adeept products.

This manual is also suitable for:

Picar-b

Table of Contents

Save PDF