Uncategorized

Pit Droid powered by NVIDIA Jetson Orin Nano


introduce

In “Star Wars”, the maintenance droid is a maintenance robot designed to maintain the racing car “Podracer”. The robots are 1.2 meters tall and can be folded when not in use. Their interesting behavior is because they are programmed to have a sense of urgency but not enough processing power to perform some complex tasks. Pit droids have appeared in several Star Wars movies and series, and they’re really cute and fun.

In this project, I’ll show you how to build a real-life Pit Droid and power it with NVIDIA Jetson Orin Nano. The robot can perform artificial intelligence object detection and move its head towards objects.

Pit robot assembly

STL is the abbreviation of stereolithography, a popular 3D printing technology. This file format is suitable for 3D printing.Star Wars fan Dave Moog has created STL files for the Pit Droid and several other droids, if you want to purchase the files you can check out his Robotics Division Shop on Etsy. Please do not share these files, there is a lot of effort behind creating realistic printable 3D robots, so please support the author by making a purchase. Droid Division Printing Club is a Facebook group where you can find many other robot builders, it’s a great community and members can give you lots of advice on 3D printing.

I don’t have a 3D printer, so I ordered the printing service online after purchasing the file. 3D printing can be challenging, and you should get to know your printer before proceeding to advanced 3D printing. A few weeks later, I received the 3D printed parts and started assembling the robot.

All 3D printed parts must be sanded before being primed and painted. Sanding will smooth the edges of the part and remove small spots and lines from the surface of the part. To smooth the part and remove any sanding lines, use sandpaper of different grits. You can also use any electric sander, which will save you a lot of time. Once you are happy with the finished part you can apply a primer, which usually dries quickly and you can paint over it. It will most likely take several rounds of painting to achieve the desired result.

The great thing about the Pit Droid is that you can paint it different colors. I decided to paint mine a white/red combo. First I started assembling the legs, the parts went together with screws, a good tip is to spray a little WD-40 on all the moving parts to make sure those parts can move around more easily.

After assembling the body of the pitbot, I placed all the electronics on the head of the robot. The robot’s head has enough space to accommodate an NVIDIA Jetson Orin Nano, two servo motors, an LED display, and a webcam. The charging cable can be pulled from the head to the back of the body so it is not directly visible.

The fully assembled mine robot can stand on its own. With movable arms and legs, you can easily position it into a sitting or other position.

The next task was to get it working so that it could see using the NVIDIA Jetson Orin Nano and move the head with servo motors. On the head of the robot, there is also an LED display that lights up when working.

NVIDIA Jetson Orin Nano settings

The NVIDIA Jetson Orin Nano Development Kit is a powerful edge device for developing AI-powered robots, smart drones, and smart cameras. Compared with the Jetson Nano, the device’s performance is improved by up to 80 times, and its AI performance is improved by up to 40 TOPS. Development kit includes Jetson Orin Nano board and power supply. You’ll also need a microSD card, computer monitor, USB mouse, and keyboard to start using it.

The Jetson Orin Nano Developer Kit SD card image can be found at the following link, you can use the following command to write the image to your microSD card ballena etching machine.The setting is super simple, I suggest you take a look at NVIDIA’s official Getting started with the Jetson Orin Nano Developer Kit. After successfully setting up, booting and logging in, you should be able to see the Ubuntu desktop.

The NVIDIA Jetson Orin Nano pin layout is shown in the figure below, which will be used to connect the LED display and servo motor. The network camera used in the project is connected via a USB port.

Pin setting is done using the Jetson-IO tool, which you can run in a terminal using the following command:

sudo /opt/nvidia/jetson-io/jetson-io.py

An interface will pop up showing the current 40-pin layout.

Select the option to manually configure the header pins and select the PWM option for pins 32 and 33.

For more information on the NVIDIA Jetson Orin pinout and other specifications, feel free to check out the Jetson Download Center and Documentation Jetson Orin Nano Development Kit Carrier Board Specifications.

LED light control

LED lights are placed on the front of the robot’s head. I wanted to make sure everything fit together nicely before adding a webcam in front of the lights.

Requires 3 AAA batteries to power the LED light. To turn them on and off I used Arduino relays.

The Arduino relay is connected to pins 9, 12 and 17 of the NVIDIA Jetson Orin Nano.

The battery, relays, and LED lights are connected to the NVIDIA Jetson Orin Nano in the following ways:

Testing the LED light using Python can be done using the following code, we turn on the Arduino relay and turn it off after 2 seconds.

import RPi.GPIO as GPIO
from time import sleep

# set mode to BCM
GPIO.setmode(GPIO.BCM)

# define output pin
output_pin = 18

# GPIO setup
GPIO.setup(output_pin, GPIO.OUT)

# turn LED on
GPIO.output(output_pin, 1)
sleep(2)

# turn LED off
GPIO.output(output_pin, 0)
sleep(2)

# cleanup
GPIO.cleanup()

With the functional LED light in hand, I placed the webcam in front of the light and connected it to the NVIDIA Jetson Orin Nano’s USB port.

Servo Control

The servo motor is located inside the robot’s head and attached to its neck. One servo motor turns the head up and down, and the other turns the head left and right.

One digital servo is connected to NVIDIA Jetson Orin Nano pins 2, 6, and 32, and the other servo motor is connected to pins 4, 30, and 33.

You can test the motor and put it in the starting position. The motor can be placed in different positions through the ChangeDutyCycle command. Don’t forget to run the stop and clean commands at the end.

import RPi.GPIO as GPIO
from time import sleep

# set mode to BOARD, pins are by numbers on board
GPIO.setmode(GPIO.BOARD)

# define output pin
output_pin = 33

# GPIO setup
GPIO.setup(output_pin, GPIO.OUT)

# start
servo=GPIO.PWM(33, 50)
servo.start(0)
sleep(1)

# move head left
servo.ChangeDutyCycle(5)
sleep(1)

# move head right
servo.ChangeDutyCycle(10)
sleep(1)

# stop and cleanup
servo.stop()
GPIO.cleanup()

Both servo motors apply the same logic to move the head up/down or left/right. In the video below you can see how head movement works.

Servo control quick test

By using web cameras and computer vision, the robot will be able to detect objects and position its head towards them.

Visual artificial intelligence (object detection)

To start learning about NVIDIA Jetson Orin Nano and the possibilities of AI, I would recommend the following GitHub repositories: https://github.com/dusty-nv/jetson-inference

This repository provides many great examples that you can try out for image classification, object detection, or others. First, clone the repository locally.

git clone https://github.com/dusty-nv/jetson-inference

Enter the cloned folder:

cd jetson-inference

First, execute submodule update to add all git submodules.

git submodule update --init

Create a build folder, enter it, and execute cmake, install and ldconfig.

mkdir build
cd build
cmake ../
make -j$(nproc)
sudo make install
sudo ldconfig

Now everything is ready to start the example. Execute detectornet.py in the Python examples folder to start real-time camera object detection.

 ./detectnet.py /dev/video0

The following video shows how to detect people in a live video source.

In the Detectnet.py code, you’ll notice that the detector object has been logged and that it has properties for the detected class and the location of the detected object.

-- Confidence: 0.746582
-- ClassID: 1
-- Left: 565
-- Top: 314.648
-- Right: 1173.75
-- Bottom: 719
-- Width: 608.75
-- Height: 404.352
-- Area: 246149
-- Center: (869.375, 516.824)

These properties allow us to control robots. Once a person or other object is detected, we can turn on the LED light.

# detect objects in the image (with overlay)
detections = net.Detect(img, overlay=args.overlay)

# print the detections
print("detected {:d} objects in image".format(len(detections)))

lights = False

for detection in detections:
if int(detection.ClassID) == 1: # person is detected
lights = True

if lights:
GPIO.output(output_pin, 1)
else:
GPIO.output(output_pin, 0)

Again, by using the object position property, we can turn on the servo and position the head to the detected object.

last words

At the Azure Lowlands (Utrecht, Netherlands) and Techorama (Antwerp, Belgium) conferences, my friends sherry list I introduced the Pit Droid and how it works. Special thanks to the conference organizers and attendees for their interest in this project!

I hope this project inspired you and taught you something new about computer vision and the Internet of Things! If you decide to build such a robot, please feel free to contact us if you have any questions.

Don’t forget to recycle your robots and save the galaxy!



Source link

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *