Overview of the robot

In order to accomplish the tasks in the Home Mini Size category, our team managed to create a compact and efficient robot, meant to help the people around. We successfully integrated systems that performs the 3 basic tasks: robot guide test, follow me test and pick and place test. Because our team always comes up with solutions related to the current problems that people face, this time, we decided to help the sick elderly people. Our robot can interact with these people and help them take their medicines on a certain time. It can bring a specific drug depending on the day or time and bring a glass of water to help the elderly who cannot move easily and need treatment.

alternative
alternative
alternative

From the beginning we wanted to create a robot with an original design, so we made a lot of parts ourselves, using the 3D printing technique. Regarding the electrical part of our robot, we chose to have one single mainboard designed by us, fact that allowed us to implement our own components. Needless to say, creating all these parts on our own significantly reduced the cost of producing such a robot, so it could be affordable to any person if it will go into production. For the 3D printed components, we used Galaxy Black Prusament PETG from Prusa Research, because of its high impact strength and durability.

By using the Amazon Alexa Echo and building an Alexa skill, our robot can interact with people, especially with those who cannot move easily. It can bring them a certain medicine and a glass of water in order to help them take their treatment.

This task is accomplished using a Logitech C920 web camera, connected directly to the Raspberry PI 4. Using an algorithm (presented in software section), the robot follows the black cylinder. For the 90 degrees turns, our team used the sensor from the previous task to determine the direction of the followed robot.

The web camera and the fixed sensor make possible the detection of a bottle. The robot stops in front of it at a distance lower than 100 mm. Using a 3D printed arm, the robot picks the bottle then transports it to the indicated place. The arm is an open source project from EEzyrobots, and it uses four micro servomotors. We adapted this arm to our needs, so we modified the base of the arm and we designed a different gripper that allows picking the bottle. Also, we used only three micro servomotors, because the purpose of the fourth one was to rotate the whole arm, and it was unnecessary in our case. Also, this arm is used to move the chair and we plan to adapt it to move the mystery object too.

For this task, we used three distance sensors (Wenglor, P1KY001), of which two of them are mounted on servomotors to be able to detect objects in a larger area. The third sensor is fixed and it is used to detect objects behind the robot. These sensors are used to determine the distance between them and other objects by means of transit time measurement. These types of sensors have a great accuracy and a high switching frequency, using laser class 1 – safe for human eye. In order to rotate those two mobile sensors, we used two goBILDA servomotors controlled from our mainboard. To move the robot to a certain area on the table, we used the motor encoders, the gyroscope and the accelerometer (for rotations).

Motion

Our team embraced the design of a circular 3 omnidirectional wheels robot, even though the motion control system is far more complicate then the classic motion control used for 4 wheels robots or 2 wheels ones. We chose this option, because we wanted our robot to be able to move in any direction without changing its position. Also, we opted for this type of robot, because we had to use only 3 DC motors for the moving, so the power consumption is lower than in the 4 wheels robots.

On a circular 3D printed base (210 mm diameter) we mounted 3 Pololu 50:1 Metal Gearmotors (12 V) using 3 supports fixed on the base with screws. The 3 motors used for movement are equipped with quadrature encoders, as a result, we know the exact position of the robot on the competition table.


Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse aliquet turpis vitae lacus lobortis fringilla. Aenean mollis gravida nisl, nec feugiat neque sagittis ac. Nam vel tellus a quam condimentum viverra nec et diam. Mauris porta vehicula blandit. Ut semper nisl eget lectus ornare blandit. Duis in pellentesque justo, at elementum eros. Vivamus leo ipsum, venenatis euismod turpis vel, porta mollis enim. Curabitur dignissim sollicitudin neque non hendrerit. Donec facilisis convallis velit eget vulputate. In erat sem, scelerisque sit amet dapibus ac, pretium nec ligula. Fusce quis ligula eleifend, consectetur urna nec, blandit nisi. Nam vulputate diam sit amet mattis finibus. Morbi et sapien sed lacus semper vulputate. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Sed non nunc a sapien commodo consectetur. Sed ut finibus erat, sit amet volutpat felis.

The electronics part

The brain of our robot is a Raspberry PI 4 (4GB Ram), which is attached to our mainboard with pin headers. This controls 2 DC motor drivers (7A Dual DC Motor Driver from DFROBOT). The drivers use 2 digital pins for the direction control and one analogic pin (PWM) to control the speed of every motor. These drivers are powered from the main 12V battery and as a safety measure we added a 20 A fuse for each driver. The logic voltage for the drivers is 3.3 V. The Raspberry PI generates DMA timed PWM for a high accuracy.

alternative
alternative

Because we used all the available pins, on the mainboard, we integrated an Arduino UNO in case we want to add other components. In this case, the communication between the Arduino and the Raspberry PI will be serial. Because we need a precise rotation of our robot, we use gyroscope and an accelerometer, which is connected to the Raspberry using I2C protocol. Initially, we designed the mainboard with a special analog to digital converter to read the values from SHARP distance sensors, but we gave up this idea, because the output of these sensors was non-linear and it was very hard to interpret it. On the mainboard, we added 8 servomotors ports with 3 pins: GND, +5V and PWM (software PWM generated by Raspberry PI).

The robot is powered from a 12 V battery and has an emergency switch in case if it is needed to turn it off immediately. The Raspberry PI is powered using a 12 V to 5 V and 2.5 A converter and for Alexa Echo we used a Pololu Adjustable 9-30V Step-Up/Step-Down Voltage Regulator, which is able to provide 15 V continuously (Amazon Alexa requires 15 V in order to operate at normal parameters).

Because the servomotors have a power consumption that can not be provided by the Raspberry intern 5V regulator, we design our own regulator which can provide 5V and 5A. This uses the LM2678 regulator, a 20 uH coil and some electrolytic capacitors for a filtering. This circuit is mounted on the mainboard as a shield, using some spacers. After we read the datasheet for every component, we designed the electrical schematics in Autodesk Eagle, then we created the layout of our printed circuit board.

We chosed to prototype our mainboard with a PCB specialized manufacturer, while the 5V regulator board was made by us using the ultraviolet method.

Next step was the soldering process, which requires an increased attention, because most of the components were SMD-s. The last step regarding the electrical part was testing the functionality of every component. We faced some problems at the DC driver control, but after some measurements we adjusted the PWM signal, such that the speed of each motor is constant.

alternative

The layout of the mainboard in Autodesk Eagle


alternative

Ultraviolet lamp used to print the traces on the copper board

alternative

PCB before etching

The 3d design part

The programming part