myzharbot
myzharbot
MyzharBot - Your personal mobile robot
127 posts
New about MyzharBot project http://myzharbot.robot-home.it
Don't wanna be here? Send us removal request.
myzharbot · 8 years ago
Text
This news is not directly related to MyzharBot project, but somehow it will give to MyzharBot a big improvement. Let’s start from the beginning…
It was May 2017 and I bumped into an internet page about a challenge organized by Flir and BeagleBone.org Foundation. I know Flir cameras very well and it was clear in my head how they can be useful for robotic applications. The joint of Flir and BeagleBone Blue SBC was the better starting point to put into practice what was in my mind and the project “SmarTC – Smart Thermal Camera” has born.
As a Computer Engineer specialized in Robotics, Computer Vision and Artificial Intelligence I would like to use the FLIR Lepton 3 LWIR camera to create an intelligent sensor, capable of better understanding the environment where a robot is operating. Using a LWIR sensor the robot could better distinguish the obstacles in the environment and detect “animated” (hot) and “inanimate” (cold) objects, reacting to their movements, according to their classification. The sensor could also be used to detect humans and animals making the robots safest. Furthermore the LWIR sensor makes the robot capable to operate safely also in lack of light, during night hours or during in an electrical fault. The combination of the FLIR Lepton 3 LWIR sensor and the computational power of the BeagleBone Blue board can so result in an “smart thermal camera” useful for autonomous domestic robots or autonomous flying drones.
I submitted my idea for the Challenge and I crossed my fingers.
The time passed and it was the 1st August when I received an email by the Judging Panel of the Challenge telling me that I was one of the five finalists of the contest. In a few days I received a Flir Lepton3 thermal camera and a BeagleBone Blue SBC and I started to work on the project. I must say that it was not really easy to start to handle the Lepton3 and I faced a lot of troubles to write a stable C/C++ code to acquire thermal images with the BeagleBone Blue, but with the help of previous works made on the former Lepton sensors and a lot of patience of Jason Kridner (the father of the Beagle SBC family) I finally reached my goal: I had a thermal video stream in my hand.
Flir Lepton 3 interface is made of two buses: SPI tocontinuously transmit the thermal frames, I2C to receive control command and to send theinformation about the status of the sensor. Everything is well described in two documents: “FLIR LEPTON® 3 Long Wave Infrared (LWIR) Datasheet” and “FLIR LEPTON® Software IDD“. Understanding the contents of both documents in detail is necessary to deal with Lepton3… I understood this statement after 10 days of panic.
The steps of the work that I made can be summed up with the following YouTube videos:
and the final video, with the presentation of the basis of my algorithm to be applied to Robotics is this:
The source code that I have written for the challenge is publicly available on Github: https://github.com/Myzhar/Lepton3_BBB
The 3D printed case for the Flir Lepton3 sensor, designed by myself, is available on Thingiverse.
Starting from now I want to integrate all the knowledge that I acquired with this challenge into the MyzharBot project. MyzharBot will gain the capability to see temperature and it will be able to navigate in the dark. Furthermore I want to use all the potentiality of the BeagleBone Blue SBC, mainly the 9 axis IMU and the motor control capabilities.
As always stay tuned if you want to know more…
  “SmarTC – Smart Thermal Camera” won the Flir Lepton3 and BeagleBone Blue Challenge This news is not directly related to MyzharBot project, but somehow it will give to MyzharBot a big improvement.
0 notes
myzharbot · 8 years ago
Text
This news is not directly related to MyzharBot project, but somehow it will give to MyzharBot a big improvement. Let’s start from the beginning…
It was May 2017 and I bumped into an internet page about a challenge organized by Flir and BeagleBone.org Foundation. I know Flir cameras very well and it was clear in my head how they can be useful for robotic applications. The joint of Flir and BeagleBone Blue SBC was the better starting point to put into practice what was in my mind and the project “SmarTC – Smart Thermal Camera” has born.
As a Computer Engineer specialized in Robotics, Computer Vision and Artificial Intelligence I would like to use the FLIR Lepton 3 LWIR camera to create an intelligent sensor, capable of better understanding the environment where a robot is operating. Using a LWIR sensor the robot could better distinguish the obstacles in the environment and detect “animated” (hot) and “inanimate” (cold) objects, reacting to their movements, according to their classification. The sensor could also be used to detect humans and animals making the robots safest. Furthermore the LWIR sensor makes the robot capable to operate safely also in lack of light, during night hours or during in an electrical fault. The combination of the FLIR Lepton 3 LWIR sensor and the computational power of the BeagleBone Blue board can so result in an “smart thermal camera” useful for autonomous domestic robots or autonomous flying drones.
I submitted my idea for the Challenge and I crossed my fingers.
The time passed and it was the 1st August when I received an email by the Judging Panel of the Challenge telling me that I was one of the five finalists of the contest. In a few days I received a Flir Lepton3 thermal camera and a BeagleBone Blue SBC and I started to work on the project. I must say that it was not really easy to start to handle the Lepton3 and I faced a lot of troubles to write a stable C/C++ code to acquire thermal images with the BeagleBone Blue, but with the help of previous works made on the former Lepton sensors and a lot of patience of Jason Kridner (the father of the Beagle SBC family) I finally reached my goal: I had a thermal video stream in my hand.
Flir Lepton 3 interface is made of two buses: SPI tocontinuously transmit the thermal frames, I2C to receive control command and to send theinformation about the status of the sensor. Everything is well described in two documents: “FLIR LEPTON® 3 Long Wave Infrared (LWIR) Datasheet” and “FLIR LEPTON® Software IDD“. Understanding the contents of both documents in detail is necessary to deal with Lepton3… I understood this statement after 10 days of panic.
The steps of the work that I made can be summed up with the following YouTube videos:
and the final video, with the presentation of the basis of my algorithm to be applied to Robotics is this:
The source code that I have written for the challenge is publicly available on Github: https://github.com/Myzhar/Lepton3_BBB
The 3D printed case for the Flir Lepton3 sensor, designed by myself, is available on Thingiverse.
Flir Lepton3 in 3D printed case
3D printed case for Flir Lepton3
3D printed case for Flir Lepton3
Flir Lepton3 in 3D printed case
3D printed case for Flir Lepton3
SPI and I2C test
Starting from now I want to integrate all the knowledge that I acquired with this challenge into the MyzharBot project. MyzharBot will gain the capability to see temperature and it will be able to navigate in the dark. Furthermore I want to use all the potentiality of the BeagleBone Blue SBC, mainly the 9 axis IMU and the motor control capabilities.
As always stay tuned if you want to know more…
  "SmarTC - Smart Thermal Camera" won the Flir Lepton3 and BeagleBone Blue Challenge @flir @beagleboardorg +BeagleBoard.org This news is not directly related to MyzharBot project, but somehow it will give to MyzharBot a big improvement.
0 notes
myzharbot · 8 years ago
Text
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit @nvidia @NVIDIAembedded @nvidia_IT
The Computer Vision is a really computational power requesting task, if you work with computer vision you know that to reach very high frame rates you need to maximize the parallelism of the algorithms and you know that massive parallelism is reachable only moving elaborations to GPUs. The embedded massive parallelism was a dream for computer vision until Nvidia launched Jetson TK1 board in 2014.…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit @nvidia @NVIDIAEmbedded @NVIDIA_IT
The Computer Vision is a really computational power requesting task, if you work with computer vision you know that to reach very high frame rates you need to maximize the parallelism of the algorithms and you know that massive parallelism is reachable only moving elaborations to GPUs. The embedded massive parallelism was a dream for computer vision until Nvidia launched Jetson TK1 board in 2014.…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
First boot of the Nvidia Jetson TX2 and CUDA test
First boot of the Nvidia Jetson TX2 and CUDA test @NVIDIAEmbedded @nvidia @NVIDIA_IT
Yesterday 7th march Nvidia launched the new Jetson TX2. In the launch post I described the differences with the previous Jetson and how the new Jetson TX2 can double the performances of the TX1.
Now it’s time to see how it really works, so let’s stop the written words and let’s start watching the new video:
Stay tuned for the next…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
First boot of the Nvidia Jetson TX2 and CUDA test
First boot of the Nvidia Jetson TX2 and CUDA test @NVIDIAEmbedded @nvidia @NVIDIA_IT
Yesterday 7th march Nvidia launched the new Jetson TX2. In the launch post I described the differences with the previous Jetson and how the new Jetson TX2 can double the performances of the TX1.
Now it’s time to see how it really works, so let’s stop the written words and let’s start watching the new video:
Stay tuned for the next…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
[:it]Well, it’s been a while since I published the latest news about MyzharBot project, what’s better than starting again from a really big news? Nvidia has presented the new Jetson module, Nvidia Jetson TX2 and it’s ready to get on board of MyzharBot
What’s about the new “embedded monster” released by Nvidia? While the step from the Jetson TK1 to the Jetson TK2 has been very big, this time Nvidia put the focus on the continuity: the new Jetson TX2 is a module really similar to the Jetson TX1, with the same dimensions and, more important, pin to pin compatible with the former one. We can continue to use the carrier boards that we are using with Jetson TX1, but with more and more computational power.
Nvidia Jetson TX2
So what’s new respect to the Jetson TX1… sit down and read carefully, it’s impressive:
GPU: from Maxwell to Pascal up to 1.3Ghz
CPU: added dual core 64bit Denver (up to 2.0Ghz) to the existing four cores 64 bit A57
Memory: from 4 GB 64bit LPDDR4 to 8 GB 128bit LPDDR4
Storage: doubled the eMMC to 32 GB
Video Encode: from 2160p @ 30 FPS to 2160p @ 60 FPS 
Hence what can we do with the new Jetson TX2? We can run 6 DNNs, while making 2x Object Tracking and decoding two 4K 30FPS streams, composing the result and encoding it to 2 streams H265… quite impressive!!!
NVidia has not only worked  on the hardware, new software is coming too. The new Jetpack v3.0 will be released soon and it is highly focused on Artificial Intelligence:
Deep Learning: TensorRT 1.0 GA, cuDNN 5.1, DIGITS Workflow
Computer Vision: VisionWorks 1.6, OpenCV4Tegra 2.4.13 (OpenCV 3 is available compiling from source Nvidia submitted a lot of improvement for Tegra to the main stream)
GPU Compute: CUDA 8.0.64, CUDA Libs
Multimedia: ISP Support, Camera Imaging, Video CODEC
More: ROS compatibility, OpenGL 4.5, OpenGL ES 3.2, OpenGL EGL 1.4, Vulkan 1.0.3, GStreamer, advanced developer tools… and much more
Also connectivity has been improved:
3x USB 2
3x USB 3
PCIe 1×4 + 1×1 or 2×1 + 1×2
6x CSI cameras
Support for DSI display
Support for HDMI display
Digital MIC
Digital Speakers
Gigabit Ethernet
Wifi
5x I2C
3x SPI
2x CAN
Nvidia is highly focusing on Artificial Intelligence for Robotics, Automotive and Security, the new Jetson TX2 will really speed up the embedded applications that require high computational power.
Next he unboxing video and a few photos of the development kit… new videos will follow in the next days exploring CUDA and Computer Vision capabilities:
[:en]Well, it’s been a while since I published the latest news about MyzharBot project, what’s better than starting again from a really big news? Nvidia has presented the new Jetson module, Nvidia Jetson TX2 and it’s ready to get on board of MyzharBot
What’s about the new “embedded monster” released by Nvidia? While the step from the Jetson TK1 to the Jetson TK2 has been very big, this time Nvidia put the focus on the continuity: the new Jetson TX2 is a module really similar to the Jetson TX1, with the same dimensions and, more important, pin to pin compatible with the former one. We can continue to use the carrier boards that we are using with Jetson TX1, but with more and more computational power.
Nvidia Jetson TX2
So what’s new respect to the Jetson TX1… sit down and read carefully, it’s impressive:
GPU: from Maxwell to Pascal up to 1.3Ghz
CPU: added dual core 64bit Denver (up to 2.0Ghz) to the existing four cores 64 bit A57
Memory: from 4 GB 64bit LPDDR4 to 8 GB 128bit LPDDR4
Storage: doubled the eMMC to 32 GB
Video Encode: from 2160p @ 30 FPS to 2160p @ 60 FPS 
Hence what can we do with the new Jetson TX2? We can run 6 DNNs, while making 2x Object Tracking and decoding two 4K 30FPS streams, composing the result and encoding it to 2 streams H265… quite impressive!!!
NVidia has not only worked  on the hardware, new software is coming too. The new Jetpack v3.0 will be released soon and it is highly focused on Artificial Intelligence:
Deep Learning: TensorRT 1.0 GA, cuDNN 5.1, DIGITS Workflow
Computer Vision: VisionWorks 1.6, OpenCV4Tegra 2.4.13 (OpenCV 3 is available compiling from source Nvidia submitted a lot of improvement for Tegra to the main stream)
GPU Compute: CUDA 8.0.64, CUDA Libs
Multimedia: ISP Support, Camera Imaging, Video CODEC
More: ROS compatibility, OpenGL 4.5, OpenGL ES 3.2, OpenGL EGL 1.4, Vulkan 1.0.3, GStreamer, advanced developer tools… and much more
Also connectivity has been improved:
3x USB 2
3x USB 3
PCIe 1×4 + 1×1 or 2×1 + 1×2
6x CSI cameras
Support for DSI display
Support for HDMI display
Digital MIC
Digital Speakers
Gigabit Ethernet
Wifi
5x I2C
3x SPI
2x CAN
Nvidia is highly focusing on Artificial Intelligence for Robotics, Automotive and Security, the new Jetson TX2 will really speed up the embedded applications that require high computational power.
Next he unboxing video and a few photos of the development kit… new videos will follow in the next days exploring CUDA and Computer Vision capabilities:
[:]
The dawn of the new Jetson... Nvidia Jetson TX2 has come #TX2 #JetsonTX2 @NVIDIAEmbedded @nvidia @NVIDIA_IT [:it]Well, it's been a while since I published the latest news about MyzharBot project, what's better than starting again from a really big news?
0 notes
myzharbot · 8 years ago
Text
Well, it’s been a while since I published the latest news about MyzharBot project, what’s better than starting again from a really big news? Nvidia has presented the new Jetson module, Nvidia Jetson TX2 and it’s ready to get on board of MyzharBot
What’s about the new “embedded monster” released by Nvidia? While the step from the Jetson TK1 to the Jetson TK2 has been very big, this time Nvidia put the focus on the continuity: the new Jetson TX2 is a module really similar to the Jetson TX1, with the same dimensions and, more important, pin to pin compatible with the former one. We can continue to use the carrier boards that we are using with Jetson TX1, but with more and more computational power.
Nvidia Jetson TX2
So what’s new respect to the Jetson TX1… sit down and read carefully, it’s impressive:
GPU: from Maxwell to Pascal up to 1.3Ghz
CPU: added dual core 64bit Denver (up to 2.0Ghz) to the existing four cores 64 bit A57
Memory: from 4 GB 64bit LPDDR4 to 8 GB 128bit LPDDR4
Storage: doubled the eMMC to 32 GB
Video Encode: from 2160p @ 30 FPS to 2160p @ 60 FPS 
Hence what can we do with the new Jetson TX2? We can run 6 DNNs, while making 2x Object Tracking and decoding two 4K 30FPS streams, composing the result and encoding it to 2 streams H265… quite impressive!!!
NVidia has not only worked  on the hardware, new software is coming too. The new Jetpack v3.0 will be released soon and it is highly focused on Artificial Intelligence:
Deep Learning: TensorRT 1.0 GA, cuDNN 5.1, DIGITS Workflow
Computer Vision: VisionWorks 1.6, OpenCV4Tegra 2.4.13 (OpenCV 3 is available compiling from source Nvidia submitted a lot of improvement for Tegra to the main stream)
GPU Compute: CUDA 8.0.64, CUDA Libs
Multimedia: ISP Support, Camera Imaging, Video CODEC
More: ROS compatibility, OpenGL 4.5, OpenGL ES 3.2, OpenGL EGL 1.4, Vulkan 1.0.3, GStreamer, advanced developer tools… and much more
Also connectivity has been improved:
3x USB 2
3x USB 3
PCIe 1×4 + 1×1 or 2×1 + 1×2
6x CSI cameras
Support for DSI display
Support for HDMI display
Digital MIC
Digital Speakers
Gigabit Ethernet
Wifi
5x I2C
3x SPI
2x CAN
Nvidia is highly focusing on Artificial Intelligence for Robotics, Automotive and Security, the new Jetson TX2 will really speed up the embedded applications that require high computational power.
Next he unboxing video and a few photos of the development kit… new videos will follow in the next days exploring CUDA and Computer Vision capabilities:
Nvidia Jetson TX2 Dev kit just unpacked
CSI Camera module
PCI express and SATA connectors
USB3, USB2, HDMI, SD Card and Wifi antennas
The dawn of the new Jetson... Nvidia Jetson TX2 has come @NVIDIAEmbedded @nvidia @NVIDIA_IT #JetsonTX2 Well, it's been a while since I published the latest news about MyzharBot project, what's better than starting again from a really big news?
0 notes
myzharbot · 9 years ago
Text
A simple talk about Stereo Vision
A simple talk about Stereo Vision
The last weekend, 21-22 May, was held in Rome the annual event dedicated to robotics organized by the no profit association Officine Robotiche. For the third consecutive year I made a simple talk dedicated to artificial vision applied to robotics, to approach in a simple way the audience to this subject that is too often taken for granted, when in reality is much more complicated than it may seem.
View On WordPress
0 notes
myzharbot · 9 years ago
Text
A simple talk about Stereo Vision
A simple talk about Stereo Vision #OR16 @OfficineRobotic @NVIDIATegra @stereolabs3D
The last weekend, 21-22 May, was held in Rome the annual event dedicated to robotics organized by the no profit association Officine Robotiche. For the third consecutive year I made a simple talk dedicated to artificial vision applied to robotics, to approach in a simple way the audience to this subject that is too often taken for granted, when in reality is much more complicated than it may seem.
View On WordPress
0 notes
myzharbot · 9 years ago
Text
It’s been a lot since latest post… the project has not stopped, but the time is missing, so blog update is confined into limited time slots.
Who follows the news on Twitter, Facebook and Youtube knows that also this year MyzharBot was demoing in the Nvidia’s booth at GTC 2016 conference in the beginning of May to illustrate the elaboration power of the amazing Nvidia Jetson TX1 applied to mobile robotics. It was a really tiring week, but it was a pleasure to see how such a “simple” project can attract a lot of GPU geniuses.
After GTC 2016, the next event in the timeline is Officine Robotiche 2016. MyzharBot will be exposed in Rome the next 21st and 22nd May and it will be very different from the version presented at GTC.
During GTC the tracks have shown a lot of limitations due to a bad design of the “toothed sprocket” that transmits the power from motors to rubber tracks. The high friction between the teeth of the sprocket and the teeth of the track causes a lot of waste of power and the robot has a lot of problems turning on the place with a few kind of surfaces.
So I decided to design two new sprockets and to make them by 3D printing… in the next days the new track will be ready. This is the 3D rendering:
New track structure
I also decided to use a bunch of other bearings to remove any other possible cause of frictions… a lot of new tests are yet to come.
When MyzharBot-v4 reaches a good stability, I will publish the project of the new chassis and the new part list… I started using Actobotics components since there is now an italian importer: Steplab… really good components!
Despite all the problems listed above, MyzharBot-v4 works … and it works finally outdoor, as you can see from the new videos available on YouTube … of course it’s totally autonomous, MyzharBot is a robot, not a radio-controlled toy car😉
    Latest news... the project is going on It's been a lot since latest post... the project has not stopped, but the time is missing, so blog update is confined into limited time slots.
0 notes
myzharbot · 9 years ago
Text
It’s been a lot since latest post… the project has not stopped, but the time is missing, so blog update is confined into limited time slots.
Who follows the news on Twitter, Facebook and Youtube knows that also this year MyzharBot was demoing in the Nvidia’s booth at GTC 2016 conference in the beginning of May to illustrate the elaboration power of the amazing Nvidia Jetson TX1 applied to mobile robotics. It was a really tiring week, but it was a pleasure to see how such a “simple” project can attract a lot of GPU geniuses.
After GTC 2016, the next event in the timeline is Officine Robotiche 2016. MyzharBot will be exposed in Rome the next 21st and 22nd May and it will be very different from the version presented at GTC.
During GTC the tracks have shown a lot of limitations due to a bad design of the “toothed sprocket” that transmits the power from motors to rubber tracks. The high friction between the teeth of the sprocket and the teeth of the track causes a lot of waste of power and the robot has a lot of problems turning on the place with a few kind of surfaces.
So I decided to design two new sprockets and to make them by 3D printing… in the next days the new track will be ready. This is the 3D rendering:
New track structure
I also decided to use a bunch of other bearings to remove any other possible cause of frictions… a lot of new tests are yet to come.
When MyzharBot-v4 reaches a good stability, I will publish the project of the new chassis and the new part list… I started using Actobotics components since there is now an italian importer: Steplab… really good components!
Despite all the problems listed above, MyzharBot-v4 works … and it works finally outdoor, as you can see from the new videos available on YouTube … of course it’s totally autonomous, MyzharBot is a robot, not a radio-controlled toy car😉
    Latest news… the project is going on It's been a lot since latest post... the project has not stopped, but the time is missing, so blog update is confined into limited time slots.
0 notes
myzharbot · 9 years ago
Text
Nvidia Jetson TX1 as Access Point and a guide to configure ROS-OpenCV for Jetsons
Nvidia Jetson TX1 as AP and howto configure ROS-OpenCV @NVIDIATegra @rosorg @Jetsonhacks @raffaello86 @OSRFoundation
Making the final preparations for the participation of MyzharBot-v4 at the next conference GTC 2016 by Nvidia (San Jose, California, APRIL 4-7, 2016), I realized that it would be very helpful to have two guides for the realization of two basic operations for using the Nvidia Jetson TX1 on a robot:
Configurate the Nvidia Jetson TX1 to act as a Wifi Access Point
Hacking ROS configuration to let…
View On WordPress
0 notes
myzharbot · 9 years ago
Text
Hacking Jetson TX1 power connector
Hacking Jetson TX1 power connector
The power connector of the carrier board of the Nvidia Jetson TX1 is really solid and well done, but the strong vibration of a tracked robot can compromise its stability, so I decided to modify it to be sure that MyzharBot-v4 has a really robust power connection. (more…)
View On WordPress
0 notes
myzharbot · 9 years ago
Text
The power connector of the carrier board of the Nvidia Jetson TX1 is really solid and well done, but the strong vibration of a tracked robot can compromise its stability, so I decided to modify it to be sure that MyzharBot-v4 has a really robust power connection.
A full description of the operation is available here
  [HowTo] Hacking Jetson TX1 power connector @NVIDIATegra @nvidia @NVIDIA_IT @OfficineRobotic The power connector of the carrier board of the Nvidia Jetson TX1 is really solid and well done, but the strong vibration of a tracked robot can compromise its stability, so I decided to modify it to be sure that MyzharBot-v4 has a really robust power connection.
0 notes
myzharbot · 9 years ago
Text
[Video] Testing the new uNavSB H-Bridge board for uNav Motor Controller
[Video] Testing the new uNavSB H-Bridge board for uNav Motor Controller
The motors that move MyzharBot are really powerful, really too much powerful for the motor drivers mounted on the little uBridge shield used originally by the amazing uNav Motor Control board made by Officine Robotiche.
For this reason Mauro Soligo (under a “very little” pressure) realized the new uNavSB carrier board that uses the ST L6205 Dual H-Bridge Motor driver, plenty of power for…
View On WordPress
0 notes
myzharbot · 9 years ago
Text
[Video] Testing the new uNavSB H-Bridge board for uNav Motor Controller
[Video] Testing the new uNavSB H-Bridge board for uNav Motor Controller @ST_World @KatodoStore @OfficineRobotic
The motors that move MyzharBot are really powerful, really too much powerful for the motor drivers mounted on the little uBridge shield used originally by the amazing uNav Motor Control board made by Officine Robotiche.
For this reason Mauro Soligo (under a “very little” pressure) realized the new uNavSB carrier board that uses the ST L6205 Dual H-Bridge Motor driver, plenty of power for…
View On WordPress
0 notes