#nvidiaembedded
Explore tagged Tumblr posts
shubhambhatt · 3 years ago
Photo
Tumblr media
Did it turn out ok? 🤔 Looks kind on sketchy 😑. #arduino #raspberrypi #nvidiaembedded #3dprinting #robotics #robot #automation #instructables #education #esp32 #esp8266 #artificialintelligence #machinelearning #ai #metaverse #nft #internetofthings #iot #design #engineering #innovation #programming #coding #science #experiment #design #sketching #drawthisinyourstyle #matlab #physics #indianroboticscommunity https://www.instagram.com/p/CYvSYGgL4bd/?utm_medium=tumblr
0 notes
ryracing · 6 years ago
Photo
Tumblr media
Excited to start moving my dev over to the Jetson Xavier platform, thank you Joe!! #adlinktech #nvidiaembedded #ryracing (at Austin, Texas) https://www.instagram.com/p/B1jjgwElEzX/?igshid=tniuq3aqbhtw
0 notes
astronautabby · 5 years ago
Photo
Tumblr media
#Sponsored Say hi to my new #AI! As an advocate for #stemeducation, I’m a huge fan of NVIDIA’s new Jetson Nano 2GB Developer Kit which provides hands-on practice with robotics and AI for young people at an accessible price point! Pre-order Now by clicking the #linkinbio! Big thanks @NVIDIAEmbedded for sending me their new #jetsonnano product! I had a ton of fun unboxing this. Which projects do you think I should make first with the Jetson Nano? #ad #AI #robotics #STEMlearning #womeninstem #nvidia #nasa #stemgirls #distancelearning https://www.instagram.com/p/CGYeuIjFWXb/?igshid=1ct1o8cqjx0kz
5 notes · View notes
ipv1 · 6 years ago
Photo
Tumblr media
@nvidiajetson @nvidiaembedded why oh why is it still running unity!? Why!!!! — view on Instagram http://bit.ly/2UxJx6p
0 notes
urielfanelli · 6 years ago
Text
RT @KeinPfusch: RT @seeedstudio: A tutorial on @Hacksterio explained how to run #Caffe on #Jetson Nano. 👉 https://t.co/wKXFPHOnHn Jetson Nano can run a wide variety of advanced networks, including the full native versions of popular ML frameworks like #TensorFlow, #PyTorch, Caffe @NVIDIAEmbedded https://t.co/WQRsMa055E
@KeinPfusch ha twittato: RT @seeedstudio: A tutorial on @Hacksterio explained how to run #Caffe on #Jetson Nano. 👉 https://t.co/wKXFPHOnHn Jetson Nano can run a wide variety of advanced networks, including the full native versions of popular ML frameworks like #TensorFlow, #PyTorch, Caffe @NVIDIAEmbedded https://t.co/WQRsMa055E
A tutorial on @Hacksterio explained how to run #Caffe on #JetsonNano.
View On WordPress
0 notes
kayawagner · 7 years ago
Text
E-Commerce Giants Select NVIDIA Jetson AGX Xavier for Next-Gen Delivery Robots
Leading China e-commerce companies JD.com and Meituan have selected the NVIDIA Jetson AGX Xavier platform to power their next-generation autonomous delivery robots.
Over the past decade, China’s e-commerce has grown to represent more than 40 percent of the world’s online transactions —accounting for hundreds of billions of dollars in trade annually. That’s larger than the combined total in the U.S., Japan, Germany, UK, and France, according to the McKinsey Global Institute.
To support this tremendous growth, China’s leading e-commerce companies are developing AI delivery robots to move goods to and from warehouses, speed up last-mile deliveries and deliver products to consumers and businesses.
Requirements for Next-Gen Delivery Robots
Next-generation delivery robots require massive computing performance in a small package. Various sensors, including multiple high-resolution cameras and lidar, must perceive the world around them to localize, path plan and move in complex, dynamic, city environments. They need to identify — and respond to — pedestrians, cars, traffic lights, signs and other objects, all in real time.
Jetson AGX Xavier is perfect for that. High performance and energy efficient, it handles real-time processing of all of these computing tasks, so that delivery robots can perform safely and autonomously. Capable of up to 32 trillion operations per second, the module delivers the processing capability of a powerful workstation with greater than 10x the energy efficiency of its predecessor yet fits in the palm of your hand.
Since autonomous machines like delivery robots are software-defined, it’s easy to improve performance and add more functionalities through updates. The Jetson AGX Xavier platform comes with the NVIDIA JetPack SDK, a full AI development software solution, which includes the latest versions of CUDA, cuDNN and TensorRT, as well as high-level software code to simplify and accelerate development.
JD.com: Autonomously Moving Goods Into Neighborhoods
JD.com is China’s largest retailer, with more than 300 million active customers and the largest in-house e-commerce logistics network covering 99% of the population. It’s the first e-commerce company in the world to launch a fully-automated B2C fulfillment center and commercially deployed drone delivery. To satisfy growing delivery demands, JD.com began trials of autonomous delivery robots last year on university and business campuses.
JD’s delivery robot is outfitted with multiple high-definition sensors connected to a Jetson AGX Xavier to provide 360-degree real-time vision and perception processing for full situational awareness of the environment. By doing this, the delivery robot can easily navigate crowded streets, autonomously plan its route to customers, avoid obstacles, and recognize traffic lights.
“Our delivery robots need a platform that has significant computing power and can interface with different types of sensors to safely drive outdoors in dynamic and unstructured environments, along with interacting with humans,” said Qi Kong, who leads development of JD.com’s robotics efforts. “This is why we selected Jetson AGX Xavier — a first-of-its-kind platform for autonomous machines.”
Meituan: Enabling Food on the Move Through AI
Meituan Dianping is the world’s largest on-demand food delivery company. Combining the business models of Uber Eats, Yelp and Groupon, it works with more than 400,000 local businesses. Meituan Dianping launched its Xiaodai (meaning “pouch”) autonomous delivery vehicle to move meals from restaurants to consumers.
Xiaodai is being tested in three locations, including Joy City Mall in Beijing’s Chaoyang District, Lenovo’s offices in Shenzhen, and the city of Xiong’an.
While the delivery vehicle has a small size and battery, the amount of processing required for sensing, positioning and planning is the same as large unmanned vehicles and relies heavily on high-performance computing from Jetson AGX Xavier.
“Unmanned delivery vehicles are vital to the development of the logistics industry, which can greatly improve distribution and delivery,” said Xiahua Xia, general manager at Meituan. “We’re looking forward to leveraging Jetson AGX Xavier’s powerful AI capabilities to advance our Xiaodai autonomous delivery robot.”
NVIDIA Jetson AGX Xavier Developer Kit Delivers
The NVIDIA Jetson AGX Xavier developer kit is available now for $1,299. Attendees of GTC China, where these delivery robots and more are on display this week, can purchase the kits onsite through local distributor Synnex.
Follow @NVIDIAEmbedded for all of the latest news in autonomous machines.
The post E-Commerce Giants Select NVIDIA Jetson AGX Xavier for Next-Gen Delivery Robots appeared first on The Official NVIDIA Blog.
E-Commerce Giants Select NVIDIA Jetson AGX Xavier for Next-Gen Delivery Robots published first on https://supergalaxyrom.tumblr.com
0 notes
myzharbot · 8 years ago
Text
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit @nvidia @NVIDIAembedded @nvidia_IT
The Computer Vision is a really computational power requesting task, if you work with computer vision you know that to reach very high frame rates you need to maximize the parallelism of the algorithms and you know that massive parallelism is reachable only moving elaborations to GPUs. The embedded massive parallelism was a dream for computer vision until Nvidia launched Jetson TK1 board in 2014.…
View On WordPress
0 notes
shubhambhatt · 3 years ago
Photo
Tumblr media
Gaganyaan crew module 🚀. Model is not accurate i have added and removed few details to fit all the electronics and make it 3d printable without support. Let me know if you are interested in stuff like this or what would you like to see next. I am working on the simulator part. Will release the code and tutorial maybe in a month. Gaganyaan is an Indian crewed orbital spacecraft intended to be the formative spacecraft of the Indian Human Spaceflight Programme. The spacecraft is being designed to carry three people, and a planned upgraded version will be equipped with rendezvous and docking capability. Download: https://grabcad.com/library/gaganyaan-crew-module-3d-printable-1 100% PLA 0.3 mm #arduino #3dprinting #raspberrypi #electronics #embeddedsystems #robotics #robot #automation #education #artificialintelligence #ai #indianroboticscommunity #internetofthings #iot #ai #design #engineering #innovation #programming #coding #science #spacex #nasa #education #stem #hobby #nvidiaembedded #electronics #opensource #innovation #motivation #chandrayaan3 #experiment #isro https://www.instagram.com/p/CZ7FWGvvjzF/?utm_medium=tumblr
0 notes
shubhambhatt · 3 years ago
Photo
Tumblr media
Cube rover or Cute rover 😉 I designed it last year. 3d printed and opensource. Has node mcu and communicates over wifi. Inspired by 👉 @astrobotictechnology They uploaded some testing footage on their YouTube a while ago please watch it to get an idea how this rover works. I will post a tutorial and demo video soon. I have already shared code ans and files on Github if anyone likes to try it. Cad files: https://grabcad.com/library/cube-rover-by-astrobotic-1 #arduino #3dprinting #robotics #robot #automation #nasa #spacex #cubesat #cuberover #isro #instructables #hobby #education #ai #engineering #electronics #technology #internetofthings #iot #ai #design #engineering #innovation #programming #coding #science #nvidiaembedded #design #diy #esp32 #experiment https://www.instagram.com/saste.jugaad/p/CYY8Dp5PBxM/?utm_medium=tumblr
1 note · View note
ipv1 · 6 years ago
Photo
Tumblr media
@nvidiajetson @nvidiaembedded unboxed. Pretty small for what it's supposed to do. #artificialintelligence — view on Instagram http://bit.ly/2Zt3v6d
0 notes
kayawagner · 7 years ago
Text
Getting Brainy in Brisbane: NVIDIA Talks Robots, Research at ICRA
We’re bringing NVIDIA researchers — the brains behind our bots — to the International Conference on Robotics and Automation (ICRA) in Brisbane, Australia, from May 21-25. And they want to meet you.
Held annually since 1984, ICRA has become a premier forum for robotics researchers from across the globe to present their work.
The conference is a great opportunity to meet our team, go in-depth with our recent work shaping robotics research and development, and learn how NVIDIA GPUs and AI are powering the biggest advancements in autonomous machines.
From conference talks and poster sessions to two nights of meetups, ICRA will be chock-full of opportunities to connect with some of the sharpest minds in robotics and automation. You can score a deal on an NVIDIA Jetson TX2 Developer Kit, too.
Stop by ICRA stands 7 and 8 to sync with our recruiting team to learn more about careers at NVIDIA.
Get Some Face Time with the Brains Behind Our Bots
Two amazing new members of our robotics team will be at ICRA all week long.
Meet the NVIDIA researchers who are driving the latest robotic innovations.
Claire Delaunay, vice president of engineering, will be hosting evening meetups May 21 and 22 at the iconic Fox Hotel (more on that below). She’ll be joined by Dieter Fox, who heads up our robotics lab.
For more than a decade, Delaunay has led robotics teams at startups, research labs and big companies, including Google, where she was the program lead.
Most recently she co-founded Otto, which was acquired by Uber, where she served as director of engineering before coming to work with us to develop robotic solutions.
Fox joined NVIDIA to head our robotics research lab in Seattle. The goal of the lab is to develop the next generation of robots that can robustly manipulate the physical world and interact with people naturally.
He also runs the University of Washington Robotics and State Estimation Lab, where his research focuses on robotics with strong connections to AI, computer vision and machine learning.
Join Fox and his colleagues throughout the week at conference talks and poster sessions:
“Re3: Real-Time Recurrent Regression Networks for Visual Tracking of Generic Objects” – Robust object tracking requires knowledge and understanding of the object being tracked: its appearance, its motion and how it changes over time. A tracker must be able to modify its underlying model and adapt to new observations. Re3 is a real-time deep object tracker capable of incorporating temporal information into its model.
“SE3-Pose-Nets: Structured Deep Dynamics Models for Visuomotor Planning and Control” – This talk describes an approach to deep visuomotor control using structured deep dynamics models. Our deep dynamics model, a variant of SE3-Nets, learns a low-dimensional pose embedding for visuomotor control via an encoder-decoder structure.
“Synthetically Trained Neural Networks for Learning Human-Readable Plans from Real-World Demonstrations” – This talk presents a system to infer and execute a human-readable program from a real-world demonstration. It consists of a series of neural networks to perform perception, program generation and program execution. The networks are trained entirely in simulation, and the system is tested in the real world on the pick-and-place problem of stacking colored cubes using a Baxter robot.
Examples of object detection from image-centric domain randomization, showing the seven detected vertices.
We Want to Meet You at Our Meetups
After the conference on Monday and Tuesday evenings, our Jetson meetups at the iconic Fox Hotel will be the place to be.
Delaunay, Fox and other NVIDIA researchers — along with our developers and partners — will be on hand to connect with you over good drinks and great food.
We’ll have talks from NVIDIA Research and technology demos that you won’t want to miss. It’s a chance for you to listen, learn and connect with industry luminaries and peers in a fun, relaxed setting.
During the meetup, there’ll be special pricing on the Jetson TX2 Dev Kits for just AUD $599. Space is limited, so register today.
The Jetson TX2 Dev Kit is the best tool for all of your robot needs.
Follow @NVIDIAEmbedded and #brainsbehindthebots for all of the latest news.
The post Getting Brainy in Brisbane: NVIDIA Talks Robots, Research at ICRA appeared first on The Official NVIDIA Blog.
Getting Brainy in Brisbane: NVIDIA Talks Robots, Research at ICRA published first on https://supergalaxyrom.tumblr.com
0 notes
myzharbot · 8 years ago
Text
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit
Embedded Computer Vision Real Time? Nvidia Jetson TX2 + VisionWorks toolkit @nvidia @NVIDIAEmbedded @NVIDIA_IT
The Computer Vision is a really computational power requesting task, if you work with computer vision you know that to reach very high frame rates you need to maximize the parallelism of the algorithms and you know that massive parallelism is reachable only moving elaborations to GPUs. The embedded massive parallelism was a dream for computer vision until Nvidia launched Jetson TK1 board in 2014.…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
First boot of the Nvidia Jetson TX2 and CUDA test
First boot of the Nvidia Jetson TX2 and CUDA test @NVIDIAEmbedded @nvidia @NVIDIA_IT
Yesterday 7th march Nvidia launched the new Jetson TX2. In the launch post I described the differences with the previous Jetson and how the new Jetson TX2 can double the performances of the TX1.
Now it’s time to see how it really works, so let’s stop the written words and let’s start watching the new video:
Stay tuned for the next…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
First boot of the Nvidia Jetson TX2 and CUDA test
First boot of the Nvidia Jetson TX2 and CUDA test @NVIDIAEmbedded @nvidia @NVIDIA_IT
Yesterday 7th march Nvidia launched the new Jetson TX2. In the launch post I described the differences with the previous Jetson and how the new Jetson TX2 can double the performances of the TX1.
Now it’s time to see how it really works, so let’s stop the written words and let’s start watching the new video:
Stay tuned for the next…
View On WordPress
0 notes
myzharbot · 8 years ago
Text
[:it]Well, it’s been a while since I published the latest news about MyzharBot project, what’s better than starting again from a really big news? Nvidia has presented the new Jetson module, Nvidia Jetson TX2 and it’s ready to get on board of MyzharBot
What’s about the new “embedded monster” released by Nvidia? While the step from the Jetson TK1 to the Jetson TK2 has been very big, this time Nvidia put the focus on the continuity: the new Jetson TX2 is a module really similar to the Jetson TX1, with the same dimensions and, more important, pin to pin compatible with the former one. We can continue to use the carrier boards that we are using with Jetson TX1, but with more and more computational power.
Nvidia Jetson TX2
So what’s new respect to the Jetson TX1… sit down and read carefully, it’s impressive:
GPU: from Maxwell to Pascal up to 1.3Ghz
CPU: added dual core 64bit Denver (up to 2.0Ghz) to the existing four cores 64 bit A57
Memory: from 4 GB 64bit LPDDR4 to 8 GB 128bit LPDDR4
Storage: doubled the eMMC to 32 GB
Video Encode: from 2160p @ 30 FPS to 2160p @ 60 FPS 
Hence what can we do with the new Jetson TX2? We can run 6 DNNs, while making 2x Object Tracking and decoding two 4K 30FPS streams, composing the result and encoding it to 2 streams H265… quite impressive!!!
NVidia has not only worked  on the hardware, new software is coming too. The new Jetpack v3.0 will be released soon and it is highly focused on Artificial Intelligence:
Deep Learning: TensorRT 1.0 GA, cuDNN 5.1, DIGITS Workflow
Computer Vision: VisionWorks 1.6, OpenCV4Tegra 2.4.13 (OpenCV 3 is available compiling from source Nvidia submitted a lot of improvement for Tegra to the main stream)
GPU Compute: CUDA 8.0.64, CUDA Libs
Multimedia: ISP Support, Camera Imaging, Video CODEC
More: ROS compatibility, OpenGL 4.5, OpenGL ES 3.2, OpenGL EGL 1.4, Vulkan 1.0.3, GStreamer, advanced developer tools… and much more
Also connectivity has been improved:
3x USB 2
3x USB 3
PCIe 1×4 + 1×1 or 2×1 + 1×2
6x CSI cameras
Support for DSI display
Support for HDMI display
Digital MIC
Digital Speakers
Gigabit Ethernet
Wifi
5x I2C
3x SPI
2x CAN
Nvidia is highly focusing on Artificial Intelligence for Robotics, Automotive and Security, the new Jetson TX2 will really speed up the embedded applications that require high computational power.
Next he unboxing video and a few photos of the development kit… new videos will follow in the next days exploring CUDA and Computer Vision capabilities:
[:en]Well, it’s been a while since I published the latest news about MyzharBot project, what’s better than starting again from a really big news? Nvidia has presented the new Jetson module, Nvidia Jetson TX2 and it’s ready to get on board of MyzharBot
What’s about the new “embedded monster” released by Nvidia? While the step from the Jetson TK1 to the Jetson TK2 has been very big, this time Nvidia put the focus on the continuity: the new Jetson TX2 is a module really similar to the Jetson TX1, with the same dimensions and, more important, pin to pin compatible with the former one. We can continue to use the carrier boards that we are using with Jetson TX1, but with more and more computational power.
Nvidia Jetson TX2
So what’s new respect to the Jetson TX1… sit down and read carefully, it’s impressive:
GPU: from Maxwell to Pascal up to 1.3Ghz
CPU: added dual core 64bit Denver (up to 2.0Ghz) to the existing four cores 64 bit A57
Memory: from 4 GB 64bit LPDDR4 to 8 GB 128bit LPDDR4
Storage: doubled the eMMC to 32 GB
Video Encode: from 2160p @ 30 FPS to 2160p @ 60 FPS 
Hence what can we do with the new Jetson TX2? We can run 6 DNNs, while making 2x Object Tracking and decoding two 4K 30FPS streams, composing the result and encoding it to 2 streams H265… quite impressive!!!
NVidia has not only worked  on the hardware, new software is coming too. The new Jetpack v3.0 will be released soon and it is highly focused on Artificial Intelligence:
Deep Learning: TensorRT 1.0 GA, cuDNN 5.1, DIGITS Workflow
Computer Vision: VisionWorks 1.6, OpenCV4Tegra 2.4.13 (OpenCV 3 is available compiling from source Nvidia submitted a lot of improvement for Tegra to the main stream)
GPU Compute: CUDA 8.0.64, CUDA Libs
Multimedia: ISP Support, Camera Imaging, Video CODEC
More: ROS compatibility, OpenGL 4.5, OpenGL ES 3.2, OpenGL EGL 1.4, Vulkan 1.0.3, GStreamer, advanced developer tools… and much more
Also connectivity has been improved:
3x USB 2
3x USB 3
PCIe 1×4 + 1×1 or 2×1 + 1×2
6x CSI cameras
Support for DSI display
Support for HDMI display
Digital MIC
Digital Speakers
Gigabit Ethernet
Wifi
5x I2C
3x SPI
2x CAN
Nvidia is highly focusing on Artificial Intelligence for Robotics, Automotive and Security, the new Jetson TX2 will really speed up the embedded applications that require high computational power.
Next he unboxing video and a few photos of the development kit… new videos will follow in the next days exploring CUDA and Computer Vision capabilities:
[:]
The dawn of the new Jetson... Nvidia Jetson TX2 has come #TX2 #JetsonTX2 @NVIDIAEmbedded @nvidia @NVIDIA_IT [:it]Well, it's been a while since I published the latest news about MyzharBot project, what's better than starting again from a really big news?
0 notes
myzharbot · 8 years ago
Text
Well, it’s been a while since I published the latest news about MyzharBot project, what’s better than starting again from a really big news? Nvidia has presented the new Jetson module, Nvidia Jetson TX2 and it’s ready to get on board of MyzharBot
What’s about the new “embedded monster” released by Nvidia? While the step from the Jetson TK1 to the Jetson TK2 has been very big, this time Nvidia put the focus on the continuity: the new Jetson TX2 is a module really similar to the Jetson TX1, with the same dimensions and, more important, pin to pin compatible with the former one. We can continue to use the carrier boards that we are using with Jetson TX1, but with more and more computational power.
Nvidia Jetson TX2
So what’s new respect to the Jetson TX1… sit down and read carefully, it’s impressive:
GPU: from Maxwell to Pascal up to 1.3Ghz
CPU: added dual core 64bit Denver (up to 2.0Ghz) to the existing four cores 64 bit A57
Memory: from 4 GB 64bit LPDDR4 to 8 GB 128bit LPDDR4
Storage: doubled the eMMC to 32 GB
Video Encode: from 2160p @ 30 FPS to 2160p @ 60 FPS 
Hence what can we do with the new Jetson TX2? We can run 6 DNNs, while making 2x Object Tracking and decoding two 4K 30FPS streams, composing the result and encoding it to 2 streams H265… quite impressive!!!
NVidia has not only worked  on the hardware, new software is coming too. The new Jetpack v3.0 will be released soon and it is highly focused on Artificial Intelligence:
Deep Learning: TensorRT 1.0 GA, cuDNN 5.1, DIGITS Workflow
Computer Vision: VisionWorks 1.6, OpenCV4Tegra 2.4.13 (OpenCV 3 is available compiling from source Nvidia submitted a lot of improvement for Tegra to the main stream)
GPU Compute: CUDA 8.0.64, CUDA Libs
Multimedia: ISP Support, Camera Imaging, Video CODEC
More: ROS compatibility, OpenGL 4.5, OpenGL ES 3.2, OpenGL EGL 1.4, Vulkan 1.0.3, GStreamer, advanced developer tools… and much more
Also connectivity has been improved:
3x USB 2
3x USB 3
PCIe 1×4 + 1×1 or 2×1 + 1×2
6x CSI cameras
Support for DSI display
Support for HDMI display
Digital MIC
Digital Speakers
Gigabit Ethernet
Wifi
5x I2C
3x SPI
2x CAN
Nvidia is highly focusing on Artificial Intelligence for Robotics, Automotive and Security, the new Jetson TX2 will really speed up the embedded applications that require high computational power.
Next he unboxing video and a few photos of the development kit… new videos will follow in the next days exploring CUDA and Computer Vision capabilities:
Nvidia Jetson TX2 Dev kit just unpacked
CSI Camera module
PCI express and SATA connectors
USB3, USB2, HDMI, SD Card and Wifi antennas
The dawn of the new Jetson... Nvidia Jetson TX2 has come @NVIDIAEmbedded @nvidia @NVIDIA_IT #JetsonTX2 Well, it's been a while since I published the latest news about MyzharBot project, what's better than starting again from a really big news?
0 notes