Technology

Geek+ Debuts First Vision Only Robot Solution, in collaboration with Intel, to Empower Smart Logistics with Depth Vision

Published

on

BEIJING, Nov. 4, 2024 /PRNewswire/ — Geek+, in collaboration with Intel, launched the world’s first Vision Only Robot Solution equipped with the Intel Visual Navigation Modules, driving the digital and intelligent transformation of the logistics industry with cutting-edge technology.

As the world’s first vision only robot solution equipped with the Intel Visual Navigation Modules, the Geek+ Vision Only Robot Solution not only possesses depth vision perception enabled by the Intel RealSense camera, but also features deep algorithmic innovations in V-SLAM positioning, composite detection networks, and robot following. This allows for highly accurate positioning, navigation, and obstacle avoidance, helping enterprises effectively cope with diverse and complex logistics scenarios while enhancing both efficiency and accuracy.

Advancing Depth Vision Perception

The Geek+ Vision Only Robot Solution is equipped with the Intel Visual Navigation Module, which integrates the Intel RealSense camera. This camera features a novel all-in-one design that enables all depth calculations to be performed directly within the device, resulting in low power consumption and independence from specific platforms or hardware. The Intel RealSense camera provides core support for various vision-based AI solutions, and when paired with a dedicated visual processor, it accelerates the machine learning process and shortens the deployment cycle of automation solutions.

Thanks to the Intel RealSense camera, the Geek+ Vision Only Robot observes, understands, interacts with, and learns from the environment. By obtaining highly accurate and consistent depth data from the environment, it accurately recognizes and interacts with its surroundings.

In addition to the camera, the Intel Visual Navigation Module also includes the Robotic Vision Hub, which contains components such as the Intel Core i7-1270P processor and connection modules. This provides reliable computational support for algorithms running on the Geek+ Vision Only Robot, including V-SLAM positioning, composite detection networks, and robot following, while enabling cloud-edge collaboration through high-speed networks.

Collaborating to Drive Algorithmic Innovation for Smart Logistics Upgrades

Building on the Intel Visual Navigation Module, Geek+ is working with Intel to engage in deep algorithmic innovation related to V-SLAM positioning, composite detection networks, and robot following:

V-SLAM Positioning Algorithm: It fuses multi-sensor data and various visual feature elements to generate composite maps, such as point feature maps, line feature maps, object maps, and special area maps. It delivers highly reliable and precise positioning in complex and dynamic environments.Composite Detection Network: With both a traditional object detection network and a validation network, it processes detection data from multiple dimensions, thus enhancing accuracy and reducing the false detection rate.Robot Following: By integrating modules like personnel detection, re-identification, and visual target tracking, Geek+ has developed a flexible and efficient visual perception pipeline. Once the relative position between the target personnel and the AMR is determined, the local planning algorithm in Geek+’s self-developed RoboGo, a robotic standalone system, will enable autonomous obstacle avoidance, resulting in smooth AMR following of the target personnel.

Leveraging the depth perception of the Intel Visual Navigation Module and the collaborative algorithmic innovation from both sides, the Geek+ Vision Only Robot ensures high precision and efficiency in environmental perception, positioning, and tracking. This facilitates logistics and warehousing service providers to drive the transformation toward smart logistics. The robots are expected to see widespread adoption in areas such as factory and warehouse transportation, helping customers build agile, digital, and intelligent supply chains.

Solomon Lee, VP of Product from Geek+ stated: “The Vision Only Robot Solution, developed in collaboration with Intel, effectively leverages the depth vision perception of the Intel RealSense camera. Together with the deep algorithmic innovations from both sides, it results in a boost in business growth and efficiency for customers, driving the digital and intelligent upgrade of smart logistics.”

Mark Yahiro, Vice President of Corporate Strategy and Ventures and the General Manager of the RealSense Business Unit at Intel Inc. within Intel’s Corporate Strategy Office remarked: “Highly accurate and consistent depth vision data is critical for AMR to achieve environmental perception, significantly influencing its performance in positioning, navigation, and obstacle avoidance. Through collaboration with Geek+, we are driving AMR innovations based on depth vision data, enabling logistics robots to deliver highly stable and accurate transport services in complex environments, thereby empowering agile, digital, and intelligent supply chains.”

Going forward, Geek+ will continue to strengthen its collaboration with Intel, driving technological innovations to develop more smart logistics solutions. This aims to empower our customers, unlock new quality productive forces in the logistics sector, and lead the future of smart logistics innovation.

This world’s first Vision Only Robots equipped with the Intel Visual Navigation Modules will make their debut at CeMAT. We warmly welcome you to visit the Geek+ booth at CeMAT 2024 (Hall N1, A3-2). Join us in exploring the limitless possibilities of smart logistics.

Disclaimer: Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries.

View original content to download multimedia:https://www.prnewswire.com/apac/news-releases/geek-debuts-first-vision-only-robot-solution-in-collaboration-with-intel-to-empower-smart-logistics-with-depth-vision-302295006.html

SOURCE Geek+

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version