By 2028, over 50% of large enterprises are expected to adopt some form of intralogistics smart robots in their warehouse or manufacturing operations.[1]
Many of today’s current multi-robot solutions fall short in unstructured environments, where robots are increasingly expected to operate independently of human oversight. At the core of these limitations is the inherent inability of proprietary systems to work seamlessly with external platforms – a result of restricted autonomy and inadequate data-sharing features.
One of the most significant challenges in robotics today lies in unifying autonomous systems from different manufacturers and getting them to collaborate effortlessly. Imagine drones produced by industry peers working in harmony with our TAURUS Unmanned Ground Vehicles (UGVs) on a coordinated search-and-rescue mission, controlled through a single central platform. This is the future we are enabling.
For nearly a decade, we have harnessed deep technologies such as AI, video analytics, and advanced robotics control protocols to develop more flexible and intelligent systems.
Our robotics reusable common module, a centralised control module for robotics systems, spearheaded by our Group Engineering Centre (GEC), represents a pivotal step toward achieving true multi-robot interoperability and enhancing synergy across autonomous platforms. |
Traditionally, robotics applications have thrived in structured industrial settings. Building upon our expertise in autonomous solutions across air, land, and sea domains, we developed the module to extend these capabilities for wide-ranging, real-world environments.
Leveraging the transformative power of Generative AI (GenAI), our robotics reusable common module can quickly interpret and translate different manufacturers’ interface documentation and codes to generate interface modules – integrating these robots into its planning and management functions.
Our robotics reusable common module applies GenAI to create reusable planning modules, empowering multiple robotic systems to autonomously navigate unfamiliar environments. With this capability, a single robot can map out the terrain, and our module will then convert this data into shareable floor plans for other connected robots to use. This functionality has broad applications, such as enhancing efficiency in search-and-rescue missions and revolutionising critical mission planning.
Within the module, AI also enables dynamic task allocation based on each robot's unique capabilities. For instance, an aerial drone can intelligently assess an ongoing situation and then delegate tasks like data collection or obstacle navigation to ground-based UGVs. This shifts the burden of task assignment away from operators, allowing them to focus on high-level, mission-critical decisions.
Essentially, our robotics reusable common module accelerates the holistic integration of diverse robotics platforms, automating previously manual programming tasks and boosting efficiency by up to 60%. Built on years of continuous refinement from previous projects, it delivers enhanced product features for superior performance.
Incorporating a large language model (LLM) agent into our module has created a more natural and intuitive way for users to interact with it. Beyond improving user experience, this advancement reduces the time and cost required for system training, human errors, and development effort needed to deploy our module across different domains. By simplifying interactions and enhancing efficiency, we are making multi-robot management more accessible and scalable.
Our vision is to create a full-fledged agentic AI for robot planning and management, by include a learning component that allows it to understand user preferences over time and take actions that are aligned with them.
As we expand across industries and applications, our system will learn the underlying business logic specific to each domain, allowing smarter automation and deeper integration into different applications. This learning capability will be introduced in 2025, paving the way for AI that not only assists but adapts dynamically to user needs.
[1] Klappich, D. (2023, May 2). The Future of Robotics: Orchestrating the Heterogeneous Robot Fleet. Gartner. View article.
Copyright © 2025 ST Engineering
By subscribing to the mailing list, you confirm that you have read and agree with the Terms of Use and Personal Data Policy.