Posts

Iván Hernández Dalas: Nebius and NVIDIA collaborate for physical AI cloud

Image
Nebius has integrated the NVIDIA Physical AI Data Factory Blueprint into its global-scale AI infrastructure. | Credit: Nebius Nebius Group N.V. and NVIDIA Corp. claim to have solved robotics development’s “three-computer problem,” where engineering teams waste up to 40% of their time stitching together incompatible systems. The companies have partnered to offer an integrated cloud platform that handles everything from AI training to edge deployment for robot developers. “Physical AI is going to be one of the defining technology shifts of this decade, and the teams building it today are being held back by infrastructure and tooling that was never designed for those workloads,” said Evan Helda, head of physical AI at Nebius. “Working with NVIDIA, we are building the execution layer for the entire physical AI ecosystem — so that any team, anywhere, can go from idea to deployed robot at the speed the market demands.” NVIDIA is presenting its annual GTC user conference this week in S...

Iván Hernández Dalas: Aetina shows 3D vision and enterprise generative AI at GTC 2026

Image
Aetina is showing its latest integrated AI systems at GTC in San Jose, Calif. Source: Aetina Corp. Aetina Corp., a leading global provider of edge AI systems, is demonstrating high-precision 3D vision technology and enterprise-grade generative AI agents this week at GTC 2026. The company said its systems use NVIDIA’s Blackwell architecture to enable robots and automated systems “to advance toward end-to-end autonomy.” In physical AI application scenarios, robots and automation require real-time, reliable spatial perception. They also need a “decision-making brain” capable of executing complex task reasoning at the edge for dynamic orchestration amid shifting tasks, exceptions, and changing operational priorities, said Aetina. The company claimed that its presence at GTC highlights its role as a key infrastructure enabler for both robotic and enterprise autonomy. Aetina said it provides the “eyes of perception” and the “brain of intelligence” required for autonomous systems, acc...

Iván Hernández Dalas: Graphene-based sensor to improve robot touch

Image
Schematic showing the materials used in the sensor and the sensing array on a robotic manipulator. Figure from Multiscale-structured miniaturized 3D force sensors . Reproduced under a CC BY 4.0 licence. Robots are becoming increasingly capable in vision and movement, yet touch remains one of their major weaknesses. Now, researchers have developed a miniature tactile sensor that could give robots something much closer to a human sense of touch. The technology, developed by researchers at the University of Cambridge, is based on liquid metal composites and graphene – a two-dimensional form of carbon. The ‘skin’ allows robots to detect not just how hard they are pressing on an object, but also the direction of applied forces, whether an object is slipping, and even how rough a surface is, at a scale small enough to rival the spatial resolution of human fingertips. Their results are reported in the journal Nature Materials . Human fingers rely on multiple types of mechanoreceptors to ...

Iván Hernández Dalas: Don’t miss Neuralink pioneer Noland Arbaugh keynote at the 2026 Robotics Summit

Image
Noland Arbaugh is the first patient for the Neuralink brain-computer interface. Arbaugh will be onstage at Robotics Summit & Expo 2026 in a closing keynote conversation with Steve Crowe, editorial director of The Robot Report . Arbaugh was injured in his early 20s after a shallow water diving accident that left him with tetraplegia . In January 2024, Arbaugh became the first human recipient of Neuralink ‘s investigational brain-computer interface implant as part of the company’s clinical trial. The Neuralink N1 implant has given Arbaugh agency and independence back in his life. The interface uses 64 flexible threads carrying 1,024 electrodes to record neural activity in the motor cortex and translate intended movement into computer control. The Neuralink implant is designed to let a user control a computer or mobile device. | Credit: Neuralink Arbaugh to demonstrate brain-computer implant onstage At the Robotics Summit, Crowe and Arbaugh will discuss how the interface funct...

Iván Hernández Dalas: Opentrons introduces dynamic simulation, visualization for AI-generated lab workflows

Image
A researcher a Boston University’s DAMP Lab works with an Opentrons Flex robot. Credit: Opentrons Labworks Inc. Pharmaceutical companies and research institutions are using artificial intelligence to design robotic experiments at scale, but they need to know if AI-generated instructions will execute correctly before handling valuable samples and reagents. Opentrons Labworks Inc. today announced Protocol Visualization for Opentrons Flex, a new simulation and visualization capability. The feature allows scientists to simulate and inspect robotic protocols in a dynamic virtual environment before running them on the Flex system. The interface enables users to observe each step of an automated workflow. “This capability gives researchers a dynamic way to simulate and inspect robotic execution before an experiment begins, creating a clearer bridge between computational design and physical laboratory workflows,” stated James Atwood, CEO of Opentrons. “As AI systems propose more experimen...

Iván Hernández Dalas: Humanoid robotics developers must address a wide range of applications

Image
Humanoid robots such as Optimus need to evolve from industrial to more general applications. Source: Tesla Humanoid robots are becoming an important part of our technological reality, opening up new horizons in science, industry, and everyday life. These machines promise to solve a wide range of problems, thanks to their designed adaptability to the environment. Sergey Lonshakov, founder of AIRA and the architect of the Robonomics project, recently spoke about the latest advances in humanoid robot development. He addressed the potential benefits and the challenges facing society with these next-generation systems . When will humanoid robots be capable? The Chinese-made Unitree G1 is learning to with reinforcement learning, which is very inspiring for me. Since 2015, I’ve been working on technical projects with a team of 15 people — three generations of ITMO University robotics graduates. Our goal is to develop hypotheses that could be of interest in the near future and try t...

Iván Hernández Dalas: Data security is the foundation of trust in physical AI

Image
Cyber and data security are key concerns for physical AI such as this ANYmal inspection robot. Source: ANYbotics If you follow the robotics industry, you have likely seen the wave of humanoids performing backflips, robot dogs navigating parkour, and robotic arms folding laundry. This pace of innovation is inspiring, and it is fascinating to see the impact of AI on physical machines. However, as we move technology from the controlled safety of the lab into the complexity of the real world, a security headline serves as a stark reminder for the broader industry. Reports recently surfaced regarding critical security flaws in consumer robot vacuums . Interestingly, this was discovered by a software engineer who stumbled into the vulnerability by accident, gaining full control over devices and accessing cameras and microphones to peer into private homes. While a vulnerability in a living room is a serious privacy concern, an autonomous robot in a chemical plant or a high-voltage power ...