The process by which robots acquire skills in this new AI paradigm often relies on human demonstration. Creating the vast datasets required for this learning is increasingly leading to scenarios reminiscent of dystopian science fiction. A poignant example emerged from Shanghai, where a worker dedicated an entire week to a grueling task: donning a virtual reality headset and an exoskeleton, they repeatedly opened and closed a microwave door hundreds of times daily. The sole purpose of this monotonous endeavor was to train the humanoid robot stationed beside them. This account, reported by Rest of World, paints a stark picture of the human effort involved. In North America, the robotics company Figure is pursuing a similarly ambitious data-gathering strategy. In September, the company announced a partnership with Brookfield, an investment firm managing a staggering 100,000 residential units. The stated objective is to capture "massive amounts" of real-world data across a diverse spectrum of household environments. While Figure remained unresponsive to direct inquiries regarding the specifics of this initiative, the implications are clear: the intimate details of domestic life are poised to become the raw material for training AI.
Just as our written and spoken words formed the bedrock of training data for large language models, our physical movements are now on the precipice of following suit. However, this evolutionary trajectory might represent a significantly less advantageous bargain for humanity. The groundwork for this shift is already being laid. Aaron Prather, a seasoned roboticist, shared his insights on a recent project with a delivery company. In this instance, employees were equipped with movement-tracking sensors, meticulously recording their every action as they handled packages. The data meticulously collected from these human workers will serve as the training ground for future robots. The grand ambition of constructing sophisticated humanoids will almost certainly necessitate a colossal workforce of manual laborers acting as data collectors. Prather candidly acknowledges the peculiar nature of this undertaking, stating, "It’s going to be weird. No doubts about it." This sentiment underscores the unsettling reality that the pursuit of advanced automation is creating novel forms of human labor that are both essential and, by their very nature, peculiar.
Beyond data collection, the realm of tele-operation presents another facet of hidden human work in robotics. While the ultimate aspiration for robotics companies is to develop machines capable of performing tasks autonomously, a significant number of these firms currently employ human operators to remotely control their robots. Neo, a $20,000 humanoid robot from the startup 1X, is slated for delivery to homes this year. However, its founder, Bernt Øivind Børnich, has expressed a flexible stance on the robot’s autonomy. Should a robot encounter difficulties or if a customer requests assistance with a particularly intricate task, a tele-operator, stationed at the company’s headquarters in Palo Alto, California, will assume control. These operators will pilot the robot remotely, utilizing its onboard cameras to perform tasks such as ironing clothes or unloading dishwashers.
While this arrangement is not inherently exploitative, as 1X secures customer consent before engaging tele-operation mode, it fundamentally challenges our traditional understanding of privacy. In a world where tele-operators can perform domestic chores within our homes via robotic proxies, the concept of personal privacy as we currently know it will be irrevocably altered. Furthermore, if these home-based humanoids are not genuinely autonomous, this model can be more accurately characterized as a form of wage arbitrage. It effectively recreates the dynamics of the gig economy, but with a significant distinction: it enables physical tasks to be outsourced to locations where labor is cheapest, irrespective of geographical boundaries.
This pattern of concealed human labor is not unprecedented. The operation of "AI-driven" content moderation on social media platforms, for instance, often involves workers in low-wage countries being exposed to deeply disturbing content. Similarly, the assembly of training data for AI companies frequently relies on human input, even when sophisticated models are in development. Despite assurances that AI will eventually achieve self-sufficiency through its own outputs, even the most advanced AI systems still require substantial human feedback to perform as intended.
The existence of these human workforces does not invalidate the progress of AI; it is far from mere vaporware. However, when this human element remains invisible, the public consistently overestimates the actual capabilities of the machines. This inflated perception is undoubtedly beneficial for investors and the cultivation of technological hype. Yet, it carries significant consequences for society at large. Tesla’s marketing of its driver-assistance software as "Autopilot" is a case in point. This nomenclature fostered unrealistic public expectations regarding the system’s safety and capabilities, a distortion that a Miami jury recently acknowledged as contributing to a fatal crash that claimed the life of a 22-year-old woman. Tesla was subsequently ordered to pay $240 million in damages.
The same potential for misrepresentation and overestimation of autonomy looms large for humanoid robots. If Jensen Huang’s vision of physical AI permeating our workplaces, homes, and public spaces proves accurate, then the way we describe and critically examine this technology becomes paramount. Regrettably, robotics companies continue to maintain a level of opacity regarding their training and tele-operation practices that mirrors the secrecy surrounding AI firms’ training data. If this lack of transparency persists, we risk perpetuating a dangerous misunderstanding, mistaking concealed human labor for genuine machine intelligence and consequently perceiving far more autonomy in these robots than truly exists. The ethical and societal implications of this hidden human work demand urgent attention and a commitment to greater transparency.

