The next step in transformative technology is already here, and the United States runs the risk of getting left behind.
The amount of robotics inventions is steadily on the rise, and the U.S. military is already in on the action. A few years ago, Air Force drones surpassed 1 million combat hours. Hobbyists are using platforms like Arduino to build their own robots, and they're building them by the thousands. Tesla recently announced its intention to develop and market driverless cars by 2018. Last year, Chris Anderson quit his job as the editor-in-chief of Wired Magazine to found and run a robotics company.
See also: 14 Robotics Breakthroughs From the Past Decade
Yet for all its momentum, robotics is at a crossroads. The industry faces a choice — one that you see again and again with transformative technologies. Will this technology essentially be closed, or will it be open?
What does it mean for robotics to be closed? Resembling any contemporary appliance, they are designed to perform a set task. They run proprietary software and are no more amenable to casual tinkering than a dishwasher. The popular Roomba robotic vacuum cleaner and the first AIBO mechanical pet are closed in this sense.
Open robots are just the opposite. By definition, they invite contribution. It has no predetermined function, runs third-party or even open-source software, and can be physically altered and extended without compromising performance.
Consumer robotics started off closed, which helps to explain why it has moved so slowly. A few years ago, only a handful of companies — Sony and iRobot, for instance — were in the business of making consumer robots or writing robot software. Any new device or functionality had to come down from the top.
When introduced to the market, Sony’s AIBO dog could only run two programs, both written by Sony. At one point, consumers managed to hack the AIBO and get it to do a wider variety of things. A vibrant community arose, trading ideas and AIBO code. That is, until Sony sued them for violating the copyright in its software.
First computer, Apple lle, at South Pole in 1984. Image: NASA Mike
Compare the early days of personal computing, as described by Jonathan Zittrain in The Future of the Internet. Personal computers were designed to run any software, written by anyone. Indeed, many of the innovations or “killer apps” that popularized PCs came from amateur coders, not Apple or IBM. Consumers bought PCs not for what the machines did, but for what they might do.
The same is true of open robots. They become more valuable as use cases surface. (It can fold laundry! It can walk a dog!) That open robots are extensible — or “modular” — constitutes a second advantage. Versatile and useful robots are going to be expensive. '
Meanwhile, the technology continues to change. Let’s say there is a breakthrough in sensor technology or someone invents a new, more maneuverable gripper. The owner of a closed robot will have to wait for the next model to incorporate these technologies. The owner of an open robot can swap the sensor or gripper out.
The open model — best exemplified, for a time, by the Silicon Valley robotics incubator Willow Garage — is gaining momentum. Seven years ago, iRobot’s cofounder Colin Angle told The Economist that robots would be relatively dumb machines designed for a particular task. Robot vacuums will vacuum; robot pool cleaners will clean the pool.
Two years ago at the Consumer Electronics Show, the same company unveiled a robot called AVA designed to run third-party apps. Following a backlash over its copyright lawsuit, Sony released a software developer kit for AIBO, which continues to be used by classrooms and in competitions. Microsoft recently gave open robotics a boost by developing a software development kit for its popular Kinect sensor. There is even an award-winning Robot App Store (disclosure: which I help advise).
The trouble with open platforms is that they open the manufacturer to a universe of potential lawsuits. If a robot is built to do anything, it can do something bad. If it can run any software, it can run buggy or malicious software. The next killer app could, well, kill someone.
Liability in a closed world is fairly straightforward. A Roomba is supposed to do one thing and do it safely. Say a Roomba causes an injury in the course of vacuuming the floor. Then iRobot generally will be held liable as it built the hardware and wrote or licensed the software. If someone hacks the Roomba and uses it to reenact the video game Frogger on the streets of Austin, Texas (this really happened), or used the Roomba for a baby rodeo (it’s a thing), then iRobot can argue product misuse.
But what about in an open world? Open robots have no intended use. The hardware, the operating system and the individual software — any of which could be responsible for an accident — might each have a different author. Open-source software could have many authors. But plaintiffs will always sue the deep pockets. And courts could well place the burden on the defendants to sort it out.
I noted earlier that personal computers have been open from the start. They, too, have no dedicated purpose, run third-party software and are extensible (through USB ports). But you wouldn't sue Microsoft or Dell because Word froze and ate your term paper.
It turns out that judges dismissed early cases involving lost or corrupted data on the basis that the volatility of computers was common knowledge. These early precedents congealed over time practically to the point of intuition. Which, I would argue, is a good thing: People might not have gone into the business of making PCs if they could get sued any time something went wrong.
But there is one, key difference between PCs and robots: The damage caused by home computers is intangible. The only casualties are bits. Courts were able to invoke doctrines such as economic loss, which provides that, in the absence of physical injury, a contracting party may recover no more than the value of the contract.
Damage from software is physical, however, when the software can touch you, lawsuits can and do gain traction. Examples include plane crashes based on navigation errors, the delivery excessive levels of radiation in medical tests and “sudden acceleration” — a charge that took a team of NASA scientists 10 months to clear Toyota software of fault.
See also: Build Any Robot You Want, No Programming Required
Open robots combine, arguably for the first time, the versatility, complexity and collaborative ecosystem of a PC with potential for physical damage or injury. The same norms and legal expedients do not necessarily apply. In robotics, no less than in the context of computers or the Internet, the possibility that providers of a platform will be sued for what users do with their products may lead many to reconsider investing in the technology. At a minimum, robotics companies will have an incentive to pursue the slow, manageable route of closing their technology.
What we need is a narrow immunity, akin to what we see in general aviation, firearms and the Internet. In each case, Congress spotted a pattern that threatened an American industry and intervened. Congress immunized the companies that created the product for what consumers or others might do with their product.
For many of the same reasons, we should consider immunizing the manufactures of open robotic platforms for what users do with them — a kind of Section 230 immunity for robotics. You cannot sue Facebook over a defamatory wall post, nor can you immediately sue an Internet service for hosting copyrighted content.
Analogously, if someone adds a chainsaw to their AVA or downloads a dodgy app for their AR.Drone, it should not be possible to name iRobot or Parrot as a defendant. Otherwise, why would these companies take the chance of opening their products to third party coders?
The time for action is now. We certainly shouldn't wait to intervene until this young industry is dead or dying, as we did with general aviation. (It was called the General Aviation Revitalization Act for a reason.) Several countries already have a head start in robotics, a higher bar to product liability litigation or both. The risk of waiting is that, by the time we sort this out, the United States will not be a comparatively serious player in a transformative technology for the first time since the steam engine.
Ryan Calo is a law professor at the University of Washington School of Law and a founding director of the University’s Tech Policy Lab. This essay is adapted from his paper Open Robotics in Maryland Law Review and a related write up for the Cornell University Law School Legal Information Institute. You can follow Professor Calo on Twitter @rcalo.
Have something to add to this story? Share it in the comments.
Image: Yoshikazu Tsuno/AFP/Getty Images
অনলাইনে ছড়িয়ে ছিটিয়ে থাকা কথা গুলোকেই সহজে জানবার সুবিধার জন্য একত্রিত করে আমাদের কথা । এখানে সংগৃহিত কথা গুলোর সত্ব (copyright) সম্পূর্ণভাবে সোর্স সাইটের লেখকের এবং আমাদের কথাতে প্রতিটা কথাতেই সোর্স সাইটের রেফারেন্স লিংক উধৃত আছে ।