by Craig Klugman, Ph.D.
In the near future:
Thank you Ms. Riviera, it seems that we have all of your paperwork in order for your new job. The only thing left is your microchip. Please extend your left hand. This will only sting a little.
Tagging humans with microchips has long been a trope in fiction: The X-Files; Terminal Man; Total Recall; Johnny Mnemonic; South Park: Bigger, Longer, Uncut; Spiderman 2; Mission Impossible 3; Final Cut; and Strange Days to name a few. But in the real world, we microchip (yes, it has become a verb) our cats and dogs, not employees and grandpa. But at Epicenter, a high-tech office building in Sweden, people who work in the edifice can have a chip implanted. The chip has a tiny RFID (radio-frequency identification) chip that lets the building recognize the employee. The chip will open doors, access photocopiers, and someday even allow them to purchase food from the cafeteria. For now, the program is voluntary.
The era of implantable identification technology is upon us. One chip company extols the uses for its microchip, “…gentle implantation with minimal penetration force in most species at any life stage—from pocket pets to horses, from puppies and kittens to seniors.” Yes, you can chip your pet and your parent all with the same device.
In 1998, Kevin Warwick became the first “chipped” human. His smart office would recognize the chip and unlock the door, set the lighting to his preferred levels, turn on his favorite music, and prompt his computer to wish him hello. In other “experiments,” Warwick used his chip to control a robot hand across the Atlantic through the internet. When his partner was also chipped, he said that they were able to share emotions and feelings through their electronics.
In 2004, the FDA approved VeriChip’s human microchips to carry medical history and identification information. Since then, the use of such chips is growing. An Ohio surveillance company chipped two of its employees. An artist implanted a chip in his hand and uses the technology to carry a small animation file. Even a night club in Barcelona offers a chip to its VIPs that gives access to private lounges and acts as their bar tab.
Other human uses are helpful in a health and medicine context. For example, tracking Alzheimer’s patients, who can wander off and become confused and lost. A case in New Jersey saw a police sergeant have an ID chip which was read when he went to the hospital after a crash in a high speed chase and allowed doctors to learn of his diabetes.
But, we could also use these to track parolees and sex offenders, and suspected terrorists. Why stop there? One urban legend held that all immigrants (documented or not) would be chipped for tracking purposes. Chip a teenager and their parents can use an app on the phone to know his or her every move. Fear that your spouse is cheating—chip him or her and see whether there is a visit to seedy motels. Wondering if your employees are leaving early, track their whereabouts through the chip.
The ethics literature on chipping humans is still young. It raises issues such as risk of infection from the implantation, privacy (being tracked, having your chip read by a scam reader), and being coerced into being chipped. Michael, McNamee and Michael (2006) offer a framework for dealing with the ethical issues in human chipping. Many of their concerns revolve around the technology not being reliable enough, invasions of privacy, lack of choice, and inconvenience. They wrote that many of their scenarios were science fiction. Nearly ten years later, that fiction has become fact.
Some states have already passed laws banned forced (involuntary) human chipping: California, North Dakota, Oklahoma, and Wisconsin. Virginia and Georgia have also considered bills related to such a ban. The AMA adopted Opinion 2.40 in 2007 that says chips might be used for patient care, but should only be used with informed consent, when strong privacy is in place, and encourages physicians to research non-medical uses. But it does not argue against non-medical uses.
The tracking and other benefits may not be much different from carrying around a smartphone—a device that knows where we are, carries our identity, and can pay for goods. But a smartphone can be turned off, can be left at home, and doesn’t require a surgical procedure to install or remove. Is the potential convenience and safety factor (both personal health and national security) worth the risk/loss of privacy and confidentiality? Time will tell but sadly, like most technological innovations with ethical concerns, we usually wholeheartedly adopt the technology after we get used to the idea. Sometimes what changes is that we develop regulations for the technology and other times we lose our fearfulness to adopting something new. Putting something into your body though, seems to cross a boundary. Beneath the skin may be the new frontier but it’s also a literal boundary that requires regulation before knee jerk laws or an open market.
The only chip line I’m getting in anytime soon is the one that comes with fish.