Thousands of technology enthusiasts use it as the ultimate app, enabling them to lock and unlock their homes, cars, computers and mobile phones with a simple wave of a hand. But there’s a catch: they must have a microchip inserted into their bodies. The idea may seem weird, and painful, but human microchipping appears to appeal not only to amateurs, who call themselves biohackers, but also to governments, police forces, medical authorities and security companies. It involves using a hypodermic needle to inject an RFID (radio-frequency identification) microchip, the size of a grain of rice, usually into the person’s hand or wrist. The same kind of chip is used for tracking lost pets.  The implants send a unique ID number that can be used to activate devices such as phones and locks, and can link to databases containing limitless information, including personal details such as names, addresses and health records. RFID chips are everywhere. Basically, if you have to swipe a card, your ID is encoded in the magnetic stripe. If you touch it to a reader, as with Myki, it has an RFID chip with your number on it linked to the relevant database with your info on it. The latest credit cards have both stripe and RFID. Some RFID tags have a tiny battery or other power source, enabling them to operate at hundreds of metres so they don’t need to be within line of sight of a reader. As far as we know, this type cannot yet be made small enough to embed in humans. Cybernetics scientist Dr Mark Gasson of the University of Reading, in Britain, became the first human to be infected with a computer virus, after he injected himself with a microchip in 2009 to control electronic devices in his office. The virus was replicated on the swipecards of staff accessing his building and infected the university’s database. Nonetheless, Gasson and other scientists say a new world with mass populations of computerised people is imminent and inevitable. They say complex computing devices routinely implanted into humans for medical reasons also have the technology to enhance the abilities of healthy people. “It has the potential to change the very essence of what it is to be human,” Gasson says. “It’s not possible to interact in society today in any meaningful way, without having a mobile phone. I think human implants will go along a similar route. It will be such a disadvantage not to have the implant that it will essentially not be optional.”

Last year the line between man and machine became even more blurred, when Stanford University announced its scientists had created the first purely biological transistor that was made entirely of genetic material. Stanford assistant professor of bioengineering, Dr Drew Endy, described the breakthrough as the final component needed for a biological computer that can operate within living cells and reprogram living systems. Kevin Warwick, a professor of cybernetics at Reading University, has an electronic device in his body that interfaces with his nervous system, and had a simpler version implanted into the arm of his wife. Rudimentary signals between the two proved that purely electronic communication is possible between two human nervous systems. Warwick’s chipped arm allows him to use it via a computer link to operate a robot arm on another continent. The robot arm will mimic whatever hand and arm movements he makes with his natural arm. But the link with his wife’s nervous system is so rudimentary that he says he can only tell if she moves her arm. Melbourne internet entrepreneur and free software activist Jonathan Oxley injected himself with a microchip in 2004, after obtaining the same kit that vets use for family pets. His Twitter account describes him as a cyborg in progress. Oxley uses it to operate house locks and his computer, and says that after a decade inside his body the implant has caused no ill effects. “Now it’s just like any other part of me. I don’t even think about it,” he says. The idea of electronic implants becoming widespread in humans concerns Dr Katina Michael, an associate professor at the University of Wollongong, who specialises in the socio-ethical implications of emerging technologies. More