Society for brain integrity in Sweden
- Föreningen för hjärnans integritet i Sverige
- Föreningen för hjärnans integritet i Sverige
“It is appallingly clear that our technology has surpassed our humanity, I hope that one day our Humanity can surpass our technology.“ ~Albert Einstein
Should the fight be between intelligent machines and humans in the evolutionary process? It’s only a matter of time. Within 10-20 years, the online human brains will be able to destroy everything that evolution has achieved at 4.3 billion years. Nanotechnology is already well advanced and are achieving amazing things. And our higher consciousness seems to be without the possibility of intervention
Statement by the Swedish Defence Research Agency at the Chamber of Commerce in Stockholm 2012 03 21
The art of creating machines with ethical behaviour is something that robot scientists and philosophers have started to study.
What part has man to play in a future, in which ethical decisions will be taken also by machines?
When will a machine become a human being?
The question of borderlines between man and machine may seem trivial today, but it becomes in fact more and more important.
In the future robots will become more and more human-like, while man will be more and more robot-like, partly by means of various types of implants, sharing technology with robots; but also through a “socialization process”,in which we will learn how to deal with robots. If it will be possible at all to talk about man as a distinct species, perhaps we will be obliged to face the most difficult of all questions:
Is there really a need for human beings?
Only the species at the top of the hierarchy will be released from justifying its existence.
Henrik Carlsen has worked at FOI (the Research Institute of the Swedish National Defence) in a little more than ten years.
He has got a broad background within many of the FOI fields of activity. In recent years his focus has been:
1) Climate change, mainly questions of adaptation as well as connection with security, development, and geopolitics;
2) Technological development and social change. In the latter field Henrik Carlsen is concentrated on possible dangers involved with future technologies, primarily ethical aspects on what is usually known as “autonomous systems”, i.e. technical systems, which in some sense make “their own” decisions.
Mind Control – Remote Neural Monitoring: Daniel Estulin and Magnus Olsson on Russia Today
This show, with the original title “Control mental. El sueño dorado de los dueños del mundo” (Mind control. The golden dream of the world’s masters) — broadcasted to some 10 million people — was one of the biggest victories for victims of implant technologies so far. Thanks to Magnus Olsson, who, despite being victimized himself, worked hard for several years to expose one the biggest human rights abuses of our times – connecting people against their will and knowledge to computers via implants of the size of a few nanometers – leading to a complete destruction of not only their lives and health, but also personalities and identities.
Swedish Defence Research Agency
Miniaturization is one of the most powerful technology trends
Nanotechnology is here providing new materials with new properties. And micro-system technology providing sensors small as grains of dust or antennas only one thousandth of a millimeter in size (for optical frequencies). Or components so small that they can connect on nontrivial fashion to the brain. Shaking hands/talking with the brains minutest parts.
Sensors small as dust. Or components which can connect themselves to the nervous system. Miniaturization is a powerful technology trend.
What can be done with a lot of money in U.S. laboratories is one thing. Another thing is what can be realized in the form of cheap everyday objects.
The common-sense definition of Miniaturization is that a component or system has become smaller while performing at least as well. Or it could be something built out of very small parts resulting in a new or improved function.
Another important concept in addition to nanotechnology, microsystems technology.
The researchers are talking about components plugged into the nervous system that can talk to the brain. These components would help the soldier to extract his subconscious.
The soldier sees more than he thinks. The small components would capture the information that would otherwise be missed.
The boundary between technology and biology is blurred and biology and technology are converging.
One can imagine solutions to change the body’s biological systems, such as modifying the biochemistry and compensate for the lack of sleep. Or to take control of the nervous system of an animal, thereby creating biological UAVs. (Unmanned Aerial Vehicle)
Brain Probes: development of new nano, micro, genetic, optical, and electrical technologies making it possible to study an ever broader range of brain structures and functions in greater depth, and more rapidly than is currently possible.
Ethical, legal and social issues: The HBP (The Human Brain Project) will raise important ethical, legal, social, political and philosophical issues both about the research itself and its potential applications. At the same time, its contributions to knowledge of the brain, cognition and behavior will have important philosophical and conceptual implications touching on basic concepts of what makes us human Against this background, the HBP will include a major program of activities dedicated to ethical, legal and social issues. The program will bring together scholars in the brain sciences, social sciences, and the humani ties to study and discuss relevant issues and will use all available channels to en courage open, well-informed debate, to dissipate potential public concerns and to enhance appreciation of the potential benefits of the project’s work.
“What kind of privacy safeguards are needed if a machine can read your thoughts”?
“Invasive research on humans could involve inhuman or degrading procedures”
“Invasive research on humans could lead to new torture or inhumane punishment techniques”
Will cognition enhancers exacerbate differences between rich and poor? Or, instead, will they relegate social diversity to the status of historical artifact?
What happens if we deduce through neuroimagingthe physiological basis for morality?
Or, and by the way, what happens to free will?” (Scientific American (Editorial), September 2003)
Autonomous behavior of robots: what degree of autonomy should we give to the robot…if uncontrolled robot actions can be dangerous to humans (assistance robotics)
If we wish to deal with cases when the user’s will is “ethically”unacceptable (robots used for military purposes) Are Asimov’s Laws adequate?
Is realistically threatening the possibility of auto-replicating artificial entities?
Ontological status of “cyborgs”and A.I. creatures:
What is machine and what is human?
Who may be re-programmed?
Can we have dissidents in a world of replicants?
Scientists Warn of Ethical Battle Concerning Military Mind Control
Advances in neuroscience are closer than ever to becoming a reality, but scientists are warning the military – along with their peers – that with great power comes great responsibility
A future of brain-controlled tanks, automated attack drones and mind-reading interrogation techniques may arrive sooner than later, but advances in neuroscience that will usher in a new era of combat come with tough ethical implications for both the military and scientists responsible for the technology, according to one of the country’s leading bioethicists.
“Everybody agrees that conflict will be changed as new technologies are coming on,” says Jonathan Moreno, author ofMind Wars: Brain Science and the Military in the 21st Century. “But nobody knows where that technology is going.”
Moreno warns in an essay published in the science journal PLoS Biology Tuesday that the military’s interest in neuroscience advancements “generates a tension in its relationship with science.”
“The goals of national security and the goals of science may conflict. The latter employs rigorous standards of validation in the expansion of knowledge, while the former depends on the most promising deployable solutions for the defense of the nation,” he writes.
Much of neuroscience focuses on returning function to people with traumatic brain injuries, he says. Just as Albert Einstein didn’t know his special theory of relativity could one day be used to create a nuclear weapon, neuroscience researchintended to heal could soon be used to harm.
“Neuroscientists may not consider how their work contributes to warfare,” he adds.
Moreno says there is a fine line between using neuroscience devices to allow an injured person to regain baseline functions and enhancing someone’s body to perform better than their natural body ever could.
“Where one draws that line is not obvious, and how one decides to cross that line is not easy. People will say ‘Why would we want to deny warfighters these advantages?’” he says.
[Mind Control, Biometrics Could Change the World]
Moreno isn’t the only one thinking about this. The Brookings Institution’s Peter Singer writes in his book, Wired for War: The Robotics Revolution and Conflict in the 21st Century, that “‘the Pentagon’s real-world record with things like the aboveground testing of atomic bombs, Agent Orange, and Gulf War syndrome certainly doesn’t inspire the greatest confidence among the first generation of soldiers involved [in brain enhancementresearch.]”
The military, scientists and ethicists are increasingly wondering how neuroscience technology changes the battlefield. The staggering possibilities are further along than many think. There is already development on automated drones that are programmed to make their own decisions about who to kill within the rules of war. Other ideas that are closer-than-you-think to becoming a military reality: Tanks controlled from half a world away, memory erasures that could prevent PTSD, and “brain fingerprinting” that could be used to extract secrets from enemies.Moreno foretold some of these developments when he first published Mind Wars in 2006, but not without trepidation.
“I was afraid I’d be dismissed as a paranoid schizophrenic when I first published the book,” he says. But then a funny thing happened—the Department of Defense and other military groups began holding panels on neurotechnology to determine how and when it should be used. I was surprised how quickly the policy questions moved forward. Questions like: ‘Can we use autonomous attack drones?’ ‘Must there be a human being in the vehicle?’ ‘How much of a payload can it have?’. There are real questions coming up in the international legal community.”
All of those questions will have to be answered sooner than later, Moreno says, along with a host of others. Should soldiers have the right to refuse “experimental” brain implants? Will the military want to use some of this technology before science deems it safe?
“There’s a tremendous tension about this,” he says. “There’s a great feeling of responsibility that we push this stuff out so we’re ahead of our adversaries.