Tuesday, December 24, 2024
HomeNewsU.S. Military seeks to brainscan troops for “signs of betrayal”

U.S. Military seeks to brainscan troops for “signs of betrayal”

[Author]Could brain scan become standard procedure to see which troops might commit insider attacks?

 

by The Sleuth Journal[/Author]

 

The massive investment in neuroscience undertaken by the U.S.BRAIN project and its sister initiative the Human Brain Project is increasingly taking a turn toward the examination of mental health.

In fact, hundreds of European scientists working on the project are threatening a boycott because of this direction. In their view, the initial directive was to be more focused on repairing organic injuries and disorders such as Parkinson’s, Alzheimer’s and physical brain damage sustained in accidents. Post Traumatic Stress Disorder would be one area that might involve the military.

However, there is a disturbing trend developing in law enforcement and medicine to use what has been learned about the human brain in order to adopt pre-crime systems and predictive behavior technology.

But could a brain scan become standard procedure to see which troops might be inclined to commit insider attacks?

Troops overseas have been working alongside Iraqi and Afghan troops for years, but a new interest is being taken in evaluating potential extremists who are infiltrating to kill from within.

The numbers of these incidents are statistically low as reported by Defense One, which cites the inside killing of “several troops in recent years.” But a former Army counterintelligence agent sees the opportunity to apply new technology that presumably can screen people for mal-intent. The system is called HandShake.

Here’s how the HandShake system works: A U.S. soldier would take, say, an Iraqi officer and outfit the subject with a special helmet that can pick up both electromagnetic signals (EEG) and perform functional near-infrared imaging (fNIRs) which images blood flow changes in the brain. The soldier would put the subject through a battery of tests including image recognition. Most of the pictures in the tests would be benign, but a few would contain scenes that a potential insider threat would remember, possibly including faces, locations or even bomb parts. The key is to select these images very, very carefully to cut down on the potential false positives.

When you recognize a picture that’s of emotional significance to you, your brain experiences a 200 to 500 microsecond hiccup, during which the electromagnetic activity drops, measurable via EEG. The reaction, referred to as the P300 response, happens too fast for the test subject to control, so the subject can’t game the system.

The fNIR readings back up the EEG numbers. Together, they speak to not only whether or not a subject is a traitor but how likely an individual is to act on potentially criminal or treasonous impulses. The system then runs all the data through what Veritas calls a Friend or Foe Algorithm. The output: the ability to pinpoint an insider’s threat potential with 80 to 90 percent accuracy, according to the company.

It’s obviously ironic that this system is intended to be used on people who never should have encountered the U.S. military in the first place, since the U.S. military arrived based on lies. Moreover, to those flagged by such a system, they are clearly open to being tortured under the policies that have been established in the War on Terror world in which we live.

This system comes at an expense in excess of $1 million dollars to deploy and $500,000 per month thereafter, per site, according to the company’s founder. Both the monetary cost and the ethical costs should ensure that this technology never sees the light of day. However, the military-industrial complex has a provable track record of caring very little about either.

Note: The article linked below demonstrates how the biometric identification system in Afghanistan already has trickled down to the streets of America. If brain scanning technology is successful overseas, it is guaranteed to show up inside the United States. It’s already been proposed for air travel and other applications under theFAST system (Future Attribute Screening Technology). Additionally, with the increased war on whistleblowers, this would be a wonderful tool for employers to weed out those whose desire is not to undermine, but simply to expose criminality.

It’s obviously ironic that this system is intended to be used on people who never should have encountered the U.S. military in the first place, since the U.S. military arrived based on lies. Moreover, to those flagged by such a system, they are clearly open to being tortured under the policies that have been established in the War on Terror world in which we live.

This system comes at an expense in excess of $1 million dollars to deploy and $500,000 per month thereafter, per site, according to the company’s founder. Both the monetary cost and the ethical costs should ensure that this technology never sees the light of day. However, the military-industrial complex has a provable track record of caring very little about either.

Note: The article linked below demonstrates how the biometric identification system in Afghanistan already has trickled down to the streets of America. If brain scanning technology is successful overseas, it is guaranteed to show up inside the United States. It’s already been proposed for air travel and other applications under theFAST system (Future Attribute Screening Technology). Additionally, with the increased war on whistleblowers, this would be a wonderful tool for employers to weed out those whose desire is not to undermine, but simply to expose criminality.

 

Real-Time Facial Recognition Offered to Police in New Program

 

by Nicholas West

Activist Post

 

The San Diego Police Department is reporting their involvement in the largest facial recognition program to date. It is something straight out of theaters of war, but is set to hit the streets of America in the very near future if all goes according to plan.

Facial recognition technology, and the databases that catalog and store the results, is advancing at a pace that is difficult to contain. In 2006, the performance of face recognition algorithms were evaluated in the Face Recognition Grand Challenge (FRGC). High-resolution face images, 3-D face scans, and iris images were used in the tests. The results indicated that the new algorithms are 10 times more accurate than the face recognition algorithms of 2002 and 100 times more accurate than those of 1995. Some of the algorithms were able to outperform human participants in recognizing faces and could uniquely identify identical twins. (Source) And that was 2006.

One of the latest military-grade systems can now scan 36 million faces per second, or every face in the U.S. within 10 seconds. It is a technology that has trickled down from use in war zones like Afghanistan to catalog potential terrorists, to U.S. border control applications for combating illegal immigration, to FBI crime detection, to post-riot analysis, and right on down to establishing personal ID for a wide range of private companies.

One of the more troubling aspects of what the San Diego PD is looking to implement is that it will be used on the presumably innocent until proven guilty.

We reported in August that the new iPhones had an embedded code for a biometric kit including a fingerprint censor among other features.

RELATED ARTICLES
- Advertisment -spot_img
- Advertisment -spot_img
- Advertisment -spot_img