Rocksolid Light

Welcome to Rocksolid Light

mail  files  register  newsreader  groups  login

Message-ID:  

What we do not understand we do not possess. -- Goethe


interests / News / Skynet 0.1 alpha release

SubjectAuthor
o Skynet 0.1 alpha releaseGuest

1
Skynet 0.1 alpha release

<pdkcov$brn$1@def3.retrobbs.com>

  copy mid

https://news.novabbs.org/interests/article-flat.php?id=396&group=rocksolid.shared.news#396

  copy link   Newsgroups: rocksolid.shared.news
Path: rocksolid2!def3!.POSTED!not-for-mail
From: guest@retrobbs.rocksolidbbs.com (Guest)
Newsgroups: rocksolid.shared.news
Subject: Skynet 0.1 alpha release
Date: Thu, 17 May 2018 13:05:33 -0400
Organization: Dancing elephants
Lines: 199
Message-ID: <pdkcov$brn$1@def3.retrobbs.com>
Reply-To: Guest <guest@retrobbs.rocksolidbbs.com>
NNTP-Posting-Host: def2.lan
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Trace: def3.retrobbs.com 1526576735 12151 192.168.1.235 (17 May 2018 17:05:35 GMT)
X-Complaints-To: usenet@def3.retrobbs.com
NNTP-Posting-Date: Thu, 17 May 2018 17:05:35 +0000 (UTC)
User-Agent: FUDforum 3.0.7
X-FUDforum: e2245c1d60cd2fa7de3270a53d877d47 <1523>
 by: Guest - Thu, 17 May 2018 17:05 UTC

https://www.defenseone.com/technology/2017/12/pentagons-new-
artificial-intelligence-already-hunting-terrorists/144742/

After less than eight months of development, the algorithms
are helping intel analysts exploit drone video over the
battlefield.

Earlier this month at an undisclosed location in the Middle
East, computers using special algorithms helped intelligence
analysts identify objects in a video feed from a small
ScanEagle drone over the battlefield.

A few days into the trials, the computer identified objects
-- people, cars, types of building -- correctly about 60
percent of the time. Just over a week on the job -- and a
handful of on-the-fly software updates later -- the
machine's accuracy improved to around 80 percent. Next
month, when its creators send the technology back to war
with more software and hardware updates, they believe it
will become even more accurate.

It's an early win for a small team of just 12 people who
started working on the project in April. Over the next year,
they plan to expand the project to help automate the
analysis of video feeds coming from large drones -- and
that's just the beginning.

"What we're setting the stage for is a future of
human-machine teaming," said Air Force Lt. Gen. John
N.T."Jack" Shanahan, director for defense intelligence for
warfighter support, the Pentagon general who is overseeing
the effort. Shanahan believes the concept will revolutionize
the way the military fights.

"This is not machines taking over," he said. "This is not a
technological solution to a technological problem. It's an
operational solution to an operational problem."

Called Project Maven, the effort right now is focusing on
helping U.S. Special Operations Command intelligence
analysts identify objects in video from small ScanEagle
drones.

In coming months, the team plans to put the algorithms in
the hands of more units with smaller tactical drones, before
expanding the project to larger, medium-altitude Predator
and Reaper drones by next summer.

Shanahan characterized the initial deployment this month as
"prototype warfare" -- meaning that officials had tempered
expectations. Over the course of about eight days, the team
refined the algorithm, six times.

"This is maybe one of our most impressive achievements is
the idea of refinement to the algorithm," Shanahan said.

Think of it as getting a new update to a smartphone
application every day, each time improving its performance.

Before it deployed the technology, the team trained the
algorithms using thousands of hours of archived battlefield
video captured by drones in the Middle East. As it turned
out, the data was different from the region where the
Project Maven team deployed.

"Once you deploy it to a real location, it's flying against
a different environment than it was trained on," Shanahan
said. "Still works of course ... but it's just different
enough in this location, say that there's more scrub brush
or there's fewer buildings or there's animals running around
that we hadn't seen in certain videos. That is why it's so
important in the first five days of a real-world deployment
to optimize or refine the algorithm."

While the algorithm is trained to identify people, vehicles
and installations, it occasionally mischaracterizes an
object. It's then up to the intel analyst to correct the
machine, thus helping it learning.

The team has paired the Maven algorithm with a system called
Minotaur, a Navy and Marine Corps "correlation and
georegistration application." As Shanahan describes it,
Maven has the algorithm, which puts boxes on the video
screen, classifying an object and then tracking it. Then
using Minotaur, it gets a georegistration of the
coordinates, essentially displaying the location of the
object on a map.

"That's new, it's different and it's much needed for an
analyst because this was all being done manually in the
past," the general said.

"Having those things together is really increasing
situational awareness and starts the process of giving
analysts a little bit of time back -- which we hope will
become a lot of time back over time -- rather than just
having to stay glued to the video screen," Shanahan said.

After the Predator and Reaper video feeds get the
algorithms, the plan is to put them to work on Gorgon Stare,
a sophisticated, high-tech series of cameras carried by a
Reaper drone that can view entire towns.

"When you look at the data labeling that has to go on, the
algorithms that have to be trained and refined, that's
really what I would call the PhD-level problem that we have
up next," Shanahan said of pairing the algorithms with
Gorgon Stare.

Right now, the algorithms reside in the computers that
receive the video from the drones. At some point down the
road, the goal is to put the technology "at the edge" on the
drones themselves as well.

"The combination of those two is very powerful," Shanahan
said. "We see redundancy as important in a future world in
which you may lose the ability to communicate back to big
enterprises in the United States."

The algorithms use commercial technology, which has allowed
the project to move quickly -- lightning fast by government
standards.

"We are not trying to do something over in the department
that is already being done incredibly successfully in the
commercial world," Shanahan said.

Former Deputy Defense Secretary Bob Work stood up the
project in April. Two months later, they received funding
from Congress and six months later the first algorithms were
used on the battlefield, delivering on a promise to reach
combat by year's end.

"We are learning lessons every day for the first time about
how do you actually integrate AI into Department of Defense
operationally fielded programs, not research and
development, not test beds, but capabilities that are being
used by warfighters day in and day out," Shanahan said.

A Change in Mindset

Even this early deployment has folks thinking about its
potential and talking about how this type of AI could change
the way an intel analyst or sensor operator does his or her
job.

"I expect a year from now, we'll see sensor operators and
analysts using it in a way that we never understood was
possible," Shanahan said.

While most of the Pentagon's intelligence directorate's work
is shrouded in secrecy for operational security reasons,
Shanahan and others have been openly talking about Project
Maven and the military potential for AI.

"I don't think honestly there is any aspect of Department of
Defense that is not ripe for introducing some type of AI and
machine learning into it," Shanahan said.

Military leaders are just beginning to talk about the
potential for artificial intelligence, largely as a way to
augment overburdened troops. In October, Gen. David
Goldfein, the Air Force chief of staff, laid out a vision
for teaming airmen with machines, particularly for
maintenance and logistics functions.

"I think that's the breakthrough for the department that
we're only beginning to understand today and it will grow
faster and achieve a lot more over the next over the next
year to two years as we understand what this allows us to
do," Shanahan said of human-machine teaming,

Commercially, companies are using AI to predict when pumps
and turbines on ships will break and other types of
predictive maintenance.

"There's so much that industry is showing us is in the art
of the possible," Shanahan said. "That's what different
today. This is now being driven on the outside and we're
watching and learning how to play catch up fast whereas
opposed [to] 15 years ago the department was orchestrating a
lot of this from inside. The world is changing around us and
we're understanding how we need to keep up."

Getting people to think differently is among the most
difficult tasks at hand.

"What we're trying to do is set the conditions to build an
AI-ready culture," he said. "It's not easy. This is
uncomfortable. It's a very different way of thinking about
problems then we've used in the past. But the attitude is
out there.

"The younger people are more receptive to this and they're
ready to jump on board yesterday. They've been asking us:
What took you so long? At the same time we're beginning to
have people at the highest levels of the department start
talking about AI in new and different and encouraging ways."
Posted on: def2.i2p

1
server_pubkey.txt

rocksolid light 0.9.8
clearnet tor