Army wants more efficiency out of quick-reaction capabilities

Army Brig. Gen. Harold "Harry" Greene, program executive officer for intelligence, electronic warfare and sensors, discusses sensors, the Distributed Common Ground System-Army, cloud computing and fiscal responsibilities.

Army Brig. Gen. Harold “Harry” Greene was named program executive officer for intelligence, electronic warfare and sensors this past summer. The program executive office has 112 programs and a portfolio of $4.3 billion in intelligence, electronic warfare and sensors (IEW&S) systems. Before this appointment, he was deputy commanding general of the Army Research, Development and Engineering Command (RDECOM) and senior commander of the Soldier Systems Center in Natick, Mass.

He spoke with Defense Systems Editor-in-Chief Barry Rosenberg about sensors, the Distributed Common Ground System-Army (DCGS-A), cloud computing and fiscal responsibilities.

DS: What’s at the top of your to-do list?

Greene: We’re at a unique point in time as we transition from operations in Iraq and Afghanistan, where we’ve done a tremendous amount of work to put quick-reaction capabilities (QRC) out there to supplement the capabilities of our soldiers, to a time where we may not have as many resources but we have to keep supporting the soldiers in the field that are dealing with an adaptive enemy.


Related coverage:

Army fights to overcome data onslaught

US, South Korean exercise finds glitch in intell software


So the No. 1 thing on my to-do list is ensuring that we’re thinking about the future and we’re framing a fiscally responsible vision to the future for all of our programs that enables us to continue to provide those capabilities to the soldiers that we’ve provided over the last few years. The real target that I put for my guys is the program objective memorandum bridge that they’ll be doing in the late fall or early winter.

DS: So as you talk about QRCs in light of budget realities, what are the technologies/capabilities that you believe have the best chance of getting to theater?

Greene: Well, I think it’s a different challenge. The challenge we have is that we’ve put a tremendous amount of capability over there, and we did it leveraging overseas contingency operations funding, previously the funding known as supplementals. We now have to take those capabilities and get them into enduring funding streams and merge them into programs of record. Or the Army has to make a decision that those are capabilities that are either just niche for the current environment or capabilities we don’t need.

DS: PEO IEW&S had some systems being evaluated at the Network Integration Exercise this past summer at Fort Bliss, Texas, and White Sands Missile Range, N.M., including a long-range sensor. What did you see?

Greene: We had a concept to take our Long Range Advance Scout Surveillance System, the LRAS3, which is a stand-alone system, and instead of having that system just reporting to the operator, we would actually network them together so we could pass targets off between locations. So we netted LRAS3s out there and let the soldiers experiment with them, and they did exactly that. They were able to pass targets between field operating sites that they had out there, and they liked it well enough that they asked us to bring it back again.

DS: They asked to bring them to the fall Network Integration Exercise?

Greene: Absolutely.

DS: We talked a little about sensors but not about sensor fusion. That’s what DCGS-A is all about. Where are you right now in sensor fusion and getting that data out to the people who need it?

Greene: Right now, we’re working through all the lessons we’ve learned on processing, exploitation, dissemination (PED) that is enabled by the DCGS-A system. If you look at where we’ve been and where we’re going, we started out in the days before the Army was networked where each sensor had its own ground station, its own PED architecture. Then, through a manual process, you brought the feeds in to a human being who did the analysis and overlaid the various sensor inputs to develop understanding.

We then moved to a federated system where we had them on the network, but they were all still individual feeds and ground stations. And now we’re moving to really bring that full PED architecture together under DCGS-A that enables us to fuse data much more efficiently. It greatly reduces the complexity of the analysts trying to do that job and the footprint on the battlefield.

Now one of the things we really have to work hard at is balancing what I tell my guys are our two customers. We have our user that actually uses our equipment on the battlefield. That’s customer No. 1. But we have to balance that against the other customer, and that is the taxpayer that provides us the resources to enable that capability. And we really have to work at doing smart things.

And I think a big part of this migration as we go to the future is not just the technical portion of it, it’s also the logistics and the sustainment portion. The rule of thumb is that 65 to 70 percent of the cost of the system is in the operation and sustainment of that system. So we’ve done a tremendous amount of work in QRCs over the last few years. Right now, we have 42 QRCs on the battlefield, and our challenge is to develop sustainment strategies for those capabilities one at a time.

DS: Moving back to DCGS, what do you think needs to be done to network with DCGS-Navy and Air Force DCGS?

Greene: Well, what we’re trying to do is get all of our data where it’s available on the cloud so that regardless of what platform you’re on — be it the Army one, the Navy one or the Air Force one — they all have access to all the data all the time and are able to then fuse it for their individual purpose.

DS: And what would you say is the technical enabler that’s going to permit that?

Greene: Well, there are a number of enablers we need to work. We need to work on common data formats so we all have the same understanding of the data. We’re working through the challenges of sharing intelligence across multiple security levels. That’s inherent in all of those programs. Large datasets and data storage are a challenge, and we’ve taken the first step with our cloud work.

And then the last one that I would mention is the network. The network is obviously essential to sharing information, and we’re working our way through where we position data on the battlefield. The cloud is a great concept, but we have to actually get down to the technical details. To the user, they really don’t care where the data resides as long as they get near instantaneous access to it.

Our challenge is making sure that the data is actually accessible over the network links we have because sometimes that can become the limiting factor. Some of our units may not have as much pipe as they’d have, say, in a garrison on an air base. So we need to look at where we position the data to enable the user to discover it and bring it in for fusion.