NASA’s autonomous rendezvous for satellite servicing


Gridium:        Hello everyone, and welcome to this conversation with Eugene Skelton of the Satellite Servicing Projects Division at NASA. Eugene focuses on Rendezvous and Proximity Operations on the Raven mission, which is an on-orbit relative navigation demonstration using International Space Station visiting vehicles.

This is to say, Eugene and team are exploring the technologies we need to fix satellites in space.

My name is Millen, and I’m with Gridium. Buildings use our software to fine-tune operations.

We’ll be talking about the fascinating technology being developed as part of the Raven mission, including sensors, machine vision algorithms and processing, and what it will take to perfect autonomous, satellite-to-satellite rendezvous in space.

Right, I’m excited to be speaking with you today Eugene, thanks for joining us!

Eugene:        Thanks for having me, it’s a pleasure to be here.

Gridium:        I first heard of the Hubble space telescope repair story during our podcast with Dr. Ed Gibson, a former NASA astronaut who spent 84 days living in SkyLab.

Let’s start there Eugene, what happened with Hubble?

Eugene:         Hubble was a vehicle that was originally designed to be serviced, but not in the way that initially was required. There are gyros and batteries on the vehicle that the designers anticipated would need to be replaced, but when Hubble was designed and built, there was a flaw in its optics and when we got it on orbit, they discovered this issue and a servicing mission was required to install corrective optics on the telescope.

And so that was the first servicing mission for Hubble and it was exciting. I did not participate specifically in that one; there have been 5 total Hubble-servicing missions.

It was originally designed for servicing, and I participated in the final servicing mission for Hubble where we replaced some gyros and instruments and had a very interesting technology demonstration objective that included rendezvous and proximity operations for the first time with Hubble in an automated sense.

Gridium:        And those 5 servicing missions for Hubble, did they inspire the Raven mission or is that part of the Raven mission?

Eugene:         Yeah, it inspired it in some way.

After the 4th servicing mission, a 5th servicing mission was planned and the Columbia accident tragedy happened and Hubble still needed to be serviced. And they launched a mission to robotically repair Hubble. It was, at the time, called the Hubble Robotic Servicing and Deorbit Mission—I believe—and through that process they identified one of the key risks for a mission like that, which was the ability to autonomously rendezvous and dock with a vehicle that had not been designed for that objective.

And eventually, the robotic mission was cancelled in lieu of a man-servicing mission, servicing mission 4 for Hubble, which was actually the 5th servicing mission to Hubble, but it was called servicing 4.

And during that, like I said, we had a demonstration where we had 3 visible wavelength cameras that were installed in the back of the orbiter payload bay. And as the astronauts navigated the orbiter up to Hubble, those cameras were recording imagery of the Hubble vehicle and we were using machine vision to determine the relative attitude and position of Hubble relative to the space shuttle.

That was the genesis of Raven. We planned an ISS experiment after SM4 that eventually became a ground demonstration—it never reached flight—and through the success of that ground demonstration, which was named Argon, Raven emerged which is a ISS demonstration that is now operational on the International Space Station.

That’s kind of the path of how we got from Hubble to Raven.

Gridium:        And, let’s hear a little bit about your path Eugene. How did you end up working at NASA on Autonomous Rendezvous technology and also on Raven?

Eugene:         I started on the Hubble Attitude Control System team.

Initially after Columbia, we needed to find a way to extend the pointing capability and the life of Hubble without the services, without the equipment that it would typically need, because it was degrading because it was in need of a replacement module—specifically, the gyros.

I worked on that team to develop what we called “Life Extension Attitude Control Systems” that would maintain Hubble science on a reduced number of gyros than it was designed for. Initially, that was 2 gyros instead of 3, and eventually we got it down to 1 gyro instead of 3.

And after that completed, I supported the robotics servicing mission and eventually the 4th servicing, the SM4—the 4th servicing mission—and the demonstration that we did in the back of the orbiter payload bay. And that’s how I got onto the Raven program.

Gridium:        When you talk about the Raven program, can you describe for us its objective? I understand that there are 3 main objectives.

Eugene:         We have 5 science objectives. And primarily it is to collect imagery of visiting vehicles—invisible infrared and flash LIDAR. We want to collect multiple visiting vehicles—when we practice rendezvous and proximity operations, the machine-vision algorithms, we want to practice on multiple types of vehicles.

They look different ways, they have different surface properties for the way the camaras see them, different geometry, different lighting, angles… so, multiple visiting vehicle types. Multiple rendezvous of the same type of visiting vehicle. Exercise real-time 6-degree of freedom relative estimation, which we call POSE. And do autonomous operations, which is the ability to point the Raven sensor package autonomously at the visiting vehicle as it approaches.

Those are our main science objectives.

Gridium:        I know that it seems that we’re jumping into the story mid-way through here since Raven is operating on the International Space Station as we speak… but can you tell us a little bit about how we got here? What was the development cycle for this technology?

Eugene:         I’ve got to give a shout-out to our partners, which is the payload which hosts Raven—the DODSTP program—that’s the Department of Defense Space Technologies Program. They’re run out of Houston at Johnson Space Flight Center. They have about a dozen experiments, including Raven, on this payload which is called STPH5—the H stands for Houston, 5 stands for the number.

Raven is one of those and we collaborated with them to share the infrastructure that they had and to share the ride that they had allocated to get to the space station. So, we launched in February of 2017—that’s this year, 2017. The year has flown by!

And we were installed in the later part of that month. We were able to turn on the instrument and even look at the vehicle that brought us up there, which was pretty neat, and watch the departure of that vehicle—the SpaceX 10 vehicle.

And then since then, we’ve been able to view multiple SpaceX vehicles as they’ve approached space station, multiple Cygnus vehicles—which is filled by Orbital ATK—and a couple of the Russian vehicles still used in progress.

Gridium:        Before we dive into some of the experiments happening on Raven, I understand that you focus on Rendezvous and Proximity Operations. What is that?

Eugene:         Rendezvous and Proximity Ops is the technology or the discipline that would bring two vehicles together in space.

And you can do that either with people—man in the loop—or you can do that autonomously where computers and machines are driving where the two vehicles are located and the knowledge of their relative position and attitude.

I focus on that automated sense, where we don’t have an astronaut onboard and the lag for an operator on the ground to control the vehicles would be too long for us to expect a good, safe rendezvous and capture. And so, we focus on bringing the vehicles together in orbit using optical sensors and other traditional sensors like gyros, star trackers, and GPS.

Gridium:        Yeah, I was listening to NASA’s “Houston, we have a podcast” over the weekend in preparation for our chat Eugene.

And they were talking about the Mars Rover—the time that it would take for a person to control one of the experiments like a digging arm. The time delay is just way too long to send a signal back to Earth to update the position of the arm or then for a person to remote control the rover, the signal going from Earth back to Mars in that delay.

That’s fundamentally one of the challenges that your work overcomes, is that right?

Eugene:         Yeah, that’s right.

And if you think about it on Mars, the vehicle that’s doing the exploring is sitting on the planet and it’s not moving relative to that planet. When I do Rendezvous and Proximity Operations, I’m in an orbit with another vehicle and the orbital dynamics is constantly moving those two vehicles relative to each other.

So, I don’t have time to stop and think about what I want to do as these two vehicles are moving relative to each other. We’re talking about vehicles that are many thousands of kilograms, and any momentum exchanged between them would be very, very undesirable.

Gridium:        Yeah. I can imagine that this means that the Raven mission hardware is impressive.

Can you tell us a little bit about what that looks like?

Eugene:         Yeah, so there’s a visible camera on there that has a motorized zoom lens. As the vehicle comes towards us, just like a camera that you might have here on the ground, you’re going to want to adjust the focal length and the focus so you can get a good crisp image of that vehicle so that we can see the features of the vehicle as it goes from 100 meters all the way to 30 meters towards us.

We want to be able to see features like solar rays and blankets and other features that help us determine how those are moving, relative to us. We have an infrared camera on board, a 3D flash LIDAR—LIDAR is an instrument that sends a light out into the environment, out into space, and it starts a clock and light hits the visiting vehicle and returns to the sensor. And it measures the time of flight of that light and determines how far the object is away from us using speed of light and that time measurement.

Unlike your Home Depot version laser range finder that does a single point, for us, this creates a 256×256 point cloud. And so we can see in an area what the features of the vehicle look like. What I kind of envision is there’s a game that I used to have as a kid where I was a square, like 5×5 area with a bunch of pins in it, pins or nails…

Gridium:        Oh yes.

Eugene:         …and you could push your hand into it and you would see the shape of your hand in 3D on the other side. That’s essentially what we get back out of the Flash LIDAR.

Gridium:        So, the point cloud provides dimension?

Eugene:         That’s right.

Gridium:        Cool.

Eugene:         And then we have a two-axis gimbal which allows us to control the camera package on Raven, the instrument package.

So, as the vehicles move towards the Space Station they’re going all around it from a relative perspective. ISS for the most part is in an attitude–like LVLH where Raven is always pointing down at the Earth—sometimes they change their attitude, but for the most part we’re fixed in the rotating orbital frame.

The vehicles, on the other hand, are dancing all around us as they do the rendezvous. Sometimes they’re coming up the nadir vector, sometimes they’re coming up the velocity vector. And so Raven has to change its pointing to track those vehicles as they get closer and as they’re doing their own rendezvous towards International Space Station.

Gridium:        Would you say the nadir vector, is that somehow from behind the ISS?

Eugene:         Oh, so nadir is the vector pointing at the Earth, right? It’s down.

Gridium:        Okay.

Eugene:         I was just going to say, there was one other key element which I forgot to mention, which is the Space Cube, which is a reconfigurable computer that we use on Raven which is really the brains of the operation.

That’s where we run our machine vision algorithms and the key with that is that it is reconfigurable both from hardware—firmware and software perspective, excuse me.

Meaning that as we run these experiments, we get to modify these algorithms based on what we’re learning and then upload them and as another visiting vehicle comes, test those improvements to the algorithm.

Really an enabling technology for us.

Gridium:        Cool. Was that—I was about to ask if there’s been a particularly difficult piece of technology for you and your team to work out.

I wonder if that was the most difficult?

Eugene:         Certainly getting the algorithms to fit inside a computer that is tolerant to the space environment, specifically radiation, is a challenge.

And the computing capabilities that are available for space are far and away less capable than what we have on the ground right now. I think the joke that I hear is that the Hubble computer is less powerful than your smartphone… and, Raven has got Space Cube which is very powerful, but because it has to go through space qualification, it’s not as capable as things you can find at Best Buy, you know, this year.

And that’s part of the challenge, is getting it to fit on the box. These algorithms… people on the ground are developing machine-vision algorithms that, for autonomous driving cars or drones… and they get to leverage the processing power of today. Whereas we have to get it to fit on this much smaller footprint for space.

Gridium:        I understand that there are still tons of other potential uses for some of this technology, like Restore-L, Orion or In-Space Assembly, is that right?

Eugene:         Yeah, that’s right.

So Restore-L is a technology demonstration mission that’s going to help us prove that the technology and capabilities required for essentially servicing a satellite are ready for primetime, are ready to be provided to industry to foster a US servicing industry for satellite.

Also, Orion is being developed with a manned flight capability and we know that we would like to provide the agency with the capability of doing autonomous rendezvous for that fleet of vehicles.

We’re going to be able to hand-off the lessons learned and the technology that we’re developing for Raven to Orion. Similarly, there’s missions like W First that are considering servicing or other enabling missions like In-Space Assembly that would be capable of… you know, you can consider doing these missions with this technology now.

Gridium:        Let’s get back to Raven: how are you and your team defining success?

Eugene:         Success for us is being able to do POSE in real time and POSE is that determining 6-degrees of freedom being the position and the attitude of these visiting vehicles.

So, we want to do that in real-time on the vehicle. A lot of people have been able to demonstrate in post-processing that they can do POSE; but the challenge of course is doing it the first time that you see a vehicle, the first time that you attempt it in a flight environment. It’s kind of like knowing the answer beforehand if you get multiple attempts at it, and that’s really the challenge. And then we’ve been able to demonstrate that now with many of these vehicles.

The first time we see the vehicle, we’re able to do POSE on it, we’re able to track it very robustly.

Gridium:        That’s awesome.

Eugene:         Yeah, so that’s one.

The other key for success for us is data sharing. We’re collecting this imagery and we are allowed to share the visible imagery with US Industry, so they can start to develop their own algorithms and train them so that one day they might be able to do it on orbit.

And also, there’s availability for the infrared and the LIDAR data to be shared through agreements with NASA. So, that’s another way that we’re enabling and fostering this US Industry of Satellite Servicing and other rendezvous and prox ops capability.

Gridium:        That’s really cool Eugene that the imagery and the data is available to companies back here on Earth!

Eugene:         Yeah, that’s right. And not only is the imagery available.

We’re actively transferring our technologies to interested US companies to jump-start a domestic servicing industry with a robust fleet of servicers.

In fact, we’re hosting our second industry day on January 30th of this coming year to facilitate that process.

Gridium:        Cool.

And one of the things I’m having to remind myself is, as we speak, Raven is attached to the International Space Station, which for the benefit of our audience—and I had to look this up—is travelling at 17,150 mph, 254 miles up in the sky.

Can you tell us what it was like to see all of your work? You know, I imagine you watched the SpaceX launch. What did that feel like?

Eugene:         Yeah, it was a little nerve-racking to watch the vehicle sitting on the pad. I went down to Kennedy, took my family down there. My kids are young—they were 4 and 6 at the time, and this was their first launch..

Gridium:        Wow.

Eugene:         …which was really exciting to have my son on my shoulders watching the vehicle go up in the air.

It was a low cloud covered day. I think cloud cover was 3 or 5,000 feet and the vehicle went up, left the pad in a matter of 5 or 10 seconds it was gone. We could hear it for many minutes after that, but it was really cool to see the way that my kids—it was more impressive for me to see the way my kids were really inspired by that and you know, I’ll cherish that forever just as much as seeing all that hard work go up into space and really, the payoff was being back at Goddard, powering up the payload and seeing images come out for the first time.

We launched the vehicle with the sensors facing down into, basically a plate—you couldn’t see anything but white. We have launch locks that have to deploy and the gimbal has to actuate.

Seeing that come to life for the first time from the control room was actually more exciting than the launch because that was the payoff, for us.

Gridium:        Yeah, I see. And how are the experiments going? Is your team having fun?

Eugene:         Yeah, we’re having a lot of fun.

It’s very exciting in the Control Room. So, typically our operation intervals happen very late at night. It happens on the weekend more often than not, so at 2am on a Saturday there’ll be a small group of us that are focused on doing the pointing of the sensor package and the pose, the algorithms—getting those up online—and there’s challenges with every vehicle.

The Russian vehicles for instance, come up to Space Station varying expediently. And so, in a matter of 30-minutes, there are at the far reaches of where we can see them all the way to capture, and that’s a very challenging endeavor to be able to keep up with them and it’s very exciting in the Control Room having us call out the marks where they’re at so we can both affect the algorithms to do their job, but also to keep our eyes on them.

The American vehicles are more deliberate and give us more opportunities to tweak the algorithms in real-time and really push the state of the art of where we’re at: what these algorithms can do and to stretch them in ways that we didn’t even intend to, initially. But we have the great opportunity with this platform on orbit.

Gridium:        So, Eugene, 99% of satellites in space are not designed for rendezvous or servicing.

What do you think the future of satellite servicing will look like and how will your team’s algorithms and the work from Raven shape that future?

Eugene:         Yeah, I mean, you’re right.

Right now, the majority of the vehicles up there were not designed for rendezvous. They don’t have cooperative features on them that allow us to more readily see them and know where they’re at.

They also don’t have interfaces designed in to the modules to be able to readily replace or repair them. I think as we mature these technologies, that is going to become more prevalent in the design of vehicles because it enhances the capability of these vehicles from a resiliency perspective, from an expansion of capabilities perspective, from a flexibility perspective—it allows them to be more robust and more resilient and allow us to expand the capabilities post-launch as technologies improve on the ground.

And I think that’s going to be really enabling for us to push the science of space further.

Gridium:        What remains for the Raven mission and what are you and your team working on now?

Eugene:         Yeah, so the Raven mission will fly for another 12 to 14 months—February of 2019 will be when STPH5 is slated to be removed from ISS.

So, we’ll continue to do rendezvous monitoring and autonomous operations with these visiting vehicles. We should get—over the life of Raven, over the 2 years—25 visiting vehicles coming and going.

We’ll have a new series of vehicles coming soon which’ll be the manned U.S. fleet of vehicles: so the Boeing vehicle and the SpaceX vehicle, which’ll be a new visiting vehicle type that we can practice our algorithms with, so it’ll be really neat.

Gridium:        Cool.

Okay, well we wish you Eugene and the rest of your team nothing but the best of luck. This has been a lot of fun!

And for more information, we invite the audience to check out the dedicated Raven mission page on the Satellite Servicing Projects Division section of NASA’s website.

Thanks again, Eugene.

Eugene:         Thanks for having me!

About Millen Paschich

Millen began his career at Cambridge Associates, trained in finance at SMU, and has an MBA from UCLA. Talk to him about bicycling, business, and green chile burritos.

0 replies on “NASA’s autonomous rendezvous for satellite servicing”

You may also be interested in...

4CP Season 2024: A Look Back & Lessons Learned
4CP Season 2024: A Look Back & Lessons Learned

How did our new 4CP alerting service help Texas building operators and owners get ahead of costly demand charges this summer? Check out the results and learnings from another active, heatwave-y 4CP season in ERCOT territory.

Boosting Building Performance on a Budget: 4 Takeaways from BOMA 2024
Boosting Building Performance on a Budget: 4 Takeaways from...

Earlier this month, the Building Owners and Managers Association (BOMA) International hosted its 2024 conference in Philadelphia, PA. Here are a few of our takeaways from the event for those of you who couldn’t make it out! Focus on the…