R.I.G V1.0: Super Enhanced Field of View

In 2015 I entered into the NASA International Space Apps Challenge in Boston and with my team we designed a wearable computer system for the EVA Space suit.

We took a close look at the limitations of the current space suit and we tried to find essential features that could increase safety, comfort and adaptability for astronauts on spacewalks.

We found that faced with constantly changing scenarios, an astronaut’s ability to react is often constrained by gear and utilities which don’t facilitate real-time feedback. Research began by looking at previous projects that tested the idea of augmenting the space suit with wearable computers. By far the most helpful one was “A Wearable Computer for Support of Astronaut Extravehicular Activity“</a> by Christopher E. Carr. We were also lucky enough to have Christopher as a mentor for the project as well.

Once we new the shortcomings of the exiting suits it was easier to pinpoint the features wanted to develop. We chose to focus on the astronauts lack of visibility, connectivity and adaptability. Our solution was to design a wireless individual network that integrates multiple scalable modular devices allowing for on-demand display and transfer of real-time data to astronauts and mission crew.

The platform augments and expands the astronaut’s ability during extravehicular activity by feeding data from modular devices through a central microcomputer to a Heads-Up-Display. The wearable microcomputer is an internally mounted device that interfaces with the current Display and Control Module for space suit life support readings. The microcomputer acts as a HUB and wireless router allowing for transfer of real-time data between the astronaut and mission crew. In our prototype we used a Raspberry Pi Computer  for the mounted computer. The Raspberry power has ok processing power (enough for what we needed), wifi capability, and bluetooth connectivity with the additional usb dongles we used. (Full parts list at the bottom). For the HUD we used 2 iPhones connected via tightvnc. The external independent modules, were designed to provide a universal power supply along with an Internet-of-Things base, extending interfacing devices’ (cameras, sensors, etc.) real-time feedback to both control and/or astronauts, but we had to just use an other rasp pi and a camera to give an example of what it could look like. Finally for the actually helmet we used, are you ready……Box-o-Joe from Dunkin Donuts hard part of the helmet, face shield from safety equipment, car window tint for the visor, and old winter jacket for covering. This is what we had at the end!

       

 

We work hard and fast thankfully it was enough to win 1st place at the 2015 NASA International Space Apps Challenge in Boston.

Leave a Reply

Your email address will not be published. Required fields are marked *