P3: NASA Apollo Rich Audio Access

Objective

Develop a schema to store Apollo data and meta-data.

Background

Previous attempts have captured at most 2 channels (air-to-ground and flight director loops). Other loops were previously unavailable. Our recent digitization efforts are going to make as many as 28 channels available. The new schema should handle this.

Merit/Impact

Simultaneous access to all audio channels will offer a new perspective on the mission. Being able to selectively hear a subset of the channels will allow investigators to follow conversation chains. Some of the critical features to support will be the ability to move back and forth in time and channels (and given the fact that the audio channels are going to be 10 days long). Additionally, there will appreciation for a feature that allows user’s to play multiple channels at the same time.

Method

List of data channels

Mission

Controller Positions

Mission Personnel

Conversation

Additional Analysis

Additional analysis by algorithms will generate new (or automatically processed) meta-data. I am listing them here: * Word Count Analysis * Sentiment Analysis * Conversation Turn Taking Analysis

In general, we will tie these things up to MET. In this way, we will be able to synchronize the analysis will audio channels. I would like to show these analysis as add on widgets on our main audio access board.

Additional meta-data captured during the mission is also available: * Pictures * Videos

These will also by synced with MET and can be displayed in separate widgets (lets discuss).

Results

Pending. Populate post discussion.

Conclusion

Pending. Populate post discussion.

Appendix

mission control