I have created a new project glass inspired video which works on translating realtime conversations and displaying it as subtitles in your glasses. I can have a conversation with Elizabeth who speaks Spanish to me and I return with English. I have never learnt Spanish but using the glasses I can have a full conversation. To make this possible I used:
- 2x Raspberry Pi running debian squeeze
- Vuzix 1200 Star
- Jawbone Mic
- Headset Mic
Follow after the break to find out how it was all put together.
The individual using the glasses be wears the Vuzix 1200 Star glasses which are connected to the s-video connector on the first raspberry pi and the Jawbone bluetooth microphone that connects to a device such as smartphone or tablet to provide a clean noise cancelled audio feed. The bluetooth microphone streams across the network of what I say and what it picks up around me. This is then recognised and passed through Microsoft’s translation API with a caching layer to improve performance of regularly used statements. Passing through this API service is the biggest delay in the subtitles. Once translated the server passes back the text and translations that are picked up by the raspberry pi driving the TV and glasses displays. Elizabeth uses a headset mic but could user her own raspberry pi, glasses and jawbone microphone to have the same experience as I do.