Ever since I first got my Google Glass in May I’ve been wanting to build a teleprompter for it. I compete in a lot of hackathons and present onstage a lot, and normally I read my speech off my iPad, but it’s awkward to be looking down reading from that.

Unfortunately I didn’t have any Android development experience, nor had I ever built much with Java, as my skillset is largely C#/.NET (which I’ve been developing in since 2001, including working at Microsoft). Throughout 2013 I was working on building a startup, so I didn’t have time to learn Glass development. But I heard about Xamarin’s Glass support about the same time as I was gearing up for another year of professional hackathonning, so I thought it was time to go ahead and finally build the app I’ve been thinking about for so long.

The AT&T developer summit hackathon took place in Las Vegas on January 4th and 5th, 2014. It was a huge hackathon with hundreds of hackers, over $100k of prizes, and a really great vibe. Over the course of 24 sleepless hours, I taught myself Glass development with Xamarin and built a very rough prototype of GlassPrompter.

GlassPrompter displays the text of what I’m supposed to say onscreen so I can read it onstage. But it doesn’t just advance the text at a set pace, it listens as I talk and moves the text to match my pace. It does this by listening to what I’m saying and running speech recognition in the background, matching up what I say to the script I have entered. Since it follows along as I speak, it can also advance slides when I get to the right spots automatically.

I’ve still got some more work to get GlassPrompter to where I’d like it to be, but it performed very well onstage and I got a lot of positive reactions from the audience. Unfortunately I didn’t use any AT&T API’s and therefore didn’t qualify to win any of the prizes, but I am very happy with how it turned out, and look forward to using it when I present in the future! Check out the source on GitHub.