Exploring Google Tango

Google TangoProject Tango is a line of Google smartphones and tablets featuring the ability to accurately track position in space with a new depth-sensing camera. Tracking position on a traditional smartphone is limited and has a large margin of error, making the technology largely unreliable in this application. The unique camera on Project Tango devices allows them to “see the world in 3D,” taking photos that capture the distance from the camera rather than capturing color for each pixel. Project Tango devices track movement of objects in the camera’s frame. Because of this, they can accurately estimate how the camera must have moved to cause objects to shift in this way. The tracking ability in the Project Tango devices opens up exciting new possibilities in a variety of fields.

“At Academic Technologies, we are very excited about Project Tango. It is extremely promising technology, and quite easy to program for,” says Ryan McGrail. Ryan has been leading the exploration of Project Tango at Notre Dame, and is in the early stages of understanding the device and its potential. “I am quickly discovering how I can apply these new features to the apps we create. It is a very well-built device, and lends itself to imaginative implementations in our apps,” he says. Ryan has also observed shortcomings of Tango. The 3D camera is not effective for scanning objects in the distance, or outdoor environments. If the camera is obscured, it attempts to interpret motion from a black image. This can lead to inaccurate data. “The good aspects of these devices far outweigh their problems,” Ryan says. He believes that the kinks observed in the developer version of Tango will be worked out before the devices hit the market.

Currently, Ryan is using Tango in a partnership with Notre Dame’s architecture program. They are developing a blueprint-reconstruction tool that would allow the user recreate a space inside a Tango device by simply walking around the building. The 3D model created with Tango easily translates into blueprints of the space, and can be imported into architecture programs. Beyond this current project, Ryan sees vast potential for Project Tango at Notre Dame. “Tango can be used as a handheld frame to see the world…We could use the devices to map out buildings on campus, such as the main building. With the models saved to a few Tango devices, Notre Dame admission officers could take them around the country, and allow prospective students to “walk” around the buildings for themselves.” Tango could also be utilized in the classroom, creating an opportunity for instructors to virtually bring students to a museum on the other side of the world, or for designers to see their work come to life. “The possibilities are truly endless, and this is just one of the simplest ways to implement Project Tango,” Ryan says. We look forward to continuing to explore the Google Tango technology, and will keep you updated as we progress.


You may be thinking this article is about the ins and outs of living at college in a dorm. Or you may be thinking it has something to do with snowboarding.  Unfortunately (or fortunately), this article is actually about a teaching tool called StoryboardThat.

StoryboardThat storyboarding website

StoryboardThat is a free (freemium version allows you to create two storyboards per week), quick, and easy way to create storyboards.  What is a storyboard?  Well, according to Dictionary.com, a storyboard is “a panel or panels on which a sequence of sketches depict the significant changes of action and scene in a planned film, as for a movie, television show, or advertisement.”  A storyboard can be very useful when trying to engage students in thinking through the flow of a story or a great way for students to be creative with their stories.  And, it can help to communicate an idea via a story enabling students to visualize it themselves.

StoryboardThat even provides a resource of curated storyboards on a wide range of topics that is easily searched by tags.  You can find storyboards that range from Shakespeare to the Hunger Games (you can also scroll to the bottom of the homepage and find a listing of categories of storyboards).  So, if you are wanting to present a topic in a new and different way to engage your students, then just go to their curated storyboards and see if someone has already created a storyboard so you won’t have to.

You can communicate your storyboard by printing it, exporting it as an image, exporting it directly into PowerPoint, embedding it into a webpage, or sharing it with others through email, Facebook, or Twitter.

The next time you are looking for another way of helping your students visualize a topic, then you might want to just StoryboardThat.


Making it easy to create video

There’s a growing need for faculty to create video content quickly and easily. There are a lot of potential use cases.

  • Flipping your class by having students watch a video before coming class
  • Answering student questions in a visual medium to enhance understanding
  • Creating content for distance education
  • Making training videos

Unfortunately right now this is a pretty complex process. It requires studio space, a videographer, complex editing tools, specialized lighting, etc. Additionally the turn around times can be lengthy. Right now there’s no good solution and no good system in place to help us provide this service to faculty. So we built our beta lightboard back in March with the hopes that people would see it and get excited about it. We Love Bright Ideas! It had the desired effect. The College of Science was all over it. We’re in the process of helping them build a full scale version and we hope to have it operational by July 1. Unfortunately that may not meet the needs of everyone. It’s also not currently the most user friendly setup and will require some handholding for users.

Enter the Penn State One Button Studio!

  • You plug in a flash drive and the system turns on.
  • You hit a button and the system starts recording.
  • You give your presentation.
  • You hit the button when you’re done.
  • The file is automatically saved to the flash drive as an mp4 which you can upload to Sakai, YouTube, Kaltura, etc.

We’re starting to take a look at this now because we feel it meets most of the requirements for video creation. It’s one of those 98% solutions. It may not be perfect for everyone but if it’s good for you, it’s really easy and really good. Look for more later this summer! http://onebutton.psu.edu/

Got Brain Activity Update

We ran an experiment yesterday with the Neurosky Mindwave Mobile Headset and used the Puzzlebox Orbit helicopter for a portion of the experiment.  Our goal was to better understand how the headset worked with live participants in an experiment situation.  We are still gathering the data at this time, but will publish the unscientific results on this blog in the near future.

Our experiment consisted of 3 separate tasks set to test whether certain activities would enhance the attention/focus of the participants.  Each task involved a Braingle.com memory test (the word test) and two of the tasks had brainwave information recorded for the participants using NeuroSky’s Recorder app.  We used the iPad for each task — two of those tasks were for recording brainwave data and the other for operating the Orbit helicopter.

We found five willing participants who had never used the Neurosky headset nor the Orbit helicopter.  So, for our N=5, we won’t be able to derive too much from the results, but we will be able to learn how to set up experiments for the headset and helicopter in the future which will be invaluable.

So, stay tuned for our unscientific results!

Campus Cocoa Coding Consortium

by Jeffrey Hanrahan, Academic Technologies

Throughout the University of Notre Dame and Saint Mary’s College, there are departments that are using Xcode and iOS to develop mobile and workstation applications for internal use, teaching, learning and research.

In an effort to collectively identify the application developers and pool everyone’s knowledge and expertise, a group named Campus Cocoa Coding Consortium (C4) was formed at the University of Notre Dame.  The purpose of the group is to help each other learn how to develop applications in Xcode and iOS, talk out code problems, conceptualize processes, troubleshoot programming workflows and design human interface elements.

The C4 group is composed of faculty and staff from the University of Notre Dame and Saint Mary’s College.  The knowledge level extends from novice to expert and knowledge is shared.  You get to hear about technical issues and information that you don’t find in the programming manuals.  Every member has some type of specialty knowledge and others get to learn this information.

There is a weekly brown bag lunch meeting with the purpose of discussions that get very technical in nature.  These are working meetings where code is written and tested.  The C4 project code resides in the web-based hosting service GitHub where the group members can access project files at any time from anywhere.

So far, the group has provided assistance for a mobile application from Architecture and is currently working on a facial recognition mobile application.

If you are a faculty/staff member at the University of Notre Dame or Saint Mary’s College and would like to join the C4 group, you can contact John Slaughter at jslaught [AT] nd.edu.

Make Your Media More Interactive and Engaging

Have you ever wondered how to make your video accessible for people who have a hearing problem or speak a different native language? Have you ever wondered how your video can benefit visual learners who absorb and recall information best by reading as opposed to listening? Have you ever wondered how to give the audience the freedom to search across your video and jump to the exact keywords that interest them? You are not alone, we have been wondering about that for a while. Fortunately, we now have a solution … interactive transcript.

Interactive transcript provides the audience a new way to enjoy your media content. Similar to subtitles in many ways, interactive transcript is displayed next to the audio/video source. As the audience hears the words being spoken, they can also see matching words highlighted or underlined in the transcript. The entire transcript is clickable, users can click on any word and start to enjoy the audio/video from that exact point. They can also search the transcript and jump to the part that interests them. Interactive transcript provides users a much richer experience with media content.

1. Click on word "mammals", the video jumps to that point and starts to play from there.2. When "mammals" is spoken in the video, it's also highlighted in the transcript.3, Search "fish", all the parts that mention "fish" are highlighted in the transcript.

1. Click on word “mammals”, the video jumps to that point and starts to play from there.
2. When “mammals” is spoken in the video, it’s also highlighted in the transcript.
3, Search “fish”, all the parts that mention “fish” are highlighted in the transcript.

There are a nurmeous of interactive transcripts providers, of which 3PlayMedia and SpeakerText Captionbox are among the most popular. We ran a small pilot with 3Play Media’s service. We provided them audio/video files along with the transcripts, and they synchronized the text with the media using automated speech technology. The result has been very satisfying, except we find the price to be a bit high.

Can the price be a showstopper? If you are like me, a strong believer in OpenSource, you must be wondering if there is something like that available for free. Well, after some digging, voila! Pipwerks already published the API(s) for adding captions and interactive transcript for online videos. And it’s completely OpenSource. Now it’s time to roll up our sleeves and climb onto the shoulders of giants. Thanks to the EasyCaption and Kaltura API, I was able to help an Irish Studies professor build a repository (Irish Stories) to collect and share Irish Immigrants’ stories in a more interactive and engaging way. If you are interested, here is an example.  Please feel free to contact me at xiaojing.duan@nd.edu for any technical details.



We love bright ideas!

We love it even more when someone else has one and lets us use it.

Often times you want to make a video to illustrate a point. Ideally it would feel like a discussion. You’re facing the viewer and explaining something to them. You’re not facing a chalkboard or a whiteboard and turning your back to them. It feels natural.


Let’s state right off the bat that this is not our idea. It’s called the light board and it came from Michael Peshkin at Northwestern University. The Lightboard Home Page is really incredible and gives you all the details you need to make your own copy. Parts list with numbers and links, diagrams, technical details, etc. It’s open source hardware so he encourages you to make your own, experiment, etc. Just share what you’ve done.

We’re kind of space constrained and there’s very little available space on campus. While we think this is a great idea, we’re going to need to be able to show people how this works in order to get the funding and square footage required to make it a reality. Instead of making a 4×8 board, we made one that was 3×4. Still big enough to be useful but small enough we can find a place to demonstrate it. It’s also cheaper than a full size unit.

Ordering the glass is pretty easy due to the well detailed specifications. To build the frame we worked out a design and ordered a bunch of 80/20 aluminum. Assembly took a couple hours.

2014-03-13 09.57.31 copy

The box actually said it was an erector set for adults.

We were too excited to worry about the crappy lighting!

We were too excited to worry about the lousy lighting!

There are LED lights underneath the edge of the glass that cause the text to really pop out of the glass. Since it was a 16 foot roll, I had about 12 feet extra. Part of the challenge in this project is to illuminate the presenter and not add glare. I took the extra 12 feet and stuck it on the glass.

2014-03-14 14.35.56

I’d say it does a great job of illuminating the instructor.

I also took a few accent lights we had laying around and used them as a key light.

2014-03-14 14.36.24

Obviously we need to work on the ambient light…

Not bad for a beta test!

Not bad for a beta test!

Overall, we’re thrilled with the effect. It’s really much more pronounced than it appears here. We still have a lot of tweaking and testing to do but I think we’ve established the feasibility of the system.

Now we start showing this thing off and we’ll see if we can get a 12×15 room to really do this right!

Thanks to the following people:

  • Michael Peshkin at Northwestern University. Obviously!
  • Our colleague David Seidl for bringing it to our attention. He’s really interested in that whole maker-space culture thing. Apparently he saw it as a post on Hack A Day.
  • Tim Cichos in Notre Dame Learning Spaces who helped us engineer the frame.

Lecture Capture

You’re sitting in class, furiously taking notes in an attempt to keep up with the professor’s lecture. Even for the best note takers, it’s impossible to write down everything. That’s where lecture capture systems come in. This technology allows instructors to record what happens in their classrooms, subsequently making it available for students. Lecture audio, video, and visual aids are recorded and synced for playback on demand. The material is searchable, allowing students to easily find a specific slide or diagram that they struggled to understand in class. It has important implications for learning and online course development.


Notre Dame's Echo 360 Interface

Notre Dame’s Echo 360 Interface

Lecture capture is a valuable educational resource because it moves content beyond the classroom. For example, if students are struggling to understand a concept, the professor can further elaborate on the topic from his or her office, making that video immediately available for students to watch before the next class meeting. This frees up time for professors to make class more interactive. It also promotes greater understanding of material by giving students an opportunity for content review and exploration. While Notre Dame currently provides access only to courses students are enrolled in, other universities organize lecture content so that students can even browse content for courses in which they are not enrolled. By bringing class content online, students get a better grasp of material and an opportunity for interdisciplinary learning.


With online learning becoming increasingly prevalent, lecture capture is an important foundational tool. When appropriate, content created through lecture capture may be available for online course development. The potential for online course development has important implications, including the opportunity for distance education. Notre Dame is currently teaching a distance education course in Santiago, Chile through the lecture capture system. This is an important project to explore the potential of lecture capture in online learning. The main challenge with this content becomes intellectual property rights. At Notre Dame the intellectual property rights of faculty are protected by the University’s existing policy, which covers the creation of educational material.


Lecture capture is still in its proof of concept phase at Notre Dame. The Academic Technologies group is comparing two vendors: Echo 360 and Mediasite in two different classrooms in DeBartolo Hall. There are four classes currently using the system. The AT team hopes to make a recommendation by the Spring 2014 semester, and scale up the concept to an official pilot, working with more faculty.


Please contact cbarbour@nd.edu with any questions or comments regarding lecture capture.


For Further Reading:





Hue Lights

Imagine waking up to the colors of a sunrise slowly filling the room. Phillips Hue Lights make this possible. The LED light bulbs feature adjustable brightness and the ability to shift colors. Using 11 LEDs in 3 different colors, Hue lights can create 16 million color combinations. The energy efficient bulbs are meant to last up to 15 years and use 80% less power than traditional bulbs.

The Hue system connects to a central bridge and uses the open source ZigBee Lightlink wireless standard. The bridge can control up to 50 bulbs, allowing them to be managed remotely with the Hue app for iOS or Android. Colors can be chosen from the app’s premade “scenes”, creating lighting effects based on that image. Users can also create scenes from their personal photos, replicating the lighting of these images in their home. Dragging the color picker across a photo allows the user to select the color they want from within the image. The app also contains “recipe” options, featuring hues of light specifically  designed to compliment a state of mind, such as focus and relaxation. A timer can be added to each scene, setting the lights to brighten over time.

In addition to these household uses, Notre Dame is exploring how the Hue lights can be coded for further application. They can be programed to change color for  various notifications, such as weather conditions, or incoming emails and messages. The Hue lights also have the potential to work with Microsoft Kinect, allowing the lights to shift color as they detect movement.

Sifteo Cubes

Siftables are…sets of cookie-sized computers with motion sensing, neighbor detection, graphical display, and wireless communication. Siftables act in concert to form a single interface: users physically manipulate them—piling, grouping, sorting—to interact with digital information and media. Siftables provide a new platform on which to implement tangible games.”