Blog Category: Degree Project


One last Deaf Device Find & Share

By Erika Goering,

As evidenced by my unusually fruitful number of find & share posts this semester, it’s pretty clear that the world is looking for technological solutions to Deaf/hearing language barriers.

Here’s one last one that I found the other day:

It’s using the same idea I had for directionality, where two people in a conversation can easily read what is directed toward them, and the other person can do the same.

It’s even got a movie mode!

The unique thing about the Beethoven phone is that it also has functionality for blind users! There’s braille output for situations where hearing isn’t an option, but reading is okay (or if, of course, the user is both blind and deaf).

Braille for the blind!

  Filed under: Degree Project, Find&Share, KCAI, Learning
  Comments: Comments Off on One last Deaf Device Find & Share


Week 12: Refining Presentation

By Erika Goering,

I’ve spent some time adding detail and depth to my video. I’ve given the personas/scenarios visibly different environments (no more bland white background!) and different colored iPhones too.

I’ve also got some views of the app/phone and watch peripheral from an angle that I didn’t use before, so that adds some visual interest to the whole thing.

I’ll have to go into After Effects soon so I can add a voiceover (and captions) to everything.

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 12: Refining Presentation


Week 11: Relay Animation, Round 1

By Erika Goering,

This week was all about animating my scenarios to show exactly what Relay does and how it works. It was a relief to finally be able to demonstrate how these things were moving around in my head! There’s a lot of sliding elements and a few rotations to show how the whole thing works. It’ll help even more to have context shots that show how the device works directionally and spatially between people.

Here’s what I have so far. Keep in mind, this is just a rough draft of my video. But it gets the point across for now.

Directly exporting from Keynote screwed up the timing (for example, screens should quickly transition after selecting a mode), so I think I’m going to make some minor tweaks in AfterEffects this weekend. Doing that will allow me to add a voiceover too! (I’ll be captioned, of course.)

Next steps:

  • context shots!!
  • video timing tweaks (and narration voiceover/captions)
  • poster design
  • change the movie in the last scenario to something more appropriate for my new persona (she’s a kid, after all)

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 11: Relay Animation, Round 1


Week 10: Final UI Refinements

By Erika Goering,

Responding to the feedback from Monday, I spent some time exploring the colors and typefaces, with the intent to make everything feel more utilitarian and to give the whole thing more of a balance between masculine and feminine qualities. (The logotype was pretty darn feminine on its own, so I had to do something to offset that and make it less “cute.”)

I added a slideout menu on the side (that’s what those colored bars are), which will just slide the main menu screen on top of the current screen. That way, there’s no need for a back button of any kind (for any device; iOS or Android).

I also added the ability to scroll through movie captions, so users can go back and catch something they might have missed. This is particularly handy if the user is lipreading, and someone onscreen mumbles or covers their mouth (or is facing away from the camera, etc.).

 

And the peripherals have been updated to match the current style.

 

Next steps:

  • acquire someone’s iPhone and take some context shots!
  • start putting together some awesome Keynote animations!
  • start thinking about (and working on) the big poster

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 10: Final UI Refinements


Week 9: Wrapping Up the Brand & UI

By Erika Goering,

With my branding/naming adventure, I decided to replace my Sign•ify idea with the name Relay, simply because it makes more sense. As Jessi mentioned awhile ago (she was a HUGE help in naming my app), Sign•ify sounds like I’m making an app that converts speech into sign language, and not the other way around. So I dropped that idea and Relay was born.

Relay is my favorite choice because it embodies the idea of the back-and-forth nature of a conversation, as well as the  act of translating back and forth between languages. Also, as the Deaf community knows, a relay service is one that involves a third party to translate/interpret telephone conversations between text and speech for Deaf or HoH people and the hearing people they talk to. My app idea is like a relay service for real life, without the extra person.

Here’s some of the directions I explored:

screenshot_logo1

Gist was another favorite of mine, because there is no way to directly translate between ASL and English. Some things will inevitably be lost in translation, but the important thing is that it gets the point across. (I also toyed with the idea of referring to the gesture-based armband as the Gisture band… but that started to get a bit silly, I think.)

Lingo was another idea that had some potential. It conveyed the idea of language, obviously, but it also conveyed portability and ease (thus the arrows and the “go” part of the word). This idea wasn’t the strongest, by far, so I flip-flopped between Relay and Gist for awhile. I think Relay has the most amount of meaning of the three, so that’s what I’m going with.

Along with my new branding, I’ve changed the colors I’m using. Goodbye, trendy design colors. Hello, tasteful complementary colors!

In other news, I’ve decided to bring back the watch peripheral for a couple of reasons:

  • The armband for gesture-based ASL recognition needs to be higher up the arm, to detect elbow movement as well as muscle flexing. This will make it difficult to include a display directly on the armband in a comfortable and usable location. (Who wants to look at their elbow all day?)
  • I’ve decided to make the watch peripheral optional, because I’m also adding Google Glass (or another head-mounted display) support. This way, the user can use whichever peripheral he or she is more comfortable with to view the translated messages. A head-mounted display also allows the user to maintain eye contact during the conversation, without looking down at a watch every time someone speaks.

Here’s what the two peripherals look like:

Next steps:

  • Start working on presenting this stuff! (in a poster and Keynote)
  • More context shots!

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 9: Wrapping Up the Brand & UI


Week 7: Branding is hard

By Erika Goering,

I started applying my visual direction to all of my screens. Some things were brought up in critique, such as the emoticons being redundant for face-to-face communication.

 

I’ve been wracking my brain, trying to figure out branding. It’s been a really difficult process, and I can’t figure out how to stop overthinking it. (I’m overthinking how to stop overthinking, resulting in my mind imploding.)

 

After a few days of driving myself crazy, I finally came up with a few names:

  • sign•ify (or signify)
  • lingo
  • gist

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 7: Branding is hard


Week 6: Wireframes & Visual Exploration

By Erika Goering,

Wireframes

One-on-one mode can utilize both speech and ASL input from the Deaf/HoH user.

 

Group mode can also be used in a one-on-one conversation from afar, like across a long table in a conference room, for example.

 

 

Visual Exploration

This is a textured look that isn’t necessarily skeuomorphic, but it still gives a sense of depth and tactility.

 

I was going for a whimsical look here, but it didn’t really work out the way I had hoped. Oh well. It doesn’t feel right for this project anyway.

 

As I passionately talked about in a previous post, flat UI design is the way to go. It’s honest and true to digital media. It’s the practical, “no BS” approach, and it just plain feels right.

Next steps:

  • Branding!
  • Refine visual design
  • In-context shots (to show people what’s been in my head this whole time)

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 6: Wireframes & Visual Exploration


Find & Share: More Peripherals!

By Erika Goering,

I’m getting rid of my infrared idea for the Deaf watch. Why? Because of this thing:


It’s an armband that can sense your gestures and finger movements by how your muscles flex. (We live in the future!)

Watch the demo video to see its true awesomeness.

  Filed under: Degree Project, Find&Share, KCAI, Learning
  Comments: Comments Off on Find & Share: More Peripherals!


Week 5: New Considerations: Hardware Peripherals

By Erika Goering,

An idea that I had originally been hesitant to try was introducing theoretical hardware into the project in order to make technology less invasive. By developing something wearable, I’m starting to make the technology less of a burden and a more normal part of the person’s daily life.

Another issue that was raised was the burden of requiring multiple hearing users to download the phone app to use the “multi-device” mode. So my challenge then became to come up with a way to eliminate the need for multiple devices. So I’m getting rid of “multi-device” mode and replacing it with a kind of broadcast mode, where the Deaf user is the only person with the app and hardware, using the hardware for input and the app would then display the translated text output for everyone to see. (Sketches of this are coming soon!)

I’ve also updated my scenarios:

  • One-on-one mode: Alison, who is hard-of-hearing, orders food at a noisy restaurant. The waiter asks her a question, such as, “do you want fries or mashed potatoes?” Because of the app, she can understand and reply to the question with ease!
  • Multi-user mode: Felix, who is Deaf, is hanging out with a group of hearing friends at a coffee shop.  They have a rather energetic, quick-paced chat about school, and they can all understand each other without the need for lip-reading or slowing down (they’re all snacking and drinking coffee anyway, which makes lip-reading more difficult by nature).
  • Movie mode: An as-of-yet-unnamed user goes to a movie theater without open captions. She is able to skip the awkward process of signing out a pair of closed-captioned glasses at the front desk, and instead queues up the movie’s subtitles on her phone. When the movie starts, she hits the “sync” button and places the phone in her cup holder for easy viewing.

Next step: branding and visual exploration. I kind of put that stuff on hold so I could focus on exploring the hardware a bit more. It’s time to get back on track!

  Filed under: Degree Project, KCAI, Learning
  Comments: Comments Off on Week 5: New Considerations: Hardware Peripherals