AirPods, The Most Accessible Headphones

Headphones are an often overlooked but essential piece of equipment for the blind. Accessing a screen reader in the privacy of your own home in a quiet room is a simple affair, you can just use the loudspeaker of your phone or computer. Add some environmental noise, head outside or dare to venture into a coffee shop and the loudspeaker is no longer functional.

Headphones enable me to use my iPhone both indoors and out and about, i literally couldn’t use my iPhone without headphones. Therefore, over the years i have amassed a rather substantial collection. Everything from a cheap pair of JVC up to a rather expensive pair of active noise cancelling Bose. I am rarely seen without a pair of headphones and have them stuffed in every pocket and every bag.

I am constantly looking for the perfect pair of headphones, the pair that will make using my iPhone that much more accessible. Now i have found that elusive pair, the Apple AirPods.
The AirPods are Apple’s truly wireless earbuds. Two single ear pieces that fit snugly inside their own charging case.

They solve many of the problems a blind user has with headphones. Cables. Cables are a nightmare. Get them tangled in your pocket? Try untangling them when you can’t see. It just takes that much longer to untangle them. To the point where if I quickly need to access my phone i would prefer not too. The time taken to untangle the headphones ends up being greater than the time i needed to use the phone. So often i would either ignore a notification and vow to take a look when i got home, or place the phone close to my ear to listen. After all with a screen reader the only way you get privacy is by using headphones. Imagine if all your texts were read aloud? That embarrassing one from your friend is even more embarrassing when everyone in the lift hears it too!

So the wireless nature of the AirPods truly makes them more accessible. I can just quickly and easily slip them in. No cables to un tangle, just flip the lid of the storage case and they are in my ears for that quick check of my phone.

This brings me to one of my other favourite accessible features. Only using one of the AirPods. When you rely on sound to understand what is happening around you, having one ear focus on the screen reader frees up the other to environmental noise. Handy when walking down the street and handy at home or in a meeting. Previously if i received a notification in a meeting and hadn’t worn headphones upon entering i am left with three options. Ignore the message, go through the messy untangle process or interrupt the flow of conversation by having everyone hear your notifications through the loudspeaker. Now.I have a fourth option, just slip in one AirPod and i am away.

While out and about another side effect of being blind is generally having only one hand accessible. To navigate around i either use my guide dog or a long cane. This basically gives me no way to untangle the headphones, so i would often go for the loudspeaker approach. This is gambling with the possibility of dropping your phone as you attempt to juggle it around with one hand.

Now i just slip out one AirPod from the case, pop it in my ear and activate Siri.

There is one other fantastic bonus of using one ear piece. I double the battery life. Not to mention whenever i remove them from the case they are fully charged.

The AirPods truly have increase the accessibility of my iPhone by enabling me to use it in more daily events. I no longer have to remove myself from a social space to use my phone, these AirPods are increasing my social ability.

They truly are the most accessible headphones.

IBM & CMU assisting in mobility

Mobility for the visually impaired is always difficult. From simple tasks as heading to Starbucks for a coffee, to jumping on a bus or grabbing a taxi. Lets take the first example, heading to Starbucks is certainly challenging when you are unable to see, but what about when you enter the store? Without sighted assistance locating the counter or indeed finding somewhere to sit is challenging.

Therefore, any technology that aims to improve any of these mobility issues is always a step in the right direction. With the fear that this blog is turning into IBM fandom, it is yet another project IBM are working on.

Along with Carnegie Mellon University, IBM have developed and open sourced a smartphone application that can help you move from point A to point B.

The app called NavCog utilises either voice or haptic feedback to aid in navigation. NavCog currently uses beacons to assist in the navigation process.

It is great to see the combination of beacons and haptic feedback to aid in navigation. Over 4 years ago I was pitching to just about every GPS manufactured that this could be an interesting direction to head. My ideas seemed sound when Apple announced the Apple watch and it used the exact same haptic feedback system I had been proposing. Further the use of beacon technology to navigate is exactly what I pitched to British Airways a couple of years ago.

I proposed using beacons to navigate Terminal 5 could not only be used to direct potential customers to shops, restaurants and gates, but also aid visually impaired customers navigate the terminal.

It is truly great to see all these ideas put together and finally implemented. We now just need a larger rollout of beacon technology!

This system could also be adapted to solve the internal navigation problem. I was speaking with Google a year or so ago about how project Tango could be utilised to achieve this. I imagined a haptic feedback device that could assist in real time internal navigation. After all my guide dog may be able to avoid obstacles, but an empty chair is an obstacle to my guide dog!

Artificial Intelligence and accessibility

Over the past couple of weeks I have been fortunate enough to be exposed to some fantastic technology as well as ideas. Attending WiRED 2015 kickstarted my thought process on how artificial intelligence could be applied to accessible technology.

While attending the conference there were two ideas I wanted to pitch to people, emotion detection to facilitate social situations for the visually impaired and facial recognition. I felt both these technologies could improve an individuals ability to socialise greatly. After chatting to a few people and pitching my ideas on how these systems could work from a design, implementation and marketing front I managed to interest a few companies and institutions.

There is fantastic scope for these technologies and their assistive ability. I concentrated on the emotion detection system initially as I feel these could have the greatest and speediest impact. I have encapsulated the idea into a product for all, rather than a product specifically for the visual impaired, as I believe these to be key for mass market adoption which, in turn will reduce the price significantly and reduce that initial barrier on any accessible product, price.

I am yet to find a partner to work with for facial detection, but I recently read an article highlighting that IBM are working on this. It really does seem as time goes on that IBM and I could be a great match!

I did also have a grander idea on accessibility while at the conference and was delighted to see it referenced by yet again IBM – cognitive assistance. I have been batting around a few ideas on how accessibility could be personalised. After there are nuances in an individuals accessible needs so why not make the solutions as nuanced. This could definitely be achieved through a cognitive accessible assistant that has the capacity to learn.

An accessible system that is capable of learning could aid in such tasks as reading. It would be able to identify how an individual likes to read information and execute it in that fashion. A nice example would be skim reading, being able to learn how to read a specific document for certain contextual references would be fantastic. This would certainly of assisted me greatly while at university, losing the ability to skim read is absolutely a skill I miss.

I continue to be excited by what technology is enabling and how I can become part of the revolution of accessibility.

TICKR X & The 7 Minute Workout Accessibility

I am always looking for simple and effective ways to make workouts more accessible. It can often be difficult to monitor and track workouts so I was very excited when Wahoo sent along the TICKR X. The TICKR X is a very capable device that can track a whole multitude of stats, from HR, to body movement and more, but the data point I was most interested in was rep counting.

Utilising the TICKR X along with the Wahoo 7 minute workout app on my iPhone, all my reps could be automatically counted. No more writing it all down in an app afterwards, assuming I could remember how many reps I performed on each exercise.

What is the 7 minute workout

The 7 minute workout is a collection of 12 body weight exercises that you can complete anywhere with no equipment needed. It has been shown to give results comparable to longer running or weights sessions. It comprises of 12 work sets of 30 seconds each followed by a 10 second rest. It is a quick and highly effective HIIT workout.

The Device

The device itself is one of the rare devices a blind user can take out of the box and configure and use without sighted assistance. You click the device onto one side of the strap, wrap it around your chest and clip into the other side of the strap. Ensuring the TICKR X is positioned in the middle of your chest, do not worry if the device doesn’t touch your skin. Unlike other HR trackers the HR sensor are located in the strap, not the actual TICKR X device.

To turn the device on tap the TICKR X a couple of times and then you are ready to pair it with your phone. This is achieved inside the app.

The App

The 7 minute workout app is highly accessible, Wahoo have done a fantastic job of labelling all labels appropriately. It is simple to navigate the app and start an activity, as the reps are counted automatically that is all you ensentially have to do to use the app. Start a workout and read your results, no manual inputting its all taken care of.

The app also uses a lot of audio for feedback. For example, the different exercises and start and rest sections are read aloud. It makes for a nice accessible experience.

How Does it Perform

When I began my workout I was surprised it worked, it was a real wow moment as I heard the reps count up as I went about completing the workout. I was quickly put in my place on correct form as the TICKR X wouldn’t count reps with bad form. So no longer can I cheat and just do quick reps with poor form, I am now forced to go lower rep with correct form. While this affects my rep count it does mean I am actually performing the exercise correctly! This was evident in push up rotations, as the TICKR X wouldn’t count a rep if I didn’t perform the appropriate amount of rotation, which is coming when going for speed. This correction of form isn’t limited to simply not counting a rep, as when in plank position if you begin to wain the app notified you to watch form!

The other slightly confusing counting system is in exercises that have an up and down movement, for example, push ups, triceps dips etc. This is because the TICKR X counts the up and down as 2 reps, whereas typically I would count each up and down as one rep. In itself this isn’t actually an issue as it correctly counts the movements and you are able to compare your results and see improvement as the counting is consistent. If anything it just makes it look like you can do twice the number of push ups you used to be able to!

For me the real wow moment of the app was upon completing a full workout. When the workout is finished you are given a table with the rep results of each exercise along with the HR for each exercise. This granularity was a fantastic reporting decision. Typically average HR is used for workouts whereas Wahoo have chose to give you the HR breakdown of each exercise. This is important as you are quickly able to identify the exercises you should be pushing harder on. I was able to see that I was sandbagging it a little on what I consider to be the easier exercises, perhaps unconsciously I was using them as a little rest period.

Like the other sections of the app the reports are very accessible, Wahoo really have done a great job in regards to accessibility for the blind.

Any Bad Points?

Yes, the in app purchases. I was a little disappointed that with a premium priced piece of hardware like the TICKR X required multiple small in app purchases to get full app functionality. It must be noted that this is for the 7 minute workout challenge, there are other apps from Wahoo that work with the TICKR X, but I was focussed on utilising the TICKR X for rep counting.

Overall

The TICKR X is wonderfully accessible with the accompanying 7 minute workout challenge app. I would highly recommend it for a blind user. Its rep counting keeps you honest and the reports allow you to highlight where you should be pushing harder. It is a definite buy for any blind user looking for an accessible and quick cardio workout!

IBM Serendipity

Two years ago in the middle of my degree I went to meet with IBM HR. The idea was to have a chat to them about my vision of an inclusive and accessible world through technology..

IBM stand at a fantastic point within the technology sector where they have the ability to touch a huge amount of organizations in wildly different fields. It was this very point that made me think IBM and I could be a perfect match.

There is a need for all technology to be inclusively designed, to enable everyone to have universal access. From mobile devices, to the internet of things to access to transport. Indeed it was IBM’s Smarter Planet initiative that made me believe there was a way to make the world accessible through the advancement of new technologies.

I pitched to HR that I would be a wonderful fit for an accessibility evangelist, working with all manner of partners focussing on how technology could be made inclusive. From advising on human interface interactions that not only had visual elements but auditory and haptic, to communicating complex information in new and interesting ways. I continued by highlighting that the opportunity to interact with clients at the early stage would aid in a universal design approach amongst all technology.

Indeed it is this early stage approach why I have had great success with Kickstarter. I often find projects in the very early stages and communicate with the team on how minor adjustments could be made to improve accessibility. Be it the addition of audible tones or changing a UI to take into account a blind user. I have also had great success with FitBit and Drop scales. With both companies I advised on how to communicate information in different forms to increase accessibility. The added benefit of this change in communicating information was a greater understanding by all users not just those who cannot see.

I imagine a world where as the next 1 billion people and 10 billion devices come online there is no barrier for interaction, as these products and services have taken a universal approach from the beginning. It is also worth highlighting that this approach can create benefit for all users not just those who rely on accessibility. For example, a low vision user may be aided by contrasting or night mode colour themes, these exact features also assist any individual using the device at night. The route to a truly intuitive and simple design can also be achieved by taking the needs of a blind user. As if you can make a user interface or product that a blind user can utilise, it truly is simple and intuitive.

It was during this conversation I highlighted how important this approach is to all services and products. There should never be an assumption that a particular product or service will not be utilised by a particular demographic. To highlight this I mentioned how I had utilised RunKeeper to learn to run solo outdoors. It would have been easy for RunKeeper to assume a blind person would not use their app. After all what use would this be to a blind person. But thankfully they did and I was able to achieve what was once perceived impossible, to learn to run solo outdoors.

I continued by saying this is why I wanted to work with IBM, I wanted to make sure every service and every product across all sectors became accessible. Just imagine the impact this could achieve with the number of partners and clients IBM work with. With accessibility an assumed standard across the board just imagine the impossible things that could be achieved in the next few years.

During the rest of the conversation IBM HR mentioned they could imagine me starring in an IBM commercial, demonstrating what accessible technology can enable people to do. Well if we fast forward 2 years that opportunity arrived. IBM gave me a call and asked if I could like to be featured in a little video. I of course said yes and the result is the video below.

In those past 2 years I have continued to try and make the world a more accessible place, through advocating for universal design, working with many tech firms and countless public speaking appearances at large tech events. But I still feel I could do so much more, there is still a need for that evangelist role and I am still a great fit. There is a real need to ensure universal design across the board. When that goal is achieved countless people will be enabled to achieve the impossible.

Accessibility of Facebook on the desktop

Maximising the accessibility of a website is always of great importance. As well as the developers improving accessibility there are also a plethora of tools available that increase the accessibility client side. I always encourage the use of client side tools as it makes for a richer more seamless experience.

In the quick look video below I demonstrate the ability of Hacker Vision
for Google Chrome. Hacker vision is an extension which is capable of intelligently inverting websites, by which I mean will only invert text and background elements rather than images. This is a vast improvement over say the invert feature built into Apple hardware, as the built in invert operates at the hardware level the entire screen is inverted. Resulting in such elements as images becoming inverted. Hacker Vision negates that and makes for a far more pleasant web experience for the low vision user.

The video also demonstrates the different forms of zoom available to the user, and quickly compares window zoom versus font scaling. I believe font scaling to be incredibly powerful on the client side and is something I will touch on in a subsequent ThoughtCast.

I chose to demonstrate these features with Facebook, mainly because Facebook is often cited as having poor accessibility. I do not believe this to be true I believe a fairer assessment would be to say Facebook is doing a reasonable job and it plays relatively well with client side tools. However, it must be noted that these client side solutions will work on any website and in the case of Hacker Vision can even be tailored to invert only the websites you wish it to. Therefore, a website that does have a dark theme would not be inverted.
]

VoiceOver on the apple watch

From 9to5Mac

Like Apple’s other products, Apple Watch will have a series of key accessibility features.
To access Accessibility Settings on the fly, users will triple-click the Digital Crown.
The Apple Watch will have a VoiceOver feature that can speak text that is displayed on the screen. Users will be able to scroll through text to be spoken using two fingers. VoiceOver can be enabled either by merely raising a wrist or by double tapping the display.
Users will also be able to zoom on the Apple Watch’s screen: double tap with two fingers to zoom, use two fingers to pan around, and double tap while dragging to adjust the zoom.
There will also be accessibility settings to reduce motion, control stereo audio balance, reduce transparency, switch to grayscale mode, disable system animations, and enable bold text.

Great to see confirmation that the apple watch will support VoiceOver. From the original demo I had hoped accessibility would be baked in. Looking forward to another way to interact with my smartphone and the new possibilities that will enable. Particularly looking forward to the haptic navigation features, which is something I have been reaching out for wearable companies to add for over 2 years.

Yearly Checkup

For the past 10 years I have been registered blind. I do retain a usable level of vision but this degrades over time. My loss of vision generally goes unnoticed, this is due to the small adaptations I make on a daily basis.

The only time I am presented with a record of how much vision I have lost is the yearly checkup, the routine eye tests highlight the degeneration. Saturday was this years appointment.

Upon entering the examination room I sat down and took my first look at the eye chart. WHAT! I cant even read the top line, this is a notable reduction even for me. After letting my eyes settle to the change in light I could make out the top line and we moved on.

A quick test with the little torch showed I had a prescription change. As the optician adjusted for the change we progressed to the next eye chart.

Again I was struck by my change in vision, I was unable to read anything on the chart. Not due to my prescription but due to contrast and lighting levels. I couldn’t make out the letters as they were simply blending into the background. After a little fiddling we found my new prescription and we had a little chat about the state of my vision.

There has been a substantial pigment change, explaining my loss of contrast and colour definition. I had began to notice this in my day to day life, mainly while using the computer. Around 80% of my computer usage now takes place under inverted colours. This makes a few applications tricky to use but overall allows me to stay proficient.

To date the adaptations I am making seem to be keeping pace with the loss of vision. So I am not too worried about the recent pigment changes. I will keep adapting and moving on.

Good News

After my previous post on the lack of disabled funding I sent an email complain to my local college.

I received a repsonse instantly, “Could you please come into the college so we could discuss this?” So on monday I had a meeting with Mary from the disability department. We discussed what had happened on my previous visit, turns out the finance department were mistaken.

There was indeed funding for accessibility, we discussed my needs and I highlighted I simply needed an iPad. The college had their own idea of some custom Dolphin devices, 2 devices one of which cost £1600. I explained this solution was incredibly expensive and the iPad would achieve everything I needed, at a smaller price point.

After around 30 minutes of negotiation I was left with the answer, “I will speak to my manager”. This morning I received an email giving the OK for an iPad. YAY!!!

The cherry on top? I bought an OpticBook 3600 on eBay for £38! The scanner arrived today, a quick test and it works like a charm.

So the previous cost has been reduced to a acceptable £38. This is far lower than even my dream scenario.

Now time to get back to compiling these post process tools.

college bound?

After my interview last week for college I was told to nip in to talk with Learner Services to discuss financial support for disabled students.

Arriving at college I headed to Learner Services, unsurprisingly they were unable to help and send me over to finance. This did not go as planned….

After a 5 minute conversation which included going round in numerous circles the verdict was simply.

“We do not guarantee any financial support to disabled students.”

But I need access to the books to complete my course, surely you will help fund that?

“We cannot guarantee that.”

The cherry on the cake? the course will have been running for over a month before they even make a decision. So I would have to start the course sans books and I MAY get them at some point.

A little angry and disappointed I headed home. I quickly got on task and began looking for a book scanning service. I shot off a few emails and waited for a response. After 20 minutes I got bored of that and decided to start ringing around.

First port of call Action For Blind/RNIB. They were incredibly helpful and gave me the contact information for Leeds University Transcriptions services. After explaining my needs we got down to the nitty gritty of cost, £7.42 pp. Which for the roughly thousand pages I need require comes in at a whopping £7420.

More than a little out of my price range, shocked at the price I returned to online research. After a few calls I managed to find a private sector service that would do it for 30p a page +VAT.

A little math shows it would cost:

Books: £70
Scanning: £352.50
iPad: £429
Total: £851.50

This is before any other supplementary scanning costs for other documentation. I simply cannot afford to pay that much for 2 books.

The solution? Well thanks to my previous role I am reasonably tech savvy. So onto the shopping list goes a book scanner and a whole host of post processing tools. The money saved will be negligible but will allow me to run a service at cost for other users that need books scanning at a reasonable price.