IBM Serendipity

Two years ago in the middle of my degree I went to meet with IBM HR. The idea was to have a chat to them about my vision of an inclusive and accessible world through technology..

IBM stand at a fantastic point within the technology sector where they have the ability to touch a huge amount of organizations in wildly different fields. It was this very point that made me think IBM and I could be a perfect match.

There is a need for all technology to be inclusively designed, to enable everyone to have universal access. From mobile devices, to the internet of things to access to transport. Indeed it was IBM’s Smarter Planet initiative that made me believe there was a way to make the world accessible through the advancement of new technologies.

I pitched to HR that I would be a wonderful fit for an accessibility evangelist, working with all manner of partners focussing on how technology could be made inclusive. From advising on human interface interactions that not only had visual elements but auditory and haptic, to communicating complex information in new and interesting ways. I continued by highlighting that the opportunity to interact with clients at the early stage would aid in a universal design approach amongst all technology.

Indeed it is this early stage approach why I have had great success with Kickstarter. I often find projects in the very early stages and communicate with the team on how minor adjustments could be made to improve accessibility. Be it the addition of audible tones or changing a UI to take into account a blind user. I have also had great success with FitBit and Drop scales. With both companies I advised on how to communicate information in different forms to increase accessibility. The added benefit of this change in communicating information was a greater understanding by all users not just those who cannot see.

I imagine a world where as the next 1 billion people and 10 billion devices come online there is no barrier for interaction, as these products and services have taken a universal approach from the beginning. It is also worth highlighting that this approach can create benefit for all users not just those who rely on accessibility. For example, a low vision user may be aided by contrasting or night mode colour themes, these exact features also assist any individual using the device at night. The route to a truly intuitive and simple design can also be achieved by taking the needs of a blind user. As if you can make a user interface or product that a blind user can utilise, it truly is simple and intuitive.

It was during this conversation I highlighted how important this approach is to all services and products. There should never be an assumption that a particular product or service will not be utilised by a particular demographic. To highlight this I mentioned how I had utilised RunKeeper to learn to run solo outdoors. It would have been easy for RunKeeper to assume a blind person would not use their app. After all what use would this be to a blind person. But thankfully they did and I was able to achieve what was once perceived impossible, to learn to run solo outdoors.

I continued by saying this is why I wanted to work with IBM, I wanted to make sure every service and every product across all sectors became accessible. Just imagine the impact this could achieve with the number of partners and clients IBM work with. With accessibility an assumed standard across the board just imagine the impossible things that could be achieved in the next few years.

During the rest of the conversation IBM HR mentioned they could imagine me starring in an IBM commercial, demonstrating what accessible technology can enable people to do. Well if we fast forward 2 years that opportunity arrived. IBM gave me a call and asked if I could like to be featured in a little video. I of course said yes and the result is the video below.

In those past 2 years I have continued to try and make the world a more accessible place, through advocating for universal design, working with many tech firms and countless public speaking appearances at large tech events. But I still feel I could do so much more, there is still a need for that evangelist role and I am still a great fit. There is a real need to ensure universal design across the board. When that goal is achieved countless people will be enabled to achieve the impossible.

Looking forward to Apple TV OS

My favorite piece of technology in the living room is my Apple TV and it is about to see a significant update. I love the Apple TV for two reasons VoiceOver and Netflix. VoiceOver is fantastic at assisting in navigating the UI as it reads aloud all the elements and Netflix has fantastic audio described content.

However, it is limiting. I only access my media through Netflix but I have a world of other media. I have numerous DVD and Blue-Ray discs all with great audio described content. The problem is how I access this media. For example, identifying the discs or navigating the menus are both challenging and require sighted assistance. There just isn’t an a great accessible removable media device.

So the current solution is to rip these discs along with their audio described content and AirPlay them to my Apple TV. This allows me to use a screen reader to select the content I would like to listen too. But it shouldn’t be this hard and I hope the Apple TV can help in this respect.

For the first time the Apple TV has a SDK, meaning developers can create apps for the system. This brings with it the opportunity to access my other media through an accessible UI, this isn’t just hypothetical either as Plex have already announced their intention to release on the platform.

There is however one caveat, opening up apps on the Apple TV to using an SDK instead of creating an app under the strict UI guidelines of the past, gives developers free reign. With free reign may come the possibility of the apps no longer supporting VoiceOver, or if they do no guarantee all elements of the UI will be labelled. However, this would then merely be a software fix and I am confident developers would be willing to ensure their apps are as accessible as possible.

There is also another exciting feature of the new Apple TV – Universal Voice Search. This would reduce my need to interact with the UI significantly, now if I would like to watch the latest episode of a show or a movie I can just issue the command to Siri. It was also recently announced that this feature would roll out as an API, meaning apps such as Plex would have access.

This really does excite me, as instead of asking for help to find the DVD I would like to watch, then having sighted assistance to select the correct audio track and start the film, I can do all this myself. A simple voice command will allow me independence in viewing media.

The new Apple TV will retain its much loved spot, as it remains the most accessible media viewing device for the living room.

Learn to code in a day?

Learning to code in a day, the premise seems a little far fetched, so I was certainly intrigued by the event at Decoded in London.

With the breadth of possibilities of coding so large the focus of the day was on the specifics of creating an app that incorporated location data. Even this reduction in focus seems like a mammoth task, especially considering the course is not aimed at people with previous coding experience. In fact it is billed as aiding new comers in obtaining these skills in a day.

So with zero prior experience, is it possible to enable someone to create a location based app within a day? The quick answer is yes. Everyone on the course successfully created a location based app.

The day is broken into a few distinct learning methods, lectures, hands on and team tasks. These three different methods enable participants to gain a rounded knowledge of coding. The introduction lecture is a whirl wind tour of the beginnings of coding, I was disappointed that this didn’t feature Alan Turing, but it was a whirlwind tour after all! This lecture also included the technologies we would be utilising in order to create our app, HTML5, CSS and Javascript.

We quickly moved over to the computers and began to create our app. This takes place within an online environment created by Decoded called Playto. The real power of Playto is in its instant feedback, the environment is broken down into columns. As you type into the editor column, there is a live view column. This means you are given instant feedback on what you are creating. This is an incredibly powerful learning tool, as you can instantly see your level of progress. It is also worth noting that anyone can utilise Playto, not just participants of the course.

As the day progresses we were introduced to HTML and CSS and began to build the look of our website, with functionality reserved for after lunch. The functionality of the website, being its location information, is accomplished through Javascript and some backend tools that are beyond the scope of the single day course. This element was however covered in another lceture but it wasnt something we created ourselves.

After lunch it was time to make our apps location aware. The premise was to make that would allow you to check in within a certain radius of a location. If you were outside of the specified radius you would not be allowed to login. This simple premise has a whole host of possibilities and this was highlighted to me a few days later. A friend wanted to create an app that would have runners start at a set point and every hour each runner would have to be beyond a particular radius. As the time increased so did the radius. I realized that the app I create on coding in a day could easily be adapted to serve this function.

To complete our coding task we were broken into two teams, with each team assigned a coding research task to complete the project. This was an interesting learning experience for the day, as participants had the opportunity to communicate with team members in a way which previously may have been difficult and daunting. This is a fantastic skill that will transfer to the workplace and allow individuals to communicate with the engineers and developers.

With the team tasks complete and everyones app functional the coding in a day was complete. I realized just how empowering the day had been, in a single day everyone on the course now had the skills and confidence to create something themselves and importantly the ability to communicate with the relevant teams in their workplace. Coding skills are rapidly being highlighted as essential and perhaps so should courses like coding in a day. It has the ability to enable all team members to understand the process and language needed to communicate with development teams, which will truly become an essential skill as the workplace evolves.

The course for me personally reinvigorated my interest in coding, I returned home and spent the next few days researching location services within iOS and playing with PHP. I look forward to where I will be in a few months time and how much my own coding will have improved. It also reminded me of how much I enjoyed my previous career in the educational sector, it was facilitating others to learn that was the truly gratifying part of my job.

The questions…….

My favorite part of public speaking is the Q&A section at the end. Its interesting to be challenged by people all the time, I especially like questions that start with “I know its personal but…..”. These questions are usually challenging to answer and I do enjoy that. While that may sound scary to be stood on stage while possibly thousands of people stare and wait for an answer, it always leads to interesting trains of thought.

Recently at an event for PWC I was asked the question “What is your biggest dream”. Now looking back and with time to think about it, while I answered the question honestly I didn’t feel I gave the justification as to why. It is after all the why that makes it interesting.

The question was – “What is your dream”. I responded with “to be VP of accessibility at a major tech company”, then went on to discuss my dreams within the realm of adventuring.

This doesn’t answer, why, I want to be VP at a major tech company. Well that is because of a dream.

Access to information is essential for the advancement of anyone, from learning to simple day to day news gathering. For someone with sight loss that is immensely difficult. I cant just pick up a book, or magazine or even go to a website and read the latest information. Essentially the majority of traditional forms of information are beyond what I am capable of accessing. This can make education incredibly difficult and place the visually impaired at a severe disadvantage.

While studying for my degree this lack of access to information became incredibly apparent. While, a facility did exist to make information accessible – in an audio format, there was a substantial time delay. To the point where it would mean if I stood any chance of finishing my degree I would have to complete essays in 2-3 weeks. My dissertation was completed in 8, that was not by choice, that was a constraint introduced by access to information. I will quickly add that I was 4 points off the highest possible mark however!

But it is precisely this access to information I want to change. The mobile is truly a revolution in access to information and that is where great change can take place. Android has the largest market share in terms of devices and could make an incredible global difference through accessibility. As the next billion people come online, imagine enabling a visually impaired person for the first time to access a book, the days news, or even a menu at a local restaurant. This is all possible by utilising screen reading technology and OCR through smartphone cameras.

What is needed is rapid improvement in accessibility features, vast improvements in universal design and a focussed concentration on inclusive user centric design.

And it is precisely all these reasons I desire a senior role at a tech company, to help instigate that change and enable learning to all.

There is a place for this within other organizations and was a topic of conversation with IBM, any company that has a large consulting role has a wonderful opportunity. An opportunity to touch hundreds if not thousands of companies throughout the world. To initiate these changes and work towards a more inclusive focus for technology and services.

So thats the why, I just didn’t condense it on stage on the day!

An accessible oven?

Continuing my foray into the kitchen, I am amassing an even larger collection of specific kitchen gadgets. With the new diet commencing, I had a need for omelettes. In an attempt to be a little healthier I use more whites than yolks. To aid in splitting the whites from the yolks I purchased an egg separator. It works surprisingly well and acts as a reminder. There is often a solution to a problem, you just have to look for it.

 

It is often these gadgets created for very specific use cases that enable me to function in the kitchen. While never envisaged to be used for the blind their highly specialised function often makes them suitable for myself.

 

I have found there are numerous gadgets that aid in the prperation of food but not in the actual cooking. I feel this is because the oven. Hob and microwave don’t receive much focus in terms of specific use cases, and therefore, do not see large functional improvements.

 

Well at least that was what I thought until I heard about the June oven. Through an integrated high definition camera the June oven is able to identify what you are attempting to make and can suggest estimated cooking times. Right now it can identify chicken, steak, white fish, salmon, bacon, cookie dough, brownies, bagels, toast and burger buns. For a full breakdown of the ovens capabilities its worth checking out this The Verge articles on its capabilities.

 

The oven is also equipped with WiFi and a touch screen and is able to live stream the cooking process. Along with its ability to estimate cooking time it was the WiFi and touch screen that really stood out to me. With this system having WiFi it doesn’t seem a stretch of the imagination to be able to control the oven through a mobile app.

 

Imagine an oven I can control through an iPhone app. Be able to set the temperature, have the oven identify that I want to cook a steak and it suggest a cooking time!

 

This would literally be a game changer in the kitchen for me and open up incredible possibilities in what I am able to cook easily and independently. Pairing it with other technology in the kitchen I can see myself being able to create high quality healthy dishes for the first time in my life

 

So June, I am here, I will be your beta tester. Lets make an oven that can transform peoples lives.

Now anyone can bake?

Over the past few weeks I have become interested in advancing my baking and cookery skills. This introduces a number of obstacles as a blind individual, mainly there are a lot of tasks that have the potential to hurt you!

I have begun to break down these tasks and will be covering them in a series of posts. For today though I would like to focus on weighing.

This is a surprisingly difficult task, from measuring out liquids to weighing items for baking and cooking. There are a few speak kitchen scales out there, but as ever with products for the visually impaired they are grossly over priced for their limited and often lacklustre feature set.

So I was incredibly excited when I found the Drop scales, especially with their slogan “Now anyone can bake”. I certainly fit into the anyone category, so I popped down to the local Apple Store and made a purchase with the idea to test their accessibility. The Drop scale connects over bluetooth to an iPad and displays the weight on screen, it also has a large array of features that walk you through baking and cooking specific items as well as such features as auto scaling the weights of recipes.

I thought this could be the perfect item for me, a feature rich set of scales that would display the weight of an item on screen. VoiceOver could read the weight to me and these scales would solve a large kitchen problem.

Upon testing the app VoiceOver works surprisingly well, a large number of the features can be read aloud and buttons are labelled well. The problem came when I tested the scales core feature, weighing. The current weight is not a VoiceOver selectable item, therefore, the weight cannot be read aloud.

It is worth highlighting that if you have low vision these scales will work well, the current weight is displayed in a white font on a black background. It is very high contrast and is far superior to the small screens that usually accompany kitchen scales.

Not deterred by the scales current lack of VoiceOver support I emailed Drop putting in a request for the current weight to be selectable by VoiceOver. I unfortunately received a boiler plate response that said it was something they may investigate in the future and thanking me for my patience.

This disappointed me more than the scales not working for me. The companies lack of insight into an opportunity. The Drop scales are on price parity with other accessible scales, but are far more feature rich. Therefore, if they were accessible, they could easily take a large chunk out of that market.

There is also the additional business case of the positive marketing they would receive from making this change. It would certainly bring them attention from the VI media as well as the mainstream media.

 The business case for this change appears to make sense and that is what is disappointing. As ever making something accessible is way down on the priority list, mainly because this company fails to see the positive impact making something accessible could make.

It would make a huge impact on individuals like myself where it would solve a problem, but it would make an impact on their bottom line. The development cost to make this change would easily be outweighed by the new market these scales would be opened too and the press coverage. Companies need to stop seeing making a product or service accessible as low priority and understand the positive business case for making the change.

 

Then perhaps the slogan “Now anyone can bake” would hold true.

 

Reading a book to my children

A wonderful article about Nas Campanella, blind newsreader over at Broadsheet.com

Her studio is equipped with strategically placed Velcro patches – she operates her own panel – so she can recognise which buttons to push to air news grabs and mute or activate her mic. While she’s reading on air, that same electronic voice reads her copy down her headphones which she repeats a nanosecond later. In another ear the talking clock lets her know how much time she has left. The sound of her own voice is audible over the top of it all.

Reminded me of a problem I have in my life. Reading books to my children. I have often thought about using a tiny in ear wireless headphone, such as the Earin to solve this problem. It’s interesting to hear someone is using this on a daily basis in their work life. The article is also well worth a read as Nas’s attitude is remarkable.

Dream to Reality

A few years ago I began to think of a few adventures I would love to embark on. I came up with three: The Pilgrimage, The Return and The Dream. Late last month I was fortunate enough for The Pilgramage to become a reality.

The basic premise of The Pilgrimage was to pay homage to RunKeeper and visit a city close to my heart – NYC. The dream was to run from the HQ of RunKeeper in Boston, to NYC then compete in the NYC marathon. The idea to visit the RunKeeper HQ was to thank them for where I am today. Their app enabled me to believe running solo was possible, the reason NYC? I spent a bit of time there, while I could still see. Therefore, the city remains close to my heart.

The adventure was made possible by a few select companies, namely Twitter, PayPal and AirBnB, Little did I know that partnering with AirBnB would elevate the adventure so greatly.

I have decided to break the details of the adventure up into a little series of moments, rather than detailing the adventure chronologically, I will highlight the memories that were forged and hopefully paint a picture of how I will remember the adventure.

It is worth noting at this point how great all the companies, hosts and especially my crew were in making this a reality. Even now 2 weeks after my return the experiences are difficult to comprehend. It became more than a run, and far more than the pilgrimage I had intended it to be.

The Bradley Timepiece, a watch for the blind (& sighted!)

Back when I had useful vision, I adored collecting watches. In particular I had a penchant for unique faces and unique ways of displaying the time. My collection varied from flashing LED watches from Tokyo Flash to a Breitling Navitimer Cosmonaute. So, when I lost my useful vision and had to begin to buy talking watches I was gutted. I had gone from fine crafted Breitling to a cheap £40 piece of plastic (arguably the Tokyo Flash watches were cheap, but at least they were interesting).

The talking watches would break continuously, I would often forget to remove it when bathing my son and it would break. After this had happened 3 times, I just decided to give up on having a wrist watch. I resorted to using my phone as my new timepiece, this had a number of drawbacks however. I would have to remove it from my pocket to tell the time. It was far from subtle as the time would be read aloud and its just not as cool as a watch!

Therefore, I was incredibly excited when I heard about The Bradley Timepiece, a watch that was billed as inclusive. It seemed interesting, it had a very unique way to tell the time, so that harked back to my old collecting days! Instead of speaking the time aloud or vibrating the time, it relies on touch.

There are two ball bearings, one that runs in a groove around the rim of the watch, and the other which runs around a small circle on the face of the watch. The ball bearings are moved around by magnetism, with the outter rim for hours and the face for the minutes. To assist in telling the time, the numbers 1 through 12 are raised lines on the face, with the number 12 having a small triangle, to indicate it is the top of the face.

OLYMPUS DIGITAL CAMERA

In order to tell the time you gently move your fingers across the face and rim to indicate the time. Initially this proved slightly difficult, as I would accidentally move the ball bearings, however, a quick flick of the wrist and the bearings return to the correct positions. The raised lines of the face greatly assist in telling the time. When I first began to use the watch I would locate one of the ball bearings and then trace my finger out to the raised lines and count the lines to 12, to give me a clear reading of the current time. After owning the watch for a few weeks however, the time is now far simpler to tell the time, it just took a little practice.

One of the main niggles of the watch is the strap. I chose the mesh stainless steel strap. It has two problems. 1, it isn’t made for slim wrists, for a guy I have very slim wrists. Even at the smallest setting the watch is a little loose. The other is how difficult the strap is to fasten, you initially have to hook the clasp over a small bar, click the clasp shut, then clip over another clasp. The initial hooking of the bar takes a lot of practice, so much practice that it becomes a little irritating. However, there are a number of other straps available, so this really came down to my choice of the metal strap.

The watch itself is beautiful, using touch to tell the time is simply fantastic. I am now able to subtly tell the time. This is powerful, previously everyone in earshot knew I was checking the time, so in meetings or when giving a speech this simply wasn’t practical. Now I can subtly touch my wrist and get a sense of the time. I also receive a number of complements about the watch, something that never happened when wearing a cheap plastic talking watch! Sighted people are often intrigued by the look of the watch and remark just how beautiful it looks. Who would of thought, a product that is inclusive for the visually impaired that looks great! That truly is rare.

The design execution of the watch is even carried through to its box, something that is often overlooked when designing something vor the visually impaired. Whilst I am unable to read braille or indeed see the included booklet, it is easy to tell a lot of through has gone into every aspect of this watch.

OLYMPUS DIGITAL CAMERA

Overall, this watch is a great buy. I would recommend it to anyone sighted or non-sighted. It is a rare “out of the box” experience for the visually impaired. You can literally take this watch out of the box and instantly be able to use it, setting the time is a breeze and intuitive. There is only one obvious option to set the time, pull out the crown, twist and the bearings begin to move. I would put The Bradley timepiece in the same league as the iPhone for its out of the box experience, it is simply, that good.

Help a blind runner get from Boston to NYC!

This year I will finally run my first marathon, in NYC. Before I even ran I had dreamed of running NYC, it is also one of the cities I visited while I still had sight, so it always feels special whenever I return. It is also tantalisingly close to Boston, the birthplace of RunKeeper – the running app that made running solo outdoors as a blind runner possible.

So I had an idea, why not run from Boston to NYC, then compete in the NYC marathon? And to make it even more special, why not connect with people on social media to help me along the way. That is the plan and I am reaching out to the internet to help make it happen!

The Adventure

The plan is to arrive in Boston around mid to late october and begin running an average of 30 miles a day, for 10-13 days (distance varies depending on final route chosen). I plan to break the run into small chunks, and connect with as many people as possible. I don’t expect people to run at a particular pace, I am happy to run, jog or walk, the idea is just to connect with as many people as possible to help me get to NYC.

I intend on producing as much real time content as possible, from video, audio, photos, real-time GPS, health tracking, basically using any available technology to produce data while I run. I will also be maintaining a blog up to and during the adventure, so there will be a stream of content produced from the run. Something I hope to share with everyone involved and as a keepsake to show my children when they are older.

Funding

As I am currently a student and will finish my degree just in time for this adventure, predictably I am skint! So I am looking at two possible avenues for funding this adventure, sponsorship and public speaking.

There are numerous opportunities for brands and companies to become involved, I am open to all suggestions, so if you feel we could work together, or know someone who may be interested please get in touch.

I believe this to be a great opportunity for brands, the adventure will run for around 2 weeks, so there will be substantial social media coverage. I also anticipate numerous other press appearances, and will be contacting all the agencies I have been working with in the past. In return for brands supporting my adventure I would certainly make myself available for any press events, or public speaking at a conference of your choosing. To make this adventure a reality I really need help with travel, accomodation and a few pieces of equipment. If you can help with any of these please do get in touch.

The other route is public speaking, I do have experience at public speaking and at recent conferences including IAB mobile and Google Think, I was rated top speaker. I have also been invited to speak at technology companies including Twitter, PayPal, Google and more. As well as speaking at conferences I also do smaller motivational speeches for corporate events. So if you have any need for a motivational public speaker, or would just like to hear my story, please get in touch and help me fund my new adventure.

“Simon tells a compelling story of a life transformed by two things: technology and a positive attitude. I’ve seen him speak twice. Each time he has inspired the audience and been the best performing speaker on the programme”.
Bruce Daisley, MD Twitter UK.

So who am I?

I am aware some people may not know who I am, so to save you reading my entire blog, here is a commercial I starred in for Carphone Warehouse. It gives a great outline of who I am and what I have achieved up to this point.