HOWTO change the font size in safari on the iPad and iPhone

The ability to change font size can have an enormous impact on accessibility. Pinch and zoom is wonderful for this on iOS, but it introduces another problem. Zoom to much and you now have to scroll sideways as well as down to consume content.

There is however, a little workaround. You can increase and decrease the font size on a per site basis in Safari. This is done through a bookmark, adding two bookmarks one for increase and one for decrease. You can manually set the appropriate font size. Reloading the website will return the font to its original size.

To enable this feature follow the steps below:

  1. In Safari create a new bookmark, this can be of any website as we will be editing it soon
  2. Open bookmarks and tap edit and edit your new bookmark
  3. Change the Title to either Increase Font or Decrease Font
  4. Copy the Appropriate code from below into the link fiel
  5. Click save and repeat so you have both increase and decrease font size bookmarksd

Increase Font size

javascript:var%20p=document.getElementsByTagName('*');for(i=0;i%3Cp.length;i++)%7Bif(p%5Bi%5D.style.fontSize)%7Bvar%20s=parseInt(p%5Bi%5D.style.fontSize.replace(%22px%22,%22%22));%7Delse%7Bvar%20s=12;%7Ds+=2;p%5Bi%5D.style.fontSize=s+%22px%22%7D

Decrease font size

javascript:var%20p=document.getElementsByTagName('*');for(i=0;i%3Cp.length;i++)%7Bif(p%5Bi%5D.style.fontSize)%7Bvar%20s=parseInt(p%5Bi%5D.style.fontSize.replace(%22px%22,%22%22));%7Delse%7Bvar%20s=12;%7Ds-=2;p%5Bi%5D.style.fontSize=s+%22px%22%7D

Now whenever you need to adjust the font size on a website, tapping the increase or decrease font size button will adjust the font on your current website. This is a simple way to increase the accessibility of any website in Safari on the iPad or iPhone.

AirPods, The Most Accessible Headphones

Headphones are an often overlooked but essential piece of equipment for the blind. Accessing a screen reader in the privacy of your own home in a quiet room is a simple affair, you can just use the loudspeaker of your phone or computer. Add some environmental noise, head outside or dare to venture into a coffee shop and the loudspeaker is no longer functional.

Headphones enable me to use my iPhone both indoors and out and about, i literally couldn’t use my iPhone without headphones. Therefore, over the years i have amassed a rather substantial collection. Everything from a cheap pair of JVC up to a rather expensive pair of active noise cancelling Bose. I am rarely seen without a pair of headphones and have them stuffed in every pocket and every bag.

I am constantly looking for the perfect pair of headphones, the pair that will make using my iPhone that much more accessible. Now i have found that elusive pair, the Apple AirPods.
The AirPods are Apple’s truly wireless earbuds. Two single ear pieces that fit snugly inside their own charging case.

They solve many of the problems a blind user has with headphones. Cables. Cables are a nightmare. Get them tangled in your pocket? Try untangling them when you can’t see. It just takes that much longer to untangle them. To the point where if I quickly need to access my phone i would prefer not too. The time taken to untangle the headphones ends up being greater than the time i needed to use the phone. So often i would either ignore a notification and vow to take a look when i got home, or place the phone close to my ear to listen. After all with a screen reader the only way you get privacy is by using headphones. Imagine if all your texts were read aloud? That embarrassing one from your friend is even more embarrassing when everyone in the lift hears it too!

So the wireless nature of the AirPods truly makes them more accessible. I can just quickly and easily slip them in. No cables to un tangle, just flip the lid of the storage case and they are in my ears for that quick check of my phone.

This brings me to one of my other favourite accessible features. Only using one of the AirPods. When you rely on sound to understand what is happening around you, having one ear focus on the screen reader frees up the other to environmental noise. Handy when walking down the street and handy at home or in a meeting. Previously if i received a notification in a meeting and hadn’t worn headphones upon entering i am left with three options. Ignore the message, go through the messy untangle process or interrupt the flow of conversation by having everyone hear your notifications through the loudspeaker. Now.I have a fourth option, just slip in one AirPod and i am away.

While out and about another side effect of being blind is generally having only one hand accessible. To navigate around i either use my guide dog or a long cane. This basically gives me no way to untangle the headphones, so i would often go for the loudspeaker approach. This is gambling with the possibility of dropping your phone as you attempt to juggle it around with one hand.

Now i just slip out one AirPod from the case, pop it in my ear and activate Siri.

There is one other fantastic bonus of using one ear piece. I double the battery life. Not to mention whenever i remove them from the case they are fully charged.

The AirPods truly have increase the accessibility of my iPhone by enabling me to use it in more daily events. I no longer have to remove myself from a social space to use my phone, these AirPods are increasing my social ability.

They truly are the most accessible headphones.

Thank goodness for technology

When my sight began to slip away, I feared losing so many things I love. After all, so much of our daily lives revolves around the ability to connect on a visual level.

My first love has always been technology and just as touch screens were becoming common place, I was unable to see them. How could I possibly interact with technology that was so heavily visual? There wasn’t even any tactility to the screen, it was a perfect smooth piece of glass. No raised buttons to identify what I was pressing, no way to memorise an elaborate process of taps and clicks – I felt lost. Lost but not defeated; I clung steadfast to the belief that there must be a way to adapt this to make it work to my benefit.

There was an unforeseen advantage- and as a result an adaptability – to this. The migration to touch screen forced the industry to reimagine how we would interact with these devices. The result was Apple developing VoiceOver for the iPhone, a gesture based screen reader. I didn’t realise it at the time but this would be my entry point to making the world accessible.

Now that my phone was equipped with the ability to read on screen items aloud, the phone became indispensable. It would be my reading tool for university, with all the books converted to digital form and my phone now reading them aloud. It would also become my window to interacting with the world at large – Facebook, Twitter, email all made accessible through this fantastic interface. It even allowed me to help my kids with their homework. It would creep into every aspect of my life becoming more and more indispensable as the days wore on. The unforeseen disadvantage: battery anxiety. My phone was now an extension of me, filling in the gaps that my lack of sight had created.

With the constant creation of new and previously unthinkable technological advancements, I wonder whether my main assistive device will even be the phone? Looking ahead 5-10 years I foresee a transitional period in the mechanics of interacting with our technology. One that will see a move away from typing onto screens and move towards spoken language, with a natural migration to a screen-less (or at least screens as we know them now) future. I believe that this technology is just on the horizon and something I relish the thought of.

Accessibility – low hanging fruit

There is a lot of low hanging fruit ripe for the picking within the inclusive design realm. So in 2017 what fruit do i think is the ripest?

Dark mode. This one feature alone implemented OS wide could make a huge difference to a substantial user base Not only would it solve a problem for the visually impaired for whom contrast is a major issue, but those with situational requirements where dark mode makes the most sense. Think late at night in bed, that white screen just makes your eyes ache.

So will there be an appetite for this in 2017? My gut says yes. If rumours hold true and the iPhone moves to an AMOLED display, we will see an introduction of dark mode. This will have a wonderful knock on affect of influencing design direction for a while. So not only we will see dark mode introduced at the OS level, but we will start to see a whole host of apps fall in line.

The dream scenario? Would be for apple to introduce a way for apps to toggle in and out of dark mode dependent on users preferences. This may be a visually impaired user using this feature instead of invert colours, or perhaps a sighted user having dark mode set for specific tie frames. I think this scenario is less likely than an OS wide dark theme and waiting for app creators to fall in line, but we can dream.

So lets see if that low hanging fruit is finally picked this year.

Object avoidance for the blind

img_0385

After running into a flagpole in the Namibian desert and a burnt out car on the streets of Doncaster, I decided it was time to work on object detection. My previous challenges had all utilized very simple systems and i wanted to stay within that simple communication paradigm for object detection.

Learning to train solo as a blind runner used two very simple inputs, distance and feeling underfoot. Combined these inputs allowed me to learn to train solo along a 5 mile route. Objects were identified by me running into them and memorising where they were from an audible distance marker. I had reduced blind navigation to two simple elements and that was enough to run. With one, well 2 keyassumptions, 1. I knew where all the obstacles were and 2. There would be no new obstacles. I knew these assumptions were flawed, but i was happy to take on the risk.

Running through the desert solo made the exact assumptions. I would be aware of all obstacles ahead of time and there would be no surprise obstacles. This allowed for a very simple navigation system, as i had reduced the problem to one of bearing. As long as i knew the bearing i was running and could stick to it i could navigate a desert. The system developed along with IBM used a simple beep system to maintain bearing, silence would denote the correct bearing. A low tone beep would mean i had drifted left and a high tone drifted right. Incredibly simple, but simple is all you need in these situations, an overload of sensors and data doesn’t improve the system it just makes the process of understanding what is going on beyond comprehension. Therefore reducing navigation to one simple communication point to the user, in this case me, i was able to navigate the desert solo.

So where did it go wrong? Well those key assumptions, the obstacles in this case were a flagpole and a rock field. The flagpole can be engineered out, the rock field however, we run into the complex system problem. A highly granular descriptive system would not allow the end user to navigate such a rock field. It as a unique and specialized environment that required centimeter accurate foot positioning, indeed the correct way to navigate would be to avoid it entirely!

But could we avoid that burnt out car and flagpole? Yes we could. Could we make it a simple system for the user to understand? Absolutely.

The simplest way to communicate an object within a visual field is hapticly. It is highly intuitive for the end user with ibration feedback instantly recognizable as an obstacle. For the sensor a tiny ultrasonic sensor mounted at chest level. The chest had been chosen as it always follows the direction of running. We had discounted a head mounting, as people often look in a different direction to the one they are moving in.

It is an incredibly simple system, but that is all it needs to be. The idea is to explore the minimal communication required for obstacle avoidance. In future revisions we intend to use multiple sensors but be ever careful not to introduce complexity to the point the simple communication system is interrupted. For example, it may be tempting to use a series of sensors all over the body, this however increases complexity and issues with differentiating between different vibrations and object detection. Not to mention that human interpretation adds latency to the system which may result into running into the obstacle we are trying to avoid.

This all sounds interesting, but does it work? Yes, yes it does. I was over in Munich recently to test an early prototype. With only one sensor i felt we were so close i was tempted to test it while running. The immediacy of the system is incredible. It is totally intuitive that a vibration denotes an obstacle. Avoiding the obstacle is a simple case of drifting left or right until there is no vibration. Then moving on by.

Below is a video of the device in action. I will continue to give updates on the development of the system up until i give it a real workout at a packed city marathon, where i will run solo.

IBM & CMU assisting in mobility

Mobility for the visually impaired is always difficult. From simple tasks as heading to Starbucks for a coffee, to jumping on a bus or grabbing a taxi. Lets take the first example, heading to Starbucks is certainly challenging when you are unable to see, but what about when you enter the store? Without sighted assistance locating the counter or indeed finding somewhere to sit is challenging.

Therefore, any technology that aims to improve any of these mobility issues is always a step in the right direction. With the fear that this blog is turning into IBM fandom, it is yet another project IBM are working on.

Along with Carnegie Mellon University, IBM have developed and open sourced a smartphone application that can help you move from point A to point B.

The app called NavCog utilises either voice or haptic feedback to aid in navigation. NavCog currently uses beacons to assist in the navigation process.

It is great to see the combination of beacons and haptic feedback to aid in navigation. Over 4 years ago I was pitching to just about every GPS manufactured that this could be an interesting direction to head. My ideas seemed sound when Apple announced the Apple watch and it used the exact same haptic feedback system I had been proposing. Further the use of beacon technology to navigate is exactly what I pitched to British Airways a couple of years ago.

I proposed using beacons to navigate Terminal 5 could not only be used to direct potential customers to shops, restaurants and gates, but also aid visually impaired customers navigate the terminal.

It is truly great to see all these ideas put together and finally implemented. We now just need a larger rollout of beacon technology!

This system could also be adapted to solve the internal navigation problem. I was speaking with Google a year or so ago about how project Tango could be utilised to achieve this. I imagined a haptic feedback device that could assist in real time internal navigation. After all my guide dog may be able to avoid obstacles, but an empty chair is an obstacle to my guide dog!

Artificial Intelligence and accessibility

Over the past couple of weeks I have been fortunate enough to be exposed to some fantastic technology as well as ideas. Attending WiRED 2015 kickstarted my thought process on how artificial intelligence could be applied to accessible technology.

While attending the conference there were two ideas I wanted to pitch to people, emotion detection to facilitate social situations for the visually impaired and facial recognition. I felt both these technologies could improve an individuals ability to socialise greatly. After chatting to a few people and pitching my ideas on how these systems could work from a design, implementation and marketing front I managed to interest a few companies and institutions.

There is fantastic scope for these technologies and their assistive ability. I concentrated on the emotion detection system initially as I feel these could have the greatest and speediest impact. I have encapsulated the idea into a product for all, rather than a product specifically for the visual impaired, as I believe these to be key for mass market adoption which, in turn will reduce the price significantly and reduce that initial barrier on any accessible product, price.

I am yet to find a partner to work with for facial detection, but I recently read an article highlighting that IBM are working on this. It really does seem as time goes on that IBM and I could be a great match!

I did also have a grander idea on accessibility while at the conference and was delighted to see it referenced by yet again IBM – cognitive assistance. I have been batting around a few ideas on how accessibility could be personalised. After there are nuances in an individuals accessible needs so why not make the solutions as nuanced. This could definitely be achieved through a cognitive accessible assistant that has the capacity to learn.

An accessible system that is capable of learning could aid in such tasks as reading. It would be able to identify how an individual likes to read information and execute it in that fashion. A nice example would be skim reading, being able to learn how to read a specific document for certain contextual references would be fantastic. This would certainly of assisted me greatly while at university, losing the ability to skim read is absolutely a skill I miss.

I continue to be excited by what technology is enabling and how I can become part of the revolution of accessibility.

TICKR X & The 7 Minute Workout Accessibility

I am always looking for simple and effective ways to make workouts more accessible. It can often be difficult to monitor and track workouts so I was very excited when Wahoo sent along the TICKR X. The TICKR X is a very capable device that can track a whole multitude of stats, from HR, to body movement and more, but the data point I was most interested in was rep counting.

Utilising the TICKR X along with the Wahoo 7 minute workout app on my iPhone, all my reps could be automatically counted. No more writing it all down in an app afterwards, assuming I could remember how many reps I performed on each exercise.

What is the 7 minute workout

The 7 minute workout is a collection of 12 body weight exercises that you can complete anywhere with no equipment needed. It has been shown to give results comparable to longer running or weights sessions. It comprises of 12 work sets of 30 seconds each followed by a 10 second rest. It is a quick and highly effective HIIT workout.

The Device

The device itself is one of the rare devices a blind user can take out of the box and configure and use without sighted assistance. You click the device onto one side of the strap, wrap it around your chest and clip into the other side of the strap. Ensuring the TICKR X is positioned in the middle of your chest, do not worry if the device doesn’t touch your skin. Unlike other HR trackers the HR sensor are located in the strap, not the actual TICKR X device.

To turn the device on tap the TICKR X a couple of times and then you are ready to pair it with your phone. This is achieved inside the app.

The App

The 7 minute workout app is highly accessible, Wahoo have done a fantastic job of labelling all labels appropriately. It is simple to navigate the app and start an activity, as the reps are counted automatically that is all you ensentially have to do to use the app. Start a workout and read your results, no manual inputting its all taken care of.

The app also uses a lot of audio for feedback. For example, the different exercises and start and rest sections are read aloud. It makes for a nice accessible experience.

How Does it Perform

When I began my workout I was surprised it worked, it was a real wow moment as I heard the reps count up as I went about completing the workout. I was quickly put in my place on correct form as the TICKR X wouldn’t count reps with bad form. So no longer can I cheat and just do quick reps with poor form, I am now forced to go lower rep with correct form. While this affects my rep count it does mean I am actually performing the exercise correctly! This was evident in push up rotations, as the TICKR X wouldn’t count a rep if I didn’t perform the appropriate amount of rotation, which is coming when going for speed. This correction of form isn’t limited to simply not counting a rep, as when in plank position if you begin to wain the app notified you to watch form!

The other slightly confusing counting system is in exercises that have an up and down movement, for example, push ups, triceps dips etc. This is because the TICKR X counts the up and down as 2 reps, whereas typically I would count each up and down as one rep. In itself this isn’t actually an issue as it correctly counts the movements and you are able to compare your results and see improvement as the counting is consistent. If anything it just makes it look like you can do twice the number of push ups you used to be able to!

For me the real wow moment of the app was upon completing a full workout. When the workout is finished you are given a table with the rep results of each exercise along with the HR for each exercise. This granularity was a fantastic reporting decision. Typically average HR is used for workouts whereas Wahoo have chose to give you the HR breakdown of each exercise. This is important as you are quickly able to identify the exercises you should be pushing harder on. I was able to see that I was sandbagging it a little on what I consider to be the easier exercises, perhaps unconsciously I was using them as a little rest period.

Like the other sections of the app the reports are very accessible, Wahoo really have done a great job in regards to accessibility for the blind.

Any Bad Points?

Yes, the in app purchases. I was a little disappointed that with a premium priced piece of hardware like the TICKR X required multiple small in app purchases to get full app functionality. It must be noted that this is for the 7 minute workout challenge, there are other apps from Wahoo that work with the TICKR X, but I was focussed on utilising the TICKR X for rep counting.

Overall

The TICKR X is wonderfully accessible with the accompanying 7 minute workout challenge app. I would highly recommend it for a blind user. Its rep counting keeps you honest and the reports allow you to highlight where you should be pushing harder. It is a definite buy for any blind user looking for an accessible and quick cardio workout!

IBM Serendipity

Two years ago in the middle of my degree I went to meet with IBM HR. The idea was to have a chat to them about my vision of an inclusive and accessible world through technology..

IBM stand at a fantastic point within the technology sector where they have the ability to touch a huge amount of organizations in wildly different fields. It was this very point that made me think IBM and I could be a perfect match.

There is a need for all technology to be inclusively designed, to enable everyone to have universal access. From mobile devices, to the internet of things to access to transport. Indeed it was IBM’s Smarter Planet initiative that made me believe there was a way to make the world accessible through the advancement of new technologies.

I pitched to HR that I would be a wonderful fit for an accessibility evangelist, working with all manner of partners focussing on how technology could be made inclusive. From advising on human interface interactions that not only had visual elements but auditory and haptic, to communicating complex information in new and interesting ways. I continued by highlighting that the opportunity to interact with clients at the early stage would aid in a universal design approach amongst all technology.

Indeed it is this early stage approach why I have had great success with Kickstarter. I often find projects in the very early stages and communicate with the team on how minor adjustments could be made to improve accessibility. Be it the addition of audible tones or changing a UI to take into account a blind user. I have also had great success with FitBit and Drop scales. With both companies I advised on how to communicate information in different forms to increase accessibility. The added benefit of this change in communicating information was a greater understanding by all users not just those who cannot see.

I imagine a world where as the next 1 billion people and 10 billion devices come online there is no barrier for interaction, as these products and services have taken a universal approach from the beginning. It is also worth highlighting that this approach can create benefit for all users not just those who rely on accessibility. For example, a low vision user may be aided by contrasting or night mode colour themes, these exact features also assist any individual using the device at night. The route to a truly intuitive and simple design can also be achieved by taking the needs of a blind user. As if you can make a user interface or product that a blind user can utilise, it truly is simple and intuitive.

It was during this conversation I highlighted how important this approach is to all services and products. There should never be an assumption that a particular product or service will not be utilised by a particular demographic. To highlight this I mentioned how I had utilised RunKeeper to learn to run solo outdoors. It would have been easy for RunKeeper to assume a blind person would not use their app. After all what use would this be to a blind person. But thankfully they did and I was able to achieve what was once perceived impossible, to learn to run solo outdoors.

I continued by saying this is why I wanted to work with IBM, I wanted to make sure every service and every product across all sectors became accessible. Just imagine the impact this could achieve with the number of partners and clients IBM work with. With accessibility an assumed standard across the board just imagine the impossible things that could be achieved in the next few years.

During the rest of the conversation IBM HR mentioned they could imagine me starring in an IBM commercial, demonstrating what accessible technology can enable people to do. Well if we fast forward 2 years that opportunity arrived. IBM gave me a call and asked if I could like to be featured in a little video. I of course said yes and the result is the video below.

In those past 2 years I have continued to try and make the world a more accessible place, through advocating for universal design, working with many tech firms and countless public speaking appearances at large tech events. But I still feel I could do so much more, there is still a need for that evangelist role and I am still a great fit. There is a real need to ensure universal design across the board. When that goal is achieved countless people will be enabled to achieve the impossible.

Looking forward to Apple TV OS

My favorite piece of technology in the living room is my Apple TV and it is about to see a significant update. I love the Apple TV for two reasons VoiceOver and Netflix. VoiceOver is fantastic at assisting in navigating the UI as it reads aloud all the elements and Netflix has fantastic audio described content.

However, it is limiting. I only access my media through Netflix but I have a world of other media. I have numerous DVD and Blue-Ray discs all with great audio described content. The problem is how I access this media. For example, identifying the discs or navigating the menus are both challenging and require sighted assistance. There just isn’t an a great accessible removable media device.

So the current solution is to rip these discs along with their audio described content and AirPlay them to my Apple TV. This allows me to use a screen reader to select the content I would like to listen too. But it shouldn’t be this hard and I hope the Apple TV can help in this respect.

For the first time the Apple TV has a SDK, meaning developers can create apps for the system. This brings with it the opportunity to access my other media through an accessible UI, this isn’t just hypothetical either as Plex have already announced their intention to release on the platform.

There is however one caveat, opening up apps on the Apple TV to using an SDK instead of creating an app under the strict UI guidelines of the past, gives developers free reign. With free reign may come the possibility of the apps no longer supporting VoiceOver, or if they do no guarantee all elements of the UI will be labelled. However, this would then merely be a software fix and I am confident developers would be willing to ensure their apps are as accessible as possible.

There is also another exciting feature of the new Apple TV – Universal Voice Search. This would reduce my need to interact with the UI significantly, now if I would like to watch the latest episode of a show or a movie I can just issue the command to Siri. It was also recently announced that this feature would roll out as an API, meaning apps such as Plex would have access.

This really does excite me, as instead of asking for help to find the DVD I would like to watch, then having sighted assistance to select the correct audio track and start the film, I can do all this myself. A simple voice command will allow me independence in viewing media.

The new Apple TV will retain its much loved spot, as it remains the most accessible media viewing device for the living room.