Skip to content

Latest commit

 

History

History
548 lines (274 loc) · 53.7 KB

2020-06-09-Professional-Testers-with-Disabilities.md

File metadata and controls

548 lines (274 loc) · 53.7 KB

Professional Testers with Disabilities

An A11yChi Meetup on June 9, 2020 at 7:00 PM Eastern Time

CART Captioning Provided by Alternative Communication Services (ACS), LLC

This is being provided in a rough-draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings


Santina: I don't know, maybe my clock is wrong, I have 4:27. Nick what do you think? Shall we get started or do you want to give it a couple more minutes?

Nick: Yeah, we can get started. So there's time at the end for Q & A.

Santina: Sure, great. FYI, feel free to ask any questions in the chat and do feel free to continue to introduce yourselves as well until the speakers start.

I won't be able to see chat because I'll be switching over to the presentation slide deck. So there we are, bear with me for that.

Announcements and introduction

Santina: Folks did ask about live captions. We have that link in the chat and Karen if you could place that in the chat again I'd greatly appreciate that. You can reach that with a web browser. This is the link.

Our captioning is sponsored by McDonald's, so thank you very much, McDonald's!

You can, of course follow along with the slide show on your own device and look for professional testers with disabilities. Just go to speakerdeck.com/a11ychi.

Of course keep the discussions going on Twitter, using the hashtag, A11yChi.

We are live streaming this on YouTube, so hello to everybody on YouTube. You can ask questions through chat on YouTube as well, and if you would like to follow along with us on YouTube, this is the website. We do post the recordings from our meet‑up groups on there as well, so if you'd like to refer back to this talk and/or if you want to send it to your friends please feel free to do that and I'm just screwing that up [LAUGHTER] all of a sudden.

Just a reminder if you prefer not to be in the recording, make sure that your camera is off, hold questions for the Q & A time after the presentation, the co‑organizers will monitor the chat for technical issues and questions. I don't know what's wrong with my mouse right now. Just want to take a pause for jobs?

If you have any jobs at your company that you're listing specifically in accessibility and/or you're currently looking for a job? Feel free to take yourself off mute and mention it and/or feel free to drop it in the chat.

All right, if nobody is going to speak up you still have the opportunity to drop that in the chat, and/or you can mention that afterwards as well. We do also want to do a brief mention of the current events obviously, it concerns a lot of us and especially as accessibility professionals we need to acknowledge there are folks in our community that are being affected so we do encourage you to check out the hashtags, disabled black talk, and disabled black lives matter. Both of those are currently trending on Twitter right now.

And with that, let's go ahead and introduce our speakers. Heather Burns, Heather's passion for accessible started early turning self‑advocacy into a student run organization to help others gain the accommodations they needed for academic success.

Accommodation Microsoft she worked on the accessibility of Windows and the Office family of products becoming a co‑inventer of UIAutomation and U.S. patents and you can look up those numbers they are very difficult and very long.

Heather is currently a full time Accessibility Consultant for Equal Entry a company funded by a former Microsoft accessibility peer, Thomas Logan who was also on our line. Equal Entry provides clients with accessibility expertise through audits, training and expert witness testimony.

John G. Samuel, John G. is an award winning business development leader with other 13 years of experience building strategic partnerships, designing profitable business models and transforming organizations around the world.

He's currently the Head of LCI Tech and responsible for launching a new technology, excuse me, responsible for launching the new technology services business for LCI which is one of the largest employers of Americans who are blind. John has a proven record of accomplishment of leading start up initiatives, previously helping build Homestrings, a fintech platform focused on diaspora investments, and starting a highly successful joint venture in Cameroon for Aster, a global telecom infrastructure company.

And, Heather and Tom, please go ahead and share your screen and take it away.


Professional Testers with Disabilities

John: Awesome. Thank you so much for that introduction and thank you all for inviting us to participate today at the Chicago accessibility. My name is John Samuel and I am the Chief innovation architect at LCI Tech. We are a division of LCI and we are the largest employer of Americans who are blind and our mission at LCI Tech is to create careers for people who are blind.

My connection with the disability community started when I was diagnosed with a degenerating eye condition called Retinitis pigmentosa when I was a freshman in college and at the time I didn't know anyone who was blind and when I got this news, it was devastating and really shook my world and I was ashamed to tell anyone.

I moved on with my career and moved on throughout the world and in different places I would figure out different ways to accommodate for myself and figuring out ways with my limited vision and eyesight I would use it, using inverted colors and larger font, so eventually I got to the point where I had to use a screen reader.

The challenges that I was facing throughout my career, I realized the importance of accessibility, and so when I launched LCI Tech, I knew that accessibility was going to be something that I would need, and that's what we're doing.

But I'm joined here today with my friend and colleague, Heather Burns.

Heather: Thanks, John. So kicking off the way we kicked off the morning, so she/her is my pronoun, I'm in Charlotte, North Carolina, I didn't purchase it recently but I'm finding that a gift from Christmas is a jogging leash for my dog, and that is providing me lots of stress relief, as we get some jogs in and she's just a lot nicer on that particular leash than others, so, that is good! But as kind of alluded to my journey with accessibility started pretty early. In fact I don't remember a time really before I knew about my learning disability.

So, it was very much when I struggled academically when I had some difficulties, my parents were always great supporters to be able to say, to be able to name what it is, and be able to tell me okay, so that just means you've got to work harder, and you might need some accommodations in order to be successful.

So, throughout my academic career, I was allowed to have accommodations, extra time, but one of the controversial accommodations I received was the ability to use a computer.

Now, this was for I could do spell checking, I could rearrange sentences and craft essays using a computer, a lot more efficiently than by hand, and at the time, you've got to go back in the WayBack machine, we're talking Apple IIe, we're talking five inch floppy discs [LAUGHTER] As far as storage, right? The teachers didn't have much experience with computers, and they were concerned that if computers got into the classroom, they wouldn't be able to identify what is unique student work.

So, it was really breaking ground with that but I always kind of knew that a computer would help me. Then, when I graduated, and it was like one of the early, so July 1993, the New Yorker publishes this cartoon, and it's a depiction of a dog sitting at a computer at a keyboard, he's looking down at another dog, and the caption is saying, "On the internet, nobody knows that you're a dog."

This was done by Peter Steiner, and for me, I can't remember who cut that out, and gave it to me but I remember laminating it and holding on to it because it was so powerful to me to think that, you know, on a lot of levels, I can pass, I can be social, I can choose when I disclose about my learning disability and if I stumble over it before I've told somebody about it I get very self‑conscience.

If I run into it with a colleague that fully knows my background I'm like come on you should know about this! So, became very big for me, the power that a computer could provide to anyone with a disability, so that all disabilities could become, could be made to be invisible, and allow the individual to have the privilege and the honor to know that they get to choose when to disclose, and I thought that was incredibly powerful, so it led me to the career of computer science and Microsoft, and kept me passionate about accessibility.

So, that's kind of my back story. When we get into kind of why we're here, you know, I know that some of my friends that I see on the Zoom meetings, thank you for coming, have forwarded this to others that might not have the accessibility background so I'm just going to do a quick background for those that might not be as versed as others on accessibility, but really, including individuals with disabilities in product testing is really the only way to truly validate that a product or a website is in fact accessible.

You can use automated testing tools and I love them, but if you build a reliance on automated testing tools, you can create a culture where developers are coding to pass a test, and not to make it truly accessible.

If you have okay fine, we'll just give developers screen readers. We'll give them voice dictation technology, Dragon, others, and the learning curve on those is pretty steep so if you don't also include a significant amount of training, you end up with some unintentional consequences and by that I've seen at least twice in the last year and a half where a developer got a hold of a screen reader and because they were tabbing between all of the elements, they decided that everything needed a tab stop and they were adding tabbed indexes to everything and they spent so much time doing that and trying to do tab indexes in the right order and trying to do all of this stuff.

They didn't do the request of connecting the label with the input and allowing the program to have access so here they thought they were doing this great thing. It didn't help screen reader users is all and they just aggravated, you know, the individuals that were using the keyboard. Your power users as well as those that are sighted but physically use the keyboard rather than the mouse.

Finally I just want to mention that there are some organizations that do a great job of hiring, they include individuals with disabilities on their team. We love seeing that, but that's not a replacement. Those team members, they grew up with the product. They know where all the mitigations are. They know where all the work‑arounds are, and they become expert power users, without even knowing it.

So, they're not a true testament to an individual with a disability being successful with your product, so I'm going to hand it back to John but before I do, I want to give just some visuals as far as when we talk about usability. John is going to go into this a little bit more but I did some internet searches and I found this company based in California, Interface Analysis Associates and the only reason why I'm highlighting them is because they were kind enough to publish their pictures of their testing room [LAUGHTER] So this is very much the typical testing room that you would see.

There's a desk that is a fixed‑height desk. There are chairs to allow you to adjust that. At most they can put paper on that desk, they can put a computer on that desk but visually what you're seeing is tons of cameras around the room, from the ceiling from the side, trying to get every angle to record when somebody enters the testing room what they do.

On the walls you see a reflection so you can see a mirror across one side and a reflection on the other side of the wall. There's another mirror.

When we flip over and look at the client observation room, you realize that one of those giant mirrors is actually a one‑way mirror, so you can have clients and the facilitators sit in this back room. There's a huge monitor that presumably shows all of the pictures that are being, what the cameras are capturing to allow you to kind of see what's going on within that testing environment, so this is what I used in Microsoft. This is a very typical kind of experience.

So wanted to provide those visuals but I'll hand it off to John.

John: Thank you, Heather. Yeah, so when I got into this field, I started wondering why don't companies use people with disabilities in their usability testing and I started asking people and companies and they would tell me, "Oh, yeah we tried and it just didn't work out. We just couldn't find really great test subjects," and I started wondering and asking myself why are they not finding good test subjects.

I started thinking about it and they are looking out they work 9:00‑5:00 so they are looking out on the street trying to grab somebody who has a disability or who's blind and during these hours, and I started wondering, the reason why that person may be available is because they all have a job and maybe one of the reasons they don't have a job is because they don't necessarily have the technology skills to have a job, and so when they're trying to bring them into do some usability testing those folks just don't know how to use the system.

It just kind of stuck with me and I started thinking about it. What does an actual test environment look like? And I'm happy that Heather was able to describe that, because that's what it is. You have to go to a test facility, you have to figure out transportation to that location, and then when you get there, you know, you have to the accommodations may not be accessible, they have these fixed desks and chairs and these systems.

I know for myself I had my own computer setup and customized to my own specifications and a lot of people who use assistive technology, they have it the same way, so that when you get somebody off the streets to come into one of these sterile environments and they have to setup their system to their own specifications, there is a lot of testing time so I can see that being major issues for not having great test subjects when you bring in people off the street.

Then I started thinking about what is it that testing experience looks like? Often the case, it's a written document with click here, click here, and when we're designing these type of testing scenarios, it's important to think about the input devices and not make any assumptions because you don't know what type of assistive technology, or input devices that people have been using.

But, before we start to look at how to find test subjects for your usability testing, I think it's important to take a step back and take a look at why was LCI formed?

So, LCI was formed in 1936 by a local civil organization in Durham, North Carolina and the civic organization saw on the streets of Durham, there were a lot of people who were panhandling and many of these folks had visual impairments and so what they wanted to do was create employment opportunities for these individuals.

And that's where the formation of LC Industries which is now called LCI.

But when we started to, started the business, the first‑line of business was creating furniture if you remember the wicker‑type of chairs and then we moved into mattresses, and these type of organizations were popping up all over the country, and the Federal Government took notice and started with products from the organization and today, LCI manufactures over 2500 products, and we have a footprint of 11 states, we have eight manufacturing facilities, 50 retail location, two distribution centers, and we employ over 800 people, and 400 of them are low vision or blind.

So, when we start to, when we start to look at this, this opportunity at LCI, we really were focused on the retail and distribution, but although manufacturing is a very noble profession, we realized that we needed to start thinking outside because not everybody wants to go into this profession.

So in 2017 the company realized that they wanted to create technology‑based jobs, and so we launched what is now called LCI Tech, and so the whole purpose of this was really to create those knowledge‑based jobs and careers, and when we started looking at the statistics and I figured that's what's really important is that we really started making a decision based on the statistics that we had in the community and in 2016, 7.2 million people identified with some sort of a visual impairment, and it was 78% of people had a high school degree, 33% of people had some type of college or Higher Education and then only 15% had completed a undergraduate degree or higher and the real kicker was that only 29% had a full time job and these numbers were really just like staggering and we realized that we really had to do something. We had to do something and we started looking at the barriers that were standing in the way of people from pursuing these type of careers.

So we noticed that the stratification has always been a barrier with as we know is every day that as we live in a new normal but we also noticed that education was a big challenge and also accessibility.

So I realized that we had to address the accessibility issue so we needed to find ways to do this.

So, why am I talking about these stats and these barriers is because even though LCI had such a large number of employees, we had 400 people in our organization, we did some surveys and were really excited because I thought I look at the results of the survey and I saw that there was so many people with college education. We have so many people who have great experience with computers, and I thought this was going to be a great foundation that we needed to launch LCI Tech.

But as I started doing the assessments of these individuals ... as I started on the survey, I realized that a lot of these individuals had gone to college and had these prior careers in technology before they lost their vision, but once they lost their vision, many of them did not continue on with their education or continue with using assistive technology, and so we realized that we had to go outside and look externally for talent and so today, we have 10 people who are on the team at LCI Tech, seven of whom are blind or low vision but we had to look externally and it took us some time to find and so we did face challenges and although we're always looking for that, it's something that we had to really work at to find people in looking to build our organization.

With that I'm going to pass it back to Heather.

Heather: So really, I think what John's highlighting is the fact that finding the right participants are so key.

So, at Equal Entry a lot of our clientele are looking for a specific persona, right? When they're developing their software. Their application or their website, they have in mind a specific persona and I can use the example of Tableau. They're a company that is on our website and I work with. They are an enterprise‑level, visualization product. They work with a large amount of data. They are a business tool. They want their primary persona is educated.

So they want to include participants with disabilities that meet that education part that would be business professionals utilizing their product. We need to make sure they're proficient with computers. If we have a customer, even if it's a retail website, and no their main persona doesn't require a college education but they do expect computer proficiency because if they don't have that from a usability participant then they're going to kind of unfortunately blame the user as opposed to their software, and so we want to make sure that we're giving them the most valuable feedback that they can't really argue with, that they have to accept, and go, "Oh, okay, yeah, really we need to address this because even somebody that is highly proficient with a computer, proficient with their assistive technology, is going to be, it hasn't been successful," therefore they need to fix it.

The final component I'd say for our participant is willingness to try and fail. I think this is true for any usability study. You're putting people in a position, you're asking them to do something that's a little off the wall or a little new, outside their comfort zone. They really got to be willing to try and just accept that, "Yeah, it might not work and that's okay."

Then the final thing that we feel very strongly about at Equal Entry is that these participants should be paid. This is not something that you should be looking for volunteers to try to come in and assist with because you are looking for an educated computer proficient, proficient with assistive technology. They have a specialized skill. They're providing you with valuable insight. There should be some payment monetary kind of contribution back to them.

So, with that, once you kind of find the right participants so what's worked? So Equal Entry we've been working with LCI for about over a year now, and essentially what we found is when we tried to do an ad hoc, we had a lot of technical setup issues. It was, "How do we do this again?" It is, "Which technology did we need to join?" "How do we just set up the meetings?" "How do we get the technology aligned?"

So we've now started kind of having weekly office hours, that's cut down on the amount of churn from a technology perspective. We also solely use Zoom. Zoom allows us to share the computer audio as well as the visual of the individual and their screen, so we get great facial expression from the tester that we can pick‑up on immediately, and we can record the audio for the screen reader as well and we'll give an example of that in just a second.

Then, at Equal Entry we're very specific about the fact that we write our scripts to be as inclusive as possible. As John was alluding to rather than directions that say "click on", right? It's "activate" so they can choose which input device that they're using in order to do that.

We also found working specifically with LCI that it was difficult for them to kind of switch back and forth between the application that they were working with, and the notes or the digital information that we were providing as far as what the scripts should be on what they should do, so we have found that it's better to have a facilitator verbally tell a visually impaired an individual with visual impairments to have it read to them, so that then they can take those actions without having to switch back and forth between applications.

So, with that, I want to pause and I'm going to show a 2.5 minute video, and this is kind of an early session that we did with LCI and Vanh is the person that was helping us here and what we're going to see is she's working with the Equal Entry website.

We had just done a refresh of our About page and we had a developer and a junior tester working on it that had given it the thumbs up, and then we had Vanh do a quick review.

For those I'm going to quickly put back in the chat window the link to the full, if I can find where the chat window is. Can I ask somebody else to — Oh, here we go. I can't find it. It's lost in my screen somewhere, I apologize but if somebody could resend the link to the actual PowerPoint if you want to follow along, in the notes section of this slide, so the presentation notes has the full transcript for what you're about to hear, and if you are new to listening to a screen reader this is going to be very fast, so you might want to follow along with that transcript in order to be able to understand what's going on but I'll also give you kind of a debrief afterwards.

So we all cross our fingers and hope that technology works for us today.


Begins 2.5 minute video

[Full screen of the Equal Entry Website, bottom of the Our Work page showing the Contact Us form]

Heather: Why don’t you move to the About page and find out a little bit more about Thomas Logan.

Vanh: OK

[Focus moving within the Equal Entry menu bar]

JAWS: Our Work | Equal Entry

Banner re..

Blog

About

Contact Us

About

Menu Menu

Vanh (in an almost whisper): About

[New page load]

JAWS: Title is About | Equal Entry

Banner region

Skip to content

Heading level 1

Site Navigation region

Visited link Home | Equal Entry

Services menu 

Our Work

About

Heading level

Our History

[Screen scrolls rapidly to the footer, image of Heather Burns shown above the footer]

Subscribe to our mailing list

Rapping to top

[Screen scrolls back to the Equal Entry menu bar]

About 

Heading level 1

Heading level 2 Our History

Blank

Equal Entry was

Over the p..

[Screen scrolls rapidly to the footer, image of Heather Burns shown above the footer]

Main region end

Vanh: huh! Ok.

[Screen scrolls back to the Equal Entry menu bar]

JAWS: Our History

Heading

Heading

Blank

Contribute

Main region

Site navigation

Search 

Site Contact us

About Menu

Menu

About Menu

Leaving menus

| Equal Entry

Heather: Hold on. You we’re going quickly there. so 

[screen scrolled down just a bit to show under the 2 paragraphs of History there is another heading “Our Team” and the top of an image]

Vanh: yeah

Heather: What just happened. So, I was hearing you, hear about the our history 

Vanh: Uh-huh.

Heather: And then you went to the end of the page

JAWS: heading level 1 � Vanh: So you said, go to the About page and look for Thomas � Heather: Yum-uh � JAWS: Heading Level 2 Our History

Equal Entry

Over the past six years, Equal Entry has grown into a global company, with team members joining us from throughout Europe, Asia, and the Americas. While we are fluent

Vanh: And then, I think I jumped back up. Ok hold on.

[Screen scrolls rapidly to the footer, image of Heather Burns shown above the footer]

JAWS: Main region end

[Screen scrolls back to the header Our History, also visible is the heading Our Team]

Our Equal Entry was founded in 2012 by accessibility expert Thomas Log...

Heather: so that, um, how did you get to that Main region end. What keypress were you using there?

Vanh: to get the Main Region end?

Heather: Yeah, cus you were reading, you, the focus was in the Our History and you're reading that text and then I am hearing an announcement of region end.

JAWS: Region End

Vanh: Oh, I arrowed down. Or um, I was using the paragraph, uh command. And moved JAWS to fast, that was all.

[Screen scrolls rapidly to the footer, image of Heather Burns shown above the footer]

Heather: OK, so

[Screen scrolls back to the header Our History, also visible is the heading Our Team]

JAWS: Equal Entry was founded in 2012 by a

Heather: Ok, keep going, I’m just trying to ah, follow along with what you're doing from a keyboard perspective. 

Vanh: Uh-huh.

Heather: So and here is the first time I really lost you.

Vanh: Ok sorry about that. So it looks like I am here and

JAWS: Heading Level 2 Our History

Vanh: That is the heading and there are 2

JAWS: Equal Entry was found

Over the past

[Screen scrolls rapidly to the footer, image of Heather Burns shown above the footer]

Main Region End

Vanh: paragraphs here.

[Screen scrolls back to the header Our History, also visible is the heading Our Team]

JAWS: Over the past six years, Equal Entry has 

Vanh: Describing the history of Equal Entry

Heather: Yes

Vanh: Looks like that’s it.

Heather: Ok I’m going to say, we’re, you might have just found a bug for us.

Ends 2.5 minute video


Heather: So visually what you can see on the screen is the Equal Entry About page with our history has the two paragraphs and below it you can see that there is another title that's just barely readable of our team and then there's images and they are just the very top of Thomas Logan's head and so when it was hitting that last page and going all the way to the end you could see it was another image and it was of me so there's clearly content in there, that she was missing.

What with a little bit more investigation was identified is that we had a misuse of ARIA hidden that had misfired and was hiding the rest of the content to assistive technologies, so the automated accessibility testing that was done passed no problem and our tester who was using a screen reader was tabbing to active elements and was able to find and read all the active elements on the screen.

For a native screen reader user that is just trying to navigate a page, clearly, they were hitting some significant problems with the page. So it's a quick and easy fix. It was only up for a little while, I promise. But it was fascinating to me and I thought it was a great example of how you can do some very basic tests, and get a good sense of what an individual would find but then actually, working through just making running through some very specific scenarios, the big key issues, and making sure people are successful is as I said that gold standard of making sure something was accessible.

[Video: So, why don't you move to] — it started to play again. Sorry about that.

Really, lessons learned sharing the screen with the computer audio is absolutely key. We've tried this with other technologies where I couldn't hear what the screen reader was saying and it was just all on the participant to kind of communicate what they were experiencing in their own narrative and it just wasn't as effective. So I'm pretty verse in screen readers so you can tell that I was allowing Vahn to use her own speed of how she was using it or how she wanted to use her screen reader has allowed her to do that and I could follow it and with experience that's pretty easy to do and be able to understand where she was on the screen, but really need to have that computer audio.

Individual volume adjustment is something that we would love to see, because sometimes when you want the screen reader to be slightly less and pump up the volume of the individual that is participating so that you can hear what they're saying when they're talking over their screen reader we would love to be able to do that independent volume adjustment and that's not there yet, so that, I'm putting a plug for everybody making their request to their technology vendors, and I've kind of alluded to this as far as Zoom allows me, that in the recording all you saw was Vahn's screen, but when I was participating in that I could see her facial expression and that is pretty critical from surprise, frustration, confusion, right?

You heard her kind of go, "Huh? Oh!" And seeing facial expression helped understand that those voice expressions.

So if those facilitators aren't used to the speed of the screen reader it is important to ask the participant to slow down their screen reader. It again, it's just that critical ability for the facilitator to be able to hear exactly what the participant is hearing to be able to make some judgments over what might be the problem and being able to articulate.

The other thing is you heard in the video understanding what keystrokes are being used. How they're choosing to navigate at that point. Why they're choosing to do that. So in general, our biggest takeaway and I don't think we hit on this, but when we're talking about like why companies don't do this type of testing, a lot of it has to do with the fact that they require that physical testing environment.

So we're big believers that we want participants using their own computers. We want participants using their own assistive technology that they've customized and they setup.

So really, to host your own usability kind of component and include individuals with disabilities, we really suggest vesting the time to find the right participants that meet your requirements for your product, your website, that what's a normal persona, or what's the target development persona, and then look at it from okay, "If the individual was with a disability, what were the requirements? What would they look like?"

Are you willing to have somebody that's brand new to assistive technology or not? There's various times when we've had clients that are like we are working on keyboard accessibility. We're not ready to work with screen readers yet. We're taking an interim step in our accessibility plans and we're starting here and then going to move on so making sure you match the right participants and then we're hitting on this.

Plan on the remote participant and that means that the development environment has to allow for an individual at their own home to be able to access it, using their personal equipment, so and then it's budget the right amount of time and I'm going to say multiple sessions, and then if you need to cancel, because you never know. That first session you could have total technical difficulties, with just getting everything setup right to be able to get good data back, so if you have multiple sessions and you don't feel rushed to be able to get through anything, and you can just cancel further sessions then.

So, and then the final one is set expectations up front and that is expectations with the product development team that is asking for this as well as the participant, so it's often times we would recommend doing this type of usability study after you've done an accessibility, a traditional accessibility evaluation to determine where you think an individual is going to be successful and where they might have problems, setting that expectation up front, letting a participant know, "Okay, in general, this is an accessible website. There are a few areas we're going to have some gotchas, so I'll try to warn you of that or avoid those, or just letting you know, we're going to purposely hit some of those."

Just having that expectation up front allows the participant to feel a little bit more comfortable about when they do hit a stumbling block. So, and with that, I think we've made it to our Q & A.

Santina: Awesome and folks feel free to take themselves off mute to ask as well. I'm switching over right now to our Slack. I know that we did have some questions crop up in chat, so bear with me while I locate that.

We had a question from Josh: "Are you testing on mobile display as well?"

Heather: So LCI has participated with doing Android as well as iOS testing with us. I will also say that I have traveled to LCI. I have the advantage of being in North Carolina with them, so I have traveled and been able to sit next to them. Honestly, I find the remote just as effective. Once we've gotten through all the hurdles with making sure that they have the right technology to cast their mobile device to the computer screen, and then share out that computer screen we're off to the races. So yes, we've done the IOS, Android as well as websites, web applications.

I think we did one Windows client, so we've done a whole host of things with them.

Santina: We also had a question from Steve: "I realize this is very bespoke and everyone is different but is there a typical pricing range for facilitating testing in this way? If we wanted to make the case to do this, I feel the first question is going to be how much?"

Heather: That is an awesome point. We did not discuss ahead of time if we were talking pricing at this type of event, so I think I'm going to have to say contact John or I after? I don't know what protocol is for this.

John: I'll say one thing, Heather. One of the main things is we've always tried to make, at least from an usability perspective, we want to make it obtainable for everybody because at the end of the day we want to make sure that people are not doing it because it's cost prohibitive so we've been trying to make sure that it is reasonable for people and their budget but that's on the usability side we try to do as much as we can.

Santina: Great, thank you. Folks, if you want to go ahead and put questions in chat either here on Zoom or on YouTube Live feed. We'll go ahead and answer those questions either way.

Heather: I'm catching up on the chat.

Santina: Looks like we have a question from Andrew: "How do you balance leading testers versus letting them organically way‑find the task? It seems there is a good amount of prep to even have the conversation."

Heather: That's an excellent question. So in the video that I showed it's kind of go to the About page and look up about Thomas Logan. That's a pretty kind of broad and I also typically start off any session with kind of a just general explore, right?

I'm giving you a URL and telling you this is what we're working with today, "Why don't you just take a look around and figure out what this thing is." So you give somebody a URL you don't know if it's a web application, if it's a website so I often just let them explore a little bit. Again, you're hearing the screen reader and going and kind of hearing how they explore, which I find pretty valuable.

It is specifically when you have a site or an application that you know is not as accessible as it really should be, then you end up having to guide a lot more. And we warn clients about that as far as you want a level of accessibility to be there, to be able to be successful and be able to show a solid scenario of somebody being successful is what we try for and what it is just kind of a carrot to really put in front of clients like do good and then you'll get this, right? [LAUGHTER]

You'll have somebody be very successful and you'll have a recording of it, so it kind of goes both ways, but if it's a very accessible product then less leading is needed is the best way to say that.

Santina: Awesome. Next question: "Have you ever tried visiting the subject at their home or a neutral location?"

Heather: So I've met with LCI at their place of work. As far as going to the person's home, I can say yes in my past, so I gave the pictures of the usability testing facility the first time that I, as a program manager requested individuals with disabilities, we ran into lots of problems.

Getting people to the testing site, the testing site's accessibility, getting the accessibility software on the computer, essentially it was an abysmal failure so the next usability study I did at Microsoft, I said I want to give people computers and I'm going to pre‑load them with our beta software that's not out to the public and it's going to be locked down so they have access to it, but we'll install it in their house and to my manager's credit, they go okay, how many computers do you need? It's Microsoft, okay, fine!

So, here I load up like brand new machines, still in their box, putting them in my car, and I'm like somebody is going to stop me. Nope! So, I did put them at individual's homes and they were people that we had found through advocacy organizations for the blind, and honestly, we didn't get great data as far as of the participants only one had really advanced computer skills to the point where they felt comfortable utilizing this system.

The idea was that they would have it at their home, and because of the ability to pre‑load the beta software, it had to be a pretty powerful machine at the time, it had to be locked down from a Microsoft privacy perspective, but we said, "Hey, you can install your own assistive technology and you can customize it and we'll give you lots of time with this."

Came back later and again only one participant had actually used it. Everybody else was like yeah, I turned it on. Didn't really, yeah, didn't use it in any way.

So, I have been to participants house. I don't really see a huge difference between somebody's comfort level with being in their home versus their comfort level being at their work environment, and then working remotely, they could be at work or at home, it doesn't matter to me.

John: I have a quick question for Heather. I know that our schedules have been a little wacky the last several weeks when we started working from home but have you worked with the team at home?

Heather: Yeah.

John: How is it different?

Heather: Pretty much the same. So, I've worked with Vahn and several others both remotely and face‑to‑face, when they're at home, at LCI.

The key is having it be their own system. Having it be their comfort level with their assistive technology and as you throw at them something new from a usability perspective.

John: Okay, thanks.

Santina: Kind of building off that we have a question of "How do you overcome a problem when testers," excuse me, "How do you overcome a problem of testers getting nervous when observed?"

Heather: That is a little bit of time. It's just rapport, I think. So hopefully, I'm a welcoming voice and it's not too painful to listen to me as we go through this webinar, [LAUGHTER] But it's really part of that is setting expectations early on with the participants as far as you've got to be willing to try and fail because that's what we're looking for.

If you succeed beautifully, that's interesting data but it's really interesting where and how you fail. So that's what we're kind of going for so they got to just be willing to do that and comforted by knowing that hey, every failure that they make is a failure that the actual general public won't be making, because we're going to be able to fix it.

Santina: Just want to pause for a thank you that I just saw in the comments. "Heather, your excitement and passion is contagious. Thank you for this information and sharing your experiences. This is great!" And I agree with that so I just want to make sure that you knew that that was happening in the chat.

Another question that we have: "How long until you recommend your testing session be per‑participant?"

Heather: We hold it to pretty much an hour or less, and that is at that point, it gets draining. So the shorter sessions typically work better, and an hour block allows us to get through any technical issues and still get valuable time, so shorter, you might end up just having technical issues and not getting any data, so we block it for an hour and if we get through, we can call it early and thank everybody for the great insight and we're going to go run off and go fix things, [LAUGHTER] Or use the full hour but then cut it off and say okay, we'll rejoin and again, it's the having multiple sessions really does help us to be able to pick up and revisit things.

John: One thing is that you also have a consistency, Heather. And you have a schedule when you're going to be doing this testing also, which makes it consistent for the team, which is great.

Heather: Definitely cuts down with the technical issues.

Santina: I do have more questions coming through with chat but Dan has his hand raised so I wanted to pause and allow Dan, go ahead and take yourself off mute and ask your question.

Dan: My question is, thanks. I do braille testing in my current job which ends Monday. I've been advocating that people need to test with braille displays because I know when I've done iOS testing of an app or a website I've caught things where something would might work with voiceover speech but not with braille, with a braille display.

Heather: I don't think we have done any specific braille testing with LCI, but again, the participants that I talked about where early on in my Microsoft career, the one that was the most computer‑proficient was a braille user, so he used a combination of JAWS as well as his braille display.

Dan: Yeah, that's what I do, JAWS and voiceover.

Heather: It's multiple inputs right? The more advantages you have to get the information you need, so yeah, I love it, but I don't think we've done specific and I think it would be interesting to look at a usability study where we just did braille without the assistance of the screen reader. I'm not sure what, it would be interesting to get some data points as far as the general public, as far as how often a braille display is used in conjunction with a screen reader or in isolation, and then be able to build tests, usability tests based on that.

Dan: You could look at the web beam website. They might have statistics there.

John: Yeah, and actually, everybody on the team, I think everyone on the team except myself has braille and they have multiple braille displays on the team so that's definitely something.

Heather: So they might have been using it without me realizing it and that's fine, if that's information that they have access to but they haven't specifically pointed out that they're using it.

Santina: Another question we have: "How many testers with/without disabilities do we have and do you analyze that together?"

Heather: Analyze the results together? That's, I think that he's referring to Equal Entry and LCI analyzing the test together?

Santina: I wonder, feel free to take yourself off mute to clarify. I read that as the people with and without disability analyzing those results together. That's just my interpretation of somebody else's questions.

C1: Yeah, that's my question, so I have experience testing with testers with disabilities so I just am thinking in the future, if I incorporate that then how would I analyze data from different groups of testers?

Heather: Best user scenarios are the ones that you're doing with both sets of groups. So, if you have a key scenario, logging into a web application, being able to search for something specifically and find an article, read a full article, those are very basic tasks that you can give to any user and you should see the same success rate between an individual that's part of a general usability study and participants with disabilities.

The goal would be to be able to say okay, this scenario was, well, not successful for anyone. This is confusing and we need to go back to the drawing board from a design perspective, or, okay this class of user had no problem with it, but individuals with disabilities, be it vision impairment, mobility impairment, cognitive something along those lines was having failure issues here so as far as the number of participants that really kind of depends on the product and kind of what data you want to get out of it, but from a cost perspective, somebody brought that up before. There's always cost, time, and the ability to take that feedback in, so if I can get you wonderful results with 25 participants but it's going to take me three months and we just missed the ship window, so you can't actually make any changes based on that feedback, then that's not a good study. So you really are trying to narrow that down to well how much time do I really have to be able to influence and be able to make changes.

So, sometimes, it's one. Right? One is better than none. [LAUGHTER] So that's the way I look at it. If I can get more, I will, but it's really when can I get them in, how can I do the test and get it back so the developers can make change.

C1: Okay, thank you.

Heather: Great question.

Santina: Next we have three questions. "So, do you also engage user testing at design or prototype phase?" "How many test scripts do you normally start with at initial stages and what's the best practice during," excuse me, "what's the best practice duration for a session?" "Are there any privacy consent requirements for recording video and audio?"

Heather: Okay, three parts. I found it in the chat, so—

Santina: If you'd like me to repeat any part while you're answering happy to do that as well.

Heather: I appreciate that but I think I found it. So I will say prototype phase is difficult because normally you have some very rough images that you can't interact with a screen reader with.

So it's a more theoretical kind of conversation at that level, and reliance on traditional accessibility consulting knowledge to guide a development team. So I would say that the usability studies with individuals with disabilities are a little later in the process. There are useable studies done with prototypes and it's normally paper‑based type of stuff or image‑based so that is one kind of area that you just kind of have to have more of a conversation with a participant if you're going to do that.

How many testing scripts do you normally start with at the initial phase? Again, it's how much time do we have and where the developers can be influenced. So, if you're just starting with a client, we typically want to say that at minimum of four scenarios really give them a taste for what this would kind of be like and get them through and then you'd build from there, but there are other clients that start off with no, no, no, there are 15 key things that we absolutely have to make sure are right and we're going to have the different teams all dedicated to lined up once you give us data we're going to go and fix it and it's like okay we'll get through as much and then setting up those timings.

So I think that covered the two. The last one being privacy consent. Absolutely. Yes. So we have non‑disclosure agreements with our clients and we have a partner agreement with LCI that includes that non‑disclosure and we have the agreement between LCI and us for being able to record those videos and share those with assets with our clients so that they can see the full experience or we can cut various video. Cut various videos down to a specific issue for them.

Santina: Great. John, this one is directed specifically to you: "Do you have testers who can test with speech recognition technology?"

John: Yeah, so we also have people who use, what is it, Dragon Natural Speak, so we do have people who use that as well, and we do offer that as well.

Santina: Great. That's all I'm showing, Nick, and Karen, but I see with a question in the chat. Do you see anything from Twitter or any in chat that I've missed? Anything on YouTube?

Nick: Nope. I don't see anything else.

Santina: Great.

We'll go ahead and just pause. Again, thank you for your wonderful insights and thank you for taking the time to answer the questions and also for your wonderfully prepared presentation. This is very insightful and helpful. Just to learn about usability testing with folks with disabilities.

We have another question. What would you say are some of the biggest challenges in communicating discovered issues with developers?

Heather: Honestly, once we get to having a user with disability and a recording of them struggling, it's an easy conversation for developers. When it's an issue that was identified through an automated tester and testing tool, or Accessibility Consultant, I mean, I consider myself very educated and very adapt, but they're going to still look at it and go, "Really? Is that true? I'm going to try to argue." When you have an individual with disability that is truly struggling, and it's so obvious, they're very motivated to fix that.

So, really haven't had a big challenge once it's identified by an individual with a disability. I think that's one of the most powerful components of this.

So, with that, John? Do you want to?

John: Yeah. For sure. You know, I think that our hope from this presentation is that you leave here seeing value in including professionals with disabilities in the usability testing process, and then also, some of the best practices that you may have learned from here, I think one of the ones that Heather really pointed out was the importance of allowing subjects to use their own technology and in their own environment of choice, so I think those are some of the key things, and then also, there's organizations like LCI and other organizations like ourselves across the country who urge you guys to tap into to access the professionals and then also you know it's important to partner for this type of standards testing so hopefully, you guys can take these away with you guys and we really appreciate you guys spending this time with us.

Santina: Awesome. I think that we'll go ahead and close it out with this last question. We are almost at time for final thoughts.

We have a last question: "Do you have any tips for typical people to get used to testing with blind or low vision subjects?"

Heather: John do you want to take that one or do you want me to?

John: You can.

Heather: [LAUGHTER] Okay, so honestly, I don't know if it's any different than any other usability subjects as far as I would encourage you to be honest about yourself as far as disclosing letting the participant know what has worked well for you in the past, what hasn't, what might be some areas of concern, and being very open.

Specifically if you're talking with somebody that is raising their hand and asking to help improve the accessibility of a product, they are already eager and they want to be there to help. So if there are ways that they can help communicate that, then just be up front and honest, and have that dialogue. I don't know if that was overly helpful or not but that's how I would approach it. [LAUGHTER]

John: I think that answers a lot of things, right? There's things such as being authentic with others.

Santina: Absolutely. Just want to echo what everybody is already saying in the chat. This was a great session! Thank you both so much, Heather and John we really appreciate you joining us and sharing your insights with us!

Heather: I have truly enjoyed getting here and getting to know the Chicago meet‑up and while we were dealing with the coronavirus 2019 and while everybody is changing the world. This will be part of my memories and one of my big hopes is really that as we're seeing more people work remotely, that this is going to become more accustomed and that society is going to accept it more and then it's just going to make lives for individuals with disabilities so much more easier.

Great! Thank you all appreciate the time.

John: Thank you, guys.

Santina: Thank you. All right, everybody just a reminder we will go ahead and post the recording on YouTube, once we close this out we'll also be adding a transcript to this as well, so thanks, everybody for joining us from across the globe!

Thanks for joining us and again follow us on Twitter. Have a good evening!