Episode Notes
Welcome back to another episode of AccessWorld, a podcast on digital inclusion and accessibility. In this episode of AccessWorld, we take you to the exhibit floor at the annual AFB Leadership Conference, to meet with several of our exhibitors working in the digital inclusion space. The exhibitors include AIRA, Selvas BLV, and Vispero.
This is part one of two from the AFB Leadership Conference.
We’ve been on break the past month taking care of some housekeeping. Some of that work was moving to a new podcast hosting platform. After hitting our one-year milestone for the podcast, we wanted to move to a new hosting platform that would allow us a bit more flexibility. AFB is now on PineCast
AccessWorld is a production of the American Foundation for the Blind (AFB). The podcast is an extension of >a href="https://www.afb.org/aw">AccessWorld Magazine, a quarterly online magazine focusing on building a more accessible world through digital inclusion.
To read the latest issue of AccessWorld, or to dive into our archives from back-issues over the past 24 years, visit: www.afb.org/AW.
About AccessWorld
AccessWorld is a production of the American Foundation for the Blind (AFB). Tony Stephens leads communications for AFB, and Aaron Preece is the editor-in-chief of AccessWorld Magazine, an online publication that celebrates its 25th anniversary this year promoting digital inclusion and accessibility. Aaron Preece is editor-in-chief of AccessWorld, and Tony Stephens leads communications for AFB.
About AFB
Founded in 1921, the American Foundation for the Blind creates equal opportunities and expands possibilities for people who are blind, deafblind, or have low vision through advocacy, thought leadership, and strategic partnerships. In addition to publishing AccessWorld and the Journal of Visual Impairment & Blindness (JVIB), AFB is also the proud steward of the Helen Keller Archive, which is available on the AFB website at www.afb.org.
AccessWorld Podcast, Episode 13 Transcript
Intro (00:00):
AFB, you are listening to AccessWorld, a podcast on digital inclusion and accessibility. AccessWorld is a production of the American Foundation for the Blind. Learn more at www.afb.org/aw.
Tony Stephens (00:30):
And welcome back everybody to another episode of Access World, a podcast on digital inclusion and accessibility. Access World is a podcast from the American Foundation for the Blind. To learn more about AFB, visit our website at www.afb.org. We've got an exciting episode for you here today. We just wrapped up our AFB leadership conference and we are thrilled to have Aaron Preece, our editor in chief of Access World, hitting the rounds at the conference and visiting all the exhibitor tables. We'll be hearing from Aira, Selvas BLV and Vispero in this episode of AccessWorld. We've got more to come later this month, but enjoy the podcast for right now as we go into Aaron who's on the floor at the 2024 AFB Leadership Conference.
AIRA
Aaron Preece (01:23):
Hi, this is Aaron Preece, and we're here at the AFBLC 2024 Exhibit Hall. We're going to be talking to Jenine from Aira, so I will pass it over to you to let us know about Aira and for anybody that's
Jenine Stanley (01:38):
Sure. Hi Aaron. And it's wonderful to be here at AFBLC. We are Aira for people who are not familiar, that stands for Artificial Intelligence Remote Assistance, which really ties in with your conference theme here. We are a visual interpreting service and that basically means that for things like sign language interpreting, we do something similar with visual kinds of interpreting. So for blind and low vision people or folks who just have difficulty with visual, too much visual stimulation, et cetera. We have trained professional visual interpreters and a very secure platform. So we are able to provide you with descriptions, we're able to help you navigate just about anything. If you can do it, we can interpret it. And then we also have event services where we interpret public events or private corporate events. For example, we'll be providing visual interpreting for the vice presidential debate. Coming up with that, we also have an AI component that is called Access ai, just to make things really confusing.
(02:52):
And that particular component is image recognition and image identification and description, but it has a different component to it called verify. And you can verify the results of your AI description with an Aira visual interpreter. So they can take a look at the image and if the description is correct, they'll just say the description is correct. If it's incorrect, if it hallucinated or got something incorrect, then it's going to tell you what they got wrong basically. And this is really handy in things like gauges, dials, controls where AI sometimes gets very confused. And also health testing. Best example is a covid test. AI does not read the covid tests well. And so if you are using it to look at some type of test like that, you can verify it quickly with a visual interpreter or you can also contact one of our live visual interpreters and talk to them one-on-one.
Aaron Preece (04:01):
Very cool. So it seems like that would be a great feature too, whenever you are in a situation where you need to recognize something and you want the verification, but you might not be in a situation where you can actually speak to someone. So it seems like you have a lot of flexibility there.
Jenine Stanley (04:15):
Exactly. Exactly. And we are also increasing our number of access partners, which means places that you can use Aira for free because the entity is sponsoring the time that you're using Aira. And that includes places like airports all around the US, Canada and I believe we have one in the UK, but it's definitely US and Canada, Starbucks, Target, Bank of America, TD Bank, and we've got some big partners coming up. But we're also in places like the entire state of Colorado, all of their state services, if you're using one of them, you can use IRA to help you navigate including their state parks. So if you go to a beautiful state park in Colorado, you can use IRA to help you navigate that park.
Aaron Preece (05:07):
That is very cool and very good places to where you would really benefit from someone, a visual interpreter for sure. So I know that IRA is available through your phone's camera and that sort of thing. And I know back in the day there were some smart glasses that you could use and from what I understand there are smart glasses now that you can connect to in various ways. Could you give us some information on that?
Jenine Stanley (05:30):
Absolutely. So we right now are available on the Envision glasses and this is Google Glass. So we are available through the Envision product and that is called an Aira agent. You can get to that through the Envision Glass. And we're also starting a beta program with the Meta Ray Band Smart glasses. And the way this works, you can sign up and it's similar to when we rolled out our Access AI. We bring on board maybe 25 people at a time in a session and so people will be added to the beta as we go. But right now we are working with the WhatsApp application to get into the meta service and be able to use the camera from the glasses so you're completely hands free, your phone's in your pocket and you can actually use ira. The program is pretty small right now, but it is growing. So if you have the Meta Ray bands and you'd like to get involved, you can go to the IRA app under the more tab and go into the smart glasses tab and there's an application form right in there.
Aaron Preece (06:41):
Perfect. So anything else in particular that people should know about or anything that you would want to bring up or want people to be aware of? Sure. Well
Jenine Stanley (06:50):
We have a program now, a new initiative called Build AI and this is working with one of the major AI providers, one of the major GPT providers, and we are giving you 10 free minutes every half hour.
(07:04):
And you can choose to have that 10 minute session shared with our access partner or not. It's totally up to you to opt in and opt out, but the program is there if you'd like to sign up. It's for the United States only right now, but we're hoping to expand it once we get the initial testing done. And all of this is to craft a better AI for blind and low vision people so that we're including data that's important to us and relevant to us versus that big general block of data that goes into the GPT.
Aaron Preece (07:40):
For sure. I imagine especially when we're getting information in about how to describe things and what sort of information is important and that sort of thing.
Jenine Stanley (07:49):
Yeah, how to give directions. If we can teach the AI to give directions, we can probably teach anyone. And that's the joke. One of the jokes about Aira is you'll never hear us say it's over there, not going to hear that. So we're hoping to build that into the AI along with many other things that we have no idea what we're going to discover from this research, but we're excited to find out.
Aaron Preece (08:17):
Sounds very cool. And one thing you mentioned Aira in general being very secure and that sort of thing and when you just mentioned you're never going to hear over there. One thing I'd say to people is when I've used ira, it's very professional people as far as I understand. People are trained, trained to describe things to people who are blind or low vision and it works very well.
Jenine Stanley (08:36):
They go through extensive training and then some monitoring and also continuing education when something new comes out like the meta glasses or something that people are into like a sport or an event that's coming up, we try to provide them with as much information as we can and we recently became HIPAA compliant, so if you're in a job that requires HIPAA compliance and you would like to use IRA as one of your access tools, you can now do that and we'd be happy to talk to you about that.
Aaron Preece (09:08):
That is very cool. I feel like that's very rare to have access to that sort of thing in any circumstance. So that's amazing and a lot of people I'm sure could benefit from that stumbling block.
Jenine Stanley (09:18):
I think for folks even getting human live volunteer help, it's a big stumbling block at times.
Aaron Preece (09:27):
Alright, well thank you so much for speaking with us here at the AFBLC.
Jenine Stanley (09:34):
Sure, thank you. And let me give you our contact information for sure. You can go to our website to learn more and that's aira.io, or you can call our customer care team. They're amazing. They're available from 6:00 AM to 6:00 PM Pacific time, seven days a week. And that's 1-800-835-1934 or support@aira.io.
SELVAS BLV
Aaron Preece (10:02):
And we're at the Selvas BLV booth. So I'll turn it around here and hand it over to Antonio to talk about what Selvas BLV is doing and what products they have.
Antonio Guimaraes (10:13):
Hi Aaron, thanks for having us here, which is really great for me personally, I live in town but we do travel all over the place and would've been anywhere with you guys. I am new with Selvas, newish. I started in February of this year, 2024 and when I came on board we were releasing the braille eMotion. We released that at CSUN. That is our 40 cell smart braille display. It is a media player note taker connectivity to five Bluetooth devices and one USB. It's a very versatile device and I encourage folks to learn more about it and become familiar with its features, which are many. It does have speech output, so if I go here —
Selvas BLV Device (11:16):
File Manager
Antonio Guimaraes (11:18):
— that is connected to a Bluetooth speaker so it connects also via Bluetooth to speakers, keyboards and so on. We took many of the very useful applications of the braille SIM six, our 32 cell braille display flagship product now in its sixth iteration. It's a 32 cell braille tablet with a LCD screen, braille input and tons and tons of useful apps. It does email, it integrates with right out of the file manager with Dropbox One Drive and Google Drive as well. And again, I would encourage folks to become familiar with this. I just wanted to be really concise. The braille Sense six and its small sibling 20 cell braille display. They are Google certified Android certified devices and you can go on the app store and download anything that's up there that we don't provide right out of the box.
(12:40):
The last device I want to talk about is our Sense player. It is about a year and a half old right now in September of 2024. And it is a media player with capabilities like internet, radio, FM radio podcasts. It'll download and read books from Bookshare and other library sources. It does or will very soon have NFP News line capability so you can download any publication there and read it from your Sense player. It's good to be here right at home. My first AFB Leadership Conference was I believe in 2014 when it was in New York and this is my second.
Aaron Preece (13:41):
Alright, so with the, you said the eMotion and then the braille sense six it seems like. So the braille sense six is for someone that's looking for a full tablet experience all in one device since you said it's essentially an Android tablet as far as I understand. And then the eMotion is the as a braille display, but it also has some built-in features. So if you're looking for something to connect to a device but also that might have other features built into [the] device but more low key or where you might be doing most of the work on a computer or another tablet or phone. Is that correct?
Antonio Guimaraes (14:22):
You think of the braille eMotion as our replacement for the braille edge. I wasn't a user of the braille edge myself, so I'm not completely familiar with it but over here I've turned off speech. So looking at the braille, there's a file manager, there's a notepad, so you can take full braille notes in class or anywhere, connectivity, document reader, daisy player, which you'll play via speech or you can read on the braille display media player. It is also a sound recorder and there's a series of functionality through library services like online daisy bookshare and so on. There's a sense bible here that we brought over from the braille sense six when we go to the next release, it'll have guard mobile and the NFB Newsline and so on. So I started with these devices with the braille light braille and pretty light really. I used Braille light quite a bit back in the late nineties and this is like what the BRI light could be 20 years later.
Aaron Preece (15:52):
Gotcha. So it seems like it is very, very full featured. So essentially it's a note taker in itself in addition to being Yeah, so it's the device. It is a full featured note taker in addition to being a braille display. So with the braille, sorry, the book sense, the Sense player, it seems like it's internet connected, you have access to podcasts. What sort of file formats would that, if someone's going to transfer things over in addition to using all the internet capabilities, what sort of file formats would that can manage or support?
Antonio Guimaraes (16:33):
So read just about anything you can get your hands on. So PDFs, XML, HTML, BRL, BRF, docs, text, things of that nature would be available for reading on here. And this one and other braille displays and tablets can read files regardless of size. So if you get a 200 page PDF on braille sense six or here on the Sense Player, you can load it up and read it.
Aaron Preece (17:21):
Okay. So no problem with size when it comes to the PDF side of things, how does it handle with the tagging with PDFs? Is it pretty good about following structure as much as reading like that would, just curious about that.
Antonio Guimaraes (17:39):
Yeah, it will look at tagging and what's available there. I don't have personal experience with reading a lot of tagged and PDFs with headings and so on, but it really should be picking that up and reading it for the user, letting the user navigate through that. It does with Daisy. So our document reader’s pretty robust so it'll do it there as well. Perfect.
Aaron Preece (18:12):
Yeah, I know at least on the PC sometimes it can be a fun experience so to speak. PDFs, that seems a great use of that to read textbooks or any lengthy PDFs like that. So anything, so these are the devices. Anything on the horizon or anything in particular that people should know or people should be aware of?
Antonio Guimaraes (18:32):
Nothing on the horizon. We continue to support our products. I think one thing that people end up finding out is surprising and we're proud of is that we support our older devices. So if people have a YouTube or book sense something like that, give us a call and we will let you know whether we have a part or we support that product. But I think that's something that we do that we're really proud of for our users.
Aaron Preece (19:07):
That's very cool. And I know from what I've heard and from what I've seen, it's very durable devices and things I know people have been using for many years, so that's great. I'm sure there are people for sure that hold onto those devices and it's great to know that they can still get support and the lifespan is very long and continues. Well that's all I have unless anything else, unless you have anything else you want to say. But thank you so much for taking part in the podcast and doing an interview with us.
Antonio Guimaraes (19:45):
That's all I got as well. Look us up on Facebook, we're there and online at selvasblv.com. That's S like Sam, EL, V like Victor, A, S like Sam, BLV for blindness low vision, .com.
VISPERO
Aaron Preece (20:12):
Thank you. Hello everyone. We are here at the AFBLC 2024 conference here in the exhibit hall and we're at the Vispero booth.
Ryan Jones (20:22):
And hi everyone, I'm Ryan Jones. I'm the Vice President of Software at Vispero, so I oversee everything related to software like JAWS, ZoomText and all of our different products around enterprise accessibility, compliance.
Jeff Bazer (20:36):
Hi everybody, I'm Jeff Baer, I'm the Central Sales Director. So I cover several states for Vispero from Texas all the way up here to Minnesota where we are now and a few, but I work closely with all of the state agencies, schools, libraries, all of our distribution network in my state. So it's great to be here, Aaron. Appreciate it.
Aaron Preece (21:01):
Thank you. So I think we're going to start with Ryan talking about, so you're going to talk about JAWS, I believe.
Ryan Jones (21:06):
Yeah, I think one of the things that I've been sharing here at the conference and participated in a session yesterday is really how AI is changing and making us more productive as screen reader users and some things in particular that we're doing with JAWS to really boost our ability to consume information and to break down barriers that we've traditionally seen. So one of the biggest barriers that we've always dealt with in screen reading is access to visual or graphical information, whether it's pictures on a webpage or charts or a screenshot in an email. Those were always things that we were very limited in what we could do with screen readers to interpret or provide access to that graphical information. But with AI technology, we're able to generate and produce very detailed descriptions now of graphical information. So we put out a few months ago in JAWS a feature called Picture Smart ai, which lets you in the flow of your regular work, whenever you come across an image or something visual that you want described, you can press a couple of keyboard commands and JAWS will generate a very detailed description.
(22:18):
You can then query that so you can ask follow up questions and get follow up details about whatever it is that you are wanting to get described. So that's a big boost from an AI perspective that we're doing with screen reading now. And then the other thing is AI is really good at synthesizing large amounts of information. And so we think about all the times where people get stuck trying to remember how to do something, whether it's in JAWS or Zoom text or even just something in Excel, something for example, I was trying to hide columns in Excel not long ago and I could not remember how to do it. Something I rarely do to go out on the internet and Google search that mostly what I was getting was mouse driven explanations, click on this and drag this and so on. And that doesn't compute for someone who's using a screen reader.
(23:12):
So we built an AI powered tool called FS Companion that lets you ask just regular questions about how to do things, whether it's with JAWS like changing jaw settings or just working in office applications. How do I do this in Word, how do I add page numbers to a Word document? And it's going to give you uniquely trained information on how to do that. So it's going to be keyboard information, it's not going to tell you to click on this or that, and it's going to have JAWS in mind as your tool of how you interact with the computer. So we've really been enjoying sharing those things. Picture smart AI is already out in JAWS, the FS companion tool we're releasing this fall. So it's just a lot of great stuff that AI is helping us break down barriers and been enjoying sharing that with the folks here at the AFB conference.
Jeff Bazer (24:03):
Aaron, one of the things that has really I think helped us out a lot and a lot of folks who are using our software is the training we provide. We have a lot of resources for that and many of them, all of them are free. So it's great because what you can do is you can get onto our site at FreedomScientific.com/training is where all of that lives. And that's kind of the homepage of all the resources we offer. And one of the things that's very nice is we do a number of webinars around features in the JAWS, zoom text and Fusion as well as our hardware. We have of course three lines of video magnification software and refreshable braille displays as well. And we cover things like that in those webinars and then we archive all of them. So most of the time throughout the month, we usually do 'em, I think on the third Thursday of the month at noon Eastern time if you want to sign up because the thing about signing up live is you can get ACVREP credit for attending those, but then we archive all of them as well.
(25:14):
And we were even beginning to do this before COVID-19. So there's a lot of information up there. So if you're looking to maybe learn more about Google Docs or you need some of the new teams keystrokes, things like that, I mean, or Gmail recently changed and the whole basic HTML view went away. So if you hadn't used the standard view in Gmail before, we did a two-parter on that, a two-part webinar to explain all of that and kind of introduce people to it who hadn't done it before. So there's just so much information out there that folks might not otherwise know about. And I guess maybe some additional information, just so everybody knows kind of the whole background of Sparrow. So Sparrow is the umbrella company or the, what do you call it, Ryan?
Ryan Jones (26:12):
Kind of the parent company of several brands.
Jeff Bazer (26:15):
That's exactly right. And so the brands underneath the Sparrow, of course Freedom Scientific. So we know about JAWS, ZoomText, Fusion and the focus line of refreshable braille displays, but also the video magnification products under Freedom Scientific. You might've heard of the Ruby line of portable magnifiers, Topaz, Onyx, all of those things. But also we have Optelec that's part of Vispero, so if you've ever used a clear view or the compact line of portable video magnifiers and also enhanced vision. So things like the Merlin Da Vinci, Merlin Mini, things like that. And our products are all over the place. I mean the Veterans Administration, we have a contract with those for several of the enhanced vision products. But these are mature lines of products that have been out for, geez, 25 and 30 years that we all got under the Vispero umbrella. So it's a large company, but we are very diversified and we have folks who have been doing this a long time who can do some wonderful demonstrations to share how these products can help folks in many different environments.
Ryan Jones (27:33):
One of the other things we do also is we have a consulting arm in Vispero called TPGI or the Paciello Group Interactive. And that's our group that works with other companies to help make their digital technology more accessible, whether it's website accessibility, smartphone app accessibility. One of the new areas we're working in over the past few years is kiosk accessibility. If you've ever been out places now you see kiosks everywhere, whether it's a restaurant or a grocery store or a bank anywhere, you find kiosks where you have to interact with a screen. And so we've been partnering with companies to help make those accessible. We even have a specific version of JAWS that's geared for kiosks. It can run on Android kiosks or Windows kiosks. And so we partnered with companies like McDonald's here in the US and in some other countries as well as just one example of helping companies make their kiosk accessible to someone who needs text to speech. So that's kind of an example of the new world we live in, the technology driven world, the self-service world that a lot of businesses are moving to, that's a huge area of focus for us and we're really excited about the interest that we see from companies who understand that this is a problem for people who are blind and low vision and we're partnering with them to help solve those problems.
Jeff Bazer (28:56):
And one of the things we're seeing as well is once someone knows that, hey, this is a version of JAWS that's on this kiosk, it instantly kind of lowers the barrier a little bit. They say, wow, I've used JAWS for years, I can do this. And you sit down at a kiosk, you plug in your headphones and away you go and you find out that it's really pretty simple to use straightforward once you figure out how to navigate around. And it's been a great experience for lots of folks who have tried it.
Aaron Preece (29:28):
Very cool. And as you say, this is one kiosk have been something that's been, as computer and mobile accessibility has increased, it seems like kiosks have historically been left behind. So this is really cool to hear that you're working on this. So with the JAWS that's built for the kiosk, I assume, is it based on a touchscreen device or are there keypads attached to those or how does that work?
Ryan Jones (29:48):
Well, what we usually recommend is that, so JAWS can work with a touchscreen or also a keypad. Our recommendation to companies in this space is have it use a keypad because that allows, number one, you have a headphone jack in the keypads in many of them. And so then there's already a place there to plug in headphones to. And number two, just having physical buttons makes it easier to use for many people. But JAWS will work with either one, whether you're using a touch touchscreen, we've implemented gestures, JAWS has had gestures in it for a long time, even just for laptops, but you don't see them used near as much as on mobile devices, of course. And the cool thing about the kiosk version is JAWS recognize as if you're doing a gesture on the touch screen or a keypad, and it will give you hints based on how you're interacting.
(30:37):
So if you're swiping, it's going to say swipe right to move to the next item, but if it notices you hit the right arrow on the keypad, it will say press the right arrow again to move to the next item. So it's giving you that context sensitive tutor messaging to help you navigate because we don't assume that everyone knows exactly how to use all this technology. So the key for us here is just giving help information to help guide you through the process, whether you're ordering food or buying, checking in at an airline for example. We want to guide you through that process with JAWS.
Jeff Bazer (31:13):
And if you're able to use a smartphone, you can certainly use one of these because the gestures are very similar. You're swiping, you're double tapping, you're moving back. So it's great. I mean, it's already familiar to you if you've used your phone for years like many of us have. So it's a very user-friendly experience.
Aaron Preece (31:35):
That makes sense. A lot of people, especially if someone say new to blindness and they're still learning their technology, having all those kinds of hints would be very helpful. And just depending on, you never know what kind of being able to cover wide range of experiences and abilities. That makes a lot of sense for a publicly facing device like that.
Ryan Jones (31:55):
Yeah, exactly. And most of these devices, they don't have a keyboard, so you don't have access to all the myriad of keyboard commands that we might normally use to interact. So that's kind of a good thing in a way because it makes everything simple. You basically, you probably have four or five different gestures or keys maybe on a keypad and that's it. So it's a simple way of interacting with a kiosk.
Jeff Bazer (32:17):
Aaron, just so your listeners know, we're at the point right now here at the Conference where we have just released the public beta version of 2025 of ZoomText, JAWS, and Fusion all at the same time. So for any of your listeners who maybe haven't used our software for a while or want to try out the latest and greatest version, even if you're on the previous now, you can get access to this, you can download it, you can try it and see what you think. And of course we have a number of different programs for being able to purchase our software, but you can get in on a home license, something you're using at home for as little as $95 a year with JAWS, for example, a little bit more for Fusion and a little bit less for Zoom text. But anytime we have a new version we're getting ready to release, it's a great time to kind of come back and try it again if you haven't for a while or get pretty excited about a new version.
Ryan Jones (33:17):
Yeah, get updated. So you can use things like Picture Smart. I mean that's the nice thing if you're having to buy our software on your own from your own budget, that annual subscription is just a few dollars a month basically, and you'll always have access to the current version. So when something like Picture Smart comes out, you immediately have access to it because you are always current. It's so important to be current because all these things are changing around us, used to be on a yearly basis now things are changing on a monthly basis. Google's pushing updates and Office and we get new AI technologies like Picture Smart. And so things are, the pace of change has really increased over the last couple of years.
Aaron Preece (33:57):
So quick question. So Jeff, you mentioned the training materials and you all talk about the large number of varying products that you have, the video magnifiers, the focus brail displays, JAWS, Zoom, Text Fusion with the training materials and the webinars and that sort of thing. Does that cover all those types of devices or are there separate categories for essentially different products or different product lines or how does that work?
Jeff Bazer (34:24):
So we take new features or popular features in all of our products. So to answer your question, yes, absolutely there is hardware and software up there and if you go up and you look at the archived webinars that are there, of course you can move by heading to be able to move through all of them. Or if you're looking for something specific, you can do a find and likely it will be on there. So we've broken them out into some different categories too. For example, I think Microsoft Office and then the Google Suite, we've kind of grouped those together, but those webinars will cover all of our products.
Aaron Preece (35:03):
And then Ryan, when you were talking about Picture Smart, can you give us some details on how that works? Is it like a full screen picture? Can you narrow it down or how does that work exactly?
Ryan Jones (35:12):
You can actually do all the above. So if you are on a webpage and you come across and Jaw says graphic and it reads you the alt text or something, you can just have it get, describe that exact image and not the whole screen, but if you want to get the whole screen, you can have it do, it'll just take a screenshot and it'll then describe the entirety of the screen. So it's based on what you want and to kind of fit into the workflow of what you're doing. So people are using it in all different ways. I use it to describe the whole screen. Sometimes when I think maybe there's an inaccessible error message on the screen that JAWS can't read because it wasn't tagged properly, I'll take, I'll use Picture Smart and have it describe the whole screen and then I might figure out, okay, that's why it's not reading because there's some popover sitting on top of everything. But if I'm shopping on an online shopping site and I'm looking for sweaters, for example, and I want to get a really detailed description of the sweater I'm looking at, then I don't need it to describe the whole screen. I just wanted to describe that picture that I'm on right now with JAWS or a PowerPoint slide. For example, if you're working a PowerPoint and you want to get a description of the slide, it's just going to do that slide. So there's different keystrokes based on what you want to get described.
Jeff Bazer (36:28):
One of the interesting things that somebody might want to try in addition to everything Ryan just said is if you're watching a video online, maybe on YouTube for example, you can pause that video at any point. And we all know how, okay, there's dialogue on these videos a lot of times, but oftentimes too, there's lots and lots of things happening in the background, the scene, where are they, what's going on, kind of thing. And it is absolutely amazing the detailed description that Picture Smart AI will provide on the video. So you simply just have to pause the video, and the best way to do it then is to put that video in full screen mode with YouTube, and then you're just going to get into the Picture Smart area through the layered keystrokes. So you'd go insert Spacebar, hit a P for Picture Smart, and then you'd hit S for the screen like Ryan was mentioning before. And if the video is on the screen, it will describe it at that point where you paused it. It's amazing.
Aaron Preece (37:34):
Yeah, it seems like AI in particular has been such a game changer recently that it's, and especially in the last couple of years, it's really kind of exploded under the scene and there's always new things coming out and the pace of that, like you said, the pace of everything in general in tech is increasing a lot. And then AI in particular seems like the pace of advancement in AI is really high right now. In particular,
Ryan Jones (37:58):
It's so fast. The hardest thing is just keeping up and knowing what to focus on. So many different things you could be doing. We can't do all of them at once, so what order do you focus on? But I think the feedback that we get is, the picture description is obviously that's been such a longstanding barrier for those of us who are blind or low vision and breaking down that barrier was a no brainer for us to start with.
Jeff Bazer (38:22):
And you mentioned this earlier, Ryan, but just to kind of underscore again, so when you do get a description and you want to narrow it down or you want to get in on some specific detail, there's a link at the bottom where you can actually click and then you can ask more questions. So you can type there in the edit field. Well tell me more about whatever specific element that it described to you in the video or any kind of description that you get.
Ryan Jones (38:51):
Yeah, a chart is a good example of that. If you're looking at a graph and maybe you want to see, tell me where does the line cross the X axis, but what value does it cross the Y axis? Those are detailed questions that the AI would be able to answer for you.
Aaron Preece (39:09):
Very cool. So any last things that our listeners should know about or anything you want people to particularly be aware of?
Jeff Bazer (39:19):
Well, I think if you want to get ahold of us for further information, my email address is J Bazer, so J-B-A-Z-E-R at Vispero, V-I-S-P-E-R-O, .com. jbazer@vispero.com. And if you just have a question on anything we spoke about, you want a link to a specific webinar that maybe you don't have a chance to, you're not quite sure how to get there, we can do all of those things. We'll send you that information specifically, or if you just have questions on any products, that's what we're here for and we're happy to help.
Ryan Jones (39:51):
And people can reach me at R Jones, R-J-O-N-E-S, at vispero.com. rjones@vispero.com. I'll be happy to help as well. But I've just always encouraged people to just keep up with what we're doing. We have so much happening and we have updates to JAWS, for example, coming out every eight weeks. And it's not just bug fixes, it's actual new things in the software in many cases. So try to stay up to date, follow us on social media and keep learning about what's going on and how it will make your life easier to use if you're using screen readers.
Jeff Bazer (40:21):
There is one other podcast that we do that could be very beneficial for your listeners, Aaron, if they haven't checked it out already. And that is our FS Cast, Freedom Scientific FS Cast, and you can Google that or you can find it on our pages as well. But that's something else that we update about every six weeks or so.
Ryan Jones (40:40):
Every month we have a new episode coming out and we're talking about all kinds of things, not just our products, but just human interest stories, other things going on, learning about the lives of other people who are blind or low vision. So it's a variety of different topics that get covered.
Jeff Bazer (40:55):
Those are all archived as well, all the way back to 2006. So there's 270 or 60 or 70 episodes up there. You could binge listen for days. You sure could.
Aaron Preece (41:08):
Alright, well thank you both for taking part in this podcast. We really appreciate it. Thank you so much.
Outro (41:19):
And thanks again, Aaron, for hitting all the booths at the 2024 AFB Leadership Conference. We'll be dropping more episodes this month around the conference and things that were happening there. So lots going on. Check out Access World at www.afb.org/aw for the latest issue of our AccessWorld Magazine. And again, thanks everybody. We'll talk to you soon.