Accessibility Issues

  • 4 November 2023
  • 11 replies
  • 91 views

My wife has a degenerative neuronal disease and accesses her apps and communicates through an eye movement enable tablet running windows. I can put to Sonos app onto it and she can move the cursor with her eye gaze but can’t get the app to recognise that she wants to action an instruction, like choosing playlists, stopping and starting play, etc. Anyone had this and knows of a solution?

 

Thanks

Corky Cordo


This topic has been closed for further comments. You can use the search bar to find a similar topic, or create a new one by clicking Create Topic at the top of the page.

11 replies

Unfortunately, I don’t. And Sonos has stopped ‘supporting’ the desktop apps, both Windows and Mac. At this point they’re only providing bug fixes to both apps. 

You may want to call Sonos Support directly to discuss it, as you might be able to consider this a ‘bug’, but it’s hard to see them dedicating potentially serious coding time/resources for what may be a single user case. 

Thanks Bruce. I noticed how everything sent me to my phone rather than desktop when recently adding an additional speaker for my wife. I might check in with Sonos but, as you say, they might see it as a low priority. Having said that, the total accessibility market is many, many thousands in the UK alone. Spotify do it quite well and is her favourite streaming app for that reason, so I might find a way through by getting Spotify to recognise Sonos (as opposed to Sonos recognising and listing spotify).

Anyway, thanks for responding.

 

I never thought about working with this sort of handicap. As she moves the cursor, do items, such as the Play button, acknowledge the mouse over by highlighting? With other Apps how does she communicate a cursor ‘click’?

Hey Biuzz, yes, the items light up but do nothing more so nothing happens. In other apps (Spotify has an accessible version of their app, for example), the item lights and, if she holds it there, it actions. Thinking about that, most apps on the tv now action if someone highlights the button and doesn’t do anything for a few seconds, the app actions Play and it runs.

I don’t have Spotify installed on my PC, but the iPad App can send music to SONOS.

I’m going to try that. Thanks

 

After reading this how to I discovered the following SONOS controller interaction that might help your wife. Obviously I cannot try this with eye tracking. Overall, I had a little trouble understanding this narrative.

My interpretation is that, after highlighting an item, the user can go to the control bar and select mouse actions. I don’t know exactly what “precise mouse” will do, but I noticed that if I highlight an entry in the “System”, “Now Playing”, and Music Library detail lines, then right or left click yields a useful response.

I could highlight any of the controls at the top except Volume. Once highlighted, the control responded to a left mouse click. I was not able to adjust Volume. I could Mute.

.

That’s great Buzz, thank you. I’m trying to resolve it this morning, so I’ll let you know if I got it working.

Hello Buzz and Bruce,

 

So, after lots of trial and error, I have it working - in a way…It’s taken most of the week in phases of trial and error.

 

It does work - using Spotify  I have to set the day up each morning using my wife’s phone. Once in Spotify, I send it to ‘available devices’. Each day, I have to tell it to forget the various amazon devices around the house, then I select one of the Sonos speakers, and then use the three little dots at the righthand edge of the selected device, select Group, it switches to the Sonos App and there I can group the speakers, then I start the session, and then I’m returned to Spotify and the system works as one - all speakers operate as a single unit. From there I can go to the Spotify app set up on her tablet with eye recognition set up, and from there she can change music, search for things, play them and adjust volume up and down. So long as her phone stays in the system network, she can adjust from her eye recognition setup on her tablet.

At the end of each day, the system ‘forgets’ the entire setup and I have to do it again for her the next morning. If I set it up from my phone and switch to her tablet, the system stops working if I leave the building - and therefore the network. So it must be setup through her phone.

 

The eye recognition software in Windows is what her tablet is using - as far as I can tell. It is very capable in lots of ways but, as always, there are limitations.

Thank you both for taking the trouble and your thoughtful responses. You helped a great deal and my wife has a big smile. She says her carers are enjoying having music as well, which is a bonus for some very hard working people.

 

Best wishes,

Steve

 

 

I’m wondering if there is some sort of automation tool/keystroke emulator that can be used to simplify the common tasks she’s doing with Sonos.