So, a blind person bought a Pinephone, wanting to put Mobian on it. Since Debian has given such good accessibility features on desktop, it should give just as much accessibility on mobile. Debian on mobile should have blind users' backs. Right?

Wrong. This is just what I've been saying for the past year or so. And now, for this person who has spent their hard-earned money on a Pinephone, it's too late. Now all they have is an expensive paperweight. There is an issue created for this, though.

#a11y #debian #mobian #mobile #foss #accessibility

@devinprater All respect to this, but mobian devs, posh devs, gnome devs, and so on and so forth are doing their best in order to deliver a product that is not ready to replace an android device yet, they don't have the resources that android devs have, they're not backed by google.

So if you want to improve this situation, donate if you have money and send the message. Probably a crowdfunding campaign would help.

@lorabe This is but a symptom of a systematic issue of inaccessibility in FOSS. People can toss around blame all they want. I didn't even blame Pine for this. And yeah, users should read about stuff they're about to spend money on. But this user trusted the Debian, and thus, Mobian community. But whatever. I'm stepping back from FOSS for the most part. I'll comment on it, but I'm not about to do more work when I'm basically alone in doing it.

@devinprater You are in your right to complain, but that doesn't solve the problem, it prevents understanding.

It's quite easy to complain when you take the availability of funds and hired people for granted, but programmers are spending their free time in good faith and people don't seem to care or acknowledge their contributions.

I guess in this case i will side with the devs, but the best solution to all of this is to coordinate and collect money in order to actually fund development.

@lorabe Sure. As a blind person, I’ve tried putting myself out there, so that developers can work with me and otheR blind Linux users. But sure. I’m just yet anotheR damn user taking advantage of poor developers that are just trying to enjoy something that isn’t their day job. Never mind that companies like System 76 and the Gnome Foundation work on this fulltime. But whatever. I won’t bother the developer gods with such lowly issues as the most disadvantaged group of people ever not being able to use their software which they publish to the world.

@devinprater @lorabe
Phosh is mainly developed by Purism, not by Mobian, Debian and especially not by Pine. The pinephone costs a fraction of the Librem 5, for the most part because they don't create the software for it. I paid 4 times as much hard earned money for an L5 as you, so that we can have that in the future.
Funding as well as volunteering are needed there.

@danielst @lorabe I’ll fund whoever will work on accessibility. Right now, that's just @storm and the Stormux team.

@devinprater @danielst @lorabe @storm
A big problem is that libre devs seem to just not want to learn about accessibility.
If you spend hours ricing your setup or arguing about languages, you can't claim to not have time to read up on accessibility.

Accessibility is also not something you add as an afterthought, just like security, you consider it from day 0, so you don't have to rebuild things from the ground up when it turns out your initial assumptions are incompatible with accessibility.

When there is a problem, there are usually two ways. 1. Find someone to blame for the issue, which is very easy. 2. Find a solution to the problem, which is very hard. When the solution requires collective action, it needs a lot of patience to work with others and then the solution may take a long time to build. So a lot of people chose option 1.
@devinprater @danielst @lorabe @storm

Good point. Although in theory there already supposed be organizations where accessibility dev is an explicit goal, like GNOME. Maybe a new organization would help move things along faster, but I really really hope that GNOME gets its accessibility act together.
@devinprater @danielst @lorabe @storm

@csepp @praveen @danielst @lorabe @storm I kinda feel like they’ve forgotten about accessibility, but that's just me being synical. But I wouldn't know how to even start an organization.

@devinprater @csepp @praveen @danielst @lorabe @storm GTK4 has pretty much a full revamp of the accessibility system (taking out ATK, to use AT-SPI2 directly). It probably seems like bikeshedding, but it's actually them training a team who actually know about accessibility on Debian; all that knowledge was lost during a failed inter-organisation migration when the funding disappeared.

@wizzwizz4 @devinprater @csepp @praveen @danielst @lorabe @storm Looking at other gnome apps I see some of them are transitioning to GTK 4. If nothing radical happens by the time gnome 42 gets released we will feel a11y experience is rapidly degrading I'm afraid.

@pvagner @wizzwizz4 @devinprater @csepp @praveen @danielst @lorabe I do not use Gnome, haven't done so since gnome 3 came out. Litterally the only reason I use orca is because it's the only GUI screen reader for Linux. My hope is that most of the apps I use will stick with GTK 2 and 3. I have heard quite awful things about gtk 4, and not just for a11y.

@storm @lorabe @danielst @devinprater @wizzwizz4 @csepp @pvagner I just donated 50 USD to to fund "Software Optimizations: Accessibility support/screen reader support with phosh/gtk4" @purism I think we need to find ways to fund a11y support in gtk4, sticking with older unsupported versions is not going to be sustainable for long term.

@storm @lorabe @danielst @devinprater @wizzwizz4 @csepp @pvagner @purism I just talked to someone at Purism and they are positive about supporting it as it aligns with their goals. They are asking me for a list of priorities. I suggested screen reader, but if you all, who needs this more than me, can create a prioritized list of accessibility features, then I can share it with them.

@praveen @storm @lorabe @danielst @devinprater @wizzwizz4 @csepp In relation to @purism #librem5 The most prominent and difficult to implement feature would be #accessibility aware touch input support. In order to be productive we need to be able to explore the screen content before activating touch controls.

@pvagner @praveen @storm @lorabe @danielst @devinprater @csepp @purism What would the UI for that be like? "Single tap reads, double tap activates"? (Would there be a clicking noise when you tap something, or does it just read straight away?)

From what I can tell, the stuff I've described wouldn't be that hard to implement, assuming a correct AT-SPI2 implementation in the application. In Firefox, you'd be able to "see through walls" (be told about things in hidden tabs) until that bug is fixed.

@wizzwizz4 @praveen @storm @lorabe @danielst @devinprater @csepp @purism Single tap / touch / hover would read what's under the finger if there is enough text / accessibility support within the underlying control. Double tap should activate. There should be also a way to assign other touch gestures to screen reader actions such as text review commands

@pvagner @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
(1/4) While Purism is overwhelmed, understaffed and underfunded, I could actually imagine that GTK4 makes a11y simpler in the long run. Why? Purism created libhandy, now libadwaita in GTK4, providing consistent, complex, advanced, themeable controls, automatically adapting whole dialogs between mobile and desktop form factors.

@pvagner @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
(2/4) libadwaita controls know about their state, e.g. settings dialog knows it's currently in the WiFi sub-dialog, even if the menu is hidden on mobile. Apps using those controls automatically benefit from all improvements there, be it default gestures or screen reader integration.

@pvagner @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
(3/4) Question: Is one-tap-read-two-click really a good approach? It implies you have to tap around a lot to find stuff. With libadwaita it should be possible to do something like "read out top level items". For gnome-settings, in desktop mode it would read "menu" and "content: WiFi", indicating that WiFi is the selected menu item.

@pvagner @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
(4/4) In the mobile view, only either menu or content are visible, starting with menu. Thus, it would instead directly read out the available items, possibly assisted by gestures, e.g. tap: stop and re-read current item, swipe up: read previous, swipe down: continue reading, swipe right: select. Then continuing by reading the top level items, either settings or groups of settings, inside of WiFi.

@danielst @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism Hover or single tap to explore and double tap to activate is a typical interaction model on IOS and android so far. I may very well be missunderstanding but what you are suggesting reads the whole screen at once and users should influence that reading.

@pvagner @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
Yes and no. Possibly, different mechanisms can complement each other. BUT I'd take advantage of structural info, which is possibly more available in libadwaita apps than in some frameworks. There might always be free-style content that needs exploring, but structured controls allow much more precise navigation, without possibly missing a weirdly placed item.

@pvagner @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism
For example, the label of an input field is not independent, but directly related.
In my example, you could also swipe down 5 times very quickly, to skip reading 5 items if you already have a clue. And of course keyboard or speech recognition could be used to directly jump to relevant items. But the structural info "this is a menu" should help no matter what is your favorite approach.

@danielst @pvagner @praveen @storm @lorabe @devinprater @csepp @purism Yeah, it already does that; that's standard for AT-SPI2 screen-readers (i.e. Orca, the only AT-SPI2 screen-reader). This is talking about an additional behaviour for touchscreens.

@danielst @pvagner @praveen @storm @lorabe @devinprater @csepp @purism Though mapping *gestures* to “keyboard navigation” is an interesting idea. It probably shouldn't be the default (since it's different to the way everything else works), but it would be cool as an option, I think.

I think continuing this discussion at will be better, if any info we discussed here but missing there, then please add it.
@pvagner @wizzwizz4 @storm @lorabe @devinprater @csepp @purism

@danielst @praveen @wizzwizz4 @storm @lorabe @devinprater @csepp @purism Imagine someone is reading the content aloud and you instruct him repeat the last item, start from the begining and similar. Touching and tapping should hopefully allow us to either navigate to or directly touch the item we are interested in.

@danielst @pvagner @praveen @wizzwizz4 @storm @lorabe @csepp @purism For this, usually one can slide their finger around the screen, to explore otheR items. So, this is "exploring" the screen by touch.

Sign in to participate in the conversation

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!