StrategyCar podcast logo
StrategyCar
#3 Social Justice in the Digital Age with Laura Kalbag
Loading
/

Laura Kalbag is the author the book, Accessibility for Everyone. She is a designer from the UK, and one-third of a two-person and one-husky social enterprise called Indie that works for social justice in the digital age. At Indie, Laura follows the Ethical Design Manifesto and works on a web privacy tool called Better. In this episode, Laura discusses the intersections between UX, accessibility, and human rights.

Read Further

Ind.ie

Ethical Design Manifesto

Accessibility for Everyone

Guardian article: The missing link: why disabled people can’t afford to #DeleteFacebook

Guardian article: The punk rock internet – how DIY ​​rebels ​are working to ​replace the tech giants

 

Transcript

[THEME MUSIC]

ALAINA WIENS: This is StrategyCar, the show that’s a road trip to a better web. I’m Alaina Wiens and today I’m bringing you Laura Kalbag, the author of the book, Accessibility for Everyone. If you flip to the back of that book, you’ll learn that Laura is a designer from the UK. She’s one-third of a two-person-and-one-Huskey social enterprise called Indie that works for social justice in the digital age. At Indie Laura follows the Ethical Design Manifesto and works on a web privacy tool called Better. So, Laura I’m so grateful to have you here because I’ve been really, really inspired by what I’ve read of your work and what I’ve learned about what you’re working on. So, thank you very much.

LAURA KALBAG: Well thank you for having me. That’s very kind of you to say.

ALAINA: While we’re getting started can we establish a little bit of setting? Where are you speaking with me from today?

LAURA: I am currently in my house. I’ve only been living here for a month and I am in Cork, just outside the city of Cork in Ireland.

ALAINA: And you’ve just moved.

LAURA: I’ve just moved.

ALAINA: So kind of you to be speaking with me while you’re getting settled. Do you feel like you’re all settled?

LAURA: Yeah, I think so. We’ve got beautiful spring weather. It’s been lovely and warm today and I really love it here. Ireland is treating me very well, aside from hay fever. [laughs]

ALAINA: [laughing] Everyone in my house is a mess right now. All the plants are opening. It’s beautiful, but it’s a little hard.

LAURA: Yeah, beautiful but dangerous right now.

[TRANSITION MUSIC]

ALAINA: Can you talk a little bit about your work and what you find yourself working on today?

LAURA: Yeah, sure. For the last, maybe about four or five years, I’ve been working at Indie, or rather as a part of Indie. There’s just two of us full time. And what happened was when Edward Snowden had his revelations and started talking about the fact that the NSA along with other organizations in other countries were essentially spying on their own citizens and collecting, hoovering up loads of data about them on the web, we stopped and went, “Well, hang on. How are they doing this?” They’re doing this because all of these companies on the web and on the Internet are collecting our data and handing it over to those governments. So, the companies themselves that are doing this, the Facebooks the, Twitters, the–all of the social media and many other organizations besides, they’re all collecting our data, too. And so, what we should be trying to do is also to fight that because actually it’s going to be a bit easier than fighting the governments, but also that we can create alternatives because we are people who have worked in the tech industry for quite a long time. And so, we have the skills to try to create alternatives and try to help other people build alternatives to what we call unethical technology.

ALAINA: Gosh, so that feels really big, though, right? Like that’s, that’s a lot. How do you, within that frame, decide where to focus your energy and what work feels most important right now?

LAURA: We’ve kind of tried to split into three areas. Most of all we’re trying to do things that we think are manageable for us. Where can we actually have an impact with our set of skills and the people we know and the reach we can have? Because it’s all relative to that, really. And what we try to do is we go around, and we talk about the topic. We talk about “surveillance capitalism” as it’s called, making money from spying on people actually. And so, we go around, we give talks, we talk to people in government, people who are writing regulation as well. The second thing we do is we’re actually working actively on alternatives, so we’re currently working on an initiative called the Indienet, which is building an alternative in the form of federated personal websites which I can explain to a bit more about later. And the third thing we’re doing is just trying to mitigate some of the existing harms caused by tracking all over the web. And so, we’ve actually built a tracker blocker which is an app that works with Safari on Mac and on iOS and it stops the trackers when you’re trying to browse around the web. So, it just blocks them. And so that’s the other thing that we built. So, all of these things together, so we’re trying to sort of affect things on a big level in terms of regulation and in terms of awareness. We’re trying to mitigate the existing harms with tracking like blocking all the trackers. And we’re trying to build alternatives so that we have a future we can actually move towards. Because right now it’s no good to tell people, “Oh, stop using Facebook because they’re affecting our democracy,” or, “Don’t use Twitter, use something else. Don’t use Instagram.” There’s nothing else we can go to that is a more ethical alternative. And so we need to work on a more ethical alternative and help other people build them as well.

ALAINA: And at the same time, you’re doing a lot of advocating for accessibility. I read your book and pulled some things out of that to kind of help me reframe the way that I look at the work that I’m trying to do. I work, I work in a marketing department. I talk a lot about content and web and things from that side, but for a long time I felt like I didn’t have the vocabulary, or the skill set to really advocate for accessibility in the right way. I knew it was important, but it didn’t feel like something that I could do much about. And I’ve been really trying to reframe that and think about how I can explain it to the people around me and inject it into more of the work that we do from the beginning in sort of the way that you’ve talked about. I really liked how you framed it as designing for everyone and I wonder if you could talk a little bit about the connection between making things accessible for everyone and designing for everyone but also making sure that the tools that everyone are able to use are ethical. Do you see that there’s a connection between that work or between those philosophies?

LAURA: Oh absolutely. So I would say that these both broadly come from my sense of trying to achieve social justice in a digital age. And so I’ve always worked in accessibility for as long as I’ve been building things for the web. And I’ve always cared about it. I was really lucky when I started out that when I was learning about web development and web design, when I was reading people’s blogs they would often write about how to make what they were doing accessible because they cared about it. And so I started picking up these things here and there. And over time learned enough to be able to produce a book out of it. I mean by no means am I an industry expert. There are plenty of people who do accessibility 24/7. But the way that I try to integrate it into my work is: building things accessibly is a result of designing in an inclusive manner. So when you are trying to design and build and develop something, you’re looking at making it work for as many people as possible from the very beginning rather than building something that works for you and your team and then perhaps adding on extras and accessible bits and things like that later on when you find it doesn’t work for people with particularly accessibility needs. Like for example there might be someone who uses a screen reader which is assistive technology that reads the screen to you when you can’t see it visually. Or perhaps it could be something like providing a transcript as an alternative for someone who finds it difficult to hear the speech that’s going on, and so things like providing alternatives, making interfaces work for different types of assistive technology. So if we do that from the beginning, if we integrate it from the beginning, we’re far more likely to understand how what we’re building affects a broad number of people rather than sort of building something that’s just for a very narrow number of people and then realizing that we’ve excluded lots of them.

ALAINA: And when you’re talking about building something for a broad range of people and building it for everybody–I was reading through some of the articles that you sent me last night about that idea of the information and the access that people are giving to the social networks that they’re accessing, the web platforms that they’re accessing, and how that’s not really in their interest. It’s in the interest of a small few. And thinking about how protecting the interests of everyone in the way that they access that technology, in the same way that we think about building things for everyone, but just a really interesting connection that I’d never really thought about before.

LAURA: Yeah, so it’s interesting that nowadays, now we’re starting to become as an industry, sort of, we have more understanding in “it’s bad to have technology that relies on extracting data from people.” What this technology has resulted in is really products that are very manipulative. They try to addict us because our data, our information is so valuable to them and us always being on their site is so valuable to them that we have things that are like infinite scroll. We have things like the next video playing straight after the the previous one on YouTube and Netflix. We have all these things are designed to keep us in, to keep us locked in. And what I’ve been seeing a lot of lately is this kind of idea of, “Oh, you’re addicted to your phone. You need to put it down. You need to walk away.” And really, I think that that’s victim blaming because the reason why these things are addictive is because they’re being designed deliberately in that manner to manipulate us, to make us keep looking. What would happen if the technology we used didn’t have an end goal to extract as much information out of us as possible? I think that we would probably end up designing it very differently. And one of the articles that I sent you was this really great piece in The Guardian where it was exploring how people with disabilities use social networks because really social networks have been completely vital for people who end up having to spend a lot of time at home. Maybe they have a physical disability or maybe they have some kind of illness that means they can’t leave their bed or leave the house and it can be very lonely and very isolating. And so social media has become incredibly important for enabling people to form communities and to be able to keep in touch with each other, to be able to have a social life that they may not be able to have so easily outside of their home. And so this sort of really brings up this situation where by–we’re deliberately addicting people who are more marginalized and are more vulnerable to things and we’re not giving them alternatives and they have a choice to choose between, “Do I want to use this social network? But I also have to give up a huge amount of my personal information which may in the future render me more vulnerable, as that kind of information is used in conjunction with insurance companies and governments, the sales of all kinds of things, and that information could be used to discriminate against me because–from something as little as advertising deciding I’m one type of person to an insurance company discriminating against me because they know something about me that I didn’t necessarily want to share.” So, it’s kind of big stuff.

ALAINA: Yeah, and it’s fascinating to me to think about how so much of the conversation or the way that I’ve thought about accessibility so far has been making sure that everybody has equal ability to get to the same things. But beyond that it feels like what you’re saying is that it’s not just about making sure they can get to the things, but that once they have them we are being responsible with the way that we interact with them or allow them to access things. Does that feel right?

LAURA: To some degree. I’d say that really it’s ensuring that the technology that everybody has and that everybody gets to use is both fair in that it’s not excluding anyone but also in that it’s respecting people’s rights from the very beginning, and is respecting their privacy, and respecting their security, and is treating them well from the very outset, is not monetizing their information and their data. Because I think that, I mean, people with accessibility needs are no different from people without accessibility needs in that we all deserve something fair from technology. None of us necessarily have the knowledge or the time to be signing up for privacy tools and sort of doing all kinds of things in order to protect ourselves. That requires a lot of research, a lot of information, and a huge amount of time. And, of course, more vulnerable people or more marginalized groups are less likely to have those resources in those times. But really, for all of us, we need to make sure that things start from a fair basis for everyone.

ALAINA: How do you think that resources and networks and platforms and all of the things like the Facebooks and the Twitters and Instagrams of the world might be able to be made available without depending on that monetized model? I’ve read some things that make the argument that, well, the reason that they have to monetize you and you become the product is because how else do you expect for this sort of service to be free? It has to be paid for in some way. Do you have any thoughts on how that model might be shifted?

LAURA: I think the problem with the existing platforms is usually that their journey at some point started out with them being given a lot of investment money from venture capital. And when a venture capitalist gives someone money, what they are investing in is an exit. What they’re investing in is the chance that they are going to make a very high return on their money when that company is either sold or goes public. And so from the very beginning those platforms have been set up in order to scale very quickly, make a lot of money very quickly, and that’s really not compatible with something that’s sustainable and that can survive on maybe a smaller amount of money per person, maybe a subscription model. But really one of the things that we’ve started thinking about is, how could these platforms and services be provided from the commons in some way? I mean, if we look at the way that we use social media, the way we express ourselves on social media, these have become our new everyday things. We use these services and these products more than we use the kinds of things we pay taxes for. We probably use them more than the roads, we probably use them more than our drinking water even. And so perhaps these could be funded from some kind of common money, whether it be on behalf of a government, be on behalf of a city, be a local service. Maybe it could be just subsidized in some manner. Or maybe there are different forms of monetization, things where perhaps you have a subscription model where the people who have the most amount of money and, like, want to buy pro tools and things like that will then be able to subsidize the cost of the platform for people who may not have the money. Because I think one of the important things to say is that everyone deserves equal access to technology and it’s not really fair if the price of your privacy and your security–if you have to be rich in order to get those things. I don’t think that’s cool.

ALAINA: I don’t either. And I want so badly to have everybody who needs to hear that hear it and understand it. And one of the things that I struggle with personally, even in just, like, the very small scope of influence that I’m trying to have in my community, is how to make the case for principles like these and ideas like these that feel very big that I don’t–I don’t know how to connect those concepts with folks in a way that makes it feel like it’s imperative that they join the effort. And one of the things that I took from your book that kind of helped me think about making that case in a different way was when you were giving some advice about how to make that case, about how to explain to people why accessibility is important and why you should bake these principles and from the beginning. It really struck me that the things that you listed were very, they were positive. They were very much about the benefits that these kinds of things can have not what bad things will happen if we don’t act this way. Did you have to work at framing things in that way, in a positive way over time? Or was that just always how you looked at it? And I ask that because I find myself saying, like, “It’s really important we do this, or nobody is going to get the information that they need and no one will be connected to the resources that they need,” and that never seems to resonate in the same way that me explaining to somebody the business case for accessibility seems to work.

LAURA: Yeah people can get a little defensive if you turn around and you say to them, “We have to make this accessible because don’t you care about people?” And that’s one of the reasons why I often try to frame things in a positive way, because the second you do start talking about things negatively you can set people on the back foot, you can make them feel defensive and you don’t want to imply to someone that they don’t care. It’s just that perhaps they haven’t–they don’t know about it and they haven’t had exposure to perhaps someone, a friend or family member, themselves having any accessibility needs and so they don’t necessarily understand the impact that an inaccessible site can have on someone’s entire experience of the web. It is usually the difference between someone being able to access the site and just not be able to use  it at all, which is quite an extreme difference. And so it is good to have positive ways to frame things. I’d say that I’m not a big fan of the, “It will make you more money” case,  just because I don’t think we should be thinking about the people we’re building technology for in that way. I think that it can be useful to say, “Look, you’re already–you’re excluding a load of people. Let them in and you will make more money.” Hey, there is some truth to that. What I prefer to do is look at things like usability. I mean, by making something more accessible to people with sort of very particular accessibility needs, you will tend to make a site more inclusive and more usable for everyone because the majority of people will have some variation of that need. People decide to use a website with a different input type, they might prefer to use keyboard navigation even if they don’t have an accessibility need for it. They might be like me and I don’t like watching videos. It’s not my thing. I don’t take in information very easily that way. I way prefer reading a transcript. That way I can skim it as quickly as I like, get the information I want, and go. And so, if you provide a transcript, that makes it easier for me as well. And so, we can make things far more usable if we embrace accessibility needs and I think that’s a really good use case to make, particularly as I think as an industry, people they can understand experience more than they used to. That’s really something we understand, UX, user experience and we understand the benefits that can bring to everyone. And so, I like to tack it on is something that’s really a part of that.

ALAINA: When I was looking at that ethical design pyramid that you’ve framed up, I feel like some of that was injected in there, that usability and that–the layers of considerations that we can make as we’re creating products and thinking about the ways that we put them out into the world. Can you talk a little bit about that pyramid and those principles and how that came to be?

LAURA: Yeah, so we wrote this pyramid or a document or a manifesto that we call the Ethical Design Manifesto. Because we felt like it’s very easy to say, “Oh, that’s ethical,” or “that’s unethical,” but, really, we needed some points to help people understand what steps could I take in order to make my product more ethical. And in its current version, we look at it as a pyramid where at the bottom we have the whole idea of embracing human rights and respecting human rights because without–we could do anything else with the product, if you don’t respect human rights then you’re not respecting people. And there are some sort of more technical areas there in that you want to make a product that is decentralized, or a product that is private, or free and open, or interoperable, secure, but really inclusivity and accessibility is one of those main points. That’s respecting people’s human rights. In the physical world, we are not supposed to or allowed to discriminate against people based on their accessibility needs in the physical world, and so we shouldn’t be doing so in technology. And sustainability is also part of respecting human rights because you are ensuring that your product will be there, and will both be economically and environmentally sustainable. The next layer up is human effort and that’s something that we’re already very good at doing in technology is this idea of making things that are functional, convenient, and reliable. And that’s something we’ve been working on for a long time because if we don’t have a product that is functional, convenient, and reliable, no one’s going to want to use it or at least once someone’s used it or tried to use it, they’re not going to want to use it again. And then that top layer being once you’ve made something that respects human rights and respects human effort, you’ve got a great product there. We need to respect human experience because people are putting a lot of their time and effort into using our products and using our services. And so that’s when we can add on things that are delightful and cute and fun, but only if they do respect all of those things below. There is a bit of a danger in technology nowadays that we’re making things cute and delightful, we’re adding little cute doodles and we’re adding adorable little avatars and things like that when we haven’t really thought about all of the stuff underneath it. What we’re really doing is we’re creating design that is distracting, almost deliberately distracting people from what’s going on underneath, of what the real core of a product is. And so that’s why that pyramid is shown in that order. I think nowadays what we’re trying to do is referring to it more like a fruit, like an apple, and at its very core is the respect of human rights and having an ethical business model, a business model that does not rely on monetizing people’s information. And then the edible fleshy bit is respecting the human effort. And then the delight and the respect of the human experience is that beautiful, sort of lovely, delightful color and skin of the apple or the fruit, the thing that makes you really want to bite into it. It smells good, looks good, tastes good. So all of those things together will make an ethical product. Because the thing is if you–you could have another apple that looks beautiful and smells good and is really like shiny and wonderful, but if the inside is rotten, if the core is rotten, you don’t want to eat that apple. And so if a product has a beautiful exterior but has a rotten core we don’t want to use that product. So it’s one of the ways we’re trying to talk about it now.

ALAINA: I love that. It pulls in the ethics of it, but it also pulls in some really concrete vocabulary about how we can define what something that’s better looks like. I’ve asked a few people now, “If we’re working toward a better web, what is a better web look like?” And I really like that framework for a web that has all of those considerations and layers baked in. I like that a lot.

LAURA: And that’s the thing, because it’s not any one thing that will make the web better. It is all of these different things working well together in harmony that’s really going to make a difference.

ALAINA: So, I’ve got a couple of last questions for you. I probably could ask you a million questions and take up your entire day, but I want to try and let you get back to that beautiful weather that you’ve got. I wanted to talk a little bit about, if you could remember it, when it was or where it was that this sort of calling became apparent to you. Did you always set out in your professional career to sort of advocate for these kinds of things or do you remember a time or a moment when it became clear that this is something that was important to you?

LAURA: I think personality wise I’ve always been a bit like this. I’ve always been the person who got in trouble for having too many opinions, for trying to speak out about things that I thought were unjust or unfair. And maybe when I was younger, at school and things like that, maybe I had my sense of what was unjust and unfair was focused in the wrong place. But I always wanted to be a designer as well from a very young age. I was really interested in advertising and how you could inspire people to feel in a particular way with a particular aesthetic. That was how I very much saw it when I started out. And so these two sort of threads in my life have kind of run alongside each other. I’ve always wanted to do design. I’ve always been that kind of person. And it was only really as I started working professionally and sort of understanding how design impacts our lives that I could start putting those things together. Because a lot of the time I think for a lot of us when you start working what’s most important really is that you make enough money to be able to survive and that’s what work is. And to some degree it’s a privilege to be able to see work as anything more than that because not everyone can afford to speak up about things. Not everyone can afford to be able to take risks that might get them fired, that might not get them any future clients. And I think I’ve always been at a comfortable enough point that I could get away with that stuff. That I knew that I could always get just one more client if I needed to regardless of saying something outspoken on Twitter or writing a blog post that might wind a few people up. And so, I’ve tried to use that, and I continue to try to use that in a way that I try to stay informed and listen to other people. I don’t just want to be a mouth. I want to be a pair of ears as well [laughing] if that makes sense.

ALAINA: [laughing] You’ve got all of the things.

LAURA: Yeah, because it’s important to use your privilege to be able to speak out about the things that you see are unjust in the world and particularly to be able to speak out for people who may not be able to speak out in that same way for themselves.

ALAINA: So, when you think about what you’re going to work on next or what you’re working on at the moment even, what are you most excited about? What do you see coming up?

LAURA: I’m really pleased with the Indienet project that we’re working on now, the Indienet initiative. So, we’re working on a system of federated personal websites. So, this is essentially a system that you can make your own website very easily, you don’t have to have some ability to code or anything like that. And those websites can talk to each other in a way that allows you to be sociable and do the same kinds of things you can do on social media. I think one of the things that’s been really cool about it is that we’ve been working with the city of Ghent in Belgium, which is a very progressive city because they want to look at funding this for their citizens, enabling their citizens to have their own space on the web and be able to talk to each other and talk to the city, funded by the city. Funded but not controlled by the city, which is also important that this is not state-controlled websites. And so that’s what we’ve been working on and building it in a way that anyone can take this and use it. Any web host could take it and set up for customers. Any individual who has the technical knowledge will be able to do it, but what we’re really hoping is that we can build a system that people can build their own services on top of. So, they have an ethical base and something that’s very easy to use, very convenient and inclusive and accessible from the very beginning, and that other people can then build on and provide services around, too.

ALAINA: Wow, that is exciting.

LAURA: Yeah, I think it should be good.

ALAINA: It sounds like it. If someone wants to learn more about that or kind of follow along with the progress, where’s a good place to send someone?

LAURA: I would say the best place is to go to our website which is Ind.ie. And you can also follow us on Mastodon, mastodon.social is the instance we’re on which is we’re @indie on there. We’re Indie on Twitter and we also have a forum at forum.ind.ie where we kind of talk about a lot of different issues in the area. I try to post regular articles on the forum and on Twitter around this subject area to kind of help people understand why we’re doing what we’re doing because I think that’s one of the important things is that we’re doing it for a reason. And we’re doing things in a very particular way because we want to be able to make things that are useful for a lot of people, not just for privacy nerds essentially.

ALAINA: Thank you for working on all of this stuff.

LAURA: If I don’t, I don’t know who will do that for me. I want this for me. I want to be able to use it myself, so…

ALAINA: Alright, well, thank you so much.

LAURA: Oh, thank you for having me.

[TRANSITION MUSIC]

ALAINA: If you’d like to learn anything more about the topics that Laura was talking with us about today, I’ll be posting links to the articles we mentioned in today’s episode’s notes. Get future episodes wherever you get your podcasts, and in the meantime, you can join this conversation any time at strategycar.com.

[OUTRO MUSIC]