Margarita Quihuis: Envision a world that’s based on values rather than on utility

|TechforGood 作者:TRI 2020-04-09

Tencent Research Institute: How do you evaluate the changes brought by the internet products and services in the past few decades?

 

Margarita Quihuis: I think it has become apparent to almost everyone on this planet that a lot of technological innovation that’s being delivered by the internet has broadened the direction people talk about the internet. If we think about the potential of the internet in early 90s, we thought it was going to have many humanitarian purposes and it was going to keep people together, connect people in genuine and authentic ways, reunite families, etc., which is unimaginable before.

 

It was wonderful, but what we did not anticipate was the weaponization of technology. There were a lot of works in early 90s saying “Oh people are going to treat technology as if it was another person”. The consequence of that is the technology cannot influence you the way another human being can influence you. It can persuade you to do things, which has been testified afterward.

 

This technology has the potential to be very dangerous, and people are not even talking about this before. It’s like none of the people were looking out for that, or saying how we prevent that from happening. It was hard for them to imagine that we would be bad actors doing bad things. When you look at the literature on influencing persuasion, we can have people who can be very inspiring world leaders, like Nelson Mandela. They’re inspiring and persuading others to be better people, but there are also cult leaders who persuade people to do very bad things. Those specific skills and tactics are identical, whether used for good outcomes or bad outcomes.

 

I think it has become very apparent that technology has been used for very bad outcomes, and we just weren’t aware of it. Because we weren’t aware of it, we weren’t able to assess the risk, and we did nothing to social systems, political systems and economic systems. That’s why Stanford focuses on the ethics of technology and the ethics of design. We need to envision a world that’s based on values rather than on utility, because we’ve been focused on profitability and return of investment, to the exclusion of everything else. This has ruined the reputation and respect, which cannot be offset by money.

 

Tencent Research Institute: As a behavior designer, how did you do the ethic designs in every product?

 

Margarita Quihuis: One of the ways is that we have a data standard, and through that measurement, you can detect if this technology going to turn out to be what you wanted. For instance, Twitter was designed by 4 men. Because of the way they operate the world, they don’t feel repercussions that women do. If we had a bad relationship, we get stuck, we get harassed and we may face the potential for violence. Our sensitivity on safety is very different from men. Imagine designing Twitter today and we want safety built in. Here we are female engineers, and we’re seeing people harassing other women or bully them. In this case, we would probably come out with some different ideas to change that behavior.

 

What we would do in standard is developing new products and services. You would have these embedding codes that can detect if people are being harassed or if there are unusual phenomena. If it’s going in the direction you don’t want, you can catch it early. It’s not like 50 years ago where we don’t have enough research on cigarettes and have to wait 30 years later for cancer to show up, then do we realize their negative effects.

 

We have the Human-Centered Artificial Intelligence, and we’re looking at how unbiased AI algorithms are. Since people who created the algorithms are human, they’re not even aware that it is what they’ll put in machines or even the data-set that might be biased. So, the computer is learning from biased datasets and produce biased outcomes. It’s like that we wouldn’t intentionally train our children. We want them to be polite, and we want our children to be good citizens.

 

Tencent Research Institute: What’s the motivation for the internet giants to practice Tech for Good?

 

Margarita Quihuis: A couple of things are external forces. The governments issue anti-trust regulations. They crack down on technology companies as they would say “Look, you can’t use this piece on your platform.” The platform becomes a mass murder or enabler of violence. So governments are quickly going to force them to make this part safer.

 

But for younger entrepreneurs, they genuinely want to do the right things. They want to use technologies to make the world a better place. I think we can give them the tools and methods, and they would adopt them because for this younger generation, they’re much more value-oriented if nothing else.

 

Our society is dealing with very exponential threats. We have climate change, the polarization of the society and the aversion of trust. If we can’t trust each other, we can’t actually cooperate and innovate to make solutions to all these issues. So, the younger generation has realized that if we’re too busy fighting each other and not talking to each other, we don’t have a chance of surviving.  

 

Tencent Research Institute: While practicing Tech for Good, what’s the biggest challenge? The details of how to do things right or the mindset?

 

Margarita Quihuis: The mindset, yes. The first step is to get the attention and the desire to do something. Now, people feel like something must be done, but they don’t know who’s going to do it, what should be done or they don’t want any risks. So, we worked on a systematic innovation process to help them transform.

 

Tencent Research Institute: I wonder about the role of your lab. Do you prefer to work with the younger generation of internet producers or do the independent things on ethic design?

 

Margarita Quihuis: What we do is developing tools with the same methods and approaches. We work with entrepreneurs but we also work with corporations, because if we look at the planet and see who actually has the human and economic resources to find solutions, it’s going to be companies. Much more of my time is spent talking to corporate executives that are open to digital transformation and innovation. I can help them with this, but I need them to produce qualified citizens that have social impact.

 

For some corporations, through their positions in the market, there is something they can do as meaningful, which is sustainable at economic point because there has to be profitable products, but also addressing other issues. I’m more than happy to do something super cutting-edge with them.

 

This is a chance to do something really interesting, creative and something that requires courage. When I talk to my executives in that way very honestly, they go “Wow now it’s getting real.” So to me, Peace Innovation is a sort of umbrella to move people along and to go in the direction they really want to go.

 

Tencent Research Institute: What do you think of the privacy issues of AI products nowadays? Do you have any solutions to this problem?

 

Margarita Quihuis: I don’t know if you remember in the 90s, Intel came out with a branding strategy which was Intel Inside. This is because back in the early, the personal computers consumers didn’t pay attention to whether it’s AMD chip or Intel chip. Then Intel came out with a brand campaign where they invited a third party to vet their chips and give a neutral evaluation. Consumers would like to pay for the safer chips.

 

I see Apple is moving in that direction as they’re marketing around privacy, saying their phone will protect users’ privacy, their technology will protect privacy. So privacy in some ways is becoming a luxury because authentic products are more expensive. The strategy has helped to re-commit people’s loyalty to Apple and it has become a business model.

 

Tencent Research Institute: How do you understand the Tech for Good?

Margarita Quihuis: I think we’re talking about the same thing. We always talked about the difference between ethics to constrain people, like “don’t do this” or “don’t do that”. If you start from a blank piece of paper, what is the world we want to live in? Use that as the North Star. If you want fresh air, if you want to see flowers, if you want birds, if you want trees. You start thinking about all the things we want, then we can start filling that picture and get higher and higher resolution to become real. But if we start from the point of what we don’t want, then there is nothing for engineers to build.

 

So, say if you want technology for good, we need to start out with the outcomes in mind, we need to radically imagine the world we want to live in in 5 years 10 years or 20 years and walk to that goal. We need to have the technology and try to find a purpose. What we need to do is to figure out what our purpose is and convey it to our engineers. If it’s just a tool, then discard it. We don’t need that.

前沿杂志
互联网前沿61

2022年,从引爆AI作画领域的DALL-E 2、Stable Diffusion等AI模型,到以ChatGPT为代表的接近人类水平的对话机器人,AIGC不断刷爆网络,其强大的内容生成能力给人们带来了巨大的震撼。

2023-05-12

全站精选