Unscaled 4: The Minimum Virtuous Product

Airbnb, Uber, and Facebook have had the public turn against them when their products caused damage. Even if the founders of these companies had good intentions, they could have made better use of data and AI to measure the impact of their products. It’s time to update the MVP acronym from Minimum Viable Product to Minimum Virtuous Product. Companies should strive to build morally sound design principles from the start with accountability, explainability, and transparency.

Show notes

(full transcript at bottom)

Hemant Taneja’s book Unscaled: How AI and a New Generation of Upstarts are Creating the Economy of the Future (referral fees will be donated to charity)

Hemant Taneja, managing director at General Catalyst

Ronda Scott, marketing partner at General Catalyst

Companies must consider the long-term consequences of what they build (2:43)

How does your mission fit into where innovation is going? (4:00)

Accountability, explainability, and transparency (4:28)

China’s positioning post-GDPR (4:35)

The importance of a diverse team (6:30)

What does diversity mean? (7:18)

Why regulation matters (8:19)

No zero-sum (9:30)

Credit Karma (10:39)

Spot, an AI for workplace harassment and discrimination reporting (11:30)

Amazon same-day delivery and unintended discrimination (13:58)

How do you catch these kinds of things before they cause damage? (14:29)

What can go wrong even if the algorithm does its job (15:04)

When and how should consumers hold companies accountable? (16:13)

The chain of accountability (18:40)


We want to hear from you

Please send us your comments, suggested topics, and listener questions for future All Turtles Podcast episodes. Season 2 is coming soon!

Email: hello@all-turtles.com

Twitter: @allturtlesco with hashtag #askAT

For more from All Turtles, follow us on Twitter, and subscribe to our newsletter on our website.




Phil Libin: The acronym MVP used to stand for Most Valuable Player. I don’t know anything about that because I don’t understand sports ball. More frequently in Silicon Valley it means the Minimum Viable Product. But we’ve kind of been calling bullshit on that. We actually think a better definition of MVP is the Minimum Virtuous Product.

Phil Libin: In this fourth episode of the Unscaled series from the All Turtles podcast, we’re going to talk with Hemant Taneja and Ronda Scott about how entrepreneurs can build in morally sound design principles right from the beginning of their product. How to know that your product is doing good for the world and not harm. And how to build in this idea of social responsibility right into your beta, right into your minimum product, the new definition of MVP.

Phil Libin: I’m Phil Libin, CEO and cofounder of All Turtles. Let’s have this conversation.

Phil Libin: Welcome back, Hemant and Ronda. So we’ve had a great discussion about Unscaled over the past few episodes of the series. We started out with this idea that a lot of the problems that we have with the internet right now, a lot of what feels broken, these problems have been around for a long time. They’ve been around since the beginning. It’s just that now they apply to everyone because of scale. Which kind of took us into this discussion of what is scale anyway. And about how before the whole world was wired up where you had to be get big so that you could have impact. You needed to make a big company so that you could actually have a product that reached many people. You needed to often start making products for rich people and then get bigger, and bigger, and bigger so that you could make those products cheaper and cheaper, and let everyone experience them.

Phil Libin: But this whole model of having to get to big before you can get to impact is actually completely stood on its head right now. Where companies now can target a specific segment, be relatively small, hyper focus on just having a great product team, use API’s, and platforms, and services, and rent everywhere else. Get to really good product market fit relatively quickly before they scale, which is a great superpower to have.

Phil Libin: But that also comes with a lot of problems which is if you’re moving fast and breaking things how do you actually know when stuff is breaking? And so the last segment we talked about these algorithmic canaries. These ways that companies must take into effect the impact that their products are having, not just on their target customers but on the wider world. And actually saying that, “Well, it’s the responsibility of the founders to do this. It isn’t somebody else. There are no magical adults in the room that are going to clean up your problems. You have to do it.”

Phil Libin: Which kind of takes us to this notion of the MVP, which in Silicon Valley used to mean Minimum Viable Product, but which I think now ought to mean Minimum Virtuous Product. How do you design virtue into the product from the beginning?

Hemant Taneja: Yeah, so as we in Silicon Valley started taking on more and more profound problems, we have to think about the longterm consequences of the business we’re building. At a very high level, it starts with thinking about what if what we’re trying to build comes true along with that else our peers are building in the Valley as well.

Hemant Taneja: One example I’ll walk you through is: If you go on 101 today, somebody is working on self driving trucks. Somebody’s working on longevity. And somebody else is working on how do we make basic income work because we’ve just resigned ourselves to the fact that there’ll be no jobs.

Hemant Taneja: But if all of that comes together … because each of those is a very noble cause … what is the world going to look like? So I think we’re going to have to go to four million truck drivers in this country if each of those projects succeeded and say, “We’ve got good news and bad news. Good news is that you’re going to live 30, 40, 50 years longer. Bad news is you’re not going to have a job or self-esteem because we’re going to put you on a stipend for sustenance.”

Hemant Taneja: At an individual level for each of those projects, you’re not making the world a better place even if you were thinking you were in the context of your own mission.

Phil Libin: Right.

Hemant Taneja: So one thing is to think about broadly how does your mission fit into where innovation is going? And then you have to translate that into what does that mean for your product? A lot of the potent capabilities that we’ve talked about in the previous episodes around how we use AI and data, we need a framework for making sure we’re using it in a positive way.

Hemant Taneja: So the things I would mention. One: build your products with great regard for accountability, explainability and transparency around how you’re using AI. Making sure that your core principles are correct. Making sure you’re making it transparent how you’re using the data. And explaining it to the end user so they know the choices they’re being given, why they’re being given those choices.

Phil Libin: And that’s a super important thing. And a massive change. This idea that you really do have to explain much better than we have been what’s actually going on and what we’re doing.

Phil Libin: Some of that legally because of things like GDPR, which I know a lot of people aren’t fans of. I think it’s great. The bureaucracy is going to be a be onerous for a while. But I think the general philosophy of actually only collecting the data you need and explaining why you needed and getting permission is great.

Phil Libin: And it’s actually kind of a full employment act for product writers. There’s just going to be an army of people now whose specialty is how to write explanations for why are we asking for your contact list and that kind of stuff.

Hemant Taneja: By the way, not to interrupt but if you really believe that, you’re essentially saying China’s going to dominate the tech industry when you start thinking that way. Even though I agree with your core philosophy of GDPR, I think the question is where do you apply the principles of GDPR? Is it around data collection? Or is it around data use?

Phil Libin: You’re saying that because China doesn’t have this kind of stuff and so that’s more of a free-for-all. And you think that’ll make them move faster.

Hemant Taneja: Move faster, build better products, and [inaudible 00:05:50] in the end.

Phil Libin: Yeah.

Hemant Taneja: So there are consequences there as well as to how you apply the principles of GDPR.

Phil Libin: Right. But I think that’s a super fascinating discussion for I think a different topic. But I think this idea that you started with of a kind of first core principle is make things explainable …

Hemant Taneja: Yes.

Phil Libin: … and transparent. As explainable and transparent as possible. Yeah, I’m totally down with that. That feels like a good rule for if you’re going to make something now, how do you make it.

Phil Libin: I’d pause at kind of a second rule for this which to me almost seems like a short cut. A quasi-magical short cut, which is super hard to do. But if you can do this you’re already in a pretty good start is work ridiculously harder than you think you have to to have a diverse team actually building your product.

Phil Libin: That doesn’t solve every problem but it’s a good step forward for many, many problems.

Hemant Taneja: To me, it may be sufficient but it’s a necessary condition. You just need to be able to think about the impact your product and your company’s going to have in all directions. And no matter how much you try to be empathetic to every part of society as an individual and as a founder or team of founders, it’s not enough. You really do need diverse points of views around the table to make sure that all those considerations are being properly covered.

Hemant Taneja: So I absolutely agree with that.

Phil Libin: How should founders do that? That’s easier said than done.

Hemant Taneja: Well, first thing is: What does diversity mean? And for us, diversity’s not just about male, female. I think it truly is about having the right stakeholders around the table from different demographics that you’re thinking about for your business and different generations. And so deliberately making sure where you’re recruiting across whatever the dimensional diversity are for your business is really important.

Hemant Taneja: And I’ve heard many stories of, “Well, we tried. We said for 60 days we’re only going to hire this profile person versus not and then we gave up.” I think there’s always going to be trade-offs like that, but if you find yourself giving up and that’s your excuse every single time then you’re not trying hard enough. There are definitely companies here that have done a good job of managing through those trade-offs and making sure the diverse points of views are on the table.

Ronda Scott: Some might say it’s kind of a third point there is to go back to our Scooter Wars conversation. And that is part of your minimal virtuous product has to include regulation. It cannot flout regulation. It doesn’t mean that all regulation is good. It doesn’t mean that all regulation is sensible. But there is a framework, there is a reason why we have regulation. There is a reason why we have consumer safety boards and that sort of thing.

Ronda Scott: They’re not all bad. And so when you are thinking of a product, you want to think about your impact. You want to think about how you might comply and if you don’t comply you want to think about a public private partnership that maybe will move regulation towards where it should be.

Phil Libin: Yeah. Your product exists in a world that is much wider than just the target users of your product. So you have your customers and you have a responsibility to them. But then your product actually exists in a much broader world and you have responsibility to that entire world because you’ve chosen to put something into it.

Phil Libin: And thinking through having diverse view points. Thinking through not flouting regulations willy nilly but thinking through the impact of those trying to make things explainable. Those all seem like very good rules.

Phil Libin: I’ve got a fourth one which I mentioned briefly at the last episode which is no zero sum. I think basically almost any product that tries to give an advantage to some group of people at the expense of another group of people is a poor application of technology especially AI technology. That will scale very badly. You don’t want an arms race of people using your product versus people who aren’t.

Hemant Taneja: I’d completely agree with that. I think that’s a very good point. And this goes back to the example we were talking about student loans for kids in college. That’s exactly the point. That if you had to start to consider for it, you would say, “Well, how do we solve for that?” Because that zero sum doesn’t exist. And it will catch up to us in the longterm. It’s not great for our business just like it’s not great for society.

Hemant Taneja: I don’t think companies can endure by making short-term decisions they think are good for themselves that are not in the longterm interest of society. So the zero sum game is a key point to that.

Ronda Scott: The good news is that we have the data and we have the capability to broaden our markets. Karma Credit for example. While there are some companies out there …

Hemant Taneja: Credit Karma.

Ronda Scott: … that they’re 100% … Credit Karma. Sorry. While there are some companies out there that are 100% focused on taking the cream of the crop, if they’re only insuring people who live in nice neighborhoods with low crime rates, or they’re only insuring the healthy, or only giving loans to the most credit worthy there is huge opportunity … to use the phrase earlier … going down market. And we have the ability to use data and algorithms and we have to address those markets and make products that work for everyone.

Ronda Scott: There might be a couple of people left out of that. But if you take a more expansive view from the very beginning, you’re going to have the possibility of making a much bigger positive impact on society as a whole.

Phil Libin: It’s easy enough to do this from scratch, right? When you’re starting something new to kind of pick problems that do this. Almost everything we work on at All Turtles fits into this mold. Spot the new product that we just launched which is an AI for workplace harassment and discrimination reporting, for example, kind of fundamentally has this built in.

Phil Libin: But it’s maybe an even more interesting question. What about companies that didn’t necessarily start with this but that are actually realizing their responsibility and trying to make good faith efforts. I’m thinking for example of Airbnb.

Phil Libin: Airbnb I think has been pretty open about some of the problems of people using their platform. Some racial discrimination. Some other types of discrimination that other companies would have said, “Look, we’re just a platform. It’s not our fault. It’s not our problem. This is what the user who use it want to do this.” But Airbnb is taking the approach to actually say, “No, it actually is our responsibility and we’re going to identify it. We have the data to actually see. And we’re going to enforce policies that may not even actually be what the users want all the time because the company is taking a more expansive moral view than their actual user base is.”

Hemant Taneja: That’s right. Airbnb’s a great example. I hope that Uber is also starting to head down that path.

Phil Libin: I hope so too. I think Uber’s a company that’s worth saving.

Hemant Taneja: Yeah. There’s leadership now and they’re hopefully taking a shot at it.

Phil Libin: yeah.

Hemant Taneja: Even Facebook. I mean a lot of the issues that happened they’ve tried to make some swift moves. We’ll see if it’s starting to produce great results.

Hemant Taneja: But I’ve always believed that the founders at the helm of these companies are actually generally good people and they’re trying to do the right thing. I just think more deliberation, more intentionality of how we use data and AI in our products from the beginning would have been great. But hind sights always 20/20. The question is where do you go from here? And does that become a learning moment, not only for them but also for all of the new entrepreneurs that are starting to build businesses now?

Ronda Scott: Yeah, and I think the thing that we’ve realized is that our time to going off the rails is completely is incredibly short. So …

Phil Libin: Yeah, that’s interesting.

Ronda Scott: I mean, we’re iterating faster but we’re also heading into the abyss even faster than we ever have before.

Phil Libin: Yeah.

Ronda Scott: So …

Phil Libin: And you can go from media darling with a very strong brand to you really just screw up a couple of story arcs and, yeah, you can hurt yourself much faster than you used to be able to.

Ronda Scott: That sounds like a PR problem but it’s not really. I mean … this is an old example from a couple of years ago … but when Amazon rolled out same day delivery and they ended up rolling out. And, hey, that’s great. I need something now. I can order it and it’ll be delivered to me in two hours. That’s fantastic. But it’s not fantastic if you’re delivering it to every single community in the Boston area except for an area of Roxbury which happens to be a historically black neighborhood.

Hemant Taneja: I think that’s a good point. I think what happened there was …

Ronda Scott: They didn’t catch it. So the thing about …

Phil Libin: They didn’t catch it?

Ronda Scott: They did not catch it. The New York Times caught it.

Phil Libin: And now they could have. So as a first step it’s like, “How do we catch things?”

Ronda Scott: Right. The idea is is that went completely off the rails. They were trying to launch a new product, a new service. It seemed like it was a good thing but it assumes you …

Hemant Taneja: They didn’t launch it in Roxbury in Boston which was one zip code that statistics would have said doesn’t have enough affluent customers for their service. But also they denied service to a part of the Boston area. And that’s not how we do business. And it wasn’t necessarily somebodies fault intentionally.

Phil Libin: Right.

Hemant Taneja: But these unintended consequences are just as severe.

Ronda Scott: Yeah, the algorithm did its job. It found the …

Hemant Taneja: The most profitable customers and offered it to them.

Ronda Scott: Yes. And if offered to them which meant there was entire communities. And it didn’t just happen in Boston. It happened in Atlanta. It happened in Chicago. And there was no human oversight to that.

Phil Libin: Right.

Ronda Scott: So what do you need to do going forward with your most virtuous product? You need to think from the very beginning what if it all goes wrong?

Phil Libin: Yeah. You have to think about people who aren’t your customers. I think in this case they were probably thinking, “Well, our customers are the people using this. And so, as long as they’re happy. That’s who our customers are. Customers are the people who have the service. The people who don’t have the service are by definition not our customers and therefore fall off the radar under consideration.”

Phil Libin: But I think maybe the mental shift is, yeah, if you’re making a product, you got to think about everyone in the world that your product can have. It’s not just the people who use it.

Phil Libin: That seems like a lot more work but it might be necessary. It might be the price you pay for having all of these unscaled superpowers of actually being able to launch things this quickly.

Hemant Taneja: It may seem like a lot of work early on. But given how quickly the right companies scale, how fast they become at scale, it also becomes a really big problem if you don’t address it early on and be existential to your business.

Phil Libin: So what can consumers do to hold companies accountable? And in particular, how should companies deal with consumer complaints of this type? And are there cases, like in the Airbnb example, where actually the company may be explicitly going against what some of their consumers want because the companies taking a different moral view on things?

Hemant Taneja: On a visceral level, I don’t like putting this problem on the consumer’s head because I don’t think they have the complete picture.

Phil Libin: Right.

Hemant Taneja: Either because they don’t really know why they’re being offered what they’re being offered. And don’t really know how others are being treated in their context. So I think it’s actually difficult to make this a problem for the consumers.

Hemant Taneja: And by the way, that’s not how we solve it the offline world either. This is where to me … and I know we’ll get into this later … the regulation and the software defined regulation to see at a population level …

Phil Libin: Right.

Hemant Taneja: … how are these companies behaving. And going back to the measurement system and flagging that, I think that is a new governing function that needs to exist in society that does not exist currently.

Phil Libin: Great. I mean I think you’ve got kind of three parties. You’ve got the company, and the employees of the company as kind of the one group. You’ve got people, whether they’re customers or just people. And then you’ve got regulators. And I think the roles of all three of those is changing.

Ronda Scott: Yeah but I do think there’s a role for the consumer. The consumer needs to understand the technology. They need to put in a little bit of work understand the technology that they’re using and the services that they’re signing up for. And also, what they’re giving up when they use those services.

Ronda Scott: So, I don’t want to blame the victim here but I think as humans we need to be aware of the technology, the impact of the technology, and we need to be proactive about it. Whether that’s privacy, whether that’s election scamming, whether that’s leaving large swathes of people out in the cold when it comes to insurance or loans and access to financial services. We need to understand how this stuff works, and we need to be proactive about it, and we need to demand that our government do a better job of understanding what’s going on with the new technologies.

Phil Libin: Yeah. I think that’s right. And I think if we have a framework where we have a set of these algorithmic canaries, a set of these rules for how to make a minimum virtuous product, a framework, then you can have an escalating set of people that can catch problems.

Phil Libin: And so ideally, the problems are caught by the companies. That’s the best case scenario. They’re caught by the company and then they’re fixed.

Hemant Taneja: Self regulation is the best.

Phil Libin: Best case scenario. If that doesn’t happen, then maybe they’re caught by consumers. Maybe they’re caught by consumers, by journalists, by people who bring this up. And then there needs to be a framework for evaluating them, dealing with some of them, ignoring others, responding when appropriate, making changes. And then if all that fails, then there’s obviously a role for government and for regulators. And that’s changing as well.

Phil Libin: And we will be discussing all of these things in future episodes where we’ll talk about this new world of unscale giving new superpowers and new responsibilities to companies, to consumers, to governments. How do we make it all work? Really looking forward to the rest of that discussion.

Phil Libin: Terrific.

Hemant Taneja: Thanks, Phil.

Ronda Scott: Thanks.

Phil Libin: Thanks, guys.

Phil Libin: You’ve made it half way through the Unscaled series. This podcast is a production of the All Turtles worldwide media empire. We recording this episode at the cutting edge Donatello Studios in San Francisco, California.

Phil Libin: All of our conference rooms are named after historical turtles and they’re alphabetical by floor. So on the first floor everything starts with an A, and the second floor B, and the third floor is C. And we’re on the fourth floor and so all conference rooms here start with a D. So you know that if you have a meeting in Donatello, you know it’s going to be on the fourth floor. That’s just how clever we are.

Phil Libin: Thanks to Hemant Taneja and Ronda Scott at General Catalyst for getting half way through the series with me. We look forward to the second half.

Phil Libin: If you have questions, comments or suggestions for future episodes send us an email to hello@all-turtles.com and we’d love to hear what you think we should be talking about.

Phil Libin: Thanks to everyone involved in the production of this series. Jim Metzendorf for editing, Marie McCoy-Thompson for production supervision and editorial management, Chris Ploeg for his audio expertise, Matt Ammerman for our theme music. Our freshly customized artwork is by Micah Rivera, Gabe Campodonico, and Carlos Rocafort the fourth, or Carlos Rocafort IV if that’s how you pronounce it.

Phil Libin: On behalf of the All Turtles team, this is Phil Libin. Thanks for listening. Join us for episode five called “Just the Right Amount of Personalization.”