Rob Goldman joined the growth team at Facebook in 2012 - the same year that News Feed ads launched. He became Facebook's VP of Ads and was responsible for more than 99% of their revenue until he left seven years later in 2019. Today, Rob and NFX Partner James Currier focus on the tragedy of measurement inside these complex dynamic systems like Facebook, Twitter, or YouTube. Very few people in the world understand how to pick the measurements to be measured, measure them accurately, then display them accurately to the people in the organization so that they can run the business based on them. And then further, to communicate them to the outside world, in a way that they can understand and participate in the network that they are a part of. Read the NFX article here - https://www.nfx.com/post/rob-goldman/
James Currier:
So, today we have Rob Goldman, former head of ads for Facebook, responsible for 99% of their revenue. And he was there for seven years. And Rob is an old friend of mine from business school. And today we're going to be talking about a lot of different things, but one of the main things we're going to focus on, is we're going to be looking at the tragedy of measurement, because I think few of us understand how hard it is with these complex dynamic systems, like a Facebook or a Twitter or a YouTube to pick the measurements that you're going to be measuring, measure them accurately, display them accurately to the people in the organization so that they can run the business based on them. And then further, to communicate them to the outside world, in a way that they can understand and participate in the network that they are a part of.
James Currier:
And Rob having been in the center of Facebook for so long, doing this on a daily, weekly, monthly basis is in a very unique position to talk through how this actually works, and thus why we see the world functioning the way it does with all of us living on these networks like Twitter, and YouTube, and Facebook, worldwide. So Rob, let's talk a bit about Facebook, and Twitter, and YouTube, and some of these social media properties. You've had years of experience there, working in the deepest parts, and the most intense parts of those companies. And there are some challenges to running those businesses. There's operational challenges, technical challenges. And then of course there's the impact on society. I'd love to try to unpack some of the things that you've been able to see and induce there. That would be interesting to startup founders, as they think about building a growing these organisms, as well as everyone who's concerned about how society moves and what the impact of "big tech companies" are.
Rob Goldman:
Yeah. Sure. I guess, probably like many startup founders, I came into it through the lens of metrics, and measurement and marketing.
James Currier:
Right. And you were the CEO of a company, you got acquired in 2014, and then they said, "Okay." What was the first thing that you were doing there, when you first got there?
Rob Goldman:
The company I was working on before I got in, was basically measuring people's social voices, if you will, in terms of how wide they reach, and how much engagement do they get. Trying to quantify aspects of
Rob Goldman (Pt (Completed 05/24/21) Page 1 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
that, which is what most advertisers do when they run ads. So I was bringing that mentality to what I was doing. And I was brought in early on to focus on helping advertisers use the platform. So growing the number of advertisers using the platform, and after a little while I was leading a big part of the ads product infrastructure, and eventually led the whole ads team for a couple of years, towards the end of my tenure.
James Currier:
And what percentage of revenue was that?
Rob Goldman: 99%.
James Currier:
Okay. So that was in your bailiwick?
Rob Goldman:
Yeah. We also sold some VR goggles.
James Currier:
Okay. So 99%. Got it. So that gives us a sense.
Rob Goldman:
Yeah. And when you come at it from the advertising point of view, you think in terms of measurement, and I think that is at the heart of it. For people who don't work in these companies, or don't know people who work in the companies, they're just giant teams of people that are focusing on metrics, and on understanding them, what drives them, and designing systems to improve them. And choosing the right ones is just so, so critical. I think we talked about it a little before, but I think that probably deserves whole podcasts of its own. But I think that's what you see when you get in there.
Rob Goldman:
And if you're on the outside imagining these kinds of selfish or manipulative corporations, you're surprised when you get into the center of it. And you just see these metrics, and the metrics really they're designed to be, or they're thought of as mirrors reflecting the needs of the people using the product. That's the way that the teams want them to function. They don't always function that way, and sometimes in catastrophically bad ways, but I think they're always thought of as being a good indication of somebody doing something right. If they expand and... I think at the heart of those are metrics around how much time, and how much engagement you spend on the platform.
James Currier:
And also, I think the picture is that when you get in there, and you're a founder, and you're just a person with the knowledge and the life experiences you have, you're surrounded by other people, all with jobs, all with their motivations, all with their KPIs that they need to hit. Everyone just trying to execute themselves. So this idea of some sort of one individual giant evil corporation, it's not. It's people striving to be the best they can in their job, with people they admire and people they like working with. And they've got a certain set of metrics that they either inherited, or they made up a year or two ago, and
Rob Goldman (Pt (Completed 05/24/21) Page 2 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
now they are all interacting with each other, this network of people moving those numbers. And that's
their job.
Rob Goldman:
And just to give you a sense of the scale, and just the massive growth of that platform from when I joined, under a billion, to now I think 3.6 or something at the last... If I'm not mistaken.
James Currier:
So 1 billion of revenue in a quarter, and now 3.6...
Rob Goldman:
No that's people. People network, 3.6 billion people. That's like... It is a massive expanding thing around the globe, and the things that were happening on it, and the interactions between people that it enabled and facilitated were just exponentially growing. And there were many teams focused on metrics where there was clearly some issues with those, and those were given to another team. So there was almost like, "Well, another team is worrying about customer service on this, but we're going to just focus on making it grow faster." And those things are a bad recipe, oftentimes really bad things grow that way.
James Currier:
And when you got there. How many people were at Facebook, and now how many people are at Facebook? Do you know?
Rob Goldman:
I don't know the answer to either of those questions. I'm guessing it's something like 3,000 to 25,000 plus, it's something like that. So a 10 X-ing of the company in the time I was there.
James Currier:
Well, I'm just trying to say, we've got this giant network of between a billion, and now 3.6 billion people. And also you got this growth of 3000 people to 30,000 people, let's say. The networks, they're both networks, and those two networks are interacting.
Rob Goldman:
Totally. And to the extent that it's possible, the way that they were designed to interact, internally anyway, was trying to task teams with a focus on these various bits. And if you have teams focused on something like time, or something like sessions, or something like... I don't know, engagements. Then you have this meta system you've set up, and people in the network react to that. So the kinds of people creating content see how the algorithm rewards certain behavior or punishes others, and you get what you've been seeing over the course of last year, which is this hyperbolized debate.
James Currier:
Do you feel as if the Facebook company, and maybe Twitter and YouTube, could they decide to pursue different metrics, that would perhaps not lead to such catastrophic problems, or at this point, are they what they are, and it really can't change. Too many people's salaries, too many people's careers are
Rob Goldman (Pt (Completed 05/24/21) Page 3 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
dependent on moving the numbers up. The network has metastasized around these metrics and that's
that.
Rob Goldman:
No, I'm an optimist. And I definitely believe that it could be changed and improved. And I think there are already really good techniques being developed all over the place. At the big companies, at the Facebooks, the Amazons, the Googles of the world right now. We saw several of them on the ad side, where there are certain... I don't know, what's the right way to describe it, data signatures or something. That usually indicate something is out of control, and you can correct for them by dialing them down. So for example, in an ads algorithm a higher click through rate, maybe is viewed as better, up until, I don't know, a click through rate of 15 or 20% or something. And maybe above that, something weird is going on. Maybe something that shouldn't be going on is going on.
Rob Goldman:
So maybe you want to take extra special look at that sort of thing. And maybe even want to algorithmically correct for that. And that's the kind of thing you can do to prevent people from trying to hang out on the ends of the algorithms, in an adversarial behavioral way. So I think there are a lot of great techniques on the frontlines that could eventually be applied to the problem at a broader level. If we decide we've reached some consensus about what rules we want to set. What sort of things are inbounds or outbounds, and those are starting to happen in the weirdest ways, and I think we'll probably have dozens of countries experimenting with different regulations and laws with social media, and we'll see how they all go. But I think we're about to go through a period of experimentation about the way society wants to try to affect these social systems.
James Currier:
I've got it. So we anticipate that there are going to be laws made, and then there are going to be Facebook teams, internal to the company, who are going to have to respond to those laws, so they can continue to operate in Sri Lanka or in India or wherever.
Rob Goldman: Exactly.
James Currier:
And those teams will then have communication amongst themselves as best they can. They'll be very busy, but they'll try to carve out time to say, "What are you learning? What are you trying? What are you..."
Rob Goldman:
Well, yeah. And we'll see what happens to the networks in those places. One of the things that's really interesting at the bottom of Facebook, and I think we talked about it a little last time, was just identity, is really all you need is an email address or a phone number. There's no real ID that's getting checked. You could imagine some countries might change that rule.
James Currier:
Rob Goldman (Pt (Completed 05/24/21) Page 4 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
Sure. To start tightening that down. What are some of the things you can imagine changing for Facebook, or Twitter, or YouTube? One of the things that I was always thinking is that these are giant networks, we've moved from this place of one to many media, like radio and television. To many to many, and that's just a completely different animal. And we haven't really grokked what that means for us yet. But one of the things clearly is that the crazier voices, the voices that get clicked on, they get more share of voice than they did in the old world, for sure. The fringe ideas, the fringe voices too. But because this is a giant network, we can't necessarily fix it with a hierarchical response. So if this country is going to have this...
PART 1 OF 4 ENDS [00:11:04]
James Currier:
We can't necessarily fix it with a hierarchical response of, this country's going to have this law, or da, da, da, da. It's just going to be kind of mismatch.
Rob Goldman:
Yeah, yeah. It's going to interact, yeah. It's going to be strange, the way it interacts. And I worry about, sometimes, the way it interacts. One of the things that scares me most is when countries take down the internet. And that's what happened in India last, or I guess it was two summers ago now, in Kashmir. It'll be interesting to see the directions they take.
Rob Goldman:
The way I was thinking about it when I was inside Facebook was really kind of led by the laws that we were looking at at the time, the GDPR type regulations that we were seeing in Europe. And I think the feeling there is that, if you paint a picture forward you'll have new forms of transparency required, and a few new forms of controls, maybe, required. So that people can control their own experiences a little bit more than maybe they can now. I think Facebook's actually pioneering that on the ad side.
Rob Goldman:
The basic idea is, there should be no data that Facebook collects about you that is used to show you ads that you can't see or control. You ought to be able to see it all, and you ought to be able to turn it off if you don't want it used that way. And that's a nice, simple idea. And I think you could require something like that. I think people feel spied upon. They feel like people are watching them and that their purchase behaviors are being noted.
Rob Goldman:
And I think what's at the bottom of that problem is email addresses and phone numbers. People give away their email addresses and phone numbers all the time. And if you're a startup, especially in this current world that's happened since iOS 14.5, and you're thinking about building a relationship with your customer, you want to get their phone number or their email. It's the heart of the trade. And you give it up so many times to so many businesses all over the place. And it's aggregated, and bought and sold in lots of nefarious third parties that aren't... where you don't really have ability to get it and control it.
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Page 5 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
And so you could imagine a transparency law, or an access law, that just lets you get control of those bits of your data back. And then one would hope, over time, people would get more comfortable, and the controls themselves would just get better. And so people could go turn things on or off on their experience, whether it's their search experience, or their video experience, or their social media experience, and just have more facility.
Rob Goldman:
And so, yeah, I'm looking for something more entertaining right now, or I'm really looking something more for friends or family, or whatever. And the people will take a little more agency. The controls now are hard to use, or inaccessible, or nonexistent.
James Currier:
Right, for most platforms. And I think what you were saying is that Facebook has built them, they're just not used very often.
Rob Goldman:
Yeah, so ad preferences are the ones the ad team had worked on. And lots of teams have their preferences and care about them. I think there's also a lot of product managers who are thinking really hard about enabling better forms of control now. So I do think it's coming, and I think it's needed, and I think some countries will probably require it, and then we'll see how it works.
James Currier:
So you're saying that we are potentially on this next eve of maybe a web 3.0 where because we're going to redefine privacy, and our identity, and how data is exchanged, and what the rules should be around that, over the next two to three years, that we're going to get sort of a new flourishing of something. Potentially new openings to compete with Facebook, or Twitter, or YouTube, or something.
Rob Goldman:
Yeah, potentially. And potentially new models to give people back some feeling of agency over the way these experiences work for them.
James Currier:
Yeah. We've seen a lot of startups coming to us saying, "People care about their privacy, we're going to be the Facebook for people who want their privacy." And that never really works. It doesn't seem to be the main motivation, but it does feel as if it did find a new motivation to build a new graph that you should build it in a way to do all these things you're saying with privacy.
Rob Goldman:
Yeah, I mean, I think there are many people who don't mind personalized ads and would be happy to have them. And there are, I'm sure, many people who would rather turn them off because they feel, I don't know, that they're creepy or something. And it's not obvious to me that the platforms wouldn't actually be better off with people who don't want them, don't get them. There's other ways. I just don't think they're going to there until there's a law.
James Currier:
Rob Goldman (Pt (Completed 05/24/21) Page 6 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here. Interesting. And why not?
Rob Goldman:
I don't know, just the dynamics of competition, I guess.
James Currier: Okay. So-
Rob Goldman: Risk, taking a risk.
James Currier:
Got it. So a Twitter or a Facebook won't proactively do that, they won't develop a culture of stewardship enough to say, "We should get ahead of this. Let's start building some innovative, thoughtful things and testing them out." They're going to wait until competition or laws force them to. What's behind that mentality?
Rob Goldman:
I don't think they're waiting. My sense is that they're working really hard. And, I mean, I know when I was there I was working really hard at upgrading the infrastructure to be just much more aware of what's going on. The reality is, these systems were all built during periods of hypergrowth. And so it's not the cleanest infrastructure. And just knowing, for example, every single little bit of data that travels around the sort of backend systems of Facebook, which permissions came with it, when was it collected, what country. Those are systems that don't exist in most places.
Rob Goldman:
And I think they're really working hard to try to have those sorts of systems so they have the kind of flexibility to do this sort of work. And certainly, most startups don't have them. And yeah, you may even see a situation where they're required to be opensourced so everyone can look at them somehow. I don't think they're waiting, I think they're leading in many ways, but down a pretty dark and unknown path.
James Currier:
Got it. Not a dark path, but just an unknown path. And so they're-
Rob Goldman: Yeah, unlit, yeah.
James Currier:
Unlit. So they're not moving at an incredible speed, because no one's really sure what the best solutions are. And there's as limited amount of time, there's limited amount of hands to build the new handles that you can then want to pull to adjust things.
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Page 7 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here. Yeah.
James Currier:
Yeah. And talk about the sort of emotional reaction. Because the people who are running Facebook are humans, and they had emotional reactions to all the stuff that went down with the Trump election of 2016, and then the subsequent stuff, and the press really turning against big tech, in general, and Facebook in particular. I mean, talk to me about some of the ways people were feeling and they were reacting. Because I think that's fascinating, to try to think, they must be so proud of the machine that they've built, they must sense all this incredible power through their fingers as they have the ability to impact 3.6 billion people, but also maybe some frustration, or disappointment, or anger around sort of the narrative around the company, and whatnot.
Rob Goldman: Sure.
James Currier:
Defensive reactions, that, "Oh, no matter what we do, everyone's going to complain."
Rob Goldman:
Yeah, no, I guess, I mean, I think the main emotion that maybe, I think, people don't understand or hasn't been expressed is the magnitude of the challenges, and the tradeoffs, and the absolute passion on both sides of the question.
James Currier:
What's the question, Rob?
Rob Goldman:
Well I run into people often who think Facebook should just X, and then X is, I don't know, get rid of hate speech, or get rid of misinformation, or ban PACs, or ban political ads. And it's easy to say, but what they haven't gone through is the exercise of, what would happen if Facebook did those things? And it's equally, if not worse, quite troubling. Both sides of that are difficult. And I think you saw that, clearly, over the escalation from 2016 to 2020, ending in the takedown.
James Currier: Of Trump.
Rob Goldman:
Of Trump on 2021, January. And that is so chilling, to see a public voice as important as that one silenced in that way. And yet, it may be even more chilling to imagine leaving it up in that circumstance. And so if you think about, there's a lot of consternation about it now, I think, rightfully, it is just such a bad place for the platform to have to be. For any of these platforms to have to be, kind of in a very difficult spot. And I think you're starting to see what happens when you have big, important voices taken off the platform, which is, they're building a following outside of the platform.
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Page 8 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
So just imagine what that would look like if there were voices being deplatformed regularly, which is what, I think, one side of the argument really would prefer. And imagine where those places are forming now being 10 times bigger. And is that better? Is that a better outcome? So they're just not easy, they're just very, very tricky, very hard answers. I think it's completely fair to say that people involved, during that time, never anticipated the kind of virulent manipulations that we saw, across the platform, all over the place.
Rob Goldman:
And I think seeing the way it was used in these, to call this, terrible sort of social bads, to happen, I think made people even more concerned about what might happen if that change you were thinking would be easy and important to make was made. What would be the next order effects of that? So it's like, at a time where kind of waves are sloshing around the pool, it maybe is a good time not to make sudden changes.
James Currier:
Yeah, interesting. And so you hesitate, given the acrimony around-
Rob Goldman:
But there were many changes made, across the board, in order to respond to it. Some, I think, that worked really well, and others that didn't.
James Currier:
Right, and you had to take a guess. There's a comment that people didn't anticipate that it would be used in such a virulent, nasty way. That's a bit of a surprise to me.
Rob Goldman: Yeah.
James Currier:
Because you're looking at the data, in 2013, '14, '15, and there wasn't any sign that the storm was coming?
Rob Goldman:
Well, I mean, maybe someone was looking at that data, but I was not. I mean, I'm sure there are traces, and obviously, we went back forensically and shared all the traces that were found later. But no, there was nothing obvious or overt there. And I guess what I'd say is, it was certainly knowable. So it's not as though people who studied the way information systems had been manipulated in the past didn't have some idea that maybe these would also suffer from that sort of manipulation.
Rob Goldman:
So it's not as though the expert knowledge didn't exist in the world, but just, the expert knowledge didn't exist on the teams that needed it when they needed it. And it's not, in my view, sensible to expect those teams to know to ask for it. There's so many things they don't know they don't know. So what you want is a system that can take the expert knowledge and bring it to the team when they need it.
Rob Goldman (Pt (Completed 05/24/21) Page 9 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here. Rob Goldman:
And that was systemically what was wrong. And that's what was missing. And that's what we put in place with the transparency archive. So now those professors, and antiterrorist experts, and propagandist people can look at everything that's happening. And if they see something that looks wrong, they can call it out.
James Currier:
Mm-hmm (affirmative). And are they doing that today?
Rob Goldman:
They didn't happen before. They are. There's a lot of uproar, there's a lot of headlines generated, or there were, at least in 2020, that were generated from behaviors in the ad-
PART 2 OF 4 ENDS [00:22:04]
Rob Goldman:
Or at least in 2020, that were generated from behaviors in the ad archive. Which groups were spending on which ideas, you know? And there were take-downs, there were many take-downs. Many, many more take-downs in 2020 than there had been in the similar period of the cycle in 2016,
James Currier:
Because Facebook essentially crowdsourced the analysis of the data to help spot signal.
Rob Goldman:
Yes. Because people know what the Russian hackers like to do, but there's another group of, I don't know, hackers from a different country who behave in a different way that some expert in that country knows, and can look up and check.
James Currier:
They've got a hotline and they can call the team inside of Facebook and say, "Hey, check this out," and then-
Rob Goldman:
Then I can report it from the archive. And if it's a political ad, it'll get looked at, and those patterns can be found, and then those take-downs can happen. I mean, I'm sure there's other signals the Security Team looks at, of course. But I guess from the point of view of this very specific problem we're talking about, which was 2016 ads, there was a practical problem and a systemic one. The systemic one, I think was the one we were trying to correct for there. I'm sure there are many other systemic problems, and the ones we tried to set out at that time, the principles by which we want it to measure our own work and then be held to those, that was a way of trying to lead through that kind of unknown territory, this kind of dark cave or whatever, with a flashlight.
James Currier:
Rob Goldman (Pt (Completed 05/24/21) Page 10 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
And how successful was that? I mean, who was on the committee or whatever, the group, to come up
with those principles? Was that you and five other people? Was it you and-
Rob Goldman:
Well, I mean, yeah, some of them were kind of like team principles from the beginning, team culture. I mean, we're talking here just about the Ads Team. I'm not talking about Facebook more broadly, or it's mission statement or founding principles. All of those things are super important, and also totally relevant to talk about, because they were deep in the Facebook culture and the Facebook culture is a much bigger test tube than anything that was happening on the Ads Team. But, there were sort of simple principles about like we've been talking about here, you know? Access, transparency, control, the idea that small advertisers have the same tools and power as large advertisers, and kind of democratization of technology. All these things were kind of principles of the team.
Rob Goldman:
And then you can look at aspects of our product and say, "Well, we have a principle around transparency, but this part is still kind of hard to understand. It's sort of opaque. We could do better here." And people don't understand how their data gets into the system, and they don't like it. How can we give them better tools? How can we give them better controls? How can we make the preferences more sensible? You know? And those things can find their way onto roadmaps. And you can do good work by kind of following the principles down into the infrastructure. And I think that's what was happening and people, at least on the Ads Team when I was there... And I think people don't appreciate how long it takes, just how much work it is. The infrastructure is deep, and it's big, and it's being used every day by billions. And so to swap it out, kind of in the middle, is hard work. It's just work, but it's the work you need to do.
James Currier:
And this idea of coming up with principles and then trying to live by them, developing some laws for the team and then treating all of the advertisers equally, or treating everybody equally, that seems like a good idea.
Rob Goldman:
Well yeah. It's easy because you can delegate easily that way. And you can have people following the principles in their work, in lots of different areas, and feel reasonably good that you'll be able to understand what they're doing and why.
James Currier:
And what would make it hard to enforce those principles, or to agree on principles, or to get consensus in any organization about principals?
Rob Goldman:
Yeah. I mean, there was a healthy debate about the principles and lots of groups were kind of consulted. It was a good sort of exercise I think, and living them every day is sort of the case law of the contradictions, because they're often difficult to live by because, they contradict with each other.
James Currier:
Rob Goldman (Pt (Completed 05/24/21) Page 11 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here. Like laws do in this society.
Rob Goldman:
Sure. Yeah, of course. Yeah. I mean, like one of the basic questions deep, deep in this is the freedom of expression versus safety, you know? Then you have to kind of make the trade offs and then you have sort of a judicial, like, which principle wins out here. But I guess that was the exception, I think; it was easier to make a very long list of things that just didn't quite meet the principles that we just have work to do on. And, I think that was very motivating for everyone, and it was actually very easy to know how well you were doing by looking at how much was left, you know? And I think that was really productive. And I think if there is a good sort of story to tell out of this, it's that Facebook and maybe companies like it who are facing similar challenges, and I think there are a handful, are just more prepared now from an infrastructural level, to deal with them than they were when they cropped up five, six years ago.
James Currier:
Yeah. I mean, this is one of the arguments against breaking up a company like Facebook, just that the trust and safety mechanisms and the infrastructures have been tuned every week, every month, every year for the last many years, trying to fend off all the nuttiness that could go on.
Rob Goldman:
Mm-hmm (affirmative). Yeah. And seem to be getting better, yeah.
James Currier:
And they're getting better. I mean, what I think goes now past without anyone mentioning is the amount of stuff that does get cleaned off the platform that we all get protected from.
Rob Goldman:
Yeah, exactly. The take-downs per quarter is now public. I think there's like 1 million a day or something, of fake account take-downs now happening, and that's being reported more transparently, and there's the transparency report now. So there's things happening, and I think progress is being made. But I think the quote is it's like 99 point some percent of the fake accounts are found before they ever get live, but that other 0.3% is out there causing all kinds of problems.
James Currier:
Right. And it's cumulative, because it's still there five months later. Yeah, interesting.
Introductory Voice:
You're listening to the NFX podcast. If you're enjoying this episode, feel free to rate and review our channel, and share this conversation with someone you think would benefit from these insights. Follow us on social @nfx and visitnfx.com for more content. And now back to the show.
James Currier:
What do you make of the Facebook, they're trying to call it, "The Supreme Court" or something. What is the-
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Page 12 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
I thought that was fascinating decision. And I actually really liked it. I think a couple of things. First, I
think that Oversight Board is a good example of Facebook's not waiting around for like a law.
James Currier: Mm-hmm (affirmative).
Rob Goldman:
They're aware that it's weird and not comfortable for people inside the company to be making all the decisions. There isn't any structure, other than the laws out there to use, and so they created this board. And so, I think it's proactive, and I think it's risky and brave, and a lot you were asking before, like, "Why won't they do it without being prodded?" Here's a case where they did. And I think the board said something really good, which is like, "You're in limbo here. You're being unclear. You can do whatever you want, you can take his account down forever, or you can take as a countdown for some temporary period, but you can't say nothing about the period, or the term, or the principles you're using to evaluate when to put it back." There was like an arbitrary illness that was being called out
James Currier:
Whereas Twitter just declared, "Donald Trump is done forever."
Rob Goldman:
Yeah, which you could argue about, maybe you think from a free expression point of view, that's way worse. I'm not going to take sides there, but I think you can certainly appreciate the need for clarity, which I think was what the Oversight Board was calling out there, and I liked it. So I'm curious to see what happens.
James Currier:
Yeah, so they could deploy the Oversight Board as a new node in the network of Facebook, to help the network behave better.
Rob Goldman:
Yeah. Is that sort of like your ombudsman concept?
James Currier:
Yeah. We were talking about having an ombudsman, which used to exist in most newspapers, which was this sort of person who is charged to be in between the needs of society, and what's going on internally. And then they would have some powers, they would be there for a period of two years. It would be paid for by the company, but their job would be to be reporting to the outside what's going on inside, and to help effect change, knowing the details. Because change is hard, right? And you have to do a lot of work to make change, and so a lot of people say, "Oh, this company should do that." But the ombudsman is inside the company and knows what it would actually take to get that done, and what the timelines would be, and maybe have some deeper discussions with people about what the second order effects would be of a change or something, so they could have better judgment. But their job would be not to look after the shareholders, but rather to look after society, and what the best interests of the many are.
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Page 13 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
Interesting. So, I mean, what's your read on the Oversight Board through the lens of the ombudsperson
or whatever?
James Currier:
Well, my read on it is that the people on the Oversight Board don't appear to be people like Rob Goldman, yourself, who actually understands the technology, actually understands social media, understands product development, understands how tweaking the size of the Like button, and the color of the Like button, can change dramatically people's engagement. They're just not as qualified as many others, like yourself would be to adjudicate wisely on things, number one.
James Currier:
Number two, I think the board takes too long. I mean, the number of things they actually review over a given period of time, given the speed with which the news cycle moves, and given the speed with which the network is growing, just it's mismatched. And it appeared to be more of a sort of a solve. It appeared to be a PR move, rather than a really earnest attempt to figure out if there's a way for the whole network to move better.
Rob Goldman:
But I mean, so you would compare it, from a structural point of view, as weaker than an ombudsperson, who would be an internal role?
James Currier:
Yeah. And weaker in the sense that it won't be as rapid, weaker in the sense that it won't have the insider knowledge about how things work, weaker in the sense it won't be on the inner conversations about the different personality types that might exist in a company, and how they all balance against each other. You know, so-and-so might be non-confrontational, and so-and-so might be sort of realpolitik. There's so many different personality types that different people have, and those things matter in terms of how people think about what they think the solution should be for problems. So I think an ombudsman being inside would understand that, whereas this committee of a lot of people is just not going to be as effective as understanding that, so.
Rob Goldman:
Mm-hmm (affirmative). Interesting.
James Currier:
That's my guess, but I mean, I'm glad they have it.
Rob Goldman:
Yeah. And what did you think of the decision on the Trump account?
James Currier:
See, this is the thing; I don't understand the second order effects, but somebody said, "Boy, it sure is quieter around here."
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Page 14 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
I meant the Oversight Boards decision, like on the need for some clarity on the timeframe.
James Currier:
I think that's great. I think that's a good point, but I don't think that that's that big of a deal. It seems to me like that's kind of obvious that we should have that.
Rob Goldman: Yeah.
James Currier:
Like Twitter did, like just, "Here's the deal." In terms of me judging whether one decision is right or not, that's not really what I'm interested in. I'm interested in: what's the network design?
Rob Goldman:
Exactly. Yeah, I'm not trying to get at your opinion on the decision to put the Trump account up or down, but rather the Oversight Board and how it worked there.
James Currier:
Yeah. And I mean, I guess my point is, I think that's a good demonstration of how the Oversight Board is not particularly strong or powerful, because if that's their biggest decision, it's like, "Okay, well that was kind of small."
Rob Goldman: Anticlimactic yeah.
James Currier:
Anticlimactic. Yeah. So that would be my opinion. Now, look, I think the transparency [inaudible 00:32:49] that you've got to give all the data out is really where I look at network solutions for network problems. You're now opening up the data to let the network of analysts and network of experts join the network in a way. Join in influencing this-
PART 3 OF 4 ENDS [00:33:04]
James Currier:
... join the network, in a way, join in influencing this organism that everybody tries to sound smart by saying complex adaptive systems, right?
Rob Goldman: Yeah, yeah.
James Currier:
Joining this complex adaptive system in making it function better. And that is a network solution. I've often thought that Facebook should have a webpage where they list out all of the hotspots that we have, whether it's abortion or anti-vaxxers or Trump or whatever, Palestine, and sort of list them out,
Rob Goldman (Pt (Completed 05/24/21) Page 15 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
and then list out all the things that we're kind of past, that were hot subjects two years ago. And look, those are all in blue, look at how much we've solved guys. Like Facebook, we're working our butts off to solve these problems, and yes, we have these hotspots here today, and yes, here's the metrics on how hot these spots are. And so, yeah, let's talk about them, but at least let's be transparent about where the hotspots are and what we're doing.
James Currier:
Like one time I had a conversation with you about the Russian situation, and you said, "Yeah, it's important, but it already happened, and Trump's now our president." We're now worrying about Pakistan and India. We have two nuclear powers, there's a lot of disinformation going out on Facebook, and we're trying to figure that out. And that's kind of... There's an election coming up in the next three or four months, we've got to get ahead of that. So you were already onto this, and it blew my mind. I was like, "Oh my God, these guys are dealing with these giant geopolitical issues, and none of us have any idea of all the work that Facebook is doing to try to protect the planet." And I would love to have a little more transparency around that because then you could show us, "Oh, well, what's hot now is the Pakistan India situation." Like, "Oh, I hadn't thought about that." Like Pakistan, India was nowhere two months ago, and look at, it moved up the chart, it's now number six. And this is how hot it is, based on... Here's the data that's coming through around when the elections coming, whatever.
James Currier:
I always felt like that would be another network solution, more transparency and more communication to all of us who are part of the network, that we're participating.
Rob Goldman:
Yeah. It has to work that way, just for the same reason, that I was describing with the Russian interference on the ads team, is this broader problem of just speech in every country in the world. It's so nuanced. Every culture, every country, every subculture has their in crowd and their out crowd and their divisive issues, and it's just impossible to know even the depth and breadth of the American issues for American employees, much less the issues in every country all over the world. So it has to be open and transparent, it's just unworkable any other way< yeah.
James Currier:
It needs to be scalable.
Rob Goldman:
And that was very much the way we were feeling, I think, following the 2016 experience in the US. Was like, "Hey there's an election just about, somewhere in the world, just about every week. Every month we have an election. So there's real work in other countries, cultures, communities, languages, systems that we don't fully get, and that's why it just has to be a solution so that the people in those places can look.
James Currier:
Right, and participate in the network they're in.
Rob Goldman:
Rob Goldman (Pt (Completed 05/24/21) Transcript by Rev.com
Page 16 of 19
This transcript was exported on Jul 09, 2021 - view latest version here. Correct.
James Currier:
Because Facebook is in many of our communities, whether it's Facebook or Messenger or YouTube, it is the interface through which we are now making our policy, making our society. It's how we read what's going on. Interesting.
James Currier:
So what are you working on now, Rob Goldman?
Rob Goldman:
I'm dabbling right now. I did some work volunteering during the 2020 experience, and then since November I've been kind of almost like at school, in a way, just kind of learning about a couple of different areas I find interesting. Trying to understand kind of what's happening with technology and where it's going, and meeting people who are doing stuff and just learning from them and seeing where that leads.
James Currier:
That's an amazing time. So if you were to create a 500 million person social network, because there's probably a number of people in the world who could do that, you're one of them, what would be some of the principles you would want to build it on?
Rob Goldman:
Thanks for the confidence. I'm not sure I'm one who could.
James Currier:
No, you are. You have the technical understanding, you have the viral understanding, you have the human understanding, product design. You're one of the few, you really are.
Rob Goldman:
Thank you. I wish I had a great answer to that question, and you can tell I'm not spending enough time in my product journal, journaling up great ideas, because I don't have already a answer for it, but I'll say a couple things, I guess. At the heart of it, these networks that we're building, I think of them as places. And they have like the characteristics of places, and what's cool about some places is there's just a really clear understanding of what happens there, how to behave there. And I think some of them are really, really positive. I don't know, you could, depending on where your passions lie, you could say something like a sporting event or something like a religious service, or something like that-
James Currier: Or Burning Man.
Rob Goldman:
Or Burning Man, maybe, and I think those things, they have certain rules and kind of illustrations of how people ought to behave, and they become kind of self-propagating. And I would like to see some of
Rob Goldman (Pt (Completed 05/24/21) Page 17 of 19 Transcript by Rev.com
This transcript was exported on Jul 09, 2021 - view latest version here.
those really positive demonstrations and examples take root and occupy some of the time in people's lives that are going to some of the other ones. I don't know exactly how or where that'll happen, but it's been interesting to watch some of the newer things kind of emerging.
Rob Goldman:
I thought the Clubhouse launch was pretty interesting to watch, and I'm seeing some smaller social systems that I'm looking at start to evolve, and it's neat. It's people connecting with each other on topics of common interest and it's really cool to see. I think everyone knows that's what the technology's capable of, and how to get that goodness out of it without the badness is the question of the day.
James Currier:
And so would a principal maybe not be to focus on dwell time or engagement?
Rob Goldman:
Yeah, no. I think at the heart of this is a tragedy of measurement. We just cannot measure how satisfied you feel, or whether you lived by your values today, and if that were measurable, than this technology would be so much better for us. So I think you're stuck with things that are measurable, and lack of legibility is, I think, at the heart of a lot of the big problems, and I get to a place where it's easier.
Rob Goldman:
Like you can imagine down one particular future tech path of BCEI and EEG type displays, measuring even more, and you could imagine that being a correction to these gross measurements now, of time or engagement, or you could imagine it being another curve ball and sending us in the wrong direction. But I think that is at the heart of it is time, as bad as it is, and I think it is really bad. Like if you just gave people everything they wanted and filled up their time with that, just a lot of bad things that would grow in the world, but it's very egalitarian, and everyone has the same amount, and I think that's an important aspect of it. And if we could just find some better ones, maybe you'd get some better systems.
Rob Goldman:
But no, I think in the meantime, it's going to be left to the judgment of the people who build these systems to determine if this quality that they're expecting to see is there or not.
James Currier:
Very interesting, very interesting. Well, Rob Goldman, it is such a pleasure to have you on the NFX Podcast and just to spend time with you/.
Rob Goldman:
It was a pleasure.