Why Mark Zuckerberg’s Leadership Failure was a Predictable Surprise

By David De Cremer

In today’s more developed global village, responsible leadership is of paramount importance especially from businesses that make use of digital platforms. In 2017, The European Business Review published an original article by De Cremer, Zhang, and De Schutter entitled “The Challenge of Leading Digital Platforms in Responsible Ways”, which pointed out the disasters that Facebook and other digital platforms could face because of their lack of responsible leadership and why simply relying on legislation to tackle the ethical challenges of digital platforms is reactive at best. Today, as requested by The European Business Review, De Cremer provided an updated analysis on what they have predicted in 2017 in relation to the emergence of the Cambridge Analytica controversy. The big question today is how can Facebook, through Mark Zuckerberg’s leadership prevent people from dwelling in a realm of turmoil?

 

In our super connected world, hardly anyone will have escaped the news that the most famous of social media platforms, Facebook, has misused personal information at a very large scale. According to a recent update on 4th April 2018, it may well be that the data of as many as 87 million Facebook users has been accessed. The core of this privacy scandal is that Facebook profiles were mined for data to be used to influence the U.S. and UK elections. Facebook provided access to these profiles to academic scholars, with the pledge to only use the data for scholarly purposes, but it soon became clear that this was not the case. It seems that scholars working together with the now infamous British analytics firm, Cambridge Analytica, provided (knowingly or not) this company the “holy grail” of big datasets to use people’s personality profiles to gain the upper hand in several political battles across the globe.

No matter how much disbelief the world may communicate, this privacy scandal can be identified as a full-fledged predictable surprise. And, the reason why this is the case is the lack of responsible leadership that founders and companies using digital platforms as their business strategy display.

Most of us watched in disbelief the evidence mounting that Facebook allowed personal information to be used for political purposes and eventually went into shock when it was revealed how many Facebook profiles had been data mined. Wasn’t it the case that Facebook, as a reputable and competent company, was believed to be able to protect the privacy of its users? In fact, in June 2017, the founder and CEO of Facebook, Marc Zuckerberg, told in an interview with Freakonomics Radio: “of course, privacy is extremely important, and people engage and share their content and feel free to connect because they know that their privacy is going to be protected [on Facebook]”. No matter how much disbelief the world may communicate, this privacy scandal can be identified as a full-fledged predictable surprise. And, the reason why this is the case is the lack of responsible leadership that founders and companies using digital platforms as their business strategy display.

A predictable surprise is the emergence of a disastrous event that companies could have anticipated and prepared for (Bazerman & Watkins, 2003)¹. A problem, however, is that too often companies do not invest sufficient time and resources in the short term in order to avoid potential negative consequences that may emerge in the future. As I will explain below, for Facebook, the reason why they failed to act in a more responsible way concerns their own psychological biases and hubris. And, unfortunately, such neglect of responsible behaviour has turned into a predictable surprise in which the damage may take years to repair. Indeed, as indicated by Mark Zuckerberg in a podcast interview with media outlet Vox: “I wish I could solve all these issues in three months or six months, but I just think the reality is that solving some of these questions is just going to take a longer period of time.”

Is all this negative buzz and excitement unique to Facebook? Not really. It is something that almost seems inherent to any company adopting digital platform strategy. So, what is the problem with creating digital platforms to – using the famous words of many important Silicon Valley executives – improve people’s lives? Of course, trying to improve human lives is not a problem, the problem reveals itself when looking at the motivation of most entrepreneurs using this platform strategy. Last year, I, Zhang, and De Schutter already pointed out the disasters that Facebook and other digital platforms could face because of their lack of responsible leadership. Specifically, these authors wrote: “As with any business strategy that is new, revolutionary and aimed at raising growth quickly, the focus of platforms is directed primarily towards the product itself and less so at the possible long-term consequences at the societal level.” (p. 13)² And, it is exactly this extreme focus on the product which makes Zuckerberg and many others among him suffer from a blind spot to take on any sort of responsible leadership. How so?  

The obsession with creating something unique, something that the world had not seen before, but would change the lives of us all, makes many entrepreneurs working in this industry see the value of their undertakings in creating innovation itself, but not at all in its potential consequences.

I call it a blind spot, because their obsession with creating something unique, something that the world had not seen before, but would change the lives of us all, makes many entrepreneurs working in this industry see the value of their undertakings in creating innovation itself, but not at all in its potential consequences. It makes most entrepreneurs pursuing innovation as an end goal so focussed on their products that they forget about the human actor involved and the consequences that could impact them. Unfortunately, when creating a platform so innovative like Facebook, one does carry the responsibility for the consequences the use of this platform will produce as well. And, importantly, one is responsible for those consequences – that will take place in the future – from the first day that the platform comes into action and not only when those consequences reveal themselves! This latter part, however, is not really understood well, as illustrated by Mark Zuckerberg’s statement that only today – after the facts are known – he noted that: “I started this place, I run it, I’m responsible for what happens here. I’m not looking to throw anyone else under the bus for mistakes that we’ve made here.”

[ms-protect-content id=”9932″]

Too often, however, the beauty and attractiveness of the innovative product – the digital platform in this case – takes the attention of the developer away from the true implications and consequences it could reveal towards society further down the line. But, creating a platform that is fuelled by information sharing, creates a powerful situation that requires responsible leadership. As such, in 2017, we wrote that, “This truth is especially the case for digital platforms, whose product focus is on working with information. As we all know, those who have information also have power. It is therefore suggested that with great power also comes great responsibility.” (p .13) So, why then do we not see greater responsibility undertaken from these leaders? It is fair to say that being blind or naïve to the fact that – as the founder and CEO of Facebook ­– one is responsible for this is underpinned by a certain psychology. 

An important problem is that Facebook and others in this business have reasoned for a long time that the platform will regulate itself because (a) the user’s own interest in connecting with others worldwide and sharing information as a way to do so was regarded as a self-regulating mechanism, and (b) disclaimers including legal terms and conditions would induce a sense of responsible information sharing practices (meaning that Facebook did not have to actively monitor this). As we all know now, unfortunately, it did not. Even worse so, in this rather naive approach on what it takes to run a digital platform in responsible ways, the same mistakes as in the financial crisis seem observable. In fact, in our 2017 piece, we commented on this exact issue: “As was the case with financial markets, today’s businesses making use of digital platforms also seem to embrace a certain type of Adam Smith’s “invisible hand”. In other words, creating the platform is seen as the responsibility of the company whereas regulating its workings is more an issue of the platform users. Unfortunately, as the financial crisis showed that the invisible hand is more a fantasy than reality, in a similar way, digital platform businesses are increasingly being confronted with a host of unethical consequences due to a lack of responsible leadership and regulation.” (p. 13)

This belief in the self-regulating capacities of one’s beautiful and innovative platform design results in its founders not feeling obligated to lead and coordinate what is happening on the platform. As we said in our 2017 piece, “One important thing that is common in these examples is that the respective company owners did not directly feel responsible for the rules of play. In other words, as demonstrated by these platform builders, responsible leadership was and is lacking.” (p. 14). One could maybe maintain this point of view if the platform is a small one and the community of users is transparent and accountable. However, Facebook prides itself in leading the world in social media use with an audience of over 2 billion active users – meaning that the Facebook community is larger than any country in the world! Under those circumstances, clear guidance and leadership is needed. One can only deceive oneself in this case if one believes that such a population mass can responsibly coordinate itself. For this reason, we advised companies using a digital platform strategy in the following way: “[What is] important to take into account here is that if platforms increasingly grow bigger, collaboration between the different stakeholders will be less coordinated and diffusion of responsibility will emerge. If this is the case, the platform community will be unable to regulate themselves. Companies are therefore advised to be present in the platform community to facilitate and coordinate the exchanges that lead to trust and cooperation building systems.”

Once ethical dimensions are removed from the leadership equation, it is hard to change the view of founders and CEO’s with what is currently happening.

Nevertheless, once ethical dimensions are removed from the leadership equation, it is hard to change the view of founders and CEO’s with what is currently happening. As we (2017) noted: “In the US, Mark Zuckerberg has recently been criticised that verifiably false messages distributed on Facebook influenced the US presidential elections. In response to these accusations Zuckerberg noted that it was crazy that Facebook could influence the outcome of the presidential race and downplayed the amount of fake news out there.” (p .14) To overcome such blindness, it then takes a disaster to happen that reveals itself in a painstaking way that one failed when it came down to taking responsibility for the great powers one was given by the community of platform users. And, indeed, only now, Zuckerberg is able – with hindsight as a benefit – to come to terms with his own ethical failures. As he recently noted, on 4 April 2018, that he: “never should have referred to fake news as “crazy” as he did during the 2016 U.S. Presidential election.”

The problem is that founders of these digital platforms do not see any personal responsibility for what happens once the platform is running, simply because they only see responsibility for their direct observable actions but fail to see that because of their powerful position (collecting personal information from millions of people), their sense of duty and responsibility has to be placed in that broader community of platform users. Now this privacy crisis has surfaced, Mark Zuckerberg seems to understand this important message. As he indicated on 4 April 2018, when having a call with journalists worldwide: “We didn’t take a broad enough view of what our responsibility was”. The fact that Facebook failed to take this broader perspective at the same time underscores the failure of the founder himself, which he also acknowledged, by saying: “It was our mistake – it was my mistake.”

It thus seems that Zuckerberg has finally arrived at the realisation that the start of Facebook had to begin with an ongoing awareness of his own personal responsibility as the primary driver of Facebook’s leadership in the digital community.

It thus seems that Zuckerberg has finally arrived at the realisation that the start of Facebook had to begin with an ongoing awareness of his own personal responsibility as the primary driver of Facebook’s leadership in the digital community. It is good to have this acknowledged by a leader in the community and hopefully it can be used as a learning opportunity for many of the other digital platforms out there. But, be aware, the truth is that our human psychology can constrain even the best of intentions. When we engage in and commit to a new start-up or company that promises innovative and live-changing opportunities, the “innovation only” blind spot may kick in and take the necessity to display responsible decision-making (as early on as possible) out of the leadership equation. It is because of this psychological bias that we need to be careful to not only use legal disclaimers or rely solely on box-ticking approaches that satisfy the minimal requirements set out by the law. Laws are reactive in nature. Indeed, governments usually decide to create new laws as a response to something that has gone wrong. Hardly ever will new laws be called into life as a pro-active measure. And, it is exactly this kind of proactive attitude, those leading digital platforms where so many potential ethical dilemmas can surface, need to take. In fact, it has clearly been demonstrated that being responsible in decision-making requires a pro-active way of leadership.³ This kind of leadership makes leaders – to the extent that it is humanly possible – aware of the potential pitfalls that lie ahead, preparing them to anticipate and avoid potential ethical failures.

Feature photo: AFP

[/ms-protect-content]

About the Author

David De Cremer is a Provost’s chair and professor in management and organizations at NUS Business School, National University of Singapore. He is the founder and director of the corporate-sponsored “Centre on AI Technology for Humankind” at NUS Business school. Before moving to NUS, he was the KPMG endowed chaired professor in management studies and current honorary fellow at Cambridge Judge Business School. He is also a fellow at St. Edmunds College, Cambridge University. He is named one of the World’s top 30 management gurus and speakers in 2020 by the organization GlobalGurus, one of the “2021 Thinkers50 Radar list of 30 next generation business thinkers”, nominated for the Thinkers50 Distinguished 2021 award for Digital Thinking (a bi-annual gala event that the Financial Times deemed the “Oscars of Management Thinking”) and included in the World Top 2% of scientists (published by Stanford). He is a best-selling author with his co-authored book (with Tian Tao and Wu Chunbo) on “Huawei: Leadership, Culture and Connectivity” (2018) having received global recognition. His recent book “Leadership by Algorithm: Who leads and who follows in the AI era?” (2020) received critical acclaim worldwide, was named one of the 15 leadership books to read in Summer 2020 by Wharton and the kindle version of the book reached the no. 1 at amazon.com. His latest book is “On the emergence and understanding of Asian Global Leadership”, which was named management book of the month July (2021) by De Gruyter. His website: www.daviddecremer.com

References

1. Bazerman, M.H., & Watkins. M. (2008). Predictable Surprises: The Disasters you Should Have Seen Coming, and How to Prevent Them. Harvard Business Review Press

2. De Cremer, D., Zhang, J., & De Schutter, L. (2017). The challenge of responsible leadership in digital platforms. The European Business Review, July-August, 13-15.

3. De Cremer, D. (2013). The proactive leader: How to overcome procrastination and make a bold decision now. Palgrave Macmillan.

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here