Zucked, by Roger McNamee

SCRIBBLES  (Notes as I read)

-This book is written by an early investor in Facebook, Roger McNamee.  His intent is to give a warning about the unchecked power of the Facebook economy–how the current economic reward systems of Silicon Valley companies may not always have the best consumer interest at heart.

-McNamee gives an interesting description of his life’s journey into venture investing.

-McNamee says that Silicon Valley can be best explained by two laws:

Law #1:  Moore’s Law–it originally stated that the number of transistors on an integrated circuit doubles every two years.  But the more modern/applicable interpretation is that the efficiency of an integrated circuit doubles every 18 to 24 months, which basically means that computing power is getting faster and cheaper about every two years.

Law #2:  Metcalfe’s Law–the value of a network is proportional to the square of the number of nodes in a network.  So large networks have exponential value relative to small networks.

-These two laws together–cheap and powerful processing and network scale–have large economic implications.  It took 50 years to connect every computer to the same network (the Internet) and we’ve now been experiencing the economics outputs of the network effects.

-McNamee goes into a history of technology in Silicon Valley.  It’s an interesting sequence that summarizes the early tech days, to the first Internet boom, to Web 2.0.  McNamee emphasizes how today’s Web 2.0 technologies, and those engineers who work on them, do not need to have vast system level understanding of the underlying technology stack.  This enables a “Lean Startup” culture, where companies can develop minimum viable products (MVPs) quickly, and test for market adoption.

-McNamee discusses the history of Facebook–from his early days as an investor and the journey company makes.  This is very interesting to read–nothing particular to summarize for me, but cool to have that insight from an early insider.

-The Paypal Mafia–this is a reference to people who worked at paypal and later went on to form the most iconic companies in the valley.  The author believes that the “mafia” had an incredibly profound impact on the culture and priorities of the current tech industry.  You can read about the paypal mafia at this wikipedia page.

-Trading privacy for convenience.  One point the author makes is that Facebook made it easy for people to give up their privacy because of the convenience it offered and the social benefits it conveyed.  There’s a quote in there that I liked:  “Convenience, it turns out, was the sweetener that lead users to swallow a lot of poison.”  He’s specifically referring to Facebook Connect here–the feature of Facebook that allows users to login on other websites/apps with their Facebook account.  The tradeoff of doing this conveniently came at a big privacy cost, McNamee argues, and ties it, for example, to how it may have enabled Russian interference in the 2016 election.

-McNamee describes the incredibly complex background technology, algorithms, and economic systems behind the Facebook engine.  There is a lot there, and I am sure that I will be missing a lot, but I’ll try to get down the main points I remember.

Facebook has a very sophisticated AI system running behind its newsfeed feature.  Because of the economics of advertising on the FB platform, the goal for FB is to keep users engaged on the site for as long as possible.  The length of time on site and the amount of engagement over a given day/week/month (daily active users, monthly active users, etc.) increases both the effectiveness of advertising and also how much FB can charge for ads.  None of this should be a surprise to anyone with basic knowledge of how web 2.0 advertising work.

The biggest insight for me, though, is the techniques that FB and other social medial platforms use to drive user engagement.  Basically, McNamee argues that FB is very good at leveraging our basic human psychology to drive up user engagement.  For example, the color of blue and the color of the red notifications are specifically designed to induce neurological responses (e.g., serotonin and dopamine responses in the brain) to keep people engaged.  It’s something we have all experienced–seeing that red notification triggers a primal reward system in our brains by releasing one or more neurotransmitters.  In addition to these physiological tricks, FB relies on pushing content to appeal to our “lizard brain” psychology.  Our reptilian brain has developed to respond strongly to fear and anger.  According to the author, FB knows that content that triggers these responses are inherently valued because they provide an economic benefit–keeping people on the site longer.

This becomes problematic with the advent and proliferation of mobile.  Previously, using a website for a long time on a desktop was physically limiting.  You had to pull up a chair or open your laptop and sit there to engage.  As the book says, with mobile, the real rival for your attention is not physical limitations but rather, it’s sleep.  So, whether intentional or not (I still can’t really tell from reading the book), Facebook’s internal technology and economic incentives guide FB’s newsfeed to synthesize into those things that keep engagement going–things like emotional appeals to our reptilian brain (fear, anger, etc.), filter bubbles (where users’ content is successively curated –either by themselves or by FB–to create an echo chamber where their opinions are reinforced), lookalike comparisons (where ads are pushed to you based on users who are similar to you–not just in their likes and preferences, but with other data factors like location and time), etc.

So, why is this problematic, and how does it relate to Russian interference?  Well, what Russia was able to figure out was that by placing ads and starting groups on Facebook, they would be able to leverage the platforms’ network effect and user engagement to push undermining information to millions of users.  How?  Well, say a Russian agent starts an anti-Hillary FB group under the guise of being a pro-Bernie group (this actually happened).  FB’s internal algorithms would find people to suggest towards joining the group.  You may have a few seeds that are Russian bots or Russian agents that start the group, but if you attract a few people (actual unassuming Americans), the suggestion gets pushed to their network.  So say the group then grows slowly, and one person from the group shares a fake news story promoted within the faux pro-Bernie group.  That news story gets shared to newsfeeds that are associated with the sharer.  FB knows that human psychology tends to trust news and stories from friends and families over verified and abstract sources.  So if you’re a friend of this person, you see him or her posting an article that looks true.  Maybe you share it, or maybe you don’t, but perhaps some percentage of the sharers network pushes the story.  The network effect, Metcafe’s law, takes hold here, and the story can “go viral.”  Except, that the story is not true at all, and FB’s internal algorithms allowed for it to propagate with impunity.  The actual Russian interference in the 2016 election is more complex than this, but this is a base example of how it would work.

According to the author, after the 2016 election he met with Senator Mark Warner to explain to him how FB was leveraged by Russians to influence the election.  Senator Warner and McNamee ultimately pegged a price tag of $150M that Russia might have spent on their campaign.  To put that in perspective, it’s less than the cost of an F-18 fighter jet.  McNamee makes an interesting point here–we spend orders of magnitude more money on defense than every other country, and our government data centers are some of the most secure on earth.  But we have this whole system in the private sector enabling a foreign government to just skip our fortifications and use the tools we have already provided.

Now, there is certainly a rebuttal to be made from the FB perspective, and I’d like to read something from that viewpoint after I finish this book.  A main point of the author seems to be that FB is being disingenuous when it’s saying that it’s merely a “platform.”  That is–as a platform, they are merely providing the service, and it’s up to the individual users to act ethically.  The author’s rebuttal to this seems two fold:  1. given the size and reach of FB’s network, they have left the small pond and in the bigger waters comes bigger responsibilities and an inability to hide behind that kind of logic; and 2. FB as a company still should be drive by corporate ethics, and thus, if FB knows that fake news is being spread rampantly through it’s network, it should curate that content away from its users even if it believes that it’s merely just a platform (and not, for example, a publisher).   The author gives an example of Johnson and Johnson.  In 1982, someone put poison pills in Tylenol bottles (an interesting article on this is here).  Johnson and Johnson faced a crisis.  At that time, the bottles were not tamper proof.  Johnson and Johnson decided to issue massive public warnings about the poison pills, pulled all Tylenol bottles from shelves, and worked with the FDA to develop tamper proof packaging.  The investment cost the company $100M, but they were able to rebound and become a trusted company again.  McNamee uses this example to say that Johnson and Johnson could have taken the position that they are merely a “platform”–that they are providing the bottles, and it’s up to the individual stores and communities to make sure they are not tampered.  But they were guided by corporate ethic principles to sniff out bad acts that could become common place and go viral (sorry, the pun was there for the taking) with their bottles.

Imagine if FB decided to work with the FTC to develop regulations on fake news promulgation when they first discovered the problem.  It’s not too late, I suppose.

One thought on “Zucked, by Roger McNamee

Leave a comment