I am currently the Managing Director of the Psychology of Technology Institute, which is a project of USC's Neely Center. Before that, I led data science, research, and product teams across Facebook toward improving the societal impact of social media. I began that work focused on policy-based content moderation, but learned the painful lesson that a focus on content moderation was a dead end and often a distraction from the product design and algorithmic changes that would truly make a difference.
We clearly did not solve things, but across many teams and individual efforts, we did launch:
• content neutral algorithmic changes to reduce engagment incentives for important topics
• content neutral reputation signals
• UI changes and subjective measurement
to enable more value aligned algorithms
• content-neutral break-the-glass measures for elections and crises
• removal of (some) anger from base incentives
• auditing of viral content internationally
• numerous other research studies, product changes and experiments that can advance our understanding.
These changes collectively measurably reduced misinformation, reduced content that experts consider dangerous, and improved the user experience. However, a lot of work still needs to be done and needs the outside world to be engaged productively, in order to do it well. The principles behind the changes we made apply not just to Facebook, but to a host of other companies that build online products where engagement and algorithms interact. I write about those principles on Substack.
Before working at Facebook, I had the privilege of a unique career across technology and academic roles. I helped co-found and build the initial algorithms for Ranker.com, which is a profitable publisher of crowdsourced lists that serves tens of millions of unique visitors monthly and employs 125+ people. I have a PhD in Social Psychology from the University of Southern California and have coauthored of dozens of highly cited empirical articles about individual values, political opinions, polarization, and technology, which have collectively been cited thousands of times. I have worked with numerous non-profits fighting polarization. My work across tech and academia has been featured in the Wall Street Journal, New York Times, The Atlantic, SXSW and numerous other venues (see links).
My unique experience as both a technical leader within tech companies as well as a researcher on social impact was why Facebook recruited me to work on improving the platform. This enabled me to make the platform a bit better, but I also know there is so much more work to be done and I left to contribute the knowledge and experience I've gained toward a wider societal conversation about how we can improve technology's impact on society. It is why I joined the Psychology of Technology Institute that enables me to be part of this wider conversation. If we share that goal and I can be of help in your efforts, please do not hesitate to get in touch (me at ravi iyer dt cm).
Use this form if you think we have complementary goals and would like to work together.