Fortune asked 25 of the sharpest minds to weigh in on the epic, disruptive, thrilling, terrifying, and fascinating ideas that will mold the next decade. The future is now.
Will paper money disappear? Will gene-tailored medicine transform how we treat disease? And will you finally trade in that juicy steak for “cell grown” meat? As a new decade begins, it’s hard to think of an industry that doesn’t feel like it’s on the brink of a massive transformation.
Economy and Markets
1. Mariana Mazzucato
Business and government will—gasp!—work together again.
Warning signs that the capital markets aren’t functioning properly are everywhere. Take corporate profits. Long term, they’re up. But investment is down. Now look at the levels of income inequality and youth unemployment: Young, able-bodied people lack the skills to compete in the job market, while companies are desperate to find skilled workers.
According to Mariana Mazzucato, an economist and founding director of the UCL Institute for Innovation and Public Purpose in London, paradoxes like these point to the fact that the public and private sectors have lost their way. There was once a fairly smooth running partnership between the two in which publicly funded R&D helped propel us into the space race, and, later, the computer and Internet ages.
Unless we rebuild those bonds, Mazzucato warns, innovation will dry up, growth and profits will suffer, and inequality will worsen.
Fortune: How should government and business be working together?
Mazzucato: The public sector isn’t just there to fix the market failures. It’s also an investor of first resort, to invest in some of the most uncertain highly capital-intensive areas before businesses are willing to … The big point is that we need to focus on a more purposeful system that goes beyond shareholder value. And that requires a redesign of the governance systems of both the public and private sectors, and how they relate to one another.
How optimistic are you that we can get there?
In 2019, we celebrated the 50th anniversary of going to the moon, which was basically a massive technological feat. Reflecting on that gives me hope. Humanity did something pretty extraordinary. Now think about the Apollo 11 mission. Apollo was not a left-wing or right-wing mission. It definitely was bipartisan, and it involved the public and private sectors. There were many companies, like Honeywell, Motorola, General Electric, that were fundamental in getting us to the moon—with, of course, the massive directional power provided by NASA and the government. That’s the kind of arrangement we need today.
Your ideas have been cited by Republican Sen. Marco Rubio and freshman Democratic Rep. Alexandria Ocasio-Cortez. How do you get both the right and left to listen?
I’ve learned that as long as you’re talking about the long run and risk-taking, entrepreneurship, creativity, wealth creation—in a way that really brings in both business and the public sector—it ends up being a bipartisan narrative.—Interview by Bernhard Warner
Economist Mariana Mazzucato is the founding director of the University College London Institute for Innovation and Public Purpose.
2. Robert Shiller
The new age of economics is dawning.
The power of narratives in driving economic events will be studied more. Economics will become less mechanical—more attention to storytelling and changing popular ideas. And it will give impetus toward trying to manipulate and manage narratives. This is something politicians do instinctively. Franklin D. Roosevelt in 1933 said the only thing you have to fear is fear itself. That’s just one example. But before him, in the 1920s, Calvin Coolidge was always boosting the market. He thought that was the right thing for a President to do: instill confidence. But maybe not, because it ended badly with 1929 and the Great Depression.
Certain narratives are recurrent. Aristotle introduced the idea that machines might replace jobs over 2,000 years ago. Now we’re hearing that again. Automation is a buzzword from the 1950s. In fact, one downturn in 1957–58 was called an “automation recession” by some. In terms of today’s narrative, I think there is danger of a serious contraction, like we had 10 years ago. The stock market has reached new records, so some people are worried. But a lot of economic indicators remain strong. It’s a split. Nobody knows exactly what’s coming. It’s like 1929. Nothing in the air strongly suggested that change was coming—and suddenly it came.
Nobel Prize winner Robert J. Shiller is Sterling professor of economics at Yale.
3. Klaus Schwab
A “gold standard” of digital currencies will emerge.
Over the next decade there’s the potential for an entirely new form of money, “stablecoin.” If achieved, it could help include the world’s unbanked population and ensure a more stable financial system for all. Experimentation with blockchain in financial services has already led to the development of digital currencies like Bitcoin and Ethereum. But these remain ineffective and have proved prone to major fluctuations and misuse. Moreover, they are still hard to use in daily life, with few retailers accepting them as a form of payment. Libra, proposed by Facebook and backed by a consortium of other firms, conceptually might overcome some of those hurdles: It would be easy to use via a digital wallet on Facebook and would be stabilized by pegging it to a reserve basket of currencies (for more, see the feature story in this issue). But a “gold standard” of digital currencies has not emerged—yet. The real opportunity lies in major guarantors of the financial system, such as central banks and governments, committing to a supranational form of money. Such new currency could facilitate international payments and include those people and small businesses that are currently unbanked in the financial system.
Indeed, the real promise lies not in New York, London, Singapore, or Tokyo, where most people and businesses already have ample ways to conduct business and transfer money. It lies in helping those who are unbanked in countries like India, Indonesia, Ethiopia, or the DRC. A stablecoin could make financial inclusion real. It would represent the new frontier of money. There has not been anything as exciting since Bretton Woods.
Klaus Schwab is the founder and executive chairman of the World Economic Forum.
4. Andrew McAfee
Capitalism will save the planet (seriously).
In the Industrial Age economies grew at earth’s expense. Resource extraction directly correlated to wealth accumulation: More mining of metals, felling forests, and burning bitumen meant greater prosperity. Capitalism literally became a dirty word.
That’s changing, says Andrew McAfee, principal research scientist at the MIT Sloan School of Management. McAfee believes capitalism is partly the solution to its own ills. In the U.S. “we’re not using up the earth as much anymore. We’re using it less, even as our growth continues,” says McAfee. Pollution is, in the developed world, decreasing year over year. Electricity use has been effectively flat in America for about a decade even as growth continues. Companies are “locked in nasty competition” thanks to capitalism, McAfee says, and many are fighting to use fewer resources and less energy, which cost money.
At the same time, innovations in digital technologies are creating cleaner, more efficient alternatives to material goods. Consider the smartphone. How many fewer cameras and camcorders and answering machines and fax machines are being produced now? “I’m convinced that smartphones have actually let us tread more lightly on the planet,” he says.
That’s not to say humanity can be complacent. Without regulation, capitalism is “voracious,” McAfee says. “It will eat up sea otters and tigers and rhinos and blue whales if we let it.”
He thinks governments must protect struggling species and make polluting technologies more costly than green ones. They should also implement a carbon tax—or better yet, dividend—that would have businesses pay citizens based on the quantity of carbon dioxide the firms emit. “Properly configured and constrained, capitalism will not eat up the planet, it will actually let us take better care of it.”—Robert Hackett
Andrew McAfee is the cofounder and codirector of the MIT Initiative on the Digital Economy at the MIT Sloan School of Management, and the author of More from Less: The Surprising Story of How We Learned to Prosper While Using Fewer Resources – and What Happens Next
5. Sir Paul Collier
Business will embrace its responsibility to save capitalism.
I believe in capitalism; it’s the only system in 10,000 years that has managed to lift living standards. But it doesn’t work on autopilot. Periodically it comes off the rails, and we’re living in a derailment.
Society needs to have lots of different organizations that are all morally load-bearing. Human beings naturally organize ourselves into groups, and the most important group we’ve got on earth is the firm. Firms have to be morally load-bearing, and for most of human history they have been.
But that aspect of capitalism came off the rails sometime around the 1970s, with Milton Friedman’s dictum that the sole purpose of a firm is to make profit for shareholders. That’s a fallacy of what firms are about. That’s why after kowtowing for 30 years, the Business Roundtable has finally found the courage to say, “We just don’t do that!” Now, either [promises to take a more holistic view of the purpose of business] just degenerate into happy talk, or we get serious about it. And I think there are enough anxieties and disaffection around that we have to get serious.
Measurement [of companies’ accomplishments] matter, because if you measure only one thing—profit—in the end you’ll get only one thing. If one of the purposes of a certain firm is to help people in Africa—which sometimes it should be—then the best short-term metrics are usually the jobs it creates in the country and tax payments it makes to government, and longer-term, whether the enterprise has stimulated other companies to establish themselves or grow.
Leadership is hugely important in setting an organization’s culture. If you look at the iconic Japanese companies, the chief executive and senior management dressed like ordinary workers, they ate in the canteen like ordinary workers, and so they were able to use the word “we” without people laughing. [For a CEO who wants to create that sense of commonality, that might mean] take a pay cut, travel in the same style as employees, eat with employees—essentially make some visible personal sacrifice.
There’s nothing automatic that saves our societies. We have to save our society ourselves: It’s our responsibility each time it comes off the rails. And this time we’ve been slow to put it right. If we don’t, we will dump on our children a society that’s really, seriously, a mess. So we all have that responsibility to play our part. The further up the system you are, the bigger the responsibility.—As told to Katherine Dunn
Sir Paul Collier is a professor of economics and public policy at Oxford’s Blavatnik School of Government and the author of The Future of Capitalism: Facing the New Anxieties, and The Bottom Billion: Why the Poorest Countries Are Failing and What Can Be Done About It.
6. Jennifer Doudna
Genomics will rewrite medicine—and prevention.
Jennifer Doudna may well be the queen of Crispr. The UC–Berkeley professor and world-renowned biochemist is one of the pioneers behind the gene-editing technology, which could be used to fight conditions from cancer to blood disorders to many inherited diseases. But this genomic revolution also raises fundamental questions about ethics and the cost to consumers.
Fortune: What will gene editing and genomic sequencing look like 10 years from now?
Doudna: I think that 10 years from now, we’re likely to see much more high-quality prediction about health outcomes for people that are based on their genes. Not only that, but increasingly we’ll see Crispr turn the entire field around, with genome editing being used for preventive health care, not just for treating disease or curing existing disease.
A year after reports that a Chinese doctor created gene-edited embryos, you wrote an essay for Nature calling urgently for ethical guidelines in genomics. What should that look like?
I certainly hope that over the coming decade we see an increasing global effort to put in place appropriate regulations for using genome editing, especially in applications that could have a very profound impact on everyone. And that’ll include not just human reproductive health, but frankly also agriculture, because I think that’s an area where there is a very large opportunity with genome editing, but one that also needs to be approached with caution.
When do you think the U.S. will approve the first Crispr-based medication?
I think it’ll be before 10 years out, at least the way things are going right now. I think it’s been incredibly exciting for those of us in the field to see recent announcements around developments using Crispr in treating cancer and in treating blood disorders like sickle-cell anemia.
Often, that technology needs to be individually tailored—an expensive prospect. Will there be an accounting for that?
One area that does need a lot more attention, and something I’m personally very committed to, is thinking about cost and access. I think increasing attention will be paid to: How do we afford genome editing? How do we make it accessible to as many people as possible globally? Personally, I think a lot of it will have to come from additional technological development—not so much on the Crispr side of things, but more with respect to how we manufacture the molecules that are used for gene editing and how we deliver these new medications.
In this decade will we see the science move out of the lab?
I suspect that within about five years it will be possible to make essentially any kind of change or edit to any genome in any celled organism with precision. I think we’re really that close to being able to do that. Now, that’s in the laboratory. It’ll be maybe longer than that before it’s possible to make those kinds of genome edits in organisms in actual patients. The next step will be developing ways to effectively deliver these gene-editing tools. To me, that’s the next horizon.—Interview by Sy Mukherjee
Jennifer Doudna is a professor of chemistry and molecular and cell biology at the University of California at Berkeley and executive director of the Innovative Genomics Institute.
7. John Mackey
Cell-based meat will change the way you eat.
People are becoming more conscious about food. It’s so intimate. It’s part of people’s self-identity. Over the next decade, diets will become increasingly individualized—vegan, ketogenic, gluten-free—and also more tribalized; if you’re paleo, you’ll have an affiliation with people doing the same thing. A mass market still exists, but it’s shrinking. You can see that with traditional consumer packaged goods companies losing sales.
One thing that caught me by surprise with the growth in plant-based meats is that it isn’t driven primarily by health or even the ethics of eating animals. It’s primarily driven by millennials wanting to eat more sustainably. Historically, on a mass scale, that’s new. I wouldn’t call it mainstream yet, but it’s moved out of the edge.
We’re in the most innovative cycle in history. There’s a massive amount of capital, and it’s easier now for any good idea to get financing and spread globally fairly quickly. One innovation that’s coming as a result is cell-based meat. In the long term, it’s going to be bigger than plant-based meats, which don’t taste like meat without being extremely processed. But cell-based meat—that is, meat grown from animal cells—could change the entire planet. That trend will break in the next decade. I feel certain it will. Imagine if it’s not only more ethical, or environmentally less harmful, but even cheaper. A different way of procuring animal foods than what we’ve done for all of humanity—that would change everything.
We’re always in a race against population growth and a lack of resources versus innovation like cell-based meat that could make us more productive. So far innovation has won. Will we continue to outrun it? It’s hard to say, but I tend to be an optimist.—As told to Beth Kowitt
John Mackey is the cofounder and CEO of Whole Foods.
Get up to speed on your morning commute with Fortune’s CEO Daily newsletter.
8. Richie Etwaru
The 31st human right should be to own your medical data.
There is a fundamental lack of trust between consumers and corporations. The relationship is ill-designed in the terms and conditions agreement you have to sign for the most basic applications.
The mistrust isn’t surprising. Merchants may be mishandling data—and at the same time, they could argue that they have the agreement from the consumer to do that. After all, they signed a contract permitting the companies to do that, the thinking goes.
What we have invented, because of our belief that there should be better ways for consumers to engage, is something that transforms the “I Accept.” That could have applications for industries ranging from air travel to healthcare—and it feeds into the idea of a 31st human right for one’s ownership of their own data, adding to the 30 rights adopted by the United Nations in 1948.
In the future, in industries such as healthcare and beyond, companies will not buy data to own, they’ll lease data for use. Data integrity decays very quickly. You’re not interested in a data lake, you’re interested in a data stream—The trust issue breaks down with every subsequent party in the supply chain. They share that data with the police, or the government, or my health insurance company that then modifies my actuarials because I’m not taking enough steps. Consumers are now waking up to what’s actually going on. Your ecosystem of your trading partners have to be more trusted, to be more transparent, where it’s not predatorial on the consumer’s data.
The 30 human rights don’t address the issue of data privacy—and it’s a vital one. There’s a social media company that can look at your activity over the past month and see where you are on the gender spectrum, and now they have the decision to determine whether or not to target you for certain products.
What’s really at stake here is, Don’t we want to stop this sort of mass manipulation?— As told to Sy Mukherjee
Richie Etwaru is the founder and CEO of Hu-manity.co.
9. Jamie Dimon
The future of work is skills—so stop worrying about degrees.
A four-year college degree is not the only path to a well-paying job. This outdated thinking is partially to blame for holding back America’s growth and blocking many people’s access to opportunity. We must consider more inclusive means of hiring the best and most talented people to meet the needs of our rapidly changing economy.
The reality is the future of work is about skills, not just degrees. To be clear, we continue to value college and advanced degrees, and there’s no question of their relevance. But the talent that fuels a global company like ours is increasingly diverse and includes people who do not have a four-year college education.
As technology changes the way we work, we must be better at providing pathways to good jobs that everyone—no matter their zip code or background—can access.
To start, this is only possible if businesses and educators work together, partnering to develop curriculums and apprenticeships that offer students on-the-job experience and training. In the Washington, D.C., area, this approach has taken root. Employers are working alongside high schools, community colleges, and universities to prepare students to fill well-paying technology jobs including 30,000 open cybersecurity jobs in Northern Virginia alone.
Community colleges, which are an affordable and attainable option, exist in nearly every community, educate 13 million diverse students a year, and are often overlooked as a source of talent.
Last year, more than three-quarters of the U.S. jobs posted at JPMorgan Chase did not require a bachelor’s degree. Schools such as Columbus State Community College in Ohio are increasingly valuable resources for our company and many other employers, from technology to advanced manufacturing and health care. In the next decade, we must eliminate the stigma of community college.
Finally, with about 7 million job openings and 6 million unemployed workers in the U.S., people with criminal backgrounds deserve the same opportunity to obtain in-demand skills and good jobs as anyone else.
Returning citizens deserve a chance to secure a job at any company, including ours. We must eliminate barriers to their employment too, by increasing access to Pell Grants and financial aid, and dropping questions about criminal backgrounds from job applications. Hiring them and developing their skills is good for business and the right thing to do.
Jamie Dimon is the chairman and CEO of JPMorgan Chase.
10. Ruth Whippman
What will really lead to workplace equality? Men leaning out.
Though we’ve made some strides in workplace equality, what we’ve really done is say:
Women, your traditionally female norms aren’t as valuable or useful as men’s, so shape up. Lean in. Whatever men are doing and valuing is what we should all aspire to.
We’ve set up the cultural equation so that assertiveness is greater than deference, demanding is greater than listening. What we need to do is ask men to step back, listen more, and be humble. Maybe instead of telling women to stop apologizing, we need to encourage men to apologize more when they make mistakes!
The burden of self-improvement has been on women for the last decade. If we can encourage men to think of female norms as just as valuable as their default standard, we’ll take a big step toward equality. I hope companies will start taking responsibility for gender inequality, and as a society, we’ll start to focus on how men can start to make changes, instead of male norms dictating the standard behavior for all of us.—As told to Anne Sraders
Ruth Whippman is a British cultural critic living in the U.S. She is the author of America the Anxious.
11. Andrew Barnes
The 4-day workweek will make companies more productive.
What if there were one change companies could make to lessen their environmental impact, close the gender opportunity gap, improve employees’ mental health, and increase productivity—and what if all it took was taking a day off?
Andrew Barnes, the founder of a New Zealand estate-planning company, in 2018 introduced a four-day workweek for his 240 employees. After a carefully managed trial period, Barnes found employee engagement had improved by 40%. He’s now made it his mission to get companies around the world to reimagine what they ask of their staffers.
The pitch is the hard part. “If I went to your company and said, ‘By restructuring, I can deliver you a 40% improvement in productivity,’ most CEOs would say yes immediately,” Barnes says. “If I walk in and say, ‘I want you to let your employees work less time,’…most people say, ‘Are you kidding?’”
The secret is rethinking how employees work during the four days of the week they’re still spending in the office. Barnes has found that workers will happily give up small talk and time spent on social media when the prize is an extra day away from their desks. And the benefits—to companies, economies, and societies—are enormous.
The system takes cars off the road during rush hour. Flexible work schedules help women stay on track to move into leadership positions, rather than dropping out of the workforce after having children. At Barnes’s company, employees maintained their job performance and reported a 7% decrease in stress levels and a 24% jump in satisfaction with work/life balance. Barnes cites German autoworkers’ 28-hour weeks—and a recent Microsoft Japan experiment that saw a four-day week boost sales by 40%—as examples of how the schedule can work across blue- and white-collar professions. “We have picked an arbitrary five days a week, and we’ve stuck to it. But the world’s changed,” Barnes says.—Emma Hinchliffe
Andrew Barnes is a New Zealand-based entrepreneur and philanthropist.
12. Melinda Gates
Women will alter the workforce—dramatically.
Throughout our history, the face of power and influence in the United States has been overwhelmingly white and male. Over the next decade, that will change.
Women—and, more important, women of all backgrounds—will increasingly be the ones making decisions, controlling resources, and shaping perspectives in all spheres of society. We’ll see this shift play out in homes, in workplaces, and across public life. It will lead to new narratives, products, and policies that reflect a much broader range of perspectives. And it will enable more women to fully participate in solving the challenges that will require our collective brainpower, like structural racism and rising inequality.
This shift will not happen by accident. It will require the concerted efforts of a broad coalition of Americans working together.
In addition to the activists and advocates who are already engaged on these issues, we’ll need to enlist new partners to turn up the pressure on the institutions that are enshrining the status quo.
We’ll need to fast-track women in high-impact sectors like tech and ensure that all women (not just white women or women from elite backgrounds) are able to enter and advance in these fields.
We’ll also need to bring down the barriers that most women encounter at some point in their careers, like norms around caregiving that mean they’re expected to do more work around the home and the pervasive sexual harassment and discrimination they face in the workplace.
When I think about what it means for a woman to exercise power and influence, I picture a CEO setting new strategies for her company, a fast-food worker successfully taking action against the boss who harasses her—or any woman, whether she works outside the home or not, sitting down with her partner to divide the household chores in a way that makes sense for their family.
Those interactions, multiplied every day across millions of women, will change everything.
Melinda Gates is cochair of the Bill & Melinda Gates Foundation and founder of Pivotal Ventures. She is the best-selling author of The Moment of Lift.
13. Geoff Colvin
Showing up will matter again.
In the 2020s people in developed economies will rediscover the value of physical presence—engaging with others face-to-face, eye-to-eye. The opposite trend, social isolation, has been building for decades, described chillingly in Robert Putnam’s 2000 bestseller, Bowling Alone. Since then, as the world has become more digital, the trend has accelerated. In a 2018 survey, U.S. teens said they prefer texting to talking in person. Other research finds that compared with previous generations at the same age, members of Gen Z are less likely to get together with friends in person, go to parties, go out with friends, or go on dates. Across age cohorts, our phones are crowding out in-person interaction.
The bill for such behavior is coming due. “Loneliness kills,” says Robert Waldinger of Harvard Medical School. “It’s as powerful as smoking or alcoholism.” Researchers find that social isolation increases the risk of heart disease by 29% and stroke by 32%. The U.K. has appointed a minister for loneliness.
Now a countertrend is taking shape. WeWork may have been a financial house of cards, but coworking spaces are a megatrend in commercial real estate, attracting millions of people who could work at home for free but instead pay to sit among fellow humans. Companies are encouraging or requiring employees to come back to the office because researchers find that creativity and innovation are group activities built on trust, and “there is no substitute for face-to-face interaction to build up this trust.”
The most thoughtful analyst of the trend away from and back toward in person interaction is MIT’s Sherry Turkle, author of Alone Together and Reclaiming Conversation. Here’s what she told Fortune: “I see a historic trend to introduce more friction, to slow us down, to look up and talk to each other and to appreciate what only we as humans can give each other. The trend for the next decade: the embrace of what we don’t share with machines. Empathy. Vulnerability. The human-specific joy of the friction-filled life.”
Geoff Colvin is an author and longtime editor at Fortune.
14. Malala Yousafzai
Investing in girls’ education pays huge dividends.
Our world often fixates on a single story—the girl who went on strike for climate justice, the girl who spoke against gun violence, the girl who fought for her education. But impressive young women aren’t as rare as headlines suggest.
I was 11 years old when I started blogging for the BBC about life in Pakistan’s Swat Valley under Taliban control. But I only got the job after the girl who volunteered first turned it down when her father said no. I knew it then, and I see it today wherever I go: Courageous girls are not in short supply.
Indigenous girls in Brazil are fighting for their land and education. In Ethiopia, young women are hosting menstruation workshops to combat stigma and improve public health. Engineering students in Pakistan are competing as the country’s first all-female Formula race team. When given opportunities to learn and lead, girls show us again and again that they will.
Young women everywhere are speaking out, tackling local issues, and showing the next generation they can do the same. But even with enormous progress, far too many are unable to access quality education.
Today almost 1 billion girls lack the skills they need to succeed in the modern workforce. As technology continues to change how our world operates, girls in low-income countries are falling further behind.
In some families, girls as young as 8 years old are sent to work or marry older men. Sometimes girls stay home because the journey to school is too dangerous, or because there are no female teachers. Violence and war forces girls to flee their homes and miss years of school. In some areas, there simply is no school to attend.
Even when girls can attend school, some are not learning. Center for Global Development researchers found that only 48% of Indian girls with a fifth grade education were literate; in Nigeria, the same figure drops to 8%. The drive for access to education has not matched investment in education in some developing countries. Underfunded, overcrowded schools mean fewer trained teachers, updated textbooks, or opportunities to access technology.
When girls go to school, the future is brighter for all of us. Last year, Malala Fund and the World Bank published research showing that if all girls completed 12 years of school, they would add up to $30 trillion to the global economy, closing workforce gaps and generating new jobs. More educated girls means more women driving innovation, holding seats in government, and running companies.
I want to help girls catch up, so they can take us forward. To get there, we need every individual, every company, and every country to recognize their potential and invest in it. Governments need to allocate more funds to education and change laws that disproportionately affect women and girls; their economies will thank them later. Experts recommend that developing countries, where the highest numbers of out-of-school girls live, spend 6% of GDP on education—but very few are meeting this target today.
Corporations understand that modern economies require educated women. Companies can invest in creating a pipeline of female talent by contributing a percentage of profits to improve and expand girls’ education. And each one of us can contribute by learning more about the issue, sharing girls’ stories, or donating to girls’ education organizations.
I am excited by the energy pouring into the fight for equality right now. It gives me hope that if we work together we will soon see all girls in school and more women in leadership roles. The world we build will be one where people aren’t surprised by exceptional girls—they’ll expect them.
Malala Yousafzai is a student, cofounder of Malala Fund, and the youngest person ever awarded the Nobel Peace Prize.
15. Sandy Speicher
Thinking big will redesign the world.
A couple decades ago, designing in a human-centered way was a new idea—and it created a major opportunity for leaders in business. Today, we all have things around us that are really well designed. It’s become baseline to consider how things look, feel, and act.
Now, we have to think even bigger about what we’re designing for. We have to ask: How does this thing I’ve designed affect a larger system? How does it affect society? How does it affect the planet? This is a new realm of not just the design of the thing, but the design of the system around the thing.
If you want to design something that takes better care of the planet, you’ve got to design the supply chain. You’ve got to design the chemistry involved in the materials you’re putting together. There’s a whole new set of knowledge and domains that actually need to be considered, and that designers can have a role in shaping.
We have to understand the diversity of many people’s needs in order to change the system. That means working together. And it means asking who is at the table doing the designing—and how we can ensure that people contribute to what is being designed for them. The next decade of design will require not just listening to people, but designing with people to help them shape the future they’d like to see.—As told to Maria Aspan
Sandy Speicher was named CEO of design firm IDEO in April. She’s the third CEO—and first woman to hold the role—in the company’s almost 42-year history.
Tech and A.I.
16. Aileen Lee
Venture capital will transcend the valley.
For the first time, in 2019, this became part of the conversation between venture capitalists and startup founders: Where are you thinking of being based? Will you have one headquarters or two? Are you planning to be a distributed workforce from the beginning? The fact is, those types of decisions change how you build your culture and processes from the get-go. Because of what’s happening with open source code and Amazon Web Services [the cloud-computing infrastructure that powers many startups], more and more multibillion-dollar tech companies will be built outside Silicon Valley. There are some great areas like Seattle, Denver, Austin, Washington, D.C., and San Diego where you can live comfortably and send your kids to good schools. And there are already quite a few multibillion-dollar tech companies outside Silicon Valley.
I think you will see more regionally focused VC firms have success. And more Silicon Valley VCs will spend more time on airplanes. I think Zoom [the videoconferencing company] has something to do with this trend too. You can live—and work—anywhere.—As told to Michal Lev-Ram
Aileen Lee is a venture capitalist and founder of Cowboy Ventures. She coined the term “unicorn.”
17. Tristan Harris
Big tech won’t reign—it will be reined in.
The matrix exists, it just doesn’t look like it did in the movie,” says Tristan Harris.
What the former Google design ethicist is conveying is the notion that we all live, as dystopian as it may sound, in a mock reality fabricated by machines. These machines constitute “the surveillance-attention economy,” as Harris calls it, a product of the growing cadre of companies and technologies that “profit off of renting access to manipulate us with increasing levels of precision.”
Facebook, Google, TikTok-owner ByteDance—Big Tech corporations with hands in data-siphoning and advertising-based business models—are building profiles of people so they can predict and influence human behavior. In essence, they’re creating virtual “voodoo dolls” they can poke, prod, and use to bewitch, Harris says. “They’re competing for a better way for a third party to manipulate your habits, your moods, subtle shifts in your identity, beliefs, or behavior.”
The harms are many. Harris lumps them together under the header of “human downgrading,” a phenomenon that includes a shortening of attention spans, diminishment of free will, and increasing incidences of polarization, isolation, and depression among the population. The apparatus ultimately “destroys our capacity to make sense of the world in an accurate and well-founded way that is critical for democracy.”
How to stave off self-destruction? Harris proposes implementing regulation that would force Big Tech to disassociate its profits from “the increasing capacity to control and shape human behavior.”
The proposal has precedent. Until the late ’70s and ’80s, energy utilities in the U.S. were almost purely incentivized to encourage overconsumption: The more people left the lights on, the more money electricity suppliers made. Policies were then put in place to decouple that profit motive from consequent wastefulness. Past a certain point, consumers would be charged steeper prices for their consumption, and some of that premium would go toward funding renewable energy sources. The approach had a dual effect: bolstering thriftiness and long-term energy solutions.
Harris believes a similar policy will be needed to repair “the breakdown of society” that Big Tech is causing. These companies should be required to plow some of their profits into “regenerative” areas, Harris says. Some money could prop up investigative journalism, whose core business model Big Tech helped hollow out. Some could bankroll mental health and community-building initiatives. Still more could fund alternative tech products designed with the public interest in mind, like public utility social networks supported by Wikipedia-style nonprofit business models.
In this Matrix, so-called users are the ones being used. “Free is the most expensive business model we’ve ever created,” Harris says. Now we have to choose: “free” or freedom.—Robert Hackett
Tristan Harris is the director and a cofounder of the Center for Humane Technology. Earlier, he worked as a design ethicist at Google.
18. Geoff Colvin
The line between human and bot will disappear—and we’ll be fine with it.
When you’re text-chatting with Katie as you resolve a problem at a retail website, do you wonder whether Katie is a person or a bot? More important, do you care? Did it bother you that the 30-years-younger Robert De Niro in The Irishman was partly real and partly computer generated? Have you smiled at deepfake videos in which public figures seem, convincingly, to say outrageous things they never said?
The blurring of humanness is well underway and will accelerate in the 2020s. Living with indistinguishable humanoids—text, audio, and video versions, and just maybe physical—will become routine. The hard part is fully grasping how much better the technology will become. Just a few years ago those deepfake videos were difficult and expensive to make, and they still looked obviously doctored. Now high-quality, make-your-own-deepfake apps are available for free and getting better every day. Google demonstrated a convincing audio humanoid, Duplex, 18 months ago; this year fraudsters called a U.K. executive with a fake audio version of his boss so realistic that the executive followed its orders to send 200,000 pounds to the fraudsters’ account.
Video game makers scan thousands of athletes’ faces every year; today’s games aren’t quite indistinguishable from actual TV coverage, but it’s reasonable to think that within a decade they will be. As for the ultimate indistinguishable humanoid? Hanson Robotics in Hong Kong is developing Sophia, which it intends to make physically realistic and fully human—a “conscious, living machine.” Living? Really? Seems unlikely. What we can say with confidence is that the 2020s will be the decade in which we stop wondering if a human image, voice, or message is actually human. In many cases, we just won’t know. And we’ll be okay with that.
19. Beth Ford
The 2020s will connect rural America—or lose it.
Division is all around us today. Rural versus urban. Heartland versus coasts. Boomer versus Gen Z. Republican versus Democrat. To ensure prosperity a decade from now, we need connection. Literal connection, enabled by technology and investment, and human connection, enabled by all of us.
Today, 24 million Americans, 80% of them in rural areas, do not have access to high-speed Internet—the greatest enabler of human connection in our lifetime. It is to our times what electricity and transportation were to our grandparents.
In 10 years, all America must be connected. We must address accessibility in rural areas and affordability in urban areas. Less than 2% of the population provides the nation with safe and affordable food. The health of their communities is vital to the food security of the nation.
But today, one in four children in rural America lives in poverty. Over the past year, rural job growth was less than half the nationwide rate. More than 60% of new jobs were in metro areas, compared with 8% in rural areas. Nearly 45% of the 2017 deaths from heart disease in rural areas were deemed “potentially preventable,” compared with 18.5% in one of the urban classifications. Without the population to support large grocery stores, fresh food is less available in rural areas.
Farmers, both small and large, are the backbone of these communities. When they aren’t profitable, they can’t invest in education, health care, and the local economy. Today 60% of farmers say they don’t have enough connectivity to run their businesses; 78% do not have a choice of ISPs; and 60% say what they do have is slow. Modern agriculture relies on cutting-edge agtech and precision farming tools to boost production, address climate concerns, and improve sustainability.
You would think, given these statistics, coupled with the year they’ve had, farmers would look to the future with trepidation. But they are looking forward with a sense of action in mind—and so should we. In the coming decade, we will either connect rural America or risk losing it.
Beth Ford is the president and CEO of Land O’Lakes and is No. 31 on Fortune’s Most Powerful Women in Business list.
20. Joy Buolamwini
A.I. “hygiene” will determine the success of A.I.
Despite a current boom in artificial intelligence, today’s complicated mathematical systems still suffer an inherent flaw—their propensity, like their human creators, to fall prey to their own biases.
Recognizing that fact, explains Joy Buolamwini, the founder of the Algorithmic Justice League, is a crucial part of practicing good A.I. hygiene, a technology concept akin to continuously taking care of one’s health. A.I. systems that adapt and take action based on the data they ingest require constant tend- ing and human oversight,especially if the systems end up failing to work as well on minority or marginalized groups not equally represented in the data sets.
Consider the facial-recognition technologies offered by companies like IBM and Microsoft that worked better on lighter-skinned men than on darker-skinned women. Buolamwini and colleague Timnit Gebru’s milestone research paper published in 2018 highlighted the bias problems, which resulted in both companies improving their systems to reduce the discrepancies. But despite the fixes, the systems still don’t work as well on women with darker skin, underscoring how companies must continuously monitor and adjust their A.I. systems as more people interact with them.
She thinks more companies need to consider whetherit’s appropriate to use an A.I. system in the first place. If they do, they must keep track of their impact on different societal groups, because systems often function differently than expected.
Companies also need to be aware of their technology’slimitations and open to having an “active process and oversight and engagement with the people using these systems,” or opening up the “black box,” as she puts it.
Practicing good A.I. hygiene can help companies mitigate potential harms and bias, but it’s not something they can do once and consider themselves in the clear. It’s an ongoing process. Quips Buolamwini, “You wouldn’t shower once in 2020 and say you’re good.”—Jonathan Vanian
Joy Buolamwini is the founder of the Algorithmic Justice League.
21. Nir Eyal
Tech companies will finally decide it’s in their best interest to make devices less distracting.
When it comes to distraction, everything old is new again. Two millennia ago, Socrates complained about the corrupting influence of a new technology: the written word. It was those cursed books, not smartphones, that would capture our attention and cause akrasia—a tendency to act against our best judgment.
History’s lesson: As much as we want to believe our current technology is the sole cause of our lack of focus, it’s not. In every era and age, new creations have preyed on our brains’ cravings.
Of course, the speed and ubiquity of modern information has no precedent—and appears to have given rise to a similarly unprecedented levels of distraction. Yet here’s what gives me hope that we’ll overcome the worst of this latest era of fractured attention: new technology. Yes, I appreciate the irony. But there’s real promise in new tech to help those most at risk of addiction and distraction.
I’m not a Pollyanna. I don’t expect, nor should we depend on, technology companies to make their products less engaging for the general public. But I do think our society is waking up to our obligations to help the most tech-vulnerable among us, especially children. Companies know this—and understand its potential to affect their bottom lines. That’s why the most recent Apple and Android operating systems include much more robust parental controls. Expect to see more of these efforts in the years to come.
And the rest of us? We need to rigorously manage our attention. In doing so, perhaps we can take comfort in knowing that we’re part of a 2,000-year-old tradition. Let our millennia-long fight against akrasia continue.
Nir Eyal is the author of Indistractable: How to Control Your Attention and Choose Your Life and blogs at NirAndFar.com.
22. Christopher Tonetti
Consumers should own—and be able to sell—their personal data.
Consumers spent the first part of this century losing control over our personal information. We traded it to Facebook, Google, and other big tech companies, which have built empires by amassing—and monetizing—their users’ data. European regulations and a new California law are starting to wrest back some data-privacy protections, but economist Christopher Tonetti has a more radical proposal: allow consumers to take back ownership of our data from the tech giants—then encourage us to sell it to many companies at once. Tonetti argues that this system would simultaneously protect privacy and help generate business innovation, including at some of those very same tech giants who would have to give up their data supremacy.
Fortune: Why do you advocate for consumers taking ownership of their data, rather than for preventing tech companies from collecting that data in the first place?
Tonetti: If the goal is to protect consumer privacy, just don’t collect the data. It won’t get hacked. It won’t get breached. If protecting privacy is your only goal, that’s the right answer. But the collection of data is allowing innovation and improvements in a lot of the products that we use—in the medical context, the education context, the technological context. We don’t want to prevent firms from being able to use data to create better products. But it’s unlikely that allowing firms to do whatever they want with the data they collect is going to result in the best we can do as a society.
Consumers owning their data gets to the best of both worlds. They can protect their own privacy [if they choose]. Or they can sell their data to many firms at the same time, which is going to spark innovation and quality improvement.
What are some examples of how this system would both promote business innovation and benefit consumers?
Imagine there are two hospitals both regularly taking scans of patients’ lungs to see whether there’s cancer, and building machine-learning algorithms from the scans. That’s something desirable: We want to have better predictions for cancer. Now, the hospitals could base their machine-learning algorithms on all of the patients treated by both institutions, and each would have more accurate algorithms than by only using their own patients. [But under the current system, they’re unlikely to do so, because] they don’t want their competitors to have access to their patients’ data. But if consumers own that data, they might sell it to several hospitals, to help improve the prediction algorithms everywhere.
Still, this seems like it would impose a huge individual burden on consumers. How would you make it simple and easy for us to hold and sell our data?
You can’t spend your entire day deciding who to sell to, at what price. Ultimately, the way I see this playing out is via a market with intermediaries that are acting on the consumers’ behalf, like financial advisers. I could imagine a consumer-oriented organization, like a Consumer Reports, offering a product where you install their app on your phone. You tell them the types of data that you feel comfortable sharing, the types of stuff you don’t want to share, and then the organization can collect what’s generated by your location data, things like that, and sell it in the marketplace, bundled together with other people’s information.
So what would be a reasonable price for my data?
I don’t think we have a good answer to that. There are intermediary firms that do collect a lot of data on you and then sell it to other firms. But currently, in many cases, [big tech] firms own it. They collect it, they didn’t have to pay a price to get it, and we don’t know what price they’d be willing to pay.
What’s the best model for this sort of brokerage system? Should these intermediary advisers or brokers be private companies, nonprofits, or government agencies?
Regulators have a really big role to play here, and they’re starting to figure it out. Europe came through with the GDPR, California passed a Consumer Privacy Act. That’s going to become law very soon [Jan. 1, 2020]. So regulators play a role [in getting] the property rights correct, meaning giving consumers ownership of their data. I don’t really see the government as setting the prices for [this information]. The government’s area of expertise is setting down property rights and then letting markets form. This market might not develop without some strong government intervention—but it’s just about getting the rules right and then letting the market work.—Interview by Maria Aspan
Christopher Tonetti is an economist and associate professor at Stanford Graduate School of Business.
23. Christiana Figueres
We’ll witness the end of the internal combustion engine era.
In the 2020s, we will see the beginning of the end of the internal combustion engine. That is quite remarkable because the entirety of our economic growth over the last 150 years has come on the back of this technology and the fossil fuels that feed it.
Over the past few years we have been investing heavily in electric and hydrogen-powered vehicles, and this is just going to accelerate exponentially over the next decade. Regulators at the city, state, and national levels are beginning to understand that the internal combustion engine is both a global and locally polluting technology. Our cities are heavily polluted mostly because of the burning of fossil fuels, causing almost 7 million deaths per year globally.
In addition, many—if not all—car manufacturers realize that the demand for low- or no-emission vehicles is increasing exponentially. Users are realizing that these vehicles can have all of the performance advantages of the internal combustion engine without the maintenance or the pollution; the vehicles can go from 0-to-60 miles per hour in 2.5 seconds. Vehicle-charging networks are extending their coverage, and battery costs dropped 13% just in 2019. Some have predicted cost parity between electric cars and gasoline-powered vehicles as soon as 2020. Ford recently announced that it is producing a Ford Mustang SUV that is completely electric. Meanwhile, two-wheelers are shifting to electric: Drivers can go to a petrol station in India, find a wall of batteries that are all charged, exchange their empty battery in a few minutes, and take off.
By 2030, we will probably not be able to purchase a new vehicle with an internal combustion engine. We will still have a transition period of maybe 10 to 15 years during which both technologies will be on the road. But by 2030, I would like to see the internal combustion engine in a museum—a museum that duly honors the role that it has played in global economic development but makes it clear that such technology is now history.—As told to Rey Mashayekhi
Christiana Figueres was the executive secretary of the United Nations Framework Convention on Climate Change from 2010 to 2016.
24. Tony Fadell
Today’s waste will replace tomorrow’s plastic.
We designed our way into the plastics problem. Now we have to design our way out.
Some durable plastic is okay. But the fastest-growing use is in disposable packaging. Recycling isn’t the answer, nor are bio-plastics. Petroleum-based plastic stays in the environment for 500-plus years, and most bio-plastics will only fully compost in specific conditions. It gets into our oceans, our food, our bodies.
We need a bio-inspired packaging material that disintegrates completely, no matter where it ends up. PHA (polyhydroxyalkanoates, a class of natural polyesters derived from bacterial fermentation) is one solution. It will degrade just like a leaf—quickly in any type of environment and into nature’s molecular building blocks: carbon dioxide, water, and nontoxic biomass. We’re learning to produce it with biowaste, like rancid olive oil.
Governments need to continue banning single-use plastics, and companies have to stop the greenwashing. It will take capital, but we could tackle this in four to seven years.—As told to Clay Chandler
Tony Fadell, principal at advisory firm Future Shape, is known as the “Father of the iPod.” He is also coinventor of the iPhone and cofounder of Nest.
25. Fred Krupp
Tech alone can’t save the planet—transparency is needed, too.
As the reality of climate change hits home, it’s easy to feel despair. But the pace of environmental innovation is accelerating too. We still need ambi tious government policies, but new technology and increased transparency can speed progress and spur both business and government to deliver better results.
Business leaders know these changes are underway, and many have embraced them. In the Environmental Defense Fund’s second annual survey of 600 executives, more than 84% say they are confident that technological advances will have a positive effect on the way businesses impact the environment, especially analytics, automation, A.I., and sensors.
And because social media means that everybody gets a vote on whether your company is a responsible corporate citizen, more than 85% of those executives expect customers, employees, and investors to hold them more accountable for their impact on the environment.
After Hurricane Harvey, for example, EDF worked with a startup called Entanglement Technologies to measure air pollution near flooded petrochemical plants in Houston. With its portable technology, we quickly identified a plume of cancer-causing benzene in a community of 4,000 people. This real-time data allowed officials to identify potential health risks and prioritize resources.
EDF recently created a new subsidiary to launch MethaneSAT, an orbital mission designed to help citizens, companies, and governments locate, measure, and reduce potent greenhouse gas emissions. Data from MethaneSAT will also be available to the public free of charge so that everyone will be able to hold businesses accountable, applauding progress or spotlighting laggards. Imagine a world in which thinking machines, handheld analyzers, and orbiting sensors empower an environmental revolution. We’ll see that in this decade.
Fred Krupp is the president of the Environmental Defense Fund, a U.S.-based nonprofit environmental advocacy group.
A version of this article appears in the January 2020 issue of Fortune with the headline “20 Ideas That Will Shape the 2020s.”