In less than three decades, a handful of companies have grown from dorm-room startups and garage projects into institutions that rival governments in wealth and influence. Tech giants like Amazon, Apple, Google, Meta, and Microsoft have embedded themselves in the infrastructure of modern life to a point where it’s difficult to imagine a world without them.
This astronomic growth raises a set of questions: How did they reach this scale so quickly? What trade-offs do we face for relying on them so heavily? And what ethical dilemmas emerge when so much power is concentrated in so few hands?
Instead of wading through clickbait or partisan content, Gale In Context: High School’s Big Tech topic page channels student curiosity into credible sources: biographies that humanize the founders, business histories that trace growth, and reporting that follows controversies and innovations across decades up until today. The platform doesn’t dictate what to think—it equips students with evidence to form their own perspectives on the economic, social, and technological dynamics shaping the wider world.
Big Tech as a Tool for Critical Discussion
Let’s take a look at five of the most influential tech firms—valued at more than $13 trillion collectively—pairing a short profile of each with headline-grabbing innovations and controversies. Gale In Context: High School gives these case studies academic weight by supporting pedagogically rigorous research practices, ensuring classroom discussions stay rooted in evidence.
Amazon
Amazon began in 1994 as Jeff Bezos’s online bookstore, run from a garage in Seattle, but quickly expanded into electronics, household goods—and just about everything else. Unlike peers like Apple or Meta, Amazon has grown into a behemoth across healthcare, logistics, media, and retail, reaching further than many of its peers. Prime memberships secure customer loyalty with free shipping and streaming perks, while its acquisition of Whole Foods places it in direct competition with grocery retailers.
In 2021, Bezos stepped aside as CEO to devote more attention to his spaceflight venture, Blue Origin. His successor, Andy Jassy, built Amazon Web Services into the backbone of much of the internet, and now directs the company’s broader expansion.
Today, Amazon is valued at more than $2 trillion and employs over 1 million people worldwide. Its vast reach makes it both a driver of innovation and a lightning rod for criticism, particularly over the hidden costs of convenience.
Innovation and Controversy
Amazon transformed retail by pioneering one-click online shopping and building a supply chain that could deliver goods to most US households in two days or less. At a national scale, it brought a vast array of previously unavailable goods to rural households, making dependable home delivery a baseline service regardless of ZIP code. It also gave small and mid-size sellers access to a national storefront with built-in logistics that many would not otherwise have the capital to develop.
Behind Amazon’s fast delivery promises lies a sprawling network of warehouses where employees carry handheld scanners that track their every move, flagging “time off task” if they pause too long. Algorithms set these “pick rates,” which are tied to serious injury rates nearly double that of other warehouse workers.
That tension came to a head in 2021, when warehouse employees in Bessemer, Alabama, tried to form a union. During the mail-in election, Amazon persuaded the US Postal Service to install a mailbox just outside the warehouse, directly under company surveillance cameras. Because workers were meant to mail their union ballots, labor officials later ruled the placement created the impression that Amazon was monitoring votes.
Although the union drive failed, the National Labor Relations Board ruled that Amazon had improperly interfered with the election and ordered a re-vote.
Critical Thinking Questions:
- How might Amazon’s innovations in logistics be applied to solve challenges outside of retail, such as disaster relief or vaccine distribution?
- If Amazon’s wages and benefits outpace many competitors, does that justify the trade-off in working conditions? How should we weigh pay against dignity in the workplace?
Apple
When Steve Jobs and Steve Wozniak assembled the first Apple computer in 1976, they weren’t chasing corporate contracts or government research. They imagined a machine ordinary people could use, resulting in early models like the Apple II and the Macintosh, which quickly set Apple apart and still define its brand nearly 50 years later.
Apple’s direction has always reflected its leaders. Jobs’s flair for presentation turned product launches into cultural touchstones, while Wozniak’s engineering skills laid the foundation for early breakthroughs. Since 2011, Tim Cook has guided the company, focusing on supply chain efficiency, user privacy, and environmental commitments.
Today, Apple products anchor an ecosystem that runs from iPhones and Macs to Apple Music, iCloud, and the App Store. This integration has fueled staggering profitability, making Apple the first publicly traded US company valued at $3 trillion.
Innovation and Controversy
Apple’s most important breakthrough was the iPhone’s touchscreen design, which replaced physical keyboards and styluses with a simple swipe and tap. That change made advanced technology intuitive for a mass audience, lowering the barrier to using powerful digital tools and drawing millions of new users into mobile computing. That same design ethos extended into accessibility features—like screen readers, voice control, and text enlargement.
But the same design choices that made the iPhone revolutionary also gave Apple unusually tight control over the availability and cost of fixes. Independent repair shops have long argued that iPhones and MacBooks are deliberately difficult to fix outside Apple’s network, with restricted access to parts and diagnostic tools. Batteries, in particular, became a point of contention: users couldn’t easily replace them, leaving many dependent on Apple for costly service or forced into purchasing a new device.
The issue boiled over in 2017 with “batterygate,” when lawsuits revealed Apple had slowed older iPhones. The company said the move was meant to prevent shutdowns as batteries aged, but users saw it as proof of how little control consumers had over their own devices.
A $113 million settlement followed, and, under pressure, Apple launched its Self Service Repair program in 2022.
Critical Thinking Questions:
- How does “batterygate” reflect tensions between corporate strategy and consumer trust? Could transparency have prevented the backlash?
- Should laws require tech companies to provide affordable, accessible repair options, similar to how US law mandates automakers share repair information with independent mechanics, or how farmers have fought John Deere over tractor software locks?
Meta
Few companies have reshaped social life as quickly or dramatically as Meta. What began in Mark Zuckerberg’s Harvard dorm now draws nearly 3 billion monthly users across Facebook, Instagram, and WhatsApp, making it one of the primary conduits of modern communication.
The company’s business model is straightforward but immensely powerful: advertising. By offering its platforms for free, Meta draws billions of users whose data and engagement allow for highly targeted marketing. In 2024, more than 97% of Meta’s revenue still came from advertising, though the company has invested heavily in virtual and augmented reality through its Reality Labs division.
Those investments reflect Mark Zuckerberg’s push toward the “metaverse”—a proposed network of immersive digital spaces where people might work, socialize, or shop through avatars and headsets. The 2021 rebrand to Meta signaled this ambition, but the company’s identity and profits remain rooted in its core social media lineup, which has a market cap of nearly $2 trillion.
Innovation and Controversy
Facebook’s defining innovation came in 2006 with the launch of the News Feed, which turned scattered profile updates into a personalized, endlessly scrolling stream. The site soon became a primary source of news and connection for billions of users.
The News Feed’s design—ranking posts based on predicted engagement—helped Facebook grow into one of the most influential communication platforms in history. But the same algorithms that fueled that growth have also been tied to harmful outcomes.
In 2021, former Facebook product manager Frances Haugen leaked more than 22,000 internal documents to The Wall Street Journal and the SEC. The leaks—dubbed the Facebook Papers—revealed that Meta’s own employees had repeatedly warned executives about harms caused by its platforms.
One internal slide deck showed that Instagram made body image issues worse for 32% of teen girls. Another found that Facebook’s algorithm changes in 2018, designed to boost “meaningful social interactions,” instead rewarded outrage and misinformation. A company researcher concluded: “Our algorithms exploit the human brain’s attraction to divisiveness.”
Haugen testified before Congress that Facebook “consistently chose to optimize for its own interests, like making more money,” even when it knew its design choices endangered users or undermined democracy.
Critical Thinking Questions:
- Haugen’s leaks included statistics, researcher notes, and slide decks. How does having direct evidence differ from relying on outside critics or journalists? Should companies be legally required to release research on public harms?
- How might you redesign the News Feed to preserve its benefits while reducing its potential for harm?
Google/Alphabet
Google began in 1998 as a Stanford research project, when Larry Page and Sergey Brin built a search engine that ranked pages by relevance instead of just matching keywords. The experiment quickly eclipsed rivals like Excite and Yahoo!, and within a few years, Google had entered everyday speech both as a noun and a verb.
What made it profitable was advertising. By pairing search results with sponsored links—and later expanding through YouTube ads and the Display Network—Google transformed user queries into one of the most lucrative business models ever devised.
The 2015 creation of Alphabet, a new parent company, reorganized Google as subsidiary alongside moonshot ventures like self-driving cars (Waymo) and biotech (Verily). Sundar Pichai, Google’s CEO since 2015, has steered the company toward cloud computing and AI.
Innovation and Controversy
When Google bought YouTube in 2006, it was already the web’s leading video site. A year later, YouTube launched the Partner Program, allowing creators to earn a share of advertising revenue. Since its launch, YouTube has moved from being a hobbyist platform into the foundation of the “creator economy,” where individuals could build careers and cultivate industries around online video. As of 2023, more than 2.5 billion people turn to the platform for education and entertainment from their favorite creators each month, totaling 47% of internet users worldwide.
In relying on ad revenue to empower creators, YouTube also had to ensure viewers stayed engaged. The Partner Program rewards watch time, and YouTube’s algorithms are tuned to maximize it. That feedback loop—ads funding creators, creators competing for attention, and algorithms serving what would keep people clicking—means that success on the platform often depends on producing more eye-catching or sensational content.
Because spectacle tends to draw more viewers, those videos performed better with the algorithm, reinforcing the incentive to push boundaries in pursuit of clicks. A study from UC Davis found that those same algorithms have also been funneling users toward conspiratorial or extremist material, even if they hadn’t searched for it directly.
Journalists have described this as the “YouTube rabbit hole,” where the algorithm’s taste for sensational content lures people from mainstream coverage into the fringes.
Critical Thinking Questions
- How has the creator economy changed opportunities for individuals and small businesses worldwide?
- How might a business model driven by advertising revenue and watch time, affect its moderation decisions?
Microsoft
Before it became a global giant, Microsoft started in 1975 as the project of Bill Gates and Paul Allen, who envisioned software as the future of computing. Their first breakthrough came with MS-DOS, licensed to IBM, which cemented Microsoft as the industry’s default provider. Windows followed in 1985 and soon dominated, powering the vast majority of PCs worldwide.
Licensing turned out to be a goldmine. Windows and Office became staples of corporate and personal computers, and by the 2000s, Microsoft had expanded into gaming with Xbox and hardware with the Surface tablet line. But as Apple surged in mobile and Google in search, the company was increasingly considered a fading giant.
The turning point came in 2014, when Satya Nadella stepped in as CEO. Rather than cling to Windows as the company’s core identity, he steered Microsoft toward its Azure cloud services. Coupled with strategic bets on AI, including a high-profile alliance with OpenAI, the company has re-emerged as a tech leader.
Innovation and Controversy
In the 1990s, Windows was running on 90% of personal computers, standardizing the PC experience. This lowered barriers for new users and expanded access to computing in homes, schools, offices, and libraries worldwide. That ubiquity made digital skills a basic part of modern life and helped accelerate the spread of the internet itself.
That overwhelming reach caught the attention of regulators, and in 1998, the US Department of Justice, joined by 20 states, filed an antitrust lawsuit accusing Microsoft of using its dominance to stifle competition.
The main topic of concern was Internet Explorer. Microsoft had tied the browser directly to Windows, ensuring that anyone buying a PC encountered it by default. Computer manufacturers who wanted to license Windows were pushed to highlight Internet Explorer and hide competitors like Netscape, while internet providers claimed incentives to promote Microsoft’s browser. Internal emails made the strategy plain, describing a plan to “cut off Netscape’s air supply.”
In 2000, a federal judge ruled that Microsoft violated antitrust law and initially ordered the company to be split into two. The breakup was overturned on appeal, but the 2001 settlement forced Microsoft to change its business practices and open parts of its software to outside scrutiny. Microsoft defended itself by arguing that Internet Explorer integration made internet access simpler and cheaper at a time when browsers were still sold separately.
In hindsight, the war over Internet Explorer looks almost quaint. The browser itself was retired in 2022, remembered as the tool most people used to download Chrome or Firefox.
Critical Thinking Questions:
- Consider the software you use every day—games, word processors, graphic design tools, web browsers, etc. How did Windows’ role as a common platform in the 1990s encourage the development of these programs and shape the way people work, learn, and play today?
- What unique challenges do regulators face when it comes to applying antitrust law to fast-moving digital markets compared to industries like oil, steel, or railroads?
High school is where students first begin to question the systems that shape their lives. If we ignore Big Tech as part of that conversation, we risk leaving them fluent in the tools but ignorant of the structures behind them. By making these companies a subject of evidence-based study, classrooms turn a passive, everyday experience into a lens for critical thinking and civic awareness.
Give your students more than surface-level tech literacy by reaching out to your Gale sales representative and asking how the Gale In Context: High School platform helps learners ask more thoughtful questions about the cost of convenience.