Minutes before the 1976 MLB trade deadline, a flurry of sales went down—only to be blocked by Commissioner Bowie Kuhn.
At roughly midseason every year, Major League Baseball teams swap players, with floundering teams often dealing stars for prospects. No MLB trade deadline period, however, rocked the sports world quite like 1976.
Minutes before the trade deadline on June 15, Oakland A’s owner Charles O. Finley sold stars Joe Rudi and Rollie Fingers to the Boston Red Sox for $1 million each and Vida Blue to the New York Yankees for $1.5 million to prevent losing them for nothing in the sport’s first free-agency period.
“Biggest sale of human flesh in the history of sports…” Ron Fimrite of Sports Illustrated called the deal, unprecedented in MLB history for stars and amount of money involved.
Three days after the sale, MLB commissioner Bowie Kuhn infuriated longtime nemesis Finley by voiding the transactions, citing his best-interests-of-baseball power. Finley’s A’s, a juggernaut in the 1970s, were crippled when Rudi, Fingers and other standouts left the team after the season in free agency because the owner couldn’t afford them.
Since the advent of free agency—a ground-breaking right negotiated for players by MLB Players Association executive director Marvin Miller in late 1975—the sport has never been the same. Salaries skyrocketed and team-building became even more complex.
Babe Ruth Among Stars Sold
In April 1976, Finley began the dismantling of Oakland’s dynasty by trading star outfielder Reggie Jackson and pitcher Ken Holtzman to the Baltimore Orioles. His sale of stars in June angered fans of the A’s, the five-time defending American League West champions and World Series champs in 1972, 1973 and 1974.
“[Finley] can set up his cash on that [expletive] mound and come up here and cheer for his money,” an A’s fan at Oakland Coliseum told the San Francisco Examiner.
Other MLB owners were leery about the advent of full-fledged free agency, but none of them attempted to sell their stars as Finley had.
Deals involving star players and substantial sums of money were not new in MLB: In January 1920, future Hall of Famer Babe Ruth was sold by the Boston Red Sox to the New York Yankees for $125,000 and $300,000 in loans. Eighteen years later, pitcher Dizzy Dean, another future Hall of Famer, was sent by the St. Louis Cardinals to the Chicago Cubs for $185,000 and three players. In the early 1930s, Philadelphia A’s manager/owner Connie Mack sold future Hall of Famers Mickey Cochrane, Al Simmons and Lefty Grove.
Because the A’s were playing host to Boston when Finley sold them, Rudi and Fingers simply walked over to the Red Sox clubhouse to suit up for their next game. Before Kuhn ordered his return to Oakland, Fingers – who was inducted into the Baseball Hall of Fame in 1992—even had his picture taken in a Red Sox uniform on the field at Oakland Coliseum. But none of the three players in the sale played for his “new” team.
“If such transactions now and in the future were permitted, the door would be open wide to the buying of success by the more affluent clubs,” the commissioner stated. “Public suspicion would be aroused. Traditional and sound methods of player development and acquisition would be undermined, and our efforts to preserve competitive balance would be greatly impaired.”
“Village idiot!” Finley called the dour Kuhn after his decision voiding the players’ sale.
This 1976 mid-summer baseball circus, capped by Finley’s $10 million suit against the commissioner and MLB, served as fodder for days for the nation’s sports writers:
“It has taken the creation of Finley’s First Annual Garage Sale, the passing of $3.5 million in small, unmarked bills and the appearance of coast-to-coast obituaries for his sport, but finally Kuhn has done something,” wrote Leigh Montville of the Boston Globe.
“Innovative, arrogant and, above all, flamboyant,” Kevin Lamb of the Chicago Daily News wrote of the attempted sales.
“It has been said of and by Charley Finley that he has a right to sell his ballplayers.” wrote Dick Young of the New York Daily News. “They are his property, and that is the American way, the capitalistic way. Not quite. When you join a men’s club, or a country club, you agree to abide by its rules.”
MLB Commissioner and A’s Owner Have Contentious Relationship
Kuhn and Finley had a gasoline-meets-matchstick relationship for years: In 1972, the commissioner ordered the frugal Finley to re-open contract talks with Blue, who won the Cy Young Award as the American League’s best pitcher the previous season. During the 1973 World Series against the Mets, he demanded Finley reinstate infielder Mike Andrews, whom “Charley O” had released after he made two errors in a Game 2 loss.
With ace Blue, outfielder Rudi and relief pitcher Fingers back on the roster, the A’s finished second in the American League West in 1976. But, as Finley feared, Rudi (to the California Angels) and Fingers (San Diego Padres) left Oakland in free agency in the offseason. Blue lasted one more season with the A’s before he was traded to the San Francisco Giants. Oakland lost standout third baseman Sal Bando, catcher Gene Tenace and shortstop Bert Campeneris in free agency after the 1976 season as well, with Charley O getting nothing in return. The next season, the A’s lost 98 games, and Oakland made only one playoff appearance from 1977-1987.
As for that $10 million lawsuit, Oakland manager Chuck Tanner was confident his boss would prevail. “You should own the American League after you get through with this one,” he told Finley, according to the Oakland Tribune.
But Finley—who sold the A’s in 1980 and died in 1996—got nowhere in court. Ultimately, a U.S. Circuit Court of Appeals ruled in favor of Kuhn, who served as commissioner until 1984 and died in 2007. Charley O’s bitterness toward his antagonist lasted for years.
“All I can say is I think it’s a red-letter day for baseball,” Finley said after Kuhn announced his resignation.. “He drove me out of baseball …”
As their numbers grew, women operators became a powerful force—for workers’ rights and even serving overseas in WWI.
In the earliest days of the telephone, people couldn’t dial one another directly. They needed an intermediary—a telephone operator—to manually relay their call on a central switchboard connected to subscribers’ wires. It was a crucial new service that helped a revolutionary new technology spread widely to the masses.
The idea originated in April 1877, when 40-year-old George W. Coy attended a lecture by Alexander Graham Bell. In it, the famous inventor demonstrated how he could converse with two colleagues—one 27 miles away, the other 38 miles—using a device he’d patented just the year before: the telephone. Coy, a Civil War veteran who worked in the telegraph business, soon made a deal with Bell to set up the first telephone exchange in the United States, a central switchboard that allowed anyone with a telephone to call or be called by anyone else who had one.
Coy’s telephone exchange, in New Haven, Connecticut, opened in 1878, with all of 21 clients, including the local police, post office and a drug store. Today, Coy is often cited as the world’s first telephone operator. But while Coy devised the switchboard for the exchange (improvising some parts using wire from women’s bustles!), he hired two boys to operate it. Louis Frost, the 17-year-old son of one of Coy’s business partners, was most likely the first operator.
That Coy would employ boys to do a job later associated mostly with girls and young women was only natural. Boys often worked at telegraph offices, while female telegraph operators were a rarity. That would continue into the early days of the telephone. But by the beginning of the 20th century, women began dominating the field. And as their numbers grew they became a powerful force—fighting for the right to join unions, striking for higher wages, even serving overseas in World War I.
It turned out there was a problem with male switchboard operators: The boys, often barely in their teens, couldn’t seem to behave themselves. They had a tendency to roughhouse. And “when some other diversion held their attention, they would leave a call unanswered for any length of time, and then return the impatient subscriber’s profanity with a few original oaths,” wrote Marion May Dilts in her 1941 book, The Telephone in a Changing World.
Hoping to find operators who’d be more attentive to their duties and not cuss out the customers, local phone companies began to recruit girls and young women. Often that meant going house to house, trying to persuade parents that telephone operator was a respectable job for their daughters.
As the number of telephones in the U.S. multiplied, so did the demand for operators. In 1910, there were 88,000 female telephone operators in the United States. By 1920, there were 178,000, and by 1930, 235,000.
In the telephone’s earliest days, one phone could be connected to another by wire, allowing their two owners to speak. While that may have seemed like a miracle at the time, it was clear that the telephone would be much more useful if any given phone could communicate with numerous phones. Telephone exchanges made that possible.
Each of the phones in a particular locale would be connected by wire to a central exchange. The owner of a telephone would call the exchange, and a switchboard operator would answer. The caller would give the operator the name of the person he or she wanted to speak with, and the operator would plug a patch cord into that person’s socket on the switchboard, connecting the two. Long-distance calls would require the local exchange to patch the call through to more distant exchanges, again through a series of cables. Later, as the exchanges added more and more customers, phones were assigned numbers, and callers could request to be connected that way.
Some early telephone operators worked at small, rural exchanges, their switchboards located in the local railroad station or the back of a general store. In cities, massive switchboards could have long rows of operators packed elbow to elbow.
Operators Were Subject to Strict Rules
At the busier boards, work could be frantic. Some operators took to wearing roller skates to get around. Otherwise, the dress code tended to be strict—long black dresses and no jewelry, for example. Operators were subject to numerous other rules, and spies sometimes monitored their calls on a device called a listening board. In 1899, when a 25-year-old San Francisco operator named Anna Byrne killed herself, the coroner held the phone company responsible: “I firmly believe that the espionage to which telephone girls are constantly subjected drives them to suicidal desperation. They are overworked; and no mercy is shown them when a slight offense is committed by a trivial infraction of the company’s rules.”
Many operators agreed. “The wonder is that more telephone girls don’t kill themselves,” a veteran operator told the San Francisco Examiner. “We are not allowed to speak even in a whisper to each other the nine hours we are on duty, much less smile, and to laugh out loud is the height of recklessness.” She said she’d once been forced to work 10 extra hours, without pay, for one brief giggle.
Companies often tried to control their operators’ personal lives, as well. “The unwritten rule was that she could not marry and would lose her job if she did,” noted Ellen Stern and Emily Gwathmey in their 1994 history, Once Upon a Telephone.
The pace of the work and the repressive rules that operators often had to put up with eventually led to dissension in the ranks. Phone companies discovered that their supposedly docile female workforces could only be pushed so far.
In April 1919, for example, some 8,000 operators walked off the job at the New England Telephone Company, all but shutting down phone service in Maine, Massachusetts, New Hampshire, Rhode Island and Vermont. Five days later, the company met their demands for higher wages and the right to bargain collectively.
The New England strikers may have been inspired by the more than 200 female telephone operators (out of 7,000 who applied) who’d served heroically in First World War. The Signal Corps Female Telephone Operators Unit, informally known as the “Hello Girls,” had started overseas in March 1918. Their mission was to facilitate communications between American, British and French troops on the Western front, serving not only as operators but often as translators.
The Hello Girls, along with women serving as nurses, ambulance drivers and in other jobs crucial to the war effort, are credited with helping President Woodrow Wilson drop his objection to women’s suffrage and endorse it in a 1918 speech to Congress. “We have made partners of the women in this war…” Wilson said. “Shall we admit them only to a partnership of suffering and sacrifice and toil and not to a partnership of privilege and right?”
The End of the Line?
With the coming of the 1930s, technology that allowed telephone users simply to dial another phone without the aid of an operator had become widespread. Phone companies took advantage of the moment to slash their workforces, and thousands of operators lost their jobs. By 1940, there were fewer than 200,000 in all.
In 2021, the Bureau of Labor Statistics reported a total of just 5,000 workers it classifies as “telephone operators” plus another 69,900 categorized as “switchboard operators including answering service.” And it expects more than 20 percent of those jobs to disappear by 2029.
Athens developed a system in which every free Athenian man had a vote in the Assembly.
In the late 6th century B.C., the Greek city-state of Athens began to lay the foundations for a new kind of political system. This demokratia, as it became known, was a direct democracy that gave political power to free male Athenian citizens rather than a ruling aristocratic class or dictator, which had largely been the norm in Athens for several hundred years before.
Athens’ demokratia, which lasted until 338 B.C., is one of the earliest known examples of democracy; and although recent scholarship has complicated the Eurocentric view that it was the first democracy, this ancient political system was extremely influential in the Mediterranean region. It inspired similar political systems in other Greek city-states and influenced the ancient Roman Republic.
The last tyrannos, or tyrant, to rule Athens was Hippias, who fled the city when Sparta invaded in 510 B.C. Two or three years later, an Athenian aristocrat named Cleisthenes helped introduce democratic reforms. Over the next several decades, subsequent reforms expanded this political system while also narrowing the definition of who counted as an Athenian citizen.
What was Cleisthenes’ motivation for initiating these changes? Unfortunately, “we don’t have any good contemporary historical Athenian sources that tell us what’s going on,” says Paul Cartledge, a classics professor at the University of Cambridge. After the 514 B.C. assassination of Hippias’ brother, Cleisthenes may have sensed there was growing public support for a system in which the city-state was not governed by an elite ruling class.
“Cleisthenes, I think probably partly for his own personal self-promotion, put himself forward as champion of the majority view, which was that we must have some form of popular, ‘people’ regime,” Cartledge says.
To participate in the demokratia, a person had to be free, male and Athenian. In the beginning of the democratic period, Athenian men had to have an Athenian father and a free mother. By the mid-5th century B.C., Athens changed the law so that only men with Athenian fathers and mothers could claim citizenship. Because there were no birth certificates (or DNA tests) to prove parentage, a young Athenian man’s political life began when his father introduced him at their local demos, or political unit, by swearing that he was his father and bringing witnesses to attest to this, Cartledge says.
The Athenian democracy was direct, rather than representative, meaning that Athenian men themselves made up the Assembly. Because there were no population censuses, we don’t know exactly how many Athenian men there were in the 5th century B.C., but historians have commonly estimated the number to be around 30,000. Of those, around 5,000 might regularly attend Assembly meetings. In addition, Athenian men served on juries and were annually selected by lot to serve on the Council of 500.
There were other government positions that were in theory open to all Athenian men, although wealth and location played a large role in whether a man could take on a full-time government job or even make it to the Assembly to vote in the first place. Still, there were some positions that were only open to elites: the treasurers were always wealthy (ostensibly because wealthy men knew how to handle finances), and the 10 generals who occupied the top government office were always elite, well-known men.
Political Citizenship Remained Narrow
And then, of course, there were all the other people in Athens who were completely cut off from political participation.
Assuming that there were about 30,000 Athenian men when the city-state developed its democracy, historians estimate there were probably about 90,000 other people living in Athens. A sizable portion of these people would have been non-Athenians who were enslaved (by law, Athenians couldn’t enslave other Athenians). Others were “resident aliens” who were free and lived in Athens but didn’t meet the requirements for Athenian citizenship. The rest were Athenian women and children, both of whom couldn’t join the Assembly.
Although these groups never gained the same political rights as Athenian men, there was some debate about whether they should be able to, says Josiah Ober, a classics professor at Stanford University.
“We know that the question of ‘could women be political beings?’ was debated,” he says. In 391 B.C., the Greek playwright Aristophanes wrote a comedy, Assemblywomen, in which women take over Athens’ government. “It’s meant to be funny in some ways, but there’s a serious thought behind it,” he says. Although Aristotle thought women weren’t psychologically fit for politics, Ober notes that Aristotle’s teacher, Plato, wrote in The Republic (circa 375 B.C.) that an ideal political system would include both women and men.
In addition, “there were moves several times in Athenian crisis history to…free large numbers of slaves to make them citizens, or at least make them resident aliens, on the argument that [Athens] needed more people who were full participants in the war effort,” Ober says. However, “these tended to get defeated.”
Athens’ democratic period also coincided with the city-state’s tightening of its control over what was originally a voluntary alliance of Greek city-states, but had now become an Athenian empire. The city-states had their own governments, some of which were influenced by Athens’ democratic system, but didn’t have any political power in Athens’ demokratia.
Athens’ democracy officially ended in 338 B.C., when Macedonia defeated the city-state in battle. One of the Athenian democracy’s major legacies was its influence on the Roman Republic, which lasted until 27 B.C. The Roman Republic took the idea of direct democracy and amended it to create a representative democracy—a form of government that Europeans and European colonists became interested in several centuries later.
The last game, a 24-0 win by the Super Bowl champion Pittsburgh Steelers in 1976, was played in a ‘surreal’ deluge.
From 1934-1976, the NFL’s preseason tradition included The Chicago Charities College All-Star Game, a game featuring college stars against the league champion. The last game in the long-forgotten series, played mostly at Soldier Field in Chicago, was forgettable: a 24-0 victory by the Pittsburgh Steelers in a deluge.
As the torrents fell in that final game, many fans stormed the field, splashing and sliding on turf that resembled a lake. TV broadcaster Frank Gifford called it a “carnival,” and the game was mercifully ended in the third quarter. “Surreal,” the Pittsburgh Post-Gazette described the finish.
But the series, conceived by a newspaper sports editor, was popular, often drawing more than 70,000 fans for a game. Attendance at the 1947 game was 105,840. (The game wasn’t played in 1974 because of the NFL players’ strike.)
“It was a fascinating series of games,“ says Jon Kendle, the director of archives and football information at the Pro Football Hall of Fame. “It’s something that a lot of people don’t know about, for as long as it took place. And it’s something that will really never happen again. The way the NFL is structured now, there’s just too much at stake for all parties involved.”
Sports Editor Arch Ward Founds Game in 1933
Times were different when Chicago Tribune sports editor Arch Ward came up with the all-star game idea in 1933. College football was king then. In 1926, 110,000 fans attended the Army-Navy game at Soldier Field—the formal dedication of the stadium. In the early 1930s, the NFL needed the all-star game.
Ward was as much promoter as he was reporter. He worked in public relations for Notre Dame football during two of legendary coach Knute Rockne’s unbeaten seasons, started the Golden Gloves boxing tournament, and in 1933 suggested Major League Baseball hold a midseason exhibition between the stars of the American and National leagues. The Midsummer Classic continues to this day.
In consultation with the Chicago city leaders and George Halas of the Chicago Bears, Ward came up with a similar idea for football, pitting college all-stars against the NFL champions. That kind of game was not unusual in those days; in 1939, there were nine games between college players and NFL teams. Ward’s coup was getting the NFL to agree to allow the best players who had just left college play the champions.
Ward decided proceeds from the game would be shared by Chicago-area charities. Thus began one of the greatest charitable efforts in sports history.
“Being a member of the College All-Stars was competition that as I kid I dreamed of,” says Pro Football Hall of Fame receiver Paul Warfield, who played in the game as an all-star and with the Cleveland Browns.
A panel of 30 sportswriters chose the first all-star team. The attendance was more impressive than the result, as 79,432 watched a scoreless tie on August 31, 1934. Ward’s column on the game dealt with a technological improvement: lights at Soldier Field. “The giant audience was able to follow with facility the details of line play,” he writes.
The 1935 All-Stars’ roster included a Michigan player named Gerald Ford, who would become the nation’s 38th president. Attendance topped 100,000 at the 1942, 1947 and 1948 games. Games in 1943 and 1944 were moved to the nearby Northwestern campus in Evanston, Ill., to avoid a large gathering near downtown Chicago, considered a potential enemy target during World War II.
Jackie Robinson Plays in 1941 Game
In 1941, UCLA star Jackie Robinson, who would break Major League Baseball’s color barrier in 1947 with the Brooklyn Dodgers, scored a touchdown for the all-stars. College teams were integrated well before the NFL re-integrated in 1946.
In 42 games, the all-stars won nine and tied twice. Sammy Baugh guided the 1937 college team to its first win, 6-0 over the Green Bay Packers. Green Bay also lost in 1963, a game Vince Lombardi called his most embarrassing loss. The Packers were the first and last NFL team to lose to the all-stars.
Kendle recalls Pro Football Hall of Famer Dave Robinson, the Packers’ first-round draft pick in early December 1962, talking about being a part of the college team that beat Green Bay in 1963. “…the college players were in the locker room hooting and hollering and all of a sudden the trainer from the Packers walks in and yells out, ‘Robinson, Coach Lombardi said to get you. You’re a Packer now. Pack up your things,’” Kendle says.
Robinson sheepishly walked to the near-silent Green Bay locker room and found a corner to sit down. He told Kendle he could feel the eyes of Packers veterans piercing through him.
NFL Player’s Injury Prompts Rethink of Series
At the 1947 game, the Bears were provided new machines called air conditioning to cool the locker room on a 91-degree day. In 1972, the Miami Dolphins were undefeated, but they crossed midfield only three times against the 1973 collegians. Miami coach Don Shula replaced starting quarterback Bob Griese with Earl Morrall to secure a 14-3 win.
Warfield played for the collegians in the 1964 game after he was a first-round pick of the Cleveland Browns. He had an outstanding rookie season, but he suffered a broken collarbone the next preseason when playing against the collegians.
“I was diving for the ball for a pass that was slightly overthrown,” Warfield says. “I came down on my elbow and almost instantaneously the defender fell on top of me.” He needed two surgeries to repair the injury and was limited to one game during the 1965 regular season.
That injury fostered quiet murmurings about the wisdom of the game. Ultimately, it didn’t make sense for a team’s best draft picks to miss training camp for two weeks to work for an all-star team. As NFL training and systems improved, the game became more one-sided. The Super Bowl champions won the final 12 games.
The 1976 deluge, a quirk of nature, was perhaps the final signal that the game had run its course. The College All-Star game was washed out by a torrent of nature and the torrent of growth that the NFL enjoys to this day.
“It was a nice idea when it started,” says longtime NFL writer Vito Stellino, who covered the final game for the Pittsburgh Post-Gazette. “But this was a combination of a perfect storm and a real storm that was too much to overcome.”
From superhyped decathlete Dave Johnson’s bronze-medal showing to gymnast McKayla Maroney’s slip, here’s when American performances didn’t meet heightened expectations
Although they were favored to win gold at the Olympics, some Americans failed to even medal. In an especially cruel twist, two sprinters didn’t even make it to the quarterfinals at the 1972 Games in Munich.
1. Overhyped Decathlete Dave Johnson Settles for Bronze at 1992 Barcelona Games
In 1992, American decathletes Dan O’Brien and Dave Johnson starred in a $30 million Reebok ad campaign that made them sensations. But neither earned gold at the Olympics in Spain. O’Brien, the decathlon world champion, failed to make the U.S. Olympic team. Just before the Games, gold-medal favorite Johnson suffered a painful stress fracture in his right foot – an injury he kept secret lest he give competitors a psychological edge. “It hurt so much I could hardly walk on it,” Johnson said. He settled for the bronze.
2. Scheduling Mix-up Costs U.S. Sprinters at 1972 Munich Games
American sprinter Eddie Hart was heavily favored to win the 100-meter dash. Teammate Rey Robinson also was expected to earn a medal. But neither made it to the start line of the quarterfinals because of a scheduling snafu. U.S. track coach Stan Wright, working off a dated schedule, gave them the wrong start time. “I don’t know if I really understood what pain was, but that day I found out,” Hart recalled. “That hurt. There was no recourse, no second chance, no appeal.”
3. Mary Decker’s Fall at 1984 Los Angeles Games
In one of the more iconic photos in Olympic history, American middle-distance runner Mary Decker cried in pain after colliding with South African Zola Budd halfway through the women’s 3,000-meter final. The heavily favored Decker, who had lost a chance to medal during the U.S.-led boycott of the 1980 Moscow Olympics, led most of the race. But then she tangled with the barefooted Budd, fell and hurt her hip, and did not finish. Budd, competing for Great Britain, finished seventh. Decker’s response when Budd attempted to apologize after the race: “Don’t bother.”
4. Jim Ryun’s Protests Rejected at 1972 Munich Games
Ryun, the world-record holder in the mile, was favored in the 1,500-meter run after winning silver at the Olympics four years earlier. But he fell during his first qualifying heat after colliding with another runner. Spiked in both ankles during the race, he filed a protest, saying he was fouled, but it was rejected. “I felt I was running a very smart race from the standpoint of not getting in traffic,” he told reporters, “and the next thing I knew I was on the ground trying to recover my senses …”
5. Marion Jones Disappoints at 2004 Athens Games
At the 2000 Summer Olympics in Sydney, Australia, sprinter Jones earned three gold medals and two bronze, making her the first woman to medal five times in a single Games. She was expected to medal in Greece, too. But facing doping allegations, Jones placed fifth in the long jump, her only individual event, and her 4 X 100-meter relay team fumbled the baton in the finals and failed to place. In 2007, Jones admitted to using performance-enhancing drugs and was stripped of her Olympic medals.
6. NBA Stars Falter at 2004 Athens Games
With a roster that included veteran NBA stars Tim Duncan, Allen Iverson, Stephon Marbury and rookies LeBron James, Carmelo Anthony and Dwayne Wade, the U.S. men’s basketball team seemed like a good bet for the gold medal. But the Americans finished preliminary play with a 3-2 record, losing to Puerto Rico and Lithuania. The United States edged No. 1 seed Spain in the quarterfinals but lost 89-81 in the semifinal to eventual champion Argentina. The Americans settled for the bronze, prompting this headline in the Los Angeles Times: “U.S. Had Lots of Stars but No Real Team.”
7. Ryan Lochte Caught in a Lie at 2016 Rio de Janeiro Games
Lochte, one of the more decorated U.S. Olympic swimmers with 12 medals, was suspended for 10 months by USA Swimming during the 2016 Games after he and three teammates admitted lying about being robbed at gunpoint at a Rio gas station. Witnesses and surveillance video showed the intoxicated swimmers had vandalized a restroom, leading security guards to draw their weapons. Before the incident, Lochte was part of the gold-medal-winning 4 X 200-meter relay team; afterward, he finished fifth in the 200-meter individual medley—an event in which he was expected to medal. Charges made by Brazilian authorities against Lochte were eventually dropped.
8. Lolo Jones Stumbles in 100-meter Hurdles at 2008 Beijing Games
With just two hurdles left to go, the heavily favored Jones stumbled and finished seventh. Teammate Dawn Harper won the event, and Jones wept. “It’s the hurdles. You have to get over all 10 and if you can’t, you’re not meant to be the champion,” she told the New York Times. Four years later, at the Summer Olympics in London, she failed to earn a medal. At the 2014 Winter Olympics in Sochi, Russia, Jones competed for a U.S. bobsled team that finished 11th.
9. A Women’s Soccer Shocker at 2016 Rio de Janeiro Games
The Americans, the World Cup champions and three-time defending Olympic champs, were heavily favored. Their star-studded lineup included Alex Morgan, Carli Lloyd, Hope Solo and Megan Rapinoe. But ashocking loss to Sweden in the quarterfinals on a penalty shootout eliminated the Americans—the earliest exit for a U.S. team in six Olympics. Solo, who stalled for time before the final shootout kick by switching out her goalie gloves, stirred controversy afterward, calling the Swedes “a bunch of cowards” and stating that “unfortunately the better team didn’t win.”
10. Gymnast McKayla Maroney Slips at 2012 London Games
Heavily favored to win gold in the vault, Maroney nailed her first attempt, but the 16-year-old slipped during her second, landing on her rear and settling for silver behind Sandra Izbasa of Romania. Maroney’s disappointment showed on the podium, as she made a face with pursed lips that quickly became a viral meme with the saying “McKayla is not impressed.” During a visit to the White House with her teammates in the fall following the Games, Maroney posed with President Barack Obama—each made “The Face.”
In 1871, the Wisconsin town of Peshtigo burned to the ground, killing up to 2,500. But due to another event at the time, many have never heard about the disaster.
On the night of October 8, 1871, women snatched their children from their beds, men formed ad hoc fire brigades, and the terrified residents of Peshtigo, Wisconsin fled what would become the deadliest wildfire in American history. So why did the Peshtigo wildfire fade from national memory?
The story starts in a booming logging town surrounded by dense forests. The seemingly endless trees in close range of Lake Michigan sparked a brisk trade in logging that attracted immigrants from all over Europe, beginning in the 1780s. Thanks to its prime location near Chicago—the world’s largest lumber trade market at the time—Peshtigo prospered, felling trees for a rapidly expanding country that needed timber for its houses and new cities.
But Peshtigo’s trees proved to be its downfall.
The confluence of events that led to the devastating blaze started “a low rumbling noise, like the distant approach of a train,” witnesses to the chaos later recalled. Soon, it became clear the town itself was being consumed by flames. Before townspeople had a chance to react, it was already too late. Survivors describe a cyclone-like firestorm—a whirlwind that consumed everything around it.
The conditions were so extreme that people wondered whether they had been incited by a comet (that theory has never been proven). A staggering 1.2 million acres—the size of the state of Connecticut—burned that night.
Building after building ignited, and many burned before anyone could find their way out. Those who did make it to the river watched helplessly as their entire town burned to the ground. Cows and horses rushed into the river, too, creating a scene of anguish and chaos. Some who ran to the river drowned or died of hypothermia.
Those who made it to the next morning found only “a bleak, desolate prairie, the very location of the streets almost a matter of doubt.” A newspaper reporter wrote that “no vestige of human habitation remained, and the steaming, freezing, wretched group, crazed by their unutterable terror and despair…could but vaguely recognize one another in the murky light of day.”
That summer, in 1871, was one of the driest on record. A 20th-century reconstruction conducted by the National Weather Service showed that after a long period of higher-than-usual temperatures and drought, a low-pressure front with cooler temperatures produced winds across the region. This whipped smaller fires into a giant conflagration.
Hundred-mile-per-hour winds stoked the fire even more, with cool air fanning the flames and causing a gigantic column of hot air to rise. This produced even more wind—a vicious cycle that turned a routine wildfire into an inferno.
Peshtigo’s logging industry was partially to blame for the disaster. In an era before responsible forest management practices, loggers simply stripped the land without regard for potential fire hazards they created. They dumped refuse from logging operations in large piles of tinder that became perfect fuel for the October 8 fire. And railroad operations cleared land using small fires, leaving piles of leftover wood behind them.
The town itself was a tinderbox waiting to ignite. Most of its structures were made of wood, as were its sidewalks. Even the streets were paved in wood chips.
Weather was the match that turned those dangerous conditions into an unprecedented fire. Smaller wildfires had raged in the area for days, but on the night of the 8th, winds whipped up and the flames reached Peshtigo. Between 500 and 800 people died in Peshtigo—half the town’s population—and between 1,200 and 2,400 people died in the region through northeastern Wisconsin and Upper Michigan. However, since the records of most of the communities ravaged by fire burned, too, it wasn’t possible to identify or count all the victims.
But something else happened the night of October 8—another fire, fueled by the same conditions, in nearby Chicago. The Great Chicago Fire left 100,000 people homeless, destroyed over 17,000 wooden structures and killed 300. Though it wasn’t as severe as the Peshtigo fire, the big city blaze dominated headlines and history books.
While the Wisconsin fire was overshadowed by the Chicago fire, it is still studied by forest managers and firefighters, who use it as an example of bad forestry practices and the power of unpredictable wildfires.
Another group hasn’t forgotten the fire, either: the residents of Peshtigo. The town was rebuilt after the fire and placed the remains of over 300 of its residents—many too charred to identify as men or women—in a mass grave.