We already knew that Barry Goldwater was a man of vision, but who until now recognized the clarity with which he managed to foresee, fifty-three years in advance, the election results of April 26, 2016?
Around 1970, Alexander Grothendieck, the greatest of all modern mathematicians and arguably the greatest mathematician of all time, announced — at the age of 42 — the official end of his research career. Another great mathematician once told me that he thought he knew why. Following two decades of discoveries and insights that, one after the other, stunned the mathematical world, Grothendieck had, for the first time, achieved an insight so unexpected and so consequential that he himself was stunned. Grothendieck had discovered his own mortality.
I am told that just a few hours ago, his vision proved accurate. But the notion of Grothendieck as a mortal seems hard to swallow. He dominated pure mathematics not just through the force of his ideas — ideas that seemed eons ahead of everyone else’s — but through the force of his personality. When, around 1960, he announced his audacious plan to solve the notoriously difficult Weil conjectures by first rewriting the foundations of geometry, dozens of superb mathematicians put the rest of their careers on hold to do their parts. The project’s final page count, including the twelve volumes known as SGA (Seminaire de Geometrie Algebrique) and the eight known as EGA (Elements de Geometrie Algebrique) approached 10,000 pages. The force and clarity of Grothendieck’s unique vision scream forth from nearly every one of those pages, demanding that the reader see the mathematical world in a new and completely original way — a perspective that has proved not just compelling, but unspeakably powerful.
In Grothendieck, modesty would have been ridiculous, and he was never ridiculous. Here, in his own words — words that ring utterly true — is Grothendieck’s own assessment of how he stood apart (translated from French by Roy Lisker):
Most mathematicians take refuge within a specific conceptual framework, in a “Universe” which seemingly has been fixed for all time – basically the one they encountered “ready-made” at the time when they did their studies. They may be compared to the heirs of a beautiful and capacious mansion in which all the installations and interior decorating have already been done, with its living-rooms , its kitchens, its studios, its cookery and cutlery, with everything in short, one needs to make or cook whatever one wishes. How this mansion has been constructed, laboriously over generations, and how and why this or that tool has been invented (as opposed to others which were not), why the rooms are disposed in just this fashion and not another – these are the kinds of questions which the heirs don’t dream of asking . It’s their “Universe”, it’s been given once and for all! It impresses one by virtue of its greatness, (even though one rarely makes the tour of all the rooms) yet at the same time by its familiarity, and, above all, with its immutability.
Fifty years ago this Labor Day weekend, the presidential campaign of 1964 got underway in earnest. It is often said that Barry Goldwater “lost the election but won the Republican party” or even “lost the election but won the future” by nudging the center of either the party or the country several notches to the right.
I don’t see it. Where is the contemporary mainstream politician — Republican or otherwise — who would repeal the 1964 Civil Rights Act, or at least those provisions (Titles II and VII) that authorize Federal regulators to override private business decisions about whom to serve and whom to hire? Where is the contemporary mainstream politician who would sell the Tennessee Valley Authority? Or end all agricultural supports? If Goldwaterism is in fact ascendant, then how did entitlement spending, as a percentage of GDP, manage to grow for most of the past 20 years — even though Republicans controlled the House of Representatives for 16 of those 20? For that matter, how is it that after all those years of Republican control, the National Endowments of the Arts and Humanities — two of the more noxious weeds to arise from the soil of the Goldwater defeat — continue to thrive?
In 1706, the British astronomer John Machin calculated π to 100 digits (by hand of course). His trick was to notice that π = 16A – 4B where A and B are given by
If you’re computing by hand, this is an excellent discovery, because the series for A involves a lot of divisions by 5, which are a lot easier to calculate than, say, divisions by 7, and the series for B converges very fast, so just a few terms buys you a whole lot of accuracy. (Try using, say, just the first four terms of A and just the first term of B to see what I mean.)
Machin’s 100 digits were a substantial improvement over the 72 digits obtained just a little earlier by Abraham Sharp, using the far less efficient series
In 1729, a Frenchman named de Lagny got all the way to 127 digits, but, in the words of the scientist/engineer/philosopher/historian Petr Beckmann (of whom more later), de Lagny “sweated these digits out by Sharp’s series, and so exhibited more computational stamina than mathematical wits.”
Machin’s methods were ingenious, but no more ingenious — and certainly no more striking — than John Wallis’s 1655 discovery that
I’ve been reading about the passage of the 1957 Civil Rights Bill, which, in its original form, banned racial segregation in theaters, restaurants and hotels (though by the time it was passed, almost all of the content had been stripped out). There’s a part of this history that makes no sense to me and I’m wondering if someone can explain it.
Remember first that this was at a time when several southern states enforced laws that mandated segregation in theaters, restaurants and hotels.
It was also at a time when, as I understand it, the outcome of the legislative battle was very much in doubt, so that each side feared the worst and was eager to compromise. Supporters weren’t sure they could beat a filibuster, which meant the bill might never even come to a vote. Opponents feared a filibuster might be beaten and the bill passed without amendments.
Lyndon Johnson, the majority leader of the Senate, wanted above all else to avoid a major fight, and was eager to facilitate any compromise both sides could agree on. He floated several compromise proposals and actively solicited others, from legislators, attorneys, and everyone else he could think of.
In Master of the Senate, the third in his three-volume biography of Lyndon Johnson, Robert Caro describes a vast number of compromises that failed before the passage of the final successful compromise.
Now here’s what astonishes me: Here you had all these lawyers and politicians, desperately trying to find a creative compromise — and yet, as far as I can tell, nobody ever proposed the compromise that seems (to me) to be obvious. The Republicans and northerners wanted mandatory integration. The southerners wanted to maintain mandatory segregation. The obvious compromise, I should think, would be to have neither — the northerners agree not to pass a federal law, and the southerners agree to repeal some state laws.
Fifty years ago today at 1:30 PM eastern standard time, a minor tragedy took the life of President John F. Kennedy. A little over an hour later, a major tragedy ensued, as Lyndon B. Johnson was sworn in to replace him.
If there is such a thing as evil, it lived in Lyndon Johnson, whose life was one long obsession with the accumulation and exercise of power. His biographer Robert Caro relates how, in college, Johnson engineered, by intimidation and deceit, a takeover of the Student Council partly so that he could, apparently for sport, force the removal of talented and hardworking students from the editorships of campus publications, replacing them with non-entities and reveling in the tragic aftermath as ousted incumbents (who had received small but urgently needed stipends for their work) were forced through financial hardship to drop out of school.
It was downhill from there. As President, Johnson presided over a misbegotten war in Southeast Asia — a whirlpool of destruction fed with lives and treasure — and an equally misbegotten “War on Poverty” that too often became a war on economic freedom, the only effective antidote to poverty the world has ever known.
The War on Poverty might have been more accurately termed a war to consolidate Johnson’s influence. Poor rural families got grants and loans to expand their farms — provided they stayed on the farms, where Johnson needed their votes. Job training, educational programs, small business loans — all were available as long as you lived your life in a way that suited Lyndon Johnson’s purposes.
As I work my way through Robert Caro’s monumental four-volume biography of Lyndon Johnson, I’m repeatedly astonished by Caro’s gargantuan appetite for detail on the one hand, and his near total incuriosity about the big picture on the other.
Case in point: We get almost 40 densely packed pages on the appropriations (eventually totaling $25 million) for the Marshall Ford Dam and another 30 or so on what a dramatic change the dam (and the electricity it brought) made in the lives of Texas Hill Country farm families. But unless I overlooked it, we’re never told how many of those farm families were affected — and are thus left with absolutely no basis for thinking about whether this dam was a good investment.
At another point, we’re told of a $1.8 million expenditure to bring electric lines to 2892 Hill Country farms. (This is, of course, over and above the cost of the dam, which presumably benefited many more than just these 2892.) This time, thankfully, we are at least told how many families are affected. But since the expenditure comes to $622 per family in a time and place when one dollar a day was a good wage, where there was no running water and very little communication with the outside world, and where the soil was bad and getting worse, this raises the question of whether that $622 could have been better spent relocating that family to a better place. (All the moreso if we top off that $622 with the family’s pro rata share of the dam cost.) Caro never even acknowledges the question, pausing simply to celebrate the benefits of electricity, which, he seems to imply, were great and therefore (!) justified the expenditures.
Well, there are two ways you can get the benefits of electricity. The electricity can come to you, or you can go to it. Sometimes one way is better; sometimes the other. When conditions are as Caro describes them — with the land essentially worn out, starvation rampant, and everyone too poor to get a fresh start in, say, Austin — there’s a pretty good likelihood that the guy who could have helped you move, but instead spends a bundle to bring you an electric line, has something other than your best interests at heart.
I’m not far enough along to be sure of this, but after a little peeking ahead, it’s beginning to look like this is how Caro’s going to treat the Great Society also — hundreds of pages on the details of the legislation, hundreds more on the good it (allegedly) did, and not a single inquiry into how much more good somebody could have done with expenditures of that magnitude.
And then there’s this passage, which I feel compelled to assure you I am not making up:
Last week was not the first time the United States was transfixed by an act of terror. In 1964, three civil rights workers in Philadelphia, Mississippi were (quoting Wikipedia) “threatened, intimidated, beaten, shot, and buried by members of the Mississippi White Knights of the Ku Klux Klan, the Neshoba County Sheriff’s Office and the Philadelphia Police Department.” It took 44 days and an FBI-initiated act of torture to locate their bodies.
The FBI, in a nod to the theory of comparative advantage, subcontracted the torture to the Mafia, more specifically to the Colombo family associate Gregory Scarpa. Here’s the story as relayed by Selwyn Raab, the New York Times investigative reporter who covered the Mafia for 25 years:
[Scarpa] went down to Mississippi for the FBI and kidnapped a KKK guy agents were sure was involved in disposing of the bodies. The guy had an appliance store. Scarpa bought a TV and came back to the store to pick it up just as he was closing. The guy helps him carry the TV to his car parked in the back of the store. Scarpa knocks him out with a bop to the head, takes him off to the woods, beats him up, sticks a gun down his throat and says “I’m going to blow your head off”. The KKK guy realized he was Mafia and wasn’t kidding and told him where to look for the bodies.
(Source: Raab’s book Five Families, which is fascinating throughout. Raab says the story has been verified by “former law enforcement officials who asked for anonymity and lawyers who are aware of the circumstances”.)
The moral of the story is that torture sometimes works. Other times it doesn’t, eliciting either no information, or false information, or whatever “information” the victim believes the inquisitor wants to hear. I am almost 100% ignorant, and hence virtually 100% agnostic, about the relative frequency of these outcomes in those cases where the torturer is both skilled in his art and genuinely interested in eliciting the truth. I will be very glad if any educated reader can shed light on this question. I doubt that we’re likely to learn of any controlled experiments, but I’ll settle for sketchy data or even well-chosen anecdotes. Failing that, I’ll settle for plausibility arguments.
I am not one of the public intellectuals who were queried by The Atlantic (link might require subscription) as to which date most changed world history — but on the Internet, you can always spout off without an invitation.
It’s hard to argue with Freeman Dyson, who nominates the day an asteroid wiped out the dinosaurs, clearing the evolutionary path for the likes of you and me.
(Actually, it’s remarkably easy to argue with Freeman Dyson. I know this, having done so over tea in Princeton, many years ago. He made it very easy indeed, despite (or perhaps because of) the fact that he was 100% right and I was 100% wrong.)
At the opposite end of the intellectual spectrum, the standup comedian W. Kamau Bell, after lamenting that there’s no way he can get this right so he might as well punt, nominates the day Michael Jackson first performed the moonwalk on national TV. Unfortunately, his intent to give the most ridiculous possible answer is thwarted by one Neera Tanden of something called the Center for American Progress, who, with an apparently straight face, nominates August 26, 1920 (the day American women gained the right to vote) — an answer that begins by placing 20th century America at the center of the Universe and proceeds downhill from there.
Other 20th-century answers (the assassination of Archduke Ferdinand, Hitler’s invasion of the Soviet Union) are at least more serious, and I think that Anne-Marie Slaughter‘s nomination of the still-very-recent-by-historical-standards signing of the Declaration of Independence on July 4, 1776 is even defensible. But then what about the Glorious Revolution of 1688, which arguably laid the political and intellectual groundwork that made the Declaration possible?
In early 1464, with Lancastrian rebellions breaking out all over England, King Edward IV found it prudent to raise an army. He therefore dispatched “commisions of array” to the twenty-two counties of southern England, each charged with rounding up the able-bodied men of the county and turning them into an army. In most cases, the county commission consisted of a half dozen or more men, including one great magnate. But Richard, Edward’s brother, inspired so much trust that he was appointed sole commissioner for nine counties — everything from Shropshire and Warwickshire through Somerset to Devon and Cornwall. Richard, in other words, was solely responsible for levying troops from a quarter of the realm. He was not yet twelve years old.
This makes me believe that my seventeen-year-old stepdaughter has too few chores.
I became a lifelong political convention junkie in 1972, the year that George McGovern secured the nomination with a brilliantly executed ploy that nobody saw coming until it was over, and that even the sainted Walter Cronkite mistakenly reported as a disaster.
I was 18 years old. Most of the Democratic convention was held in the wee hours of the morning, and I went sleepless following the battle on black and white TV, jumping up every few minutes to twirl the dial to another network. All realtime analysis came from the anchormen, and at the crucial moment, the anchormen had no idea what was happening.
I gave a series of four talks last week at Cato University; only the first of them was broadcast by C-SPAN, and you can watch it here. (The title was “The Greatest Story Ever Told”, meaning the story of economic growth.)
Much of this material will look familiar to those who have watched other videos recently posted in this space, but I think it comes together a little better in this one. The remaining lectures contained more in the way of new material, and I’m hoping to be able to post at least some video excerpts in the near future.
There were a lot of fabulous talks at this event by such luminaries as Tom Palmer (here and here) and the extraordinary Robert McDonald, who held the audience in thrall with his gripping three-part series on the history of the American revolution (not, unfortunately, online, even in part).
If you missed it, there’s always next year!
If you study economics, or statistics, or chemistry, or mathematical biology, or thermodynamics, you’re sure to encounter the notion of a Markov chain — a random process whose future depends probabilistically on the present, but not on the past. If you travel through New York City, randomly turning left or right at each corner, then you’re following a Markov process, because the probability that you’ll end up at Carnegie Hall depends on where you are now, not on how you got there.
But even if you work with Markov processes every day, you’re probably unaware of their origins in a dispute about free will, Christianity, and the Law of Large Numbers.
Roughly 1500 died on the Titanic; according to Wikipedia, it would have cost about $16,000 to equip her with additional lifeboats sufficient to save them all. Call it $10 per life saved. The price level today is roughly 22 times what it was in 1912, so in today’s terms that’s $220 per life.
Now, if I were boarding a ship for a luxury cruise, and was offered the chance to pay an additional $220 for a guaranteed seat on a lifeboat in the event of a sinking, I’m quite sure I’d take a pass — and I’m quite sure so would virtually all of my fellow passengers. So if the Titanic had been designed to cross the ocean once and then spend the rest of its days in a museum, it would have been insane to equip her with extra lifeboats. But of course if the Titanic had been designed to cross the ocean once and then spend the rest of its days in a museum, it would have been insane to build her in the first place. So that’s not the right calculation.
The right calculation accounts for the fact that a single lifeboat provides security to passengers on multiple voyages. How many voyages? Well, the Titanic was intended to make the round trip between Europe and America every three weeks; that’s two voyages per three-week period. I’m not sure how long the sailing season was, but we know it was underway by mid-April (and perhaps earlier; it’s often mentioned that if the Titanic had been ready earlier she would have sailed earlier) so (assuming sailing conditions are roughly symmetric around the solstice) it must have lasted till at least mid-August. That’s time for five round trips at a minimum, and I’m guessing this is a quite conservative assumption.
If a lifeboat lasts a year, then, it does its job at least ten times. If it lasts five years (which is, I suspect, another quite conservative assumption), it does its job fifty times. Now we’re in the vicinity of $4 per passenger (and of course much less if my assumptions are indeed quite conservative).
I continue to be bowled over daily by the high quality of the discussion at MathOverflow, and the prominence of many of the frequent participants. But this one was special:
A newbie poster asked for a pointer to a proof of the “de Rham-Weil” theorem. There’s a bit of ambiguity about what theorem this might refer to, but I had a pretty good of what the poster meant, so I responded that the earliest reference I know of is in Grothendieck‘s 1957 Tohoku paper — which led another poster to ask if this meant de Rham and Weil had had nothing to do with it.
This triggered an appearance from the legendary Roger Godement (had he been lurking all this time?), now aged 91 and one of the last survivors of the extraordinary circle of French mathematicians who rewrote the foundations of topology and geometry in the mid-20th century and changed the look, feel and content of mathematics forever. I tend to think of them as gods and demigods. Godement’s indispensable Theorie des Faisceaux was my constant companion in late graduate school. And now he has emerged from retirement for the express purpose of chastising me:
I’ve been a little swamped lately and my daily blogging has fallen off. Until things get back to normal, I think I’ll fill the breach by reprinting a few of my old columns from Slate. Today’s entry is on “Why Jews Don’t Farm”.
In the 1890s, my Eastern European Jewish ancestors emigrated to an American Jewish farming community in Woodbine, N.J., where the millionaire philanthropist Baron de Hirsch provided land, tools, and training at one of the nation’s first agricultural colleges. But within a generation, the family had settled in Philadelphia where they became accountants, tailors, merchants, and eventually, lawyers and college professors.
De Hirsch had a vision of American Jews achieving economic liberation by working the land. If he’d had a better sense of history, he would have built not an agricultural college but a medical school, because for well over a millennium prior to the settlement of Woodbine, Jews had not been farmers—not in Palestine, not in the Muslim empire, not in Western Europe, not in Eastern Europe, not anywhere in the world.
You have to go back almost 2,000 years to find a time when Jews, like virtually every other identifiable group, were primarily an agricultural people. Around A.D. 200, Jews began to quit the land. By the seventh century, Jews had left their farms in large numbers to become craftsmen, artisans, merchants, and moneylenders—the only group to have given up on agriculture. Jewish participation in farming fell to about 10 percent through most of the world; even in Palestine it was only about 25 percent. Everyone else stayed on the farms.
(Even in the modern state of Israel, where agriculture has been an important component of the economy, it’s been a peculiarly capital-intensive form of agriculture, one that employed well under a quarter of the population at the height of the Kibbutz movement, and less than 3 percent of the population today.)
The obvious question is: Why? Why did Jews and only Jews take up urban occupations, and why did it happen so dramatically throughout the world? Two economic historians—Maristella Botticini (of Boston University and Universitá di Torino) and Zvi Eckstein (of Tel Aviv University and the University of Minnesota)—have recently been giving that question a lot of thought.
Today is the 200th birthday of Evariste Galois, who did not live to celebrate his 21st, but found time in his short 20 years to develop a circle of ideas that permeate modern mathematics. We know of these ideas because Galois spent the night of May 30, 1832 scribbling them furiously in a letter to a friend, in advance of the fatal duel he would fight the following morning. According to the great mathematician Hermann Weyl, “This letter, if judged by the novelty and profundity of ideas it contains, is perhaps the most substantial piece of writing in the whole literature of mankind.”
(If this were a less serious post, I might suggest that this famous letter was the first example of a Galois Correspondence.)
Now, two centuries later, every first year graduate student in mathematics spends a semester studying Galois Theory, and many devote their subsequent careers to its extensions and applications. Many of the greatest achievements of modern mathematics (for example, the solution to Fermat’s Last Theorem) are, at their core, elucidations of Galois’s 200-year-old insight.
Well, I was pretty young in 1963, probably too young to think about such matters. I remember having little interest in the Beatles, but being being very aware that they were something very big. Everyone was aware of that. But unless I am mistaken, pretty much nobody realized that we were witnessing something really big and lasting. More generally, I doubt that anyone at the time had any inkling of the long-term significance of rock ‘n’ roll. We knew it was popular, but we had no idea it would change the world. I’m not sure that in 1963 anyone knew that it was possible for music to change the world.
This led to the more general question: How quickly are great cultural watersheds recognized for what they are? In the few areas I know something about, I think the answer is “usually pretty quickly”. I remember 1910 even less vividly than I remember 1963, but I am pretty sure that it wasn’t long between the appearance of The Love Song of J. Alfred Prufrock and the realization (at least among people who care about this sort of thing) that poetry had changed forever. In mathematics, at least in the past century (and I’m pretty sure for several centuries, or even millenia, before that), major paradigm shifts have generally been recognized very quickly. When a Serre or a Grothendieck upends the mathematical world, the mathematical world quickly knows it’s been upended.
When I review the blessings of my extraordinarily blessed life, this one always appears near the top of my list: I am an adult male who has never been to war. I have always assumed — without thinking about it too hard — that in the historical scheme of things, this is a great privilege, and a great rarity.
Am I right about that? Over the course of human history, what is your estimate of the fraction of males who have reached adulthood without participating in a military conflict?
(Obviously, there’s some fuzziness about what counts as military conflict. I’m thinking here not about the occasional street fighter, but about the guy living in mud and getting shot at for weeks at a time — or things equally dangerous/traumatic/uncomfortable.)
65 years ago today, the world changed. In his magnificent World War II memoir Quartered Safe Out Here, George McDonald Fraser looks back on what might have been:
I led Nine Section for a time; leading or not, I was part of it. They were my mates, and to them I was bound by ties of duty, loyalty and honor… Could I say, yes, Grandarse or Nick or Forster were expendable, and should have died rather than the victims of Hiroshima? No, never. And the same goes for every Indian, American, Australian, African, Chinese and other soldier whose life was on the line in August, 1945. So [I'd have said]: drop the bomb.
And then I have another thought.
You see, I have a feeling that if—and I know it’s an impossible if—but if, on that sunny August morning, Nine Section had known all that we know now of Hiroshima and Nagasaki, and could have been shown the effect of that bombing, and if some voice from on high had said: “There — that can end the war for you, if you want. But it doesn’t have to happen, the alternative is that the war, as you’ve known it, goes on to a normal victorious conclusion, which may take some time, and if the past is anything to go by, some of you won’t reach the end of the road. Anyway, Malaya’s down that way … it’s up to you”, I think I know what would have happened. They would have cried “Aw, fook that!”, with one voice, and then they would have sat about, snarling, and lapsed into silence, and then someone would have said heavily, “Aye, weel” and got to his feet, and been asked “W’eer th’ ‘ell you gan, then?”, and given no reply, and at last, the rest would have got up, too, gathering their gear with moaning and foul language and ill-tempered harking back to the long dirty bloody miles from the Imphal boxes to the Sittang Bend and the iniquity of having to do it again, slinging their rifles and bickering about who was to go on point, and “Ah’s aboot ‘ed it, me!” and “You, ye bugger, ye’re knackered afower ye start, you!”, and “We’ll a’ git killed!”, and then they would have been moving south. Because that is the kind of men they were.
To my mild surprise, the face that generated the most controvery—in both comments and email—was that of Abraham Lincoln, who was born 201 years ago on this day. Readers pulled no punches. ScottN wrote: “Lincoln is on a different list I have: People Who Caused the Most Unnecessary Deaths.” Peter wrote: “[Lincoln] was a tyrant and a racist to boot.” And the consistently provocative and thoughtful Bob Murphy wrote:
I would love to hear your reasons for including Lincoln. I have the same misgivings as the other commenter above, though I was going to introduce them with levity. (E.g. “I know you like math, Steve, so is that why you included the guy who maximized the wartime deaths of Americans?”)
I replied to Bob (and others) by email, with some sketchy thoughts and a promise to blog about Lincoln sometime on or before his birthday. With the deadline looming, I realize that I have little to add to those sketchy thoughts. So here, with only some minor editing, is the email I sent to Bob Murphy:
Peter Leeson of George Mason University (currently visiting the University of Chicago) offers a new take on the medieval practice of “trial by ordeal”:
“For 400 years the most sophisticated persons in Europe decided difficult criminal cases by asking the defendant to thrust his arm into a cauldron of boiling water and fish out a ring. If his arm was unharmed, he was exonerated. If not, he was convicted.”
According to Leeson, this is less crazy than it sounds: As long as defendants believe (superstitiously) that ordeals yield accurate verdicts, guilty defendants always confess to avoid the ordeal. At the same time innocent defendants always opt for the ordeal—and are always acquitted, provided the priests cheat by (for example) substituting tepid for boiling water, or “sprinkling” a few gallons of cold holy water over the cauldron, or liberally redefining what counts as “unharmed”.
One hundred years ago today, Red Cloud, the last of the great Sioux warrior chiefs, died in peace on the Pine Ridge reservation at the age of 89. He was preceded in death by the way of life he fought so valiantly to preserve.
If there is such a thing as a just war, Red Cloud’s War of 1866 was more just than most. Black Kettle‘s village of peaceful Cheyenne had been recently and wantonly slaughtered by the Colorado militia under Colonel John Chivington at Sand Creek. (The survivors of this unhappy band would meet their deaths a few years later at Washita Creek, at the equally murderous hands of General George Armstrong Custer.) Against this background, the chiefs had been betrayed at Fort Laramie, where the government had summoned them to negotiate for the right to build roads through Indian territory. With the conference still in session and no agreement in sight, Colonel Henry Carrington and a force of 700 men arrived to build the Bozeman Trail.