Search This Blog

Followers

Wednesday, June 18, 2008

The Rise, Fall And Return of Pluralism

By Peter F. Drucker, a professor of social science and management at the Claremont Graduate University and a former president of the Society for the History of Technology. He is author, most recently, of "Management Challenges for the 21st Century," just out from Harperbusiness.

T he history of society in the West during the last millennium can--without much oversimplification--be summed up in one phrase: The Rise, Fall and Rise of Pluralism.

By the year 1000 the West--that is, Europe north of the Mediterranean and west of Greek Orthodoxy--had become a startlingly new and distinct civilization and society, much later dubbed feudalism. At its core was the world's first, and all but invincible, fighting machine: the heavily armored knight fighting on horseback. What made possible fighting on horseback, and with it the armored knight, was the stirrup, an invention that had originated in Central Asia sometime around the year 600. The entire Old World had accepted the stirrup long before 1000; everybody riding a horse anywhere in the Old World rode with a stirrup.

But every other Old World civilization--Islam, India, China, Japan--rejected what the stirrup made possible: fighting on horseback. And the reason these civilizations rejected it, despite its tremendous military superiority, was that the armored knight on horseback had to be an autonomous power center beyond the control of central government. To support a single one of these fighting machines--the knight and his three to five horses and their attendants; the five or more squires (knights in training) necessitated by the profession's high casualty rate; the unspeakable expensive armor--required the economic output of 100 peasant families, that is of some 500 people, about 50 times as many as were needed to support the best-equipped professional foot soldier, such as a Roman Legionnaire or a Japanese Samurai.

The knight exercised full political, economic and social control over the entire knightly enterprise, the fief. This, in short order, induced every other unit in medieval Western society--secular or religious--to become an autonomous power center, paying lip service to a central authority such as the pope or a king, but certainly nothing else such as taxes. These separate power centers included barons and counts, bishops and the enormously wealthy monasteries, free cities and craft guilds and, a few decades later, the early universities and countless trading monopolies.

By 1066, when William the Conqueror's victory brought feudalism to England, the West had become totally pluralist. And every group tried constantly to gain more autonomy and more power: political and social control of its members and of access to the privileges membership conferred, its own judiciary, its own fighting force, the right to coin its own money and so on. By 1200 these "special interests" had all but taken over. Every one of them pursued only its goals and was concerned only with its own aggrandizement, wealth and power. No one was concerned with the common good; and the capacity to make societywide policy was all but gone.

The reaction began in the 13th century in the religious sphere, when--feebly at first--the papacy tried, at two councils in Lyon, France, to reassert control over bishops and monasteries. It finally established that control at the Council of Trent in mid-16th century, by which time the pope and the Catholic Church had lost both England and Northern Europe to Protestantism. In the secular sphere, the counterattack against pluralism began 100 years later. The Long Bow--a Welsh invention perfected by the English--had by 1350 destroyed the knight's superiority on the battlefield. A few years later the cannon--adapting to military uses the powder the Chinese had invented for their fireworks--brought down the hitherto impregnable knight's castle.

From then on, for more than 500 years, Western history is the history of the advance of the national state as the sovereign, that is as the only power center in society. The process was very slow; the resistance of the entrenched "special interests" was enormous. It was not until 1648, for instance--in the Treaty of Westphalia, which ended Europe's Thirty Years War--that private armies were abolished, with the nation-state acquiring a monopoly on maintaining armies and on fighting wars. But the process was steady. Step by step, pluralist institutions lost their autonomy. By the end of the Napoleonic Wars--or shortly thereafter--the sovereign national state had triumphed everywhere in Europe. Even the clergy in European countries had become civil servants, controlled by the state, paid by the state and subject to the sovereign, whether king or parliament.

The one exception was the United States. Here pluralism survived--the main reason being America's almost unique religious diversity. And even in the U.S., religiously grounded pluralism was deprived of power by the separation of church and state. It is no accident that in sharp contrast to Continental Europe, no denominationally based party or movement has ever attracted more than marginal political support in the U.S.

By the middle of the last century, social and political theorists, including Hegel and the liberal political philosophers of England and America, proclaimed proudly that pluralism was dead beyond redemption. And at that very moment it came back to life. The first organization that had to have substantial power and substantial autonomy was the new business enterprise as it first arose, practically without precedent, between 1860 and 1870. It was followed in rapid order by a horde of other new institutions, scores of them by now, each requiring substantial autonomy and exercising considerable social control: the labor union, the civil service with its lifetime tenure, the hospital, the university. Each of them, like the pluralist institutions of 800 years ago, is a "special interest." Each needs--and fights for--its autonomy.

Not one of them is concerned with the common good. Consider what John L. Lewis, the powerful labor leader, said when FDR asked him to call off a coal miners strike that threatened to cripple the war effort: "The president of the United States is paid to look after the interests of the nation; I am paid to look after the interest of the coal miners." That is only an especially blunt version of what the leaders of every one of today's "special interests" believe--and what their constituents pay them for. As happened 800 years ago, this new pluralism threatens to destroy the capacity to make policy--and with it social cohesion altogether--in all developed countries.

But there is one essential difference between today's social pluralism and that of 800 years ago. Then, the pluralist institutions--knights in armor, free cities, merchant guilds or "exempt" bishoprics--were based on property and power. Today's autonomous organization--business enterprise, labor union, university, hospital--is based on function. It derives its capacity to perform squarely from its narrow focus on its single function. The one major attempt to restore the power monopoly of the sovereign state, Stalin's Russia, collapsed primarily because none of its institutions, being deprived of the needed autonomy, could or did function--not even, it seems, the military, let alone businesses or hospitals.

Only yesterday most of the tasks today's organizations discharge were supposed to be done by the family. The family educated its members. It took care of the old and the sick. It found jobs for members who needed it. And not one of these jobs was actually done, as even the most cursory look at 19th-century family letters or family histories shows. These tasks can be accomplished only by a truly autonomous institution, independent from either the community or the state.

The challenge of the next millennium, or rather of the next century (we won't have a thousand years), is to preserve the autonomy of our institutions--and in some cases, like transnational business, autonomy over and beyond national sovereignties--while at the same time restoring the unity of the polity that we have all but lost, at least in peacetime. We can only hope this can be done--but so far no one yet knows how to do it. We do know that it will require something that is even less precedented than today's pluralism: the willingness and ability of each of today's institutions to maintain the focus on the narrow and specific function that gives them the capacity to perform, and yet the willingness and ability to work together and with political authority for the common good.

This is the enormous challenge the second millennium in the developed countries is bequeathing the third millennium.

The Rules of Executive Class

By PETER F. DRUCKER
June 1, 2004

An effective executive does not need to be a leader in the sense that the term is now most commonly used. Harry Truman did not have one ounce of charisma, for example, yet he was among the most effective chief executives in U.S. history. Some of the best business and nonprofit CEOs I've worked with over a 65-year consulting career were not stereotypical leaders. They ranged from extroverted to nearly reclusive, from easygoing to controlling, from generous to parsimonious. What made them all effective is that they followed the same eight practices:
• Ask "What needs to be done?" Failure to ask this question will render even the ablest executive ineffectual. Jack Welch realized that what needed to be done at General Electric when he took over as chief executive was not the overseas expansion he wanted to launch. It was getting rid of GE businesses that -- no matter how profitable -- could not be No. 1 or No. 2 in their industries.

• Ask "What is right for the enterprise?" Note that the question is not what's right for the shareholders, or the executives, or the employees. Those are all important constituencies who need to support a decision, or acquiesce in it, if the choice is to be effective. But if a decision isn't right for the enterprise as a whole, in the long run it won't be right for any of the individual stakeholders.

• Develop action plans. The action plan is a statement of intentions rather than a commitment. It should be revised often, because every success creates new opportunities. So does every failure. Napoleon allegedly said that no successful battle ever followed its plan. Yet Napoleon also planned every one of his battles, far more meticulously than any earlier general had done. Without an action plan, the executive becomes a prisoner of events.

• Take responsibility for decisions. This is particularly important when it comes to hiring or promoting people. If after promoting a person, the decision has not had the desired results, executives don't conclude that the person has not performed. They conclude, instead, that they themselves made a mistake. In a well-managed enterprise, it is understood that people who fail in a new job, especially after a promotion, may not be the ones to blame.

• Take responsibility for communicating. Effective executives make sure that both their action plans and their information needs are understood. Specifically, this means that they share their plans with and ask for comments from all their colleagues -- superiors, subordinates, and peers. At the same time, they let each person know what information they'll need to get the job done. The information flow from subordinate to boss is usually what gets the most attention. But executives need to pay equal attention to peers' and superiors' information needs.

• Focus on opportunities, not problems. In most companies, the first page of the monthly management report lists key problems. It's far wiser to list opportunities on the first page and leave problems for the second page. Unless there is a true catastrophe, problems are not discussed in management meetings until opportunities have been analyzed and properly dealt with.

• Make meetings productive. Every study of the executive workday has found that even junior executives and professionals are with other people -- that is, in a meeting of some sort -- more than half of every business day. Making a meeting productive takes a good deal of self-discipline. It requires that executives determine what kind of meeting is appropriate and then stick to that format. It's also necessary to terminate the meeting as soon as its specific purpose has been accomplished. Good executives don't raise another matter for discussion. They sum up and adjourn.

• Think and say "We." Effective executives know that they have ultimate responsibility, which can be neither shared nor delegated. But they have authority only because they have the trust of the organization. This means that they think of the needs and the opportunities of the organization before they think of their own needs and opportunities. This one may sound simple. It isn't, but it needs to be strictly observed.


I'm going to throw in one final, bonus practice. This one's so important that I'll elevate it to a rule: Listen first, speak last.

Mr. Drucker is a professor of social science and management at the Peter F. Drucker and Masatoshi Ito Graduate School of Management at Claremont Graduate University. This commentary is adapted from his article "What Makes an Effective Executive" in the June issue of the Harvard Business Review.

The American CEO

By PETER F. DRUCKER
December 30, 2004

CEOs have ultimate responsibility for the work of everybody else in their institution. But they also have work of their own -- and the study of management has so far paid little attention to it. It is the same work, whether the organization is a business enterprise, a nonprofit, a church, a school or university, a government agency; and whether it is large or small, world-wide or purely local. And it is work only CEOs can do, but also work which CEOs must do.

In any organization, regardless of its mission, the CEO is the link between the Inside, i.e., "the organization," and the Outside -- society, the economy, technology, markets, customers, the media, public opinion. Inside, there are only costs. Results are only on the outside. Indeed the modern organization (beginning with the Jesuit Order in 1536) was expressly created to have results on the outside, that is, to make a difference in its society or its economy.

The CEO's Tasks

To define the meaningful Outside of the organization is the CEO's first task. The definition is anything but easy, let alone obvious. For a particular bank, for instance, is the meaningful Outside the local market for commercial loans? Is it the national market for mutual funds? Or is it major industrial companies and their short-term credit needs? All three of these "outsides" deal with money and credit. And one cannot tell from the bank's published accounts, e.g., its balance sheet, on which of these "outsides" it concentrates. Each of them is a different business and requires a different organization, different people, different competencies and different definitions of results. Even the very biggest bank is unlikely to be a leader in all these "outsides." For which of these to concentrate on is a highly risky decision and one very hard to change or reverse. Only the CEO can make it. But also the CEO must make it. It is the first task of the CEO.

The second specific task of the CEO is to think through what information regarding the Outside is meaningful and needed for the organization, and then to work on getting it in usable form. Organized information has grown tremendously in the last hundred years. But the growth has been mainly in Inside information, e.g., in accounting. The computer has further accentuated this inside focus. As regards the Outside there has been an enormous growth in data -- beginning with Herbert Hoover in the 1920s (to whose work as secretary of commerce we largely owe the data on GNP, on productivity, and on standard of living). But few CEOs, whether in business, in nonprofits, or in government agencies have yet organized these data into systematic information for their own work.

One example: Every major maker of branded consumer goods knows that few things are as important as the values and the behavior of that great majority of consumers who are not buyers of the company's products, and especially information on major changes in the non-customers' values and habits. The data are largely available. But few consumer-goods manufacturers have so far converted them into organized information on which to base their decisions (one well-publicized exception is the Shell Petroleum group of companies). Again it is primarily the CEO who needs this information and whose work it is to organize getting it.

The definition of the institution's meaningful Outside, and of the information it needs, makes it possible to answer the key questions: "What is our business? What should it be? What should it not be?" The answers to these questions establish the boundaries within which an institution operates. And they are the foundation for the specific work of the CEO. Particularly:
• They enable the CEO to decide what results are meaningful for the institution.


This is particularly important, particularly critical, and particularly risky for institutions that lack the discipline of the "bottom line," that is, for non-businesses. And non-businesses constitute the great majority of organizations in every developed society. But even for businesses, the bottom line is not by itself adequate as a definition of results -- the same bottom line may have very differing meanings according to how an institution defines "meaningful results." To decide what results a given bottom line represents is a major job of the executive. It is not based on "facts" -- there are no facts about the future. It is not made well by intuition. It is a judgment. Again, only the CEO can make this judgment, but also the CEO must make it.

This judgment is so risky that all pre-modern economies tried to avoid making it. In fact, the Modern Enterprise -- the one major institutional innovation of the Modern Economy -- was in large part created as the systematic risk-taker and risk-sharer, thereby enabling the individual strictly to limit the personal risk of investing in future expectations.

By thus making possible these time decisions in very large numbers and on an enormous scale, the Enterprise can be said to be the one invention that created the Modern Economy -- far more so than any other invention, whether material or conceptual. With the invention of the Enterprise the Executive came into being as a distinct role and function, with one of his or her major tasks being the making of the decision between short-term yields and deferred expectations. Making this decision requires a good deal of very hard work on the part of the CEO. (Both Machiavelli's "Prince" and Shakespeare's "The Merchant of Venice," two Renaissance masterpieces the background of which is the emergence of the modern economy, are built around the challenge of this decision).
• The answers to the question "What is our business? And what should it be?" enable CEOs to decide what is meaningful information for the business and for themselves.


This too is a high-risk decision. That U.S. business executives, for instance in the '50s and '60s, decided (in many cases quite deliberately) that what was going on in Japan was not particularly meaningful information for them and their companies, explains in large part why the Japanese export push caught them so unawares and unprepared.

It is information about the Outside that needs the most work. For far too many institutions -- and not only businesses -- define Outside in large part as their direct competitors. Toy makers tend to define the Outside as their toy-maker competitors; a hospital as the two competing hospitals in the same suburb, and so on. But the most meaningful competitors for the toy maker are not other toy makers but other claimants on potential customers' disposable dollars. The most meaningful information about the toy maker's Outside is therefore what value the toy presents to the potential buyer. (Customer Research, in other words, may be more important than market research -- but also far more difficult).
• The CEO has to decide the priorities.


In any but a dying organization there are always far more tasks than there are available resources. But results are obtained only by concentration of resources, especially by concentration of the scarcest and most valuable resource, people with proven performance capacity.

There is constant pressure on every CEO to do a little bit of everything. That makes everybody happy but guarantees that there are no results. The CEO's most critical job -- also the CEO's most difficult job -- is to say "No." To do so is not just a matter of will power. It requires an inordinate amount of study and work -- work which only the CEO can do but again work which the CEO must do.
• The CEO places people into key positions. This, in the last analysis, determines the performance capacity of the institution.


Every organization says, "We have better people." But this is, of course, impossible. Once an organization grows beyond a handful of people, it is subject to statistics' most ruthless law: the law of the great number, which dictates that there is only "normal distribution." What differentiates organizations is whether they can make common people perform uncommon things -- and that depends primarily on whether people are being placed where their strengths can perform or whether, as is only too common, they are being placed for the absence of weakness. And nothing requires as much hard work as "people decisions." The only thing that requires even more time (and even more work) than putting people into a job is unmaking a wrong people decision. And again, critical people decisions only the CEO can make.

No Real Counterpart

The CEO is an American invention -- designed first by Alexander Hamilton in the Constitution in the earliest years of the Republic, and then transferred into the private sector in the form of Hamilton's own Bank of New York and of the Second Bank of the United States in Philadelphia. There is no real counterpart to the CEO in the management and organization of any other country. The German "Sprecher des Vorstands," the French "Administrateur Delegue," the British "Chairman," or the Japanese "President" are all quite different in their powers and in the limitations thereon.

The American CEO is, however, fast becoming a major U.S. export. Tony Blair and Gerhard Schroeder1 are trying to make over their countries' top political job in the image of the U.S. president. In business the CEO model is being adopted even faster all over the world, e.g., in the recent restructuring of Europe's largest industrial complex, the German Siemens Group. And what makes the American CEO unique is that he or she has distinct and specific work.

Mr. Drucker is the author, most recently, of "The Daily Drucker," just out from HarperBusiness. This is the first in a three-part series on management.

Drucker on Everything

Books on management are published by the hundreds each year, but for our money you can skip everything else and simply re-read Peter F. Drucker, who was the Shakespeare of the genre and who died Friday at his California home at age 95.

For 30 years, the immigrant from Austria graced these pages as a contributor, usually under the heading, "Drucker on Management." That was a typical piece of modesty, because the more accurate description of his work would have been Drucker on Everything. He was a student of human behavior in all its ways and means, and through his many books and articles he sought to explain how managers could get the most from themselves, their colleagues and their institutions.

The business world would surely be a better place if every manager were required to read his 1966 classic, "The Effective Executive." It includes pearls on time management, especially the necessity of carving out chunks of time for thinking and decisions, on how to manage a meeting, and on the importance of focusing not on what any job requires but on what every individual can contribute.

His achievements include anticipating the rise of the modern corporation, and then dissecting its strengths and weaknesses; predicting the challenge that Japan would pose to American business; describing the rise and importance of "the knowledge worker"; and defending profit-making as central to the business enterprise at a time, in the middle of the last century, when that was not a widely held proposition.

His final piece for us, "The American CEO," was published last December 30, and was billed as a three-part series. He was too ill to complete the other parts, though it is a tribute to his stature and wisdom that readers kept sending us notes asking when the other articles would be published. We excerpt from some of his Journal articles nearby. R.I.P.

Peter Drucker is making a posthumous comeback.


It isn't happening in the U.S., where the Austrian-born management scholar spent much of his career until his death in 2005, at age 95. While most of Mr. Drucker's 39 books remain in print, they aren't fixtures on American best-seller lists, as they were a generation ago.


• Do you trust management experts? Join a discussion.



In China, however, Mr. Drucker is the man of the moment. In the past few years, devotees have created 14 Drucker academies, in Beijing, Shanghai, Xian and other Chinese cities. Their curriculum draws extensively on Mr. Drucker's writings, so thousands of students can quickly grasp the management essentials needed for China's booming economy.

Mr. Drucker's old-school values like integrity and humility play well in China, says Henry To, chief executive of the Drucker academies. Mr. Drucker spent much of his career as a consultant and professor studying big, well-known American companies. Based on their experience, he urged managers to set clear objectives, to value employees and customers, and to define their mission as more than just making a profit.

"When Drucker writes about leadership, he says that integrity must come first," Mr. To observes. "He says leaders need to listen to their employees and be followers, too. That matches our Confucian heritage."

Unlike many American management gurus, Mr. Drucker frequently stretched his precepts into nonprofit and governmental areas such as education and the Red Cross. That panoramic focus turns out to be well-suited for many Asian nations, where state policy and private-sector initiative are knit together more closely than in the U.S.

Mr. Drucker's magnum opus of the 1970s, "Management," was updated earlier this year by Joseph Maciariello, a longtime colleague of Mr. Drucker's at Claremont Graduate University in California. But his prominence has faded in the U.S., in part because his imagery often speaks to a different era.

His books sometimes recount stories from the 1940s, when Mr. Drucker was a consultant to legendary General Motors Chief Executive Alfred Sloan. Mr. Drucker's prime predated Google Inc., private-equity funds and other staples of todays most-popular business authors.

At the Drucker academies in China, however, Mr. Drucker's fondness for business history is considered a virtue, not a fault. "I tell students: 'The truth will not be outdated,'" Mr. To says.

With China building up its manufacturing capacity at breakneck speed, Mr. To says, it's probably more useful for Chinese management students to examine U.S. industrial triumphs of past decades, rather than get distracted by the fanfare associated with various postindustrial ventures of today's America.

Mr. Drucker himself laid the groundwork for China's enthusiasm for his teachings, meeting in 2000 with leaders of the nonprofit Bright China Management Foundation to get the Drucker academies started. Last year, 6,000 Chinese managers graduated from the academies, says Bright China's chairman, Ming Lo Shao, adding he expects this year's tally to be 20% higher.

Other Asian countries also are embracing Mr. Drucker's work. Last week, Drucker enthusiasts from around the globe met at the Drucker School of Management in Claremont, Calif. They discussed their efforts, through various Drucker Societies and a university think tank called the Drucker Institute, to spread his ideas.

Some of the most detailed presentations came from boosters in South Korea and Japan. In Korea, chief executives of sizable companies meet periodically in book clubs to discuss Mr. Drucker's work and how it applies to their companies. Japanese devotees publish a journal called Civilization and Management that tries to apply Mr. Drucker's ideas to current-day problems.
[business]

By contrast, U.S. attendees at the conference seemed more inclined to look backward. They giggled about Mr. Drucker's ability to outsell "The Joy of Sex" in the 1970s. Former students and colleagues shared memories of their time with him. The phrase "We miss him" was heard repeatedly.

Bob Buford, chairman of the Drucker Institute, voiced concern that American business audiences tend to be faddish, rapidly switching their attention to whatever scholar or commentator seems freshest. That makes it harder to keep Mr. Drucker's work in the public consciousness at home.

Mr. Drucker's writing style -- which mixed anecdotes and precepts in a way that led some fans to describe him as a philosopher -- is out of step with the tastes at many leading business schools, where the preference is for conclusions based on large statistical studies.

In China, however, Mr. Drucker is in no danger of fading away. His boosters there, in addition to running the Drucker academies, have assembled full sets of his translated works and have donated them to major Chinese universities. Their hope is that Chinese students will come to these "Drucker libraries" in decades ahead for inspiration.

Sunday, May 25, 2008

Warren Buffett: US Recession “Deeper and Last Longer”


BERLIN (Reuters) - The United States is already in a recession and it will be longer as well as deeper than many people expect, U.S. investor Warren Buffett said in an interview published in German magazine Der Spiegel on Saturday.

He said the United States was "already in recession" and added: "Perhaps not in the sense that economists would define it" with two consecutive quarters of negative growth.

"But the people are already feeling the effects," said Buffett, the world's richest man. "It will be deeper and last longer than many think."

But he said that won't stop him from investing in selected companies and said he remained interested in well-managed German family-owned companies.

"If the world were falling apart I'd still invest in companies," he said.

Buffett also renewed his criticism of derivatives trading.

"It's not right that hundreds of thousands of jobs are being eliminated, that entire industrial sectors in the real economy are being wiped out by financial bets even though the sectors are actually in good health."

Buffett complained about the lack of effective controls.

"That's the problem," he said. "You can't steer it, you can't regulate it anymore. You can't get the genie back in the bottle."

Tuesday, May 20, 2008

A Brief History of Jeans

On this date in 1873, Levi Strauss and Jacob Davis received a patent for the process of putting rivets in pants, and modern jeans were born. But that’s not the whole story.

We bet you think that jeans started as an American trend, specifically among gold miners in California. That’s not, however, exactly right: the history of jeans actually goes all the way back to eighteenth-century Italy. Genoan sailors of the time wore particularly snappy outfits made from denim; the word “jeans” coming from “Genoa.” For that matter, the word “denim” refers to a type of cotton cloth called “Serge de Nimes,” which literally means “cloth from Nimes,” a town in the south of France.

What America deserves credit for is popularizing the product, not inventing them. The first American jeans were made from slightly different fabrics than their European counterparts, but plantation labor eventually made cotton widely available in the States. By the time the Gold Rush started in 1848—not, as the NFL might have you believe, ’49—cotton denim jeans were the standard. But the miners didn’t pick up the trend until 1853, when one Leob Strauss moved to San Francisco, changed his name to Levi (nobody knows why), and started selling his pants wholesale. (There’s another guy who doesn’t get credit nearly as often, although he deserves it. Jacob Davis, a tailor from Reno, Nevada, was the guy who figured out how to put rivets in the corners of pants; he collaborated with Strauss.) A hit among the miners, the jeans were sturdy enough to handle rough work and repeated washings. Strauss shrewdly capitalized on that fact. In 1886, Levi’s Jeans even bore a leather label showing them being pulled between two horses to emphasize how durable they were.

Still, jeans remained the workwear of the rough and tumble West well into the mid-20th century. They started to trickle out to the general public in the 1930s, as Hollywood Westerns started sweeping across movie screens, introducing audiences to macho types sporting jeans as they lassoed cattle, slung guns, and engaged in other cowboyish activities. A decade later, another manly-man archetype picked up the trend: the World War II soldier, who often wore jeans and overalls when off the job. Finally, in the 1950s, teenagers and rebels, with or without causes, realized that jeans would make them look tough, aloof, and hardscrabble – without requiring them to actually do any of the dirty work of cowboys or soldiers. Once James Dean and Marlon Brando donned a couple pairs, there was no stopping the trend.