Wednesday, April 17, 2013

History of the Contemporary Period, in 7 Paragraphs

Last month I published a short book called Bridge to the Future, which is available in electronic form on Amazonand Barnes and Noble. One of the book's chapters provides strategies for teaching and understanding the full expanse of world history. Since we are living it now, and since most global history teachers are in the midst of teaching it, I decided I would post my account of the "contemporary" period (the historical period beginning somewhere between 1900 and 1914, and lasting until the present).

I hope this excerpt proves interesting and useful to you. Please post a comment with your thoughts!

As the world moved into the 20th century, escalation of standing armies, advances in industrial weaponry and transportation, and competition for prestige and colonies poised Europe, Japan and the United States for total war. Imperialist arrangements and consequences of the Great War altered political, social and economic conditions in Europe and opened the door to revolutions in Mexico, China, and Russia. Following a brief period of high growth during the 1920s the world economy collapsed in a depression. This period of economic shock facilitated the rise of dictatorship in Spain, Italy and Germany, and Japanese aggression in the Pacific heightened tension among the major power players. Genocidal campaigns by the Germans and Japanese resulted in mass civilian casualties. The Holocaust in Europe resulted in murders of more than six million Jews and murders of six million members of other minority groups that the Nazis targeted. WWII lasted more than six years and resulted in an Allied victory, leaving the USA and the USSR as the sole superpowers vying for supremacy in a world clouded by fear of nuclear annihilation. 
In the late 1940s, the competition between capitalist and communist frameworks for industrial development and between democratic and dictatorial modes of political participation became the dominant narrative of both local and international politics. New international bodies such as the UN, World Bank and IMF became forums for political posturing and economic intervention. The Nuremburg trials, which put Nazi leaders on trial for war crimes, helped establish a new precedent for global government, and the military alliances of NATO and the Warsaw Pact drew a line between divergent political-economic models. While the so called “Western” world of industrialized countries split between communist and capitalist camps, over 55 countries declared neutrality in the Cold War, and a wave of newly independent nations, freed from colonial rule, had to position themselves relative to these two poles. 
The US and USSR intervened militarily and economically in affairs across the world, exercising a new kind of imperialism in this modern, post-WWII world. Across the world, various conditions conspired to spark major democratization movements. While Latin America and Southeast Asia experienced brutal dictatorships, in part due to superpower sponsorship during the 1950s-80s, by the 1990s most of these oppressive regimes were replaced by more democratic cultures and forms of government. The independence movements in Africa and Asia succeeded in pushing out their European colonizers. This often left a power vacuum as the European empires, weakened by war, withdrew their imperial reach, along with investment, skilled bureaucrats, and legal structures. Newly independent countries faced an uphill climb to develop their economies so they could compete globally, and pressure from outside interventions and domestic, impoverished masses frequently destabilized fragile governments. In the worst cases, brutal civil wars ravaged countries for decades and genocides occurred in Cambodia, Bangladesh and Rwanda (just to name those with the most devastating numbers of casualties). Such conflicts, at one time fueled by super power posturing and intervention, ended up taking on a different shape—some ending, others persisting—after the collapse of the Soviet Union in 1991. 
New technologies, longer life expectancy, maturing democratic cultures and evolving economic systems all contributed to major changes in local and global culture, with changing gender roles representing one of the most important changes in culture during the 20th century. Improved educational and employment opportunities for women delegitimized patriarchal systems and empowered women to go further, resulting in women earning positions of power and influence in countries across the world. Nevertheless, vestigial biases against women and challenges women face with the responsibilities of motherhood contribute to certain imbalance of gender representation in higher levels of government and industry. 
As the world reintegrated economically starting in the 1960s, the “Asian Tigers” of Singapore, South Korea, Taiwan and Hong Kong led an industrial boom in East Asia. A development model that involved incubating local industry and subsidizing hi-tech export markets allowed these countries to make phenomenal growth gains. By contrast, China lagged in growth under Mao, as the Communist regime, established in 1949, initiated a series of collectivization efforts ironically called “The Great Leap Forward.” After Mao’s death in 1976, the country began to open its economy to international trade and in the later decades of the 20th century, China followed elements of the Asia Tiger model, setting the stage to become the largest economy in the world early in the 21st century. The shifting power to the East, away from Europe, and to a lesser extent, away from the United States, has changed the political dynamics of a new multi-polar world of the 21st century. 
The power of oil draws particular attention to the Middle East, where the young nation-state of Israel, formed in 1948, would be the lightening rod for revolutionaries, nationalists, radicals and operatives around the world. Arab attacks on Israel in 1967 were rebutted and the subsequent Israeli occupation of surrounding territories resulted in Palestinian displacement and deep resentment. At the same time, terrorist attacks and anti-Semetic declarations by Arab and Iranian groups have fueled an ongoing sense of Israeli insecurity. The Iranian revolution of 1979 produced an Islamic state in the oil-rich region of Persia, and the rise of the demagogue of Saddam Hussein led to three major wars in the region, two of which involved significant US investment. 
In 2012, the world population passed seven billion. The UN projects that the population will continue to grow until sometime mid-century, when it plateaus somewhere above nine billion. While close to three billion people at the time of writing this live on less than two dollars per day, the fastest growing group of people will live a middle class life in their home country. To sustain levels of consumption these classes aspire to enjoy we would need the resources equivalent to four planet earths.[1] The emissions from the burning of fossil fuels and massive deforestation have driven the levels of carbon in the atmosphere well beyond the “safe” limit of 350 parts per million. As a result, rising sea levels and extreme weather events, including storms, floods and droughts, threaten social-political stability, agriculture, and biodiversity. We are living through a mass extinction event, the poisoning of oceans, land and air, and a scarcity of basic resources needed for human survival and happiness. To meet the need for ecological conservation and equitable human prosperity, calls for systemic reform emanate in books, films, political campaigns and public debates. Unfortunately widespread disagreement abounds about the nature of this reform. Of a range of possibilities, technological innovation, cultural transformation, and economic re-structuring represent the three main areas of debate over how the human race will achieve sustainability in the age newly minted by geologists as “The Anthropocene.”



[1] http://www.popsci.com/environment/article/2012-10/daily-infographic-if-everyone-lived-american-how-many-earths-would-we-need

Thursday, April 4, 2013

Vouchers and the Community School

Free market thinking suggests that the solution to poor school quality is school choice and competition. Align incentives, and schools will be forced to develop and promote their competitive advantage to earn  students, and therefore, their right to exist and perhaps, grow.

Democratic, humanitarian thinking suggests the solution to poor school quality requires investing in our public schools--not "throwing money" at them, but really investing the time, energy, thought and resources to identify their needs, address them, and by extension, address the needs of the community.

I posit today that we don't need to chose. In fact, both camps are right.

What if we did provide vouchers to not a few, but to all students? These vouchers could range from $500 to $15,000, depending on how far we want to go, but essentially they would be designed to give students and parents choice for where they wanted to pursue a vast array of academic and extracurricular learning. These vouchers could permit enrollment in a full-time college prep program, an after school music program, an outdoors adventure summer camp, or a specialized engineering course. With these vouchers, private providers, regulated by an accreditation board, compliance agencies or private ratings agencies, but most importantly, by the market, would compete for the business of students seeking the very best along an array of interests. Wouldn't that help produce a crop of new, excellent programs? Wouldn't that give new opportunities to kids?

Well sure. You really can't argue that it wouldn't produce some great new programs, and give some amount of choice. What you can argue, and the evidence bears it out, is that if vouchers replace public schools, they end up gutting schools as the center of a community, and leaving behind many students and families who are not "educated consumers." That's the cost of vouchers, and why, in this thought experience, I propose vouchers as one half of the solution. Now for the other half:

Imagine that the vouchers previously discussed were used to merely supplement investments in strong, community based schools. These schools would provide critical functions--counselors and academic advisors would support all local students in their academic and career planning, as well as their social and personal health and wellness concerns. In the lower grades, mandatory and excellent literary and math classes would give young children the foundational skills to advance to higher grades, regardless of what specialization they might choose down the line. Classes in civics and government would be taught for middle and high school (and adult) age students, because spreading this knowledge is a public good, and we should guarantee it for the well being of all members of the community.

The community school would house a medical clinic, apprenticeship classes, a library, a media center, tutoring and babysitting. It might have computer labs where students could do distance learning provided by institutions from around the world, and it might have an auditorium where visitors and presenters could educate large audiences.

The community school would provide the home for local sports teams and clubs, to help build local pride and relationships between neighbors. The community school might grant diplomas, or it might simply facilitate students earning diplomas elsewhere. Either way, the community school would serve primarily to support the children and families of its local community, preparing them to contribute to the social good and to achieve their dreams.

Imagine a school system where learning doesn't need to occur in the local school, but it where it absolutely can. That seems like a system that leverages the best of American freedom and democracy, and provides the best model to ensure these national treasures persist for generations into the future. 

Tuesday, April 2, 2013

Grad School and Career Planning

Should I go to grad school? Do I need the credentials? The network? What do I need to learn for the rest of my career? Would I be better off teaching it to myself with free and inexpensive resources, or through a degree-granting program? Do only the top-10 schools justify the opportunity cost of losing out on two years of more work experience? What if the program is free? What if it's part-time? What if it's only one year? What if I just traveled the world for a year: wouldn't that be a better education? What if I start my own business or non-profit: wouldn't that be a better education?

These are the questions that my peers and I ask ourselves all the time. Two years ago I was pretty sold on at least applying to MBA programs; however once I started running my own business it seemed like everything I needed to learn I was learning as I went. Not only that, but grad school is expensive! Well, not all of the programs out there, but the fact is that most grad schools involve assuming 5-6 figures of debt.

Right now I have friends who are at top business schools, who are in or have done a variety of graduate school programs, or who are considering whether or not to go. There seems to be no consensus on whether or not it is worth it, though the most sage advice seems always to be along the lines of "do what's best for you." What's interesting and to many, what's stressful, is that the economy seems to present so many unknowns now. What will be most valued in the labor market of the future? Do we all need technical skills? Creative skills? Management skills? Work experience? World-class networks? Passion?

I'm curious where other come down on this.

-Joe


Monday, April 1, 2013

Trivia, Information and Meaningful Learning


I find it interesting the degree of consensus apparent in America's education discourse today. Much is said about how 21st century learning is all about creativity, critical thinking, curiosity and character. Everyone's growing interest, from Barack Obama on down, is in "STEM" (science, technology, engineering, math). While many private sector people are still excited about using data to personalize instruction, few people anywhere endorse the current regime of standardized testing. Finally, with the availability of flipped classroom software, iPads, Kindle, google and video games, everyone seems to want to do away with the textbook. 

Yet it seems we still have a fundamentally unresolved issue. That is, what information do students need to learn, and how should they learn it? 

It's easy to say (as Tony Wagner does in Thomas Friedman's op-ed this weekend), that most of the information we teach students in school they will "never use" or they can easily look up online if/when they need it. But when you look more closely at the discrete nuggets of information we teach, it starts to look less trivial. Do we really think students should not learn basic information about American history or biology?

Indeed, the original decision to include information in the curriculum was independent from the core objectives of the oft cursed industrial model of education. In fact, teaching information is based on a fundamental understanding about learning: information provides the schema for analyzing claims, creating ideas, and expressing oneself. Students don't need to learn all the information out there, but there is a minimal level of knowledge necessary for intellectual reasoning and core skill competency. 

We should think twice before we launch a crusade to eliminate all the content from the curriculum. In fact, it may not matter so much what content we teach, so long we (a) teach facts (b) teach enough of them (c) teach them effectively and (d) don't let the information instruction overwhelm teaching critical thinking, literacy and creativity. 

Enter new concept: multi-layered learning. In multi-layered learning, student experiences draw from content and skill instruction simultaneously. Good teachers have been doing this forever, but intentionally structuring pedagogy to couple skills and information together offers greater promise for driving achievement in both. Multi-layered learning takes what would otherwise be trivia, and makes it information relevant for application. 

Two models are particularly effective in delivering multi-layered learning: game-based and project-based learning.  Both models provide some level of structure with elements of choice. Both models have a fundamental orientation to information, while calling upon the learner to do something. Both of these models are highly interactive, involve individual and group learning, and connect discrete skills and knowledge to larger learning objectives. Even mini-games that emphasize drill and practice are an important part of preparing students for 21st century challenges--these games can be a fun and effective way to teach foundational information without using class time, therefore making learning in school all the more rigorous and meaningful. 

Let's forget the idea that education will ever be easy. It won't. There is no panacea, short-cut or simple answer. Information, skills, and understandings all matter. If we can do more to build our instructional practice around multi-layered learning, we will have a better chance of engaging students in their learning and providing them all they need to succeed in the future.