Anne McCaffrey's Dragons

I've had a passing fascination with McCaffrey's books over the years, even as I never really dabbled in them. (I owned one book, Dragonflight, years ago.) I was always somewhat intimidated by the sheer size and scale of the series, and I was always more interested in SF than I was Fantasy (although now, I realize that that was a bit misguided.) Anne McCaffrey was always an author I was aware of: one of the female authors alongside the Asimovs, Herberts and Heinleins in my high school library.

Yet, in recent years, as I've been researching, I've become aware that McCaffrey has occupied an important role in the genre: she's an extremely successful female author, but she also writes in such a way (and is marketed as such) that she's an excellent gateway into the SF world for a huge range of readers.

Go read Anne McCaffrey's Dragons over on Kirkus Reviews.

Sources:

  • Trillion Year Spree: The History of Science Fiction, Brian Aldiss. Aldiss has some excellent points about McCaffrey's early works in his book, although she's mentioned sparingly.
  • Transformations: The Story of the Science Fiction Magazines from 1950-1970, Mike Ashley. Ashley provides some outstanding quotes and background into how McCaffrey got her start in the genre, and especially how she was aided by John W. Campbell Jr.
  • Gateways to Forever: The Story of the Science Fiction Magazines from 1970-1980, Mike Ashley. This book follows up with Transformations, but likewise provides some good information on McCaffrey's work.
  • Partners in Wonder: Women and the Birth of Science Fiction 1926-1965, Eric Davin. This was a particularly good source, providing some interesting background information that didn't appear anywhere else, but also helped my thinking with how McCaffrey got into writing in the first place, but how she viewed her stories.
  • Survey of Science Fiction Literature, vol. 2, William Magill. There's an excellent review and overview of Dragonflight in this volume.
  • Dragonholder: The Life and Dreams of Anne McCaffrey, Todd McCaffrey. This was a particularly helpful source, but very poorly laid out and written. It's jumbled, and jumps from point to point, making it difficult to locate the right information.
  • The Routledge Companion to Science Fiction, edited by Mark Bould, Andrew M. Butler, Adam Roberts and Sherryl Vint. This text had some good background information.
  • ISFDB. As always, this is a particularly helpful site for figuring out when and where stories were published.

Looking Far into the Future: Olaf Stapledon

My latest post for the Kirkus Reviews Blog is now online! This time, we look at English author Olaf Stapledon and his legacy.

This wasn't the post I'd intended on writing. Originally, this spot had been reserved for an examination of C.S. Lewis, and his Out of a Silent Planet trilogy. As this series has progressed, I've been finding a curious evolution of the science fiction genre, something that will continue on. From Mary Shelley to Edgar Allan Poe, to Jules Verne and to H.G. Wells, there's a facinating story of connections between one another. They found influences in themselves, carrying ideas forward in time, changed somewhat by each author's own sensibilities. Following Wells, we find Olaf Stapledon, who by his own words, was influenced by Well's stories, and in turn, inspired future authors, such as Sir Arthur C. Clarke. Lewis, I found, wrote in opposition of the two, and in a large way, was out of place in my plans.

Stapledon was an interesting author, and the scale of his works and the themes behind them set him apart from just about everyone in the field at the time and since. Read Looking far, far into the future: Olaf Stapledon over on the Kirkus Reviews Blog!

Here's the sources that I used:

An Olaf Stapledon Reader, By Olaf Stapledon, Robert Crossley: This book contains an interesting series of articles on Stapledon and his writing, but of most interest is two letters that Olaf wrote to famed science fiction author H.G. Wells, where he talks about how the former influenced him.

The Olaf Stapledon Online Archive: Located here, the site for Stapledon contains a fairly good biography on the author and some of his works, which provided a good starting point for the biographical elements of this piece.

Last and First Men / Last Men of London, Olaf Stapledon: This collected version is a book that I picked up on a whim a couple of years ago, and read through Last and First Men. An interesting story, it was of particular use when coming to understand the scale and scope of Stapledon's efforts - it's a very different, but highly recommended novel.

Arthur C. Clarke: The Authorized Biography, by Neil McAleer: This biography of Clarke helped to confirm that Clarke was influenced by Stapledon's works.

Survey of Science Fiction, vol 3 & 5, Frank Magill: This book as usual, is a particularly useful resource in looking up specific meanings and critical reviews of Stapledon's works.

The History of Science Fiction, by Adam Roberts: Roberts devotes an entire glowing section to Stapledon's legacy, shedding some light on the author and his influences.

Complicated History

A couple of months ago, I went to the Sullivan Museum and History Museum at Norwich University for a talk by one of the history professors, Dr. Steven Sodergren, as part of an exhibit series on the Civil War. His talk was about the specific motivations for individuals on each side of the Civil War, refuting the idea that there was a uniform block of support behind both the Union and Confederate governments. Some Southern states, when the decision came to vote on the decision to split from the United States, had a close majority: no more than 55-60% of the population supporting the idea, leaving a substantial chunk in opposition.

The idea behind the talk was a sound one, taking on the idea of the very nature of taught history: it's not as simple as it's made out to be. History is a difficult topic to convey to a large audience: big, complicated and multi-facetted, the very instruction of the field is just as enlightening as a separate topic. The Civil War was never quite as clear cut when it came to the motivations of the soldiers on the field: according to Sodergren, it was a deeply personal and difficult choice for everyone who took up arms. More recently, a talk on VPR with Vermont Historian Howard Coffin noted that looking at enlistment numbers is important: high initially, support dropped off following the first major battles when bodies began to return home.

I recently presented a paper at the New England Historical Association, where I talked about Norwich University's efforts during the Battle of the Bulge. My panel's commentator noted that between the papers, there's a high level view of history, with the strategy and big decisions, and the ground level, with the individual soldiers fighting: my paper bridged the gap, telling the story of the Bulge through the soldiers who fought there, but also how their actions played into a much larger story. Their own actions were far from singular: they spanned the entire command structure, from a Private First Class to a Major General. In our continued study of Norwich History, my wife and I have found soldiers who enlisted in foreign militaries prior to the United States' entry into the Second World War, while others were drafted.

A recent article by Slate Magazine caught my eye: How Space-Age Nostalgia Hobbles Our Future: Contrary to popular belief, public support for space exploration in the 1960s was far from universal. It's an interesting read, presenting a very contrary view to the supposed popularity of the Apollo program during the 1960s-1970s. Far from the major popular support that we perceive, the public approval rating for the program only hit a majority around the time that Neil Armstrong and Buzz Aldrin landed on the moon, and individual accounts from around the country shows that there was a wide range of opinions as to the value of the program. Support for the space programs also varied wildly depending on age group, and undoubtedly, on location as well.

Looking at political records from the time, there's also an important story when it comes to how Congress approved wartime funding: the public easily remembers President John F. Kennedy's speech at Rice University. The reality of actually funding the space program is far more complicated, with competing national priorities. Even Kennedy's speech, while influential, isn't so clear cut: it was designed in the aftermath of the failed Bay of Pigs Invasion, and was issued to help divert attention away from the administration's blunder.

A book that I particularly detest is Victor David Hanson's Carnage and Culture: Landmark Battles in the Rise of Western Power, an enormously popular and reviled book on the nature of culture and war: he outlines that the very nature of democracy makes a standing military inherently stronger, because the individual soldiers have a stake in their government and by extension, their destiny. It's a very appealing, straight-cut assumption, and one that breaks down when one considers the enormous complexity inherent in a democratic nation: no sane person makes the decision to take up arms for their country lightly, and Hanson's text does a disservice to the historical community by overly simplifying a situation that shouldn't be simplified.

In a lot of ways, this falls under the same public mentality that spawned the Greatest Generation from the Second World War and the Lost Cause line of thinking from the Civil War. Looking even further back into our nation's history, the War for Independence was likewise far from universally supported! Another specific example from one of my instructor's talks was the Boston Tea Party: essentially a rebranded name in an age of nostalgia to smooth over the fact that the 'Destruction of the Tea' was committed by political radicals.

I often wonder as I hear political reminiscing about the space age or the greatest generation or of Lincoln's efforts, whether people throughout the ages understand that the rosy memories upon which we build the future on is really nothing more than a shared fabrication, and why we reject the complicated story for something that has been watered down to the point that it's contrary to the original message.

History is our most wonderful, complicated Mandelbrot set that continues to bring out new levels and stories. Dr. Sodergren's talk highlighted a key point in how we approach history: it becomes defined by its major outcomes, as opposed to the actions that lead up to them, and increasingly, it feels as though the lessons that we can learn are missed, overlooked or simply ignored.

Who knows, though? Maybe we need the simple stories.

Out of the Ashes: How an Irish Episcopal Priest Saved Norwich University

I've sold a new article to the Norwich Record, titled Out of the Ashes: How an Irish Episcopal Priest Saved Norwich University. This was one of the projects that I was working on last fall, and shortly after the start of the New Year, I submitted my final draft. The research phase was interesting: going through archives and piecing together a rather interesting and diverse man that was a central, but forgotten figure in Norwich University and local Vermont history.

When assigned to this project, I was a little skeptical: what exactly were the links between the Episcopal Church and how would something like this be relevant to today's reader and Norwich alum? After reading up on Bourns, it became clear that there are some interesting things that he has to teach us today.

Out of the Ashes: How an Irish Episcopal Priest Saved Norwich University

The year 1866 was a pivotal one for Norwich. In March, a fire destroyed the school’s primary building—the Old South Barracks—and the University’s future lay in jeopardy. The disaster represented the biggest challenge to date in Reverend Edward Bourns’ tenure as president, a career that had shepherded the young school through fifteen years of adversity, including hostilities from the citizens of Norwich and Hanover, crippling debt, and four years of civil war. Yet, under the immensely popular Irishman’s steadfast guidance and vision, the University would not only survive, but thrive.

NO ORDINARY MAN

Reverend Edward Bourns was well-equipped to run a college. A learned man, he not only held the office of president, but served on the faculty, teaching ancient languages and moral sciences. An ordained Episcopal priest, he held religious services on Sundays.

The reverend’s lack of military training in no way hindered his leadership abilities. Described by Adelbert Dewey as “a man of peace by profession, better versed in canon law than cannon balls,” he had nevertheless acquired “the swinging stride of the modern soldier.” An insatiable reader renowned for his “incisive and delicate wit,” it became a saying among the cadets “that no one could enter the doctor’s rooms on the briefest of errands and not depart wiser than he came.” An imposing presence at six foot two, Rev. Bourns was respected by all, and perfectly suited—both as a shrewd administrator and genial leader—to steer Norwich safely through perilous times.

Born October 29, 1801, in Dublin, Ireland, Bourns entered Trinity College in 1823, but put his education on hold to serve as a private tutor, completing his degree a decade later. Ellis’ History of Norwich University describes him as “a man of learning and acumen,” and at Dublin he won numerous book prizes for scholastic achievement.

From Dublin he moved to London, where he engaged his skills as a writer and reviewer, working alternately in the publishing industry and as a teacher. In 1837, he journeyed across the Atlantic to the United States, where he became acquainted with a fellow Irishman, the Reverend William DeLaney, Provost of Pennsylvania University. Shortly after, Bourns followed Reverend DeLaney (now the Bishop of Western New York) to Geneva, where he enrolled at Hobart College, earning his MA and becoming an adjunct classics professor. By 1841, having received his LLD from Hobart, he was ordained Deacon of Geneva’s Trinity College. Four years later, after a short stint as a fully ordained priest, Dr. Bourns resigned his professorship at Hobart and left for Brooklyn, N.Y., where he taught ancient languages for five years.

 

You can read the full article here.

Everybody’s Going to the Moonbase

During a campaign stop in Florida in advance of the next Republican Primary, former speaker of the house Newt Gingrich promised the moon and the stars to Florida voters: "By the end of my second term, we will have the first permanent base on the moon, and it will be American."

It's one of the few things that I've heard from Gingrich that I've liked: returning to space with the full backing of the United States government. With a real perception that the United States has begun to fall behind other countries when it comes to programs in space and with NASA facing budget cut backs and the loss of its most visible program, the Space Shuttle, it’s a nice thing to hear, especially for those who focus on US efforts in space. However, it’s also an empty promise on Gingrich’s part, designed simply to gain traction against his rival, Mitt Romney in advance of the debates.

The Florida ‘Space Coast’ relies much on the infrastructure that's been built up around NASA's launch facilities: the demise of the Apollo Program in the 1970s led to massive layoffs, while the more recent Space Shuttle cancellation has led to further reductions of demand for the highly skilled work force that the industry requires. It's easy to see why Gingrich would propose such a program in Florida: it means hundreds of thousands of new, high paying jobs. At the same time however, it means a complete reversal of personal philosophy, because it would require a massive government program and spending to rebuild the space program to the point where not only reaching the moon, but also establishing a logistical system to support it, would be the first steps. Once established, it's an expensive, ongoing effort to build, maintain, supply and staff a permanent habitation on the lunar surface.

United States space programs have an odd effect on domestic politics: Republicans, traditionally the supporters of limited or restrained government, support such programs: it's heavily tied to defense and national pride, while Democrats typically see the money that's going off-planet as something that can be used to help solve the numerous problems back on the ground. Gingrich, attempting to fulfill his own fantasies, would never get far with a right-of-center government that is looking to bring down government spending (presumably), while the money that is left over would be fought over by those who's programs are being slashed.

The drive to go to the moon wasn't a whim of the U.S. public: it was the result of a carefully crafted argument made for its existence: national security. The development of rockets that could take people and equipment up to space were in place to support Intercontinental Ballistic Missiles, as a check against Soviet power growing in Europe and elsewhere in the world. A highly public and dramatic example of the progression of U.S. technology, the existence of a space program capable of reaching the moon was a powerful indication of what the country could do. Certainly, if NASA could send people to walk around on the moon, the Soviet Union was well within reach of the U.S. Strategic Air Command and its nuclear arsenal.

NASA's budget began at a relatively small amount in 1958: $89 million, $488 million as of 2007. This would steadily grow from .1% of the US budget to 2.29% of the federal budget following President Kennedy's speech at Rice University in 1962. The budget for NASA would then double to 4.41% in 1966, during the height of the Gemini and Apollo programs, and would steadily decline. By the time we landed on the moon in 1969, it was back down to 2.31%, or $4.2 billion dollars. ($21.1 billion today). As of 2007, NASA's budget was around $17 billion dollars, but at the equivalent of .6% of the entire US budget. With the entire economic health of the United States in question, it's a program that's largely seen as non-essential and expendable when it comes time to tighten the belt. To reach the moon, NASA would likely have to return to spending levels seen in the 1960s: twice the budget that's been on the books, for sustained periods of time, and on top of that, maintain public engagement for the same amount of time.

Returning to the moon isn't something that can be picked up after forty years, requiring an entirely different mindset and mission stance than the low-earth orbit work that's been done since the early 1980s. New rockets would need to be constructed, and an entirely new logistical support system would need to exist to support such a mission.

This is all before one asks the next question: why return to the Moon and why set up a permanent base on its surface? The original lunar missions were exploratory in nature, and the first people over the finish line in an international race. The Cold War is long since over, the United States has proved that they could reach the moon, and the American public returned to their lives back on Earth. A self-sustaining moon program simply cannot exist for the sake of its own existence, and cannot exist as a show to the rest of the world. A graduated, strategic plan for going to the Moon and beyond, for a concrete, supportable purpose is the only way that the United States will work to go beyond Low Earth Orbit.

There are potential resources in the skies above Earth. Asteroids contain a number of metals, and there's quite a bit of scientific knowledge to be gained, but somehow, I don't think that Gingrich had anything in mind other than restoring the glory days of the United States.

Gingrich isn't going to go far with this plan: already, Romney has slammed him for his plan: "That's the kind of thing that's gotten this country into trouble in the first place." I disagree with Romney's assertion: going to the Moon brought about quite a lot of technology and a sense of security. As Craig Nelson noted in his 2009 book, Rocket Men: The Epic Story of the First Men On The Moon, going to the moon was one of the great endeavors that makes the country worth defending. But, there's a lot of competition for that sort of thing, and I don't foresee a serious, government-backed program coming to fruition in the near future during the current economic climate.

Romney's words indicate that a space program under his administration would fare worse, and of the two, Gingrich's attitude is the best of the group - if he was serious about it. Of course, if he was serious about it, he'd have serious questions about his self-proclaimed description as a 'Reagan-style conservative'. Either way, the Obama administration's move to bring about a space industry using private enterprise seems to be to be the best way to foster the growth of a sustainable American presence in space, something that seems like it would be far more in line with what a Republican administration would back.

Returning to space should be a priority for the country: it’s a means to accomplish great things, from walking on another planet’s surface, to discover incredible things, and to advance the human race far beyond its imagination. At the same time, it’s a way to ensure an industry that is advanced and highly skilled, which is something that will keep us in space even longer. Because of that, I don't believe that it should be a political football, simply to score a couple of percentage points.

2011 Reading Census

This year has been an interesting reading year for me, fluctuating between a bunch of really, really good books, and a couple that really sucked out any interest that I had in reading at that time, with a number of books in-between that I thought were fun reads. Here's what I got through in 2011:

1- Grey, Jon Armstrong (1-8) 2- The Dervish House, Ian McDonald (1-21) 3 - Hull Zero Three, Greg Bear (1-23) 4 - Hunger Games, Suzanne Clarke (2-1) 5 - The Lifecycle of Software Objects, Ted Chiang (2-4) 6 - At The Queen's Command, Michael A. Stackpole (2-19) 7 - Mossflower, Brian Jacques (2-20) 8 - Embedded, Dan Abnett (3-7) 9 - Kraken, China Mieville (3-9) 10 - Leviathan Wakes, James A Corey (3-17) 11 - Little Fuzzy, H Beam Piper (3-28) 12 - Fahrenheit 451 Graphic Novel, Ray Bradbury (4-13) 13 - Yarn, Jon Armstrong (4-13) 14 - Welcome to the Greenhouse, Gordon Van Gelder (4-19) 15 - Fuzzy Nation, John Scalzi (4-25) 16 - Spectyr, Philippa Ballentine (4-26) 17 - Soft Apocalypse, Will McIntosh (4-27) 18 - Blackout, Connie Willis (4-30) 19 - Locke & Key, Joe Hill (5-8) 20 - Catching Fire, Suzanne Collins, (5-22) 21 - Deathless, Catherynne Valente (5-27) 22 - Embassytown, China Mieville (6-18) 23 - Hex, Allen M. Steele (7-2) 24 - The Gravity Pilot, MM Buckner (7-4) 25 - A Game of Thrones, George R. R. Martin (7-15) 26 - The Big Roads, Earl Swift (7-19) 27 - Spellbound, Blake Charlton (8-2) 28 - The Magician King, Lev Grossman (8-4) 29 - Bright's Passage, Josh Ritter (8-5) 30 - Grave Peril, Jim Butcher (8-13) 31 - Spook Country, William Gibson (9-6) 32 - Machine Man, Max Barry (9-10) 33 - Crisis in Zefra, Karl Schroeder (9-15) 34 - Halo: The Fall Of Reach, Eric Nylund (10-1) 35 - Germline, TC McCarty (10-5) 36 - The Windup Girl, Paolo Bacigalupi (10-16) Audio 37 - Halo: Glasslands, Karen Traviss (10-29) 38 - Red Herring, Archer Mayor (10-20) 39 - Ganymede, Cherie Priest (11-11) 40 - Ender's Game, Orson Scott Card (11-20) 41 - Ready Player One, Ernie Cline (11-26) 42 - Open Season, Archer Mayor (12-5) 43 - Seed, Rob Zeigler (12-11) 44 - Rule 34, Charles Stross (12-??)

In the pipeline: X-Wing: Rogue Squadron, by Michael A. Stackpole, All You Need Is Kill by Hiroshi Sakurazaka, Learning to Eat Soup with a Knife: Counterinsurgency Lessons from Malaya and Vietnam by John A. Nagl and The Unforgiving Minute: A Soldier's Education by Craig M. Mullaney. Rogue Squadron is something I'm going to finish up sometime this weekend, and All You Need is Kill is somewhere behind that. The other two are a bit denser, and while they're interesting, they're taxing to get through.

Interestingly, this was the first year where I really read books electronically. I've dabbled with it in the past, ever since I bought an iPad, but this year, I made the jump and read a small percentage digitally: 7 in all: Grey, Lifecycle of Software Objects, Embedded, Little Fuzzy, Crisis in Zephra, Ender's Game and Open Season. Add in Game of Thrones, with which I alternated between my paperback and ecopy, and that's 19%, or just under a fifth of my book pile existed on a hard drive somewhere, rather than a bookshelf.

An interesting thing about eBooks: there's really only a single novel that I read in which I felt really took advantage of the book’s digital nature: Crisis at Zephra. This novel, a short novella, really, was published by the Canadian Military, and incorporated a lot of data about new and upcoming technologies, and trends in said technology. I was limited in that I was reading on a wifi only iPad when I was away from the internet, which left me unable to click on the links scattered throughout the text, with explanations as to what the terms, technology and theory meant. This, I think, is where eBooks will eventually head: less reading experiences, and more immersive and interactive ones.

I've also been doing a bit more with book reviews, on a number of different sites: SF Signal, The Functional Nerds, Kirkus Reviews, and my own blog, with a total of 15 books (34%) read for a review. In this instance, I've written reviews for a number, but these are books that were given to me by either the website that I wrote the review for, or sent by an author or publicist for my own purposes, even if a review wasn't necessarily expected or promised. Just under a full third of my reading this year was subsidized by someone else, for review purposes. Of those books, I had a bit of fun, although my reviews weren't universally positive. The caveat to this, of course, is that a majority of my reading, (29 books in all - 65%) are for my own pleasure, and a minor attempt to whittle down my own to-read list. I've got a feeling that I'll never destroy the growing pile.

I've always described myself as a science fiction fan, rather than a fantasy one, and in years past, I've typically read more fantasy than science fiction. This year? I read 27 Science Fiction books (61%), 11 fantasy books (25%), 2 mystery novels (4.5%), 2 YA novels (4.5%), and 1 each of history and steampunk (2%). This year was certainly more science fictional than years past, which I'm happy about.

Interestingly, while I describe this year as being up and down, when looking over the list as a whole, there's only four books that I really didn't like. I thought just under half (20) were good, while just under a quarter (10%), were okay - decent, but nothing that really wowed me. 10 books in all really blew me away (22%). Of the books that I read this year, the more memorable were the really great ones, and of those, three really stood out for me: The Magician King, by Lev Grossman, Soft Apocalypse, by Will McIntosh, and The Dervish House, by Ian MacDonald. (See my top 10 list for the full number of ones that impressed me this year.) These books are astonishing reads, and I really hope that we'll see The Magician King and Soft Apocalypse get the attention they deserve: Grossman has gained a considerable amount of acclaim, but McIntosh's first novel feels like it's under the radar a bit, the underdog of the year. If you haven't read it: I can't recommend it highly enough. The Dervish House was nominated for a Hugo, but somehow ended up at the bottom of the polls. Still, it's nice to see it nominated.

Of the really bad books, these all stand out as ones that I had the most trouble getting through: Seed, by Rob Zeigler, The Gravity Pilot by M.M. Buckner, Deathless, by Catherynne Valente and Hex, by Allen M. Steele. I believe that the reason why they stand out so much is because they were all books that I had high hopes for: Seed was lauded as the successor to Paolo Bacigalupi's The Windup Girl, and utterly failed at that, The Gravity Pilot looked interesting, and didn't work, Deathless was wonderfully written, but was a book that I simply couldn't get into, and Hex was part of Steele's Coyote universe, which started off so well, and has fallen so far with this book. There were some others, like Jack Campbell's Beyond the Frontier: Dreadnaught, which was so abysmally written that I couldn't even get through the first chapter, and Sarah Hall's Daughters of the North that I had a lot of trouble getting into and didn't finish.

Everything else in the middle was entertaining, and some excellent novels: Susanne Collins' Hunger Games was an excellent read, although the sequel was a bit too much of the same for my liking. I haven't reached #3, Mockingjay, and I'm awaiting that one's release in paperback. China Mieville's Embassytown was interesting, a little flawed, but brilliant all the same, although I have to say that I liked Kraken quite a bit more. Leviathan Wakes was a lot of fun to read, and a promising start to a new series, while John Scalzi's Fuzzy Nation was something I tore through in just a couple of hours on a plane. I finally got in on A Game of Thrones, and it lives up to the hype, somewhat. I even broke out of the SF/F genres, and picked up the fantastic The Big Roads, by Earl Swift, which was a fascinating look at the construction of roadways in the US. Karen Traviss's entry into the Halo universe was also a fantastic one, and it's dragged me in to that particular expanded world, as I picked up several other Halo novels, which will likely get read next time I'm on a Halo kick. I re-read Mossflower after Brian Jaqcues passed away, as well as Ender's Game, and found both books really lived up to my memories of them. Ernie Cline's Reader Player One was a fun, entertaining book, but it was lacking in other departments. Finally, I had a chance to go back and revisit Paolo Bacigalupi's The Windup Girl, which lives up to my first impressions wonderfully.

So, why quantify my enjoyment? I've generally been accused from people of taking things like this too seriously, in reviewing films or books that should be 'just for fun'. I've never subscribed to the ‘turn your brain off while you read/watch/listen’ train of thought, because I think that does a disservice to the author. Certainly, there's books or films that I've done that with, enjoying them because they were written to be enjoyed. But, distilling a year's worth of reading down into some easy statistics?

A couple of reasons: one, it helps me better understand my own interests by grounding them in reality. As mentioned, I firmly describe myself in the science fiction camp, but over the past couple of years, I've generally been surprised when I've read more fantasy than science fiction. My interests are all over the place, and I don't generally remember at a glance what I've read as a whole. I was a little surprised that I hadn't finished more than a single history book this year, despite the intense work that I did on various history projects: I've read portions of numerous historical texts, mainly about World War II and military history (including a couple that are still technically on the reading list), but never finished them, or needed to finish them. This might also be me forgetting to stick a book onto the 'Read' List.

Reading is an important part of what I do. I typically read at night, before I go to bed (increasingly, if I'm using my iPad, or at the beginning of the day, when I can get through 10-15 pages while I'm waiting for my computer to load up at work. Weekends usually mean a lot of time to blow through something, and when I was on public transportation for two trips earlier this year to Washington D.C. and Belgium, I read a lot: three books for each trip (for the DC trip, that was one book for the airplane, one for the second day on the train, and the third for the flight home, all in a couple of days.) Better understanding my own reading habits help me to read more, I think, and while it's not quantity over quality, I've got a massive backlog of books that I've bought. Looking over my list from this year, I had a total of 6 books - 13%! - came off of that list, which currently numbers around 100. These are all books that I've owned for more than a year, while a huge number of books that I picked up this year were released this year, and this also comes as a bit of a surprise.

My thoughts going into 2012 is that I’ll be whittling down the to-read list. There’s a lot of books that I do want to get to in the near future. Off the top of my head, I can think of a number that are edging up the list: George R.R. Martin's second entry in the Song of Ice and Fire, Clash of Kings is most certainly going to make it onto the list when the next season hits, the entire X-Wing series by Michael Stackpole and Aaron Allston will get re-read prior to the next novel in the series, Mercy Kill. I also want to revisit Timothy Zahn’s Icarus Hunt. I've also been wanting to begin David Louis Edelman's Infoquake, finish out William Gibson's Bigend trilogy with Zero History and get into Neal Stephenson, Iain M. Banks, and generally blow through a bunch of paperbacks and history books that I've had for a couple of years. Hopefully, I'll be able to get through a portion of that, and hopefully, I'll slow down the growth of my own library - we're running out of shelf space (again).

It’s been a fun year, with a lot of good stories all around. It looks like 2012 will be just as much fun.

Space Exploration and the American Character

Historian Dr. Michael Robinson, of the University of Hartford, opened his talk with a William Falkner quote that helped frame the 1961-1981 Key Moments in Human Spaceflight conference in Washington DC, held on April 26th through April 27th: “The past is never dead. It's not even past.” The first talks of the day dealt extensively with the narrative and drive behind space travel and exploration, painting it as much of a major cultural element within the United States as it was one of scientific discovery and military necessity. In a way, we went to space because it was something that we’ve always done as Americans.

The Past

Dr. Robinson started with a short story of a great endeavor that captured the imagination of the public, one that brought in a lot of rivalry between nations on a global scale, advanced our scientific knowledge, and where high tech equipment helped bring valiant explorers to the extremes. Several disasters followed, and the government pulled back its support, yielding part of the field to private companies. If asked, most people would describe the space race of the twentieth century, and while they would be right, what Robinson talked about was the race for the North Pole. In 1909, American explorer Robert Peary claimed to have reached the North Pole, becoming the first known man to do so. While there are reasons to doubt or support Peary’s travels, Robinson makes some interesting points in comparing the North Pole to that of the space expeditions.

Robinson described a culture of exploration that’s existed in the United States since its inception, but took pains to make a distinction between the frontier motif that has permeated science fiction, and the realities that we’ve come to expect from going into orbit. Television shows have undoubtedly aided in the excitement for space research and exploration, but they’ve incorporated elements that have great significance for American audiences: Star Trek, for example, had been described as a ‘Wagon train to the stars’, while Firefly has likewise been described as a ‘Western in space’, to say nothing of films like Outland, Star Wars, and numerous other examples. In his 2004 address that helped outline America’s space ambitions, President George W. Bush noted that “the desire to explore and understand is part of our character”. Other presidents have said similar things, and it’s clear that there’s a certain vibe that it catches with the American voter.

It makes sense, considering the United State’s history over the past centuries: Americans are all newcomers, and as Robinson said, the west was a place to settle. The arctic, and space, really aren’t, and distinctions should be made between everything. Historically, both space and the arctic have much smaller footprints of human interactions. It’s a difficult area to reach, and once people are there, it’s an incredibly hostile environment that discourages casual visits.

The American West, on the other hand, is very different for the purposes of imagery for space travel. During the great migration during the 1800s, it was relatively cheap for a family to travel out to vast untapped territory: around $500. Additionally, once people reached the west, they found a place that readily supported human life, providing land, food, and raw materials. The American west was transformed by mass migration, helping to vastly expand the U.S. economy during that time, while leading to a massive expansion of the federal government and to the Civil War. Space, on the other hand, isn’t so forgiving, and like the arctic, doesn’t yield the benefits that the west provided.

File:Caspar David Friedrich 006.jpg

The explorations into the arctic gives us a sense of where space can go and how expectations from the public and the scientific community can come into line with one another. The polar explorations absolutely captured the imagination of the public: art exhibits toured the country, while one of the first science fiction novels, Frankenstein, was partially set in the North. However, what we can learn from the arctic is fairly simple: we abandon the idea of development in the short to mid future. Like the arctic, space is an extreme for human life, and the best lessons that we can glean for space will come from the past experiences that we’ve had from other such extremes: exploration in areas where people don’t usually go. This isn’t to say that people shouldn’t, or can’t go to the ends of the Earth and beyond, but to prepare accordingly, in all elements.

The arctic provides a useful model in what our expectations should be for space, and provide some historical context for why we go into space. We shouldn’t discount the idea that the west and the country’s history of exploration and settlement as a factor in going into space.

The Space Age

James Spiller, of SUNY Brockport, followed up with talk about the frontier analogy in space travel, noting that the imagery conformed to people’s expectations, and that notable figures in the field, such as Werner von Braun, liked the comparison because it helped to promote people’s interest in space. The west connected and resonated with the public, which has a history and mythos of exploration. This goes deep in our metaphorical, cultural veins, linking the ideas of US exceptionalism and individualism that came from the colonization of the American continent. The explorations to the west, the arctic and eventually to space, came about because it appealed to out character: it was part of our identity.

The launch of Sputnik in 1957 undermined much of what Americans believed, not just on a technical scale, but seemed to confirm that a country with vastly different values could do what we weren’t, with everything that was going for us, able to do. In the aftermath of the launch, President Eisenhower moved slowly on an American response, to great dismay of the public. It was a shock to the entire country, one that helped to prompt fast action and pushing up the urgency for a red-blooded American to go into space. How could the individual, exceptional Americans fall behind the socialists, whose values run completely counter to our own? There had already been numerous examples of individuals who had conquered machines and territories, such as Charles Lindberg and Robert Peary and the Mercury astronauts followed. Indeed, for all of the reasons for why the West feels important to Americans, the space program exemplifies certain traits in the people we selected to represent us in space.

Spiller noted that the frontier of the west seems to have vanished: the culture towards the end of the 1960s and early 1970s fractured society and the idea of American exceptionalism: the Civil Rights movement discredited parts of it, all the while the United States seems to have lost its lead in the global economy as other countries have overtaken it. As a result, the message of space changed, looking not out, but in. President Ronald Reagan worked to revisit the message, as did President H.W. Bush. There have been further changes since the first space missions: a new global threat that actively seeks to curtail modernism, terrorism, has preoccupied out attention, and pushed our priorities elsewhere.

Going Forward

The last speaker was former NASA Historian Steven Dick, who looked at the relationship between Exploration, Discovery and Science within human spaceflight, pointing out distinctions between the three: Exploration implies searching, while Discovery implies finding something, while science leads to explanation. The distinctions are important because they are fundamental to the rhetoric, he explained, and that the last program to really accomplish all three was the Apollo program.

Going into the future, NASA appears to be at a crossroads, and its actions now will help to define where it goes from here on out. The original budget that put men on the moon was unsustainable, but only just, and that as a result, NASA at the age of fifty is still constrained by actions taken when it was only twelve. The space shuttle is part of a program that was not a robust agent of exploration, discovery or science. He pointed out that where programs like Apollo and the Hubble Space telescope have their dramatic top ten moments, the Space shuttle really doesn’t, because it’s a truck: it’s designed with indeterminate, multiple functions, ranging from a science platform to a delivery vehicle for satellites. This isn’t to discredit the advances made because of the shuttle, but when compared to other programs, it doesn’t quite compare. The space station, on the other hand, was well worth the money, but people don’t respond as well to pure science as they do exploration. Apollo demonstrated that science alone isn’t enough to sustain public interest.

As he said it, “exploration without science is lame, discovery without science is blind, and exploration without discovery or science is unfulfilled.” Going forward, any endeavors beyond our planet should encapsulate all three elements to capture the public’s imagination, and make the efforts to go beyond orbit worthwhile for all. However, manned spaceflight can accomplish so much more than robotic probes and satellites, especially for fulfilling the frontier motif that helps to define our interest in going into space: it seems hard to embody the traits that have helped inspire people to go further when it’s someone, or something, else doing the exploring.

Space, the final frontier, is an apt way to look at how manned spaceflight programs are looked at, and it certainly captures the imagination of people from around the world. While some of the direct imagry is misplaced, it's not a bad thing for people to capture, but it does help to remember the bigger, and more realistic picture when it comes to what the goals and expectations are for space. NASA, going forward will have to take some of these lessons to heart, reexamining its core mission and the goals that its working to put forward. Nobody in the room doubted that the money and the advances that have come forward as a result of space travel were worth the cost and risks involved, but they want it to continue forward far into the future. To do otherwise would mean giving up a significant part of who we are, because the traits that that have come to define our exploration beyond the horizon, to the North and high above us are elements that are worth celebrating: the drive to discover, to explore and to explain are all essential for the future.

Battle of the Bulge: Phase II

On December 17th 1944, from what I can tell so far, the 100th Infantry Division was ordered to the Bastogne, Noville, and Bras areas to stop the sudden attack by German forces. The 28th Infantry division found itself on its second day fighting for its survival as their entire divisional front was under attack, and member of the division, 1st Lt. Carl Hughes of the 102nd Cavalry Recon Squadron continued to make his way through enemy lines. The Battle of the Bulge was in full force in Germany and Belgium, and would continue to rage on for over a month.

The anniversary of the beginning of the battle saw the start of the second phase of my project documenting the Norwich University alumni who fought there. I had hoped to have finished the writing by this point, but that hasn't happened yet, but the research and collection of raw data has largely wound down for the project. From the data that I was able to collect, I've assembled a list of just under a hundred and fifty people from a variety of sources: publications, records, mentions, with thirty people confirmed with sources that they were present at some point, another 73 people who might have been there based on their unit, ten people who can be written off, with a further 30 people who may or may not have been there, but with very little to go on, other than a country reference.

This collection of raw data has some additional bits of information that goes along with each student: their rank, unit, whether they were wounded or killed, what medals they earned, and any other additional notes. As a whole, it's a wealth of information that only tells me a couple of certain points that help lead to the next stage.

Raw data by itself is somewhat useless. I can tell you ten things about Carl Hughes. He was a first lieutenant in the 28th Infantry division with the 102nd Cavalry Recon Squadron, that he graduated from Norwich in 1942, that he received the Bronze and Silver Stars in addition to a purple heart, and that he walked through enemy lines for three days following the attack when his unit was surrounded. The next step involves adding context to the situation.

Going unit by unit, this next step involves adding that context. With it, I've learned that the 28th Infantry Division had taken the first impact of the German advance on December 16th, along a 25 mile stretch that enveloped the division, and that from the 16th through the 22nd, the unit was involved in heavy fighting before pulling back on the 22nd to Neufchateau to reorganize. This additional layer helps to put the individual experiences of the soldiers into better context.

With rare exceptions, student information on their individual experiences during the battle are rare, and in those instances, I have a paragraph at the most, or a brief sentence at the least that indicates that an alum was present at any part of the battle. The additional information as to what the units as a whole were up to help to fill in the blanks and gives me a general idea of what any given student might have been doing at the time. Furthermore, the individual data points that make up Norwich Students on the timeline helps to etch out a clearer understanding of how the battle worked: it was complicated, with numerous fronts, battles and units involved. Approaching the battle from the people who studied at Norwich also helps to demonstrate the impact that Norwich itself played during the battle, much like I discovered with the Operation Overload paper that I wrote in 2007. There was a collective Norwich experience that was widespread throughout the conflict.

This next step is far from done - quick passes through the Army Historical blurbs allow me to pin point some key dates for units, and a second pass will help to put in more detail for some of the larger units, such as the 2nd and 3rd Armored Divisions and the 17th Airborne Division, which seems to have a larger collection of Norwich men within it. With a codified timeline in place, the events of the battle can be put down into more detail, and a larger story of the Battle of the Bulge will appear, seen through the eyes of the school's alumni.

It's an exciting bit of work as I am able to gather more and more information on individual units and to see the battle emerge from the raw data points that I've collected. One thing is for sure so far: Norwich University was present on the front lines (and in one case, above them) and undoubtably, given some of the notations, medals and units that these men earned and occupied, it had some hand in the outcome of the battle, providing a basis for the actions of the men who fought in 1944 and 1945.

The Start of Something Fantastic: SpaceX Orbits the Earth

Yesterday, at 10:43 in the morning, a Falcon 9 rocket carried a Dragon capsule into a low earth orbit, where it circled the Earth twice before splashing down on target 500 miles off the coast of Southern California. This marked the first time that a private commercial firm has accomplished such a task, joining only a handful of countries (The United States, Russia, China, India, Japan and the European Space Agency) by going into the skies above. By doing so, it has marked the start of a new age in space travel, one that is independent of governmental agencies. Even more astonishing, this comes from a company that was founded a mere eight years ago.

The rise of SpaceX (Space Exploration Technologies Corp.) comes during a time of stagnation in space exploration. The last manned mission to the moon occurred in 1972 with Apollo 17, with Skylab crashing down to Earth in 1979 while Russia's Mir Space Station followed in 2001. The American space shuttle was first launched in 1981, heralding in its own age of scientific exploration by launching satellites, conducting repair and resupply missions and generally serving as an orbital laboratory for scientific project, and sees its own mission end early next year with two final flights. Finally, the International Space Station, a testament to international cooperation and scientific endeavor, was launched in 1998, and is scheduled for completion next year. In spite of the numerous accomplishments that NASA and other space agencies have achieved over the last three decades, their efforts have gone unrewarded by the general public and political elements, who see the efforts as a waste of money and time on behalf of the people.

Space and operations in orbit are something that will continue in the near future, and the introduction of a commercial firm is something that will help to supplement the people in orbit. Commercial firms also have the ability to break the monopoly that governments are able to hold on space operations by opening up access to Earth's orbit for not only people, but additional platforms in the skies for businesses and travel.

That future is still something that is far off, and will require an immense amount of preparation, coordination and regulation in order to become fully viable, safe and profitable for interested parties. There is a growing problem with objects in Earth's orbit, which caused collisions and dangerous conditions for astronauts and hardware, while the expense for trips into orbit is high. (SpaceX charges upwards of $43.5 million for up to 3,000 kg) Space is still something outside of the general public, and like any big, complicated, dangerous activity, it'll take a while for the prices to go down to a more affordable level, and for an entire industry to take form to support it.

An independent, forward-thinking and rational commercial future for space allows for a great deal of independence. Companies won’t be constrained by political whims and budget shortfalls, but by economic pressure to succeed amongst a pack of competitors. There will be a regulatory body to keep the conduct of these companies in check (and when you’re talking about the risks of spaceflight, this is something that will be needed), and companies will be able to expand and explore new possibilities and ventures much faster than a governmental body, and the advances that they find and create can translate into new opportunities for those of us on the ground.

As the United States grapples with economic problems, Space should be the next frontier and direction for U.S. business interests to move forward to. Long-term efforts into space bring about the possibilities of incredible exploration, scientific discovery, mineral wealth and virtually unlimited space for growth and real estate. The United States maintains a massive advantage over other countries, and would do well to foster the development of a space industry within the United States to help better its own economy (think of the requirements for skilled labor and jobs that such a line of work requires) and to bring humanity further into the stars.

The futures that have long been seen in science fiction novels, television shows and movies are still a long way off, but the steps taken by SpaceX, and the other private firms that are sure to follow over the next couple of decades make me hopeful for the futures that we might have in orbit and beyond. Maybe, just maybe, within my lifetime, I'll be able to look down on the Earth and smile.

Defending Korea & Continual Conflict

Korea has been at the forefront of the news over the past couple of weeks as violence has begun to escalate between the North and the South following a North Korean shelling of the South Korean island Yeonpyeong in response to a series of planned military exercises. The escalation of violence seems to have been rising, when an attack on a South Korean warship left almost fifty sailors dead. Indeed, the two countries seem to be a rapidly drying powder keg with a new South Korean leader, and with the expected promotion of Kim-Jong Il's son, Kim Jong-un at some point in the near future. With almost 30,000 American soldiers just to the south of the 38th parallel, an outbreak of war in the country is something that will heavily impact the United States. With two major conflicts on their way out the door, the prospect of another confrontation abroad is a sobering one.

In 1952, Dwight Eisenhower became the 34rd President of the United States, where he had campaigned on countering Communism, Korea and Corruption. Despite leading the Allied military against Adolph Hitler and his allies in Europe, Eisenhower sought to bring down the national budget on a platform of fiscal conservatism, bringing about deep cuts in the military budget and recognizing a new philosophy and approach to the United State's presence in the world.

After visiting Korea, Eisenhower sought to bring the United States and its efforts to an end, and with a cease-fire (although no resolution) to the conflict, was able to fill a major campaign goal that aligned with his beliefs: the United States did not, and could not fight in every battle across the world with a massive standing army, able to engage in more conflicts such as the one just waged in Korea. Under his 'New Look' plan, approved in 1953, which allowed the U.S. to utilize technology and America's atomic stockpile as a means to deter open aggression from the Soviet Union from directly attacking the U.S.. The policy was designed to rein in defense budget spending on a massive conventional force, while spending less on a more technologically oriented one that wasn't necessarily required to do anything but exist.

The recent troubles in Korea bring to mind some of the issues that have been ongoing in the political and military scenes recently. As the country begins to move in the direction of less spending (at least the attitude is there, somewhat), reducing the fiscal situation of the United States will require something along the lines of what Eisenhower had envisioned for the country half a century ago: reductions on all fronts, including military spending. The policies that were put into place were engineered with the fear that the country's financial footing had a corresponding impact on the nation's national security standing in the world. America, with a growing economy, population and budget, could face major problems as it was, and potentially, with the added need of continual fighting abroad in conflicts that were similar to Korea, the country’s stability could be at risk.

In a large way, the series of conflicts that followed September 11th fall right into what Eisenhower feared for the country: exceedingly high defense budgets for an expensive war where the United States has gotten its hands dirty in areas perceived to threaten the country’s security. Eisenhower had pushed against full American engagement in Vietnam, and it wasn’t until after his term in office that the conflict escalated for several Administrations, from which point the U.S. was able to stay out of major engagements until 1991, for Operations Desert Shield and Storm. Here, the theory of technological warfare as a superior form of conventional warfare was validated: for the 372 coalition soldiers killed as a result of the conflict, around 30,000 enemy soldiers were killed.

The fight in Afghanistan and Iraq are different: the U.S. has been slow to adapt to the new environment of warfare, plunging in with certain assumptions and coming out with an entirely different experience than was expected. Continual fighting in small conflicts will cause further problems for the country, especially if such conflicts are not properly understood and the ways in which to fight them are imperfectly realized.

The Eisenhower administration’s plans to deter fighting against the country worked, in part. The threat of massive retaliation faced its biggest test in October of 1962 during the Cuban Missile Crisis, and demonstrated that the threat of assured destruction of both countries (not to mention everyone else caught in the crossfire and were downwind) was enough to force both players back down. At the same time, it hasn’t been able to prevent warfare outright: Vietnam was a war in which the nuclear issue was largely side-stepped, and would cause problems years down the road, while American involvement in areas such as Haiti, Panama, Somalia and other smaller countries and conflicts have not been decreased, although their significance doesn’t approach the scale of something like the current fights in Iraq and Afghanistan.

The Iraq and Afghanistan wars are both abnormal conflicts, and two fights that signal some frightening precedents for the future. Already into its 9th year, the combined conflicts have cost an estimated $1.1 trillion, for a conflict that seems to run counter to the vision that Eisenhower had hoped for and seems to have done precisely what Eisenhower feared such battles would do to the country’s financial status. In the future, what conflicts does the United States have in store, if it can enter into a war-like state whenever it sees reason to do so?

The prospect of renewed war in Korea only adds to the fears of a continued lack of restraint when it comes to spending. Political elements in the United States have called for fiscal restraint, but the exception seems to be the money that pours in for the military. While the front-line soldiers need the financial support in order to accomplish the mission in front of them, the country needs to adopt a mindset of reducing the need for the soldiers to be requiring that money in the first place: avoiding costly confrontation across the world by recognizing which conflicts should be fought. The practice of deterrence will likely not work in this new environment of war: multinational political groups are harder to deter. Deterrence in Eisenhower’s day was the best means to contain spending and effectively protect the country from those who wished harm against the country. In the present day, we need to do much the same: figure out the best way to defend the country without oversight and restraint.

Should the tensions between North and South Korea break, the United States will likely have some hand in the issue, and we could find outselves in a third major conflict at a time when we can't afford to become entangled.

Hardwired Historian

As I've begun work on the Battle of the Bulge project, I've found that there have been some major changes in how I'm able to go about researching the event since the spring of 2007, when I did a similar research project on the Normandy Invasion. Since then, computers have become smaller, Norwich University has a campus-wide wireless network, and information on databases has grown.

Over the past couple of weeks, I've been pouring over books and file folders, hunting for references to soldiers who were in a set number of units, dates, locations, specific references to the Battle of the Bulge itself. Four years ago, I brought along a notepad and a couple of pens (or pencils, when I was up in the University Archives), and wrote down every reference that I could find, even the tangential students who might have been in the right area at the right time.

Fast forward to 2010, and the options have changed. Rather than taking a notepad and pen with me, I've been carrying my iPad and iPhone, on which I've been jotting down information as I find it. Slowly, as the lists are growing, I’m planning on taking the information and placing it onto a spreadsheet. While I do this, I’ve tapped into the wireless network, and as I come across soldiers in various units, I’ve discovered that running a quick check against the unit’s history online can help me determine if the soldier is someone I’ve been able to use, as their unit was present at the battle, or if they were somewhere else at the time, either because they hadn’t arrived, or were in another theater of operations altogether.

The move to electronic recording likewise has the benefit of being able to copy and paste my results directly into a spreadsheet, rather than having the extra step of translating my handwritten notes (no small task!) into the spreadsheet. The transfer of data is transferred between two mediums rather than three. (original, handwritten and computer). It allows me to keep information that I transpose intact far more easily than before.

The next step is something I’m thinking of trying: integrating this with Google Docs, which would allow me to keep my data online, accessible from any number of locations. Unfortunately, this isn’t a very practical thing on an iPad (I can’t easily tab between apps, and I don’t have the internet at home), but for some of the research portions, it seems like it would be an excellent thing to use, especially if someone is working with others. In this case, my girlfriend is helping out with some things, and the ability to update the same piece of data, without redundancies, would be helpful when gathering data is put together.

What I’m hoping is that the move to computers, rather than using handwritten notes, will allow me to be more efficient, and thus quicker, with the research that I’m working on. The amount of information that I need to go though: there’s something like five thousand additional files to go through when it comes to deceased students, not to mention the information on the units and after action reports that exist.

This also covers the first large phase of the research: gathering all of the raw data that I’ll need to form the basis of the project. The next step, actually distilling and then writing the report, is already digital: I can’t actually think of a time when I haven’t used a computer to type up a project. Those advantages are well known, and something that I know to work.

The Battle of the Bulge

In 2007, I went overseas to France, shortly after I finished college, to help provide the Norwich University side of things for the battlefield staff ride that we took. The D-Day study (which is partially documented here in the archives) was the final paper that I had written for my undergraduate coursework. Back in May of 2007, I had realized that this was something that I found interesting, and noted that I could easily expand this sort of research to encompass other elements of the European Theater of Operations.

I've largely kept things under my hat lately, but now that I've started, it's something that I can talk more freely about. While I'm not expanding my D-Day paper, I've been asked by Norwich to write another one, and to consult on an upcoming Staff Ride. This time around, I'll be focusing on the Norwich University Students who fought at the Battle of the Bulge at the end of 1944.

The battle, largely regarded as the last credible push on the part of the Germans during the Allied advance towards Germany, was a massive coordinated pushback that trapped U.S. forces behind enemy lines, and slowed Allied efforts in their push towards ending the war. Like in Normandy, Norwich students fought and died there, and occupied a number of positions within the U.S military.

This is a project that I'm very eager to return to, and the research phase has me very excited. This project will be coming in a couple of phases. The first, which I've started, is the research element, and I'm going to be specifically targeting several achieves and sources here at Norwich, starting with the yearbooks (a memorial edition from 1947 was what I tackled today, with very good results), and the Norwich University Record, the alumni paper, two sources that provided an incredible amount of information, along with two archives up on campus, which should provide some additional detailed information and allow me to draw up a roster of possible participants in the battle. From there, cross-checking each soldier's unit based on the historical record and actions of said unit will help to weed out the people who wouldn't have possibly been there. Student X was in Unit Y, but Unit Y didn't arrive into the area until day Z, which was after the battle, for example.

Running parallel to this will be research into the battle itself, looking for specific dates, people, unit actions and the story to which Norwich personnel will be placed. Here, the people I am looking at will be a small and unique look into how the battle went.

Once the research phase is over, the writing will begin, which I'm planning on starting around November, and finishing up by December. January through March/April is a little more fluid, but I'm guessing that I will be editing, fine-tuning and researching small details for the paper, while preparing presentations for the actual staff ride, which will take place in May of next year. Needless to say, I'm flattered and excited for this entire project.

This style of research makes a lot of sense to me, because I can work to connect the actions of the soldiers in the field to an institution that is steeped in history, and link said actions to the overall mission of the school, and provide a historical context and concrete examples of where graduates have changed the world through their actions. (And, some of these soldiers have accomplished incredible things, helping to see through the successes of various operations and actions throughout Norwich’s history.)

The Dawn of the Nuclear Age

Today marks the 65th anniversary of the detonation of 'Little Boy', the first nuclear warhead in the U.S. arsenal to be used as an act of war, and changing the world upon its use. The bomb, which was followed by 'Fat Man' on August 9th, caused casualties in the hundreds of thousands, with its effects lasting far into the present day. The United States marked a change in policy earlier today when Ambassador John Roos attended the reemergence ceremony earlier today. The onset of nuclear warfare marked a massive change in the structure and hierarchy of the world.

The culmination of the Manhattan Project and the subsequent implementation of nuclear arms into the U.S. arsenal was the result of years of work and research on the part of the United States, and one that remains fiercely debated to this day. The first, and only use of the weapons over Japan sparked much attention, but in and of itself was a single element in a larger strategy that was used to extend U.S. military power abroad. Earlier bombing runs, notably with the switch from conventional explosives to incendiary explosives on the part of the Army Air Force over Japan yielded similar results to the Hiroshima and Nagasaki bombings: a high number of civilian casualties and military targets were directly attacked, killing hundreds of thousands. The nuclear warheads are in and of themselves notable because of the sheer destructive force, and the ease to which an opposing force can destroy a comparable target when examined alongside prior methods. Previously, it required a large bombing force over enemy territory, where planes were susceptible to anti-air craft fire and mechanical breakdown. With a single air craft, the ability to do the same appeared.

It would seem that with the smaller force, and ease of destruction, that nuclear warfare would be an inevitable end to civilization as we know it. Large military forces require far more expenditure, logistics and manpower to accomplish their goals, with steep casualty costs, as seen in the casualty rates of the airmen who ventured over Axis-occupied territory during the Second World War. This misses the point, I believe, of the ease of destruction often predicted by science fiction authors. The scary thing itself isn't the bomb itself, but the system in which deploys it. The Second World War industrialized warfare to an incredible degree due to military necessity, and as a single nation almost untouched by war on its own borders, the United States found itself with the manpower, equipment and weapons in which to enforce its will across the world. When the Soviet Union joined the nuclear club, it acted as a balance of power, but one that tread upon very uneasy ground, as the potential for nuclear warfare grew immensely, and teetered on the edge at such moments like the Cuban Missile Crisis.

Fortunately, the fears of apocalypse never came to pass: cooler heads prevailed, and the implimentation of strategy that was designed to deter, rather than to destroy outright came to pass, but the introduction of nuclear weapons demonstrated that the balance of power had changed in a profound way: nations could enforce their will through the threat of force, and advances in science and technology allowed for a continued strategy on the part of the countries that were involved in the Cold War. In a real sense, with such advances, the world became a truly global, interconnected place, and affairs that had once been inconsequential now became important to the world as a whole.

The Nuclear age arguably never began with the Japan bombings, but earlier, as military strategy attempted to find ways in which to end the threat to U.S. interests. In doing so, unprecedented measures were undertaken: cities were destroyed, in what can be looked at as the closest thing to apocalypse and speculative fiction one has ever seen, and examining the aftermath provides for some almost surreal accounts: it is no wonder that people believed that the world would end with a flash of light, and it is uncomforting to realize that this sort of threat is one that is ongoing: the Cold War has since ended, but the threat of nuclear power is still one that will exist while such weapons exist, and will undoubtedly continue to influence those who look towards the future. What needs to be determined from policy makers and strategists as to the true risk, and to determine if the stakes are high enough.

The Original Mad Scientist: Nikolas Tesla

When looking at the roots of the modern world, one needs to not look further than one man, Nikolas Tesla, for a notable example. A bright mind from an early age, Tesla defines the term 'genius', and from an early age, demonstrated an ability for innovation and invention, and would later go on to enlighten the world: literally.

Born in January of 1856 in Croatia (then the Austrian Empire), Tesla's intelligence and intellect exhibited itself at an early age. In his autobiography, My Inventions, he noted that "suffered from a peculiar affliction due to the appearance of images, often accompanied by strong flashes of light, which marred the sight of real objects and interfered with my thought and action. They were pictures of things and scenes which I had really scene, never of those I imagined." He attributes this ability to strongly conceptualize and visualize as a key element in how he was able to invent various things, and early on, was frightened by this perceived ability. From an early age, he began to invent various objects: a hook to catch frogs, air powered guns, as well as dismantled clocks and at one point, fixed a fire engine's hose during a demonstration to the town.

Following this, at the age of six, he attended the Higher Real Gymnasium Karlovac, finishing out his time there in three years, instead of the four generally required. After he had finished, Tesla was stricken with Cholera. This incident encouraged his parents to send him to school for science and engineering, where they had previously hoped that he would join the clergy. Recovering, Tesla was permitted to join the Austrian Polytechnic in Graz in 1875,  where he further excelled and became further interested in physics and engineering, becoming interested in creating motors, a particularly early step in his work in alternate current.

In 1880, he relocated to Prague, Bohemia, at the Charles-Ferdinand University, before realizing that his academic pursuits were putting a strain on his parents. Leaving the school, he sought work at the National Telephone Company, before moving in 1882 to Paris, where he worked for the Continental Edison Company, working on electrical equipment, and two years later, he travelled to the United States, seeking to work for Thomas Edison. In a letter of recommendation from Charles Batchelor, a former employer and friend of Tesla's, it noted that "I know two great men and you are one of them; the other is this young man". This was a rather positive start to a relationship that would quickly sour. Tesla went to work for Edison, who had promised him $50,000 for his work to upgrade and repair generators, but shortly after the work was done, Edison claimed that he had been making a joke, and Tesla, furious, left the company.

Telsa then formed his own company, Tesla Electric Light & Manufacturing, where be began to work on his method of Alternate Current, which he believed was far cheaper and safer than the Direct Current that was used across the world by this point, but due to issues with the company, he was soon removed. In 1888, he began work under George Westinghouse at the Westinghouse Electric & Manufacturing Company, where he worked on his alternating current and studied what were later understood to be x-rays. Over the next several years, Tesla continued his work in electronics and physics.

During this time, both he and Edison became adversaries, with Edison invested in his Direct Current technology, while Tesla and Westinghouse backed Alternate Current. Edison implemented a public campaign against AC power, touting accidents and the fact that AC power was used for the first electric chair. The tide turned, however, when Westinghouse's company was awarded a contract to harness the power of Niagara falls to generate electrical power, which resulted in a positive, highly public and practical test of AC power, while the 1893 Chicago World Fair likewise utilized Tesla's power system in a highly public fashion. The result was a shift from the utilization of DC power to AC power, which allowed for a greater range for power, and over the next century, DC power was phased out.

In 1899, Tesla moved to Colorado Springs (where he was portrayed by David Bowie in Christopher Nolan's film, The Prestige), where he continued his experiments with electricity. He created several methods for transmitting power wirelessly, and in 1900, he began work on the Wardenclyffe Tower with funds from J.P. Morgan, a wireless transmission tower. The tower was completed, but the project ran short of funds, and was eventually discontinued. Tesla lost several patents at the same time, and in the years that followed, he continued scientific research, designing things such as a directed energy weapon, but found little support for his plans. In the last decades of his life, he began suffering from a mental illness, and passed away in 1943 at his home in New York City.

Tesla is a figure that has captured the imagination of the geek community, but is at the same time someone who is almost single-handedly responsible for the transmission of power that covers the nation, a necessity in modern life. In fiction, he has been portrayed several times (the aforementioned appearance in The Prestige is a good example), but is known for his intellect and forward thinking in science and technology. Several of his inventions, such as a death ray and wireless power, are still elements that belong to the science fictional realm.

What is most astonishing, reading over Tesla's 36 page autobiography, is his ability to conceive of projects and carry them out, understanding them almost completely. He appears to have had a very rare gift, one that borders on the supernatural, or to some, some sort of mental illness or disability that allowed him unprecedented abilities. In the truest sense how I see geekdom, Tesla fits all of the marks, a textbook case of following a passion extensively, and changing the world as he did so.

Defining Geek History

Before looking at exactly what 'geek history' is, the term must be defined, to give the term relevance, but also the content that should be looked at. With those elements in mind, an examination of the history behind the Geekdom becomes much easier, but also allows for someone to look at the greater significance for how exactly Geek History is in any way important.

A couple of years ago, Ben Nugent published a book titled American Nerd: The Story Of My People, a short book that was part biography, part history and part examination of culture. While I wasn't particularly impressed with the book as a whole, there were a number of very good ideas there, particularly in how he defined a geek or a nerd-type person. It boiled down to a fairly simple concept: a geek/nerd (minus the social connotations) is someone who is extremely passionate about any given subject, learning all that they can about it. They tend to be readers, and because of this attention, there's a tendency to miss out on some social elements that most people take for granted. The subject itself doesn't necessarily matter, and I've generally assumed that geeks/nerds tend to gravitate towards the science fiction / fantasy realms because the content is more appealing.

By this definition, education, literacy and an attention to detail are paramount, defining elements in how geeks and nerds are defined. In a country where education seems to be a point against an individual, it's even more important to understand the role that such things play with the public, and to recognize the importance of individuals in the past, and how their actions and knowledge has helped to define the present that we now know today.

In a large way, looking at geekdom in history is akin to looking at major historical figures who have the largest impact because of their contributions to events through conception, rather than just actions. These are people who help to develop ideas in a number of different stages, either formulating designs, concepts of plans, or helping to see some major thing through. With the Geek definition in mind, people such as this also tend to be very hands on with a lot of their work, being directly involved with their projects, or singlehandedly putting something together that changes how people think about the world afterwards. In some cases, this is a simple person to pick out: an author of a notable book, or a director of a film. Other instances, where science and industry are involved, this would be slightly more difficult, given the collaborative nature of some of these projects.

Looking at Geek History, then, is looking at the people who change the future because of their ideas, rather than predominantly implementing these changes themselves. These creators were instrumental in putting items in place that likewise changed how people interact with the world, and in addition to examining the people behind the advances, it's also important to look at how their works, whether they're inventions, novels, films or even events, helped to transform the world into a much different place.

Geek History largely comes down to the history of knowledge and ideas. Given the general rise in popularity in geek things, I tend to think of this style of history as one that looks to the past hundred to hundred and fifty years, simply because of the general proximity of the modern day, and more highly relevant to the modern sort of geek movement. However, there's elements of this line of thinking that extend far more into the past, mixing science and social histories that can likely go back to the beginning of the examination of thought itself.

The study and appreciation of the modern geek movement should look at the roots and elements that make up the modern geek, from the tools that are used to the entertainment that we soak up to the way that we think and approach the world. It's far more than the stereotypes, it's in everything that makes up those stereotypes.

Geek History Month

Website Asylum.com wrote yesterday that August should be Geek History Month, a time to examine the history in all things geek. It's not a website that I have any experience with: the brief announcement that they made had been retweeted from their own feed by several of the people that I follow, and it seems like a good idea.

Geek things seem to be on the rise, from the movies and books that have become increasingly popular with mainstream audiences to the President of the United States dropping in references to photo ops and speeches. Reading over the news every day, I feel like I am reading stories of advances, events and situations that can only exist in a science fictional universe, and it seems that a dedicated month (while somewhat silly) looking over some of the people, events and works that have created the world that we live in a good thing, and a good excuse to write about it.

A society where geeks, and more importantly, their passion for knowledge, science, literature and technology, are valued is something to be treasured indeed. A love for knowledge is something that drives people to achieve great things. In the past century, there has been a remarkable boom in technology, science, and literature that has completely redefined our understanding of the universe, and our very existence.

Earlier last month, I brought along my iPad to my grandmother, as I'd loaded some pictures from my Brother's wedding onto it, and wanted to demonstrate what it would do, as she had been talking about some alternatives to her current internet system, WebTV, which has become increasingly outdated. When I left, I remembered that she had been born in the 1930s, when radio reigned supreme for the public, and since that time, she has seen much in the way of technology, from the first atomic bombs to the first men into space and onto the moon, from when computers once filled a room, to ones that could fit into one's hands and from the first films on the silver screen to the digital theaters' ability to bring just about anything to life.

Looking back at the history of the twentieth century, it seems that much of what has happened over that time is the product of advancements of knowledge, and the people who pursued knowledge, took risks, and sought to entertain, and along the way, defined our nation, and our world, by their actions. It's entirely appropriate that these achievements be looked back upon, as everything that has happened in the past has influenced the present and beyond, creating the geeks of today.

The To Read List

 

With a couple of books finished and out of the way, it’s time to move along with the next book on the reading list. Currently, I’m reading a couple of books for online assignments, and after that, there’s a couple of more, which I haven’t started yet.

Now that I have a very portable computer, I decided to walk around the apartment and see exactly how long my To-Read list really is. And it’s pretty long…

Currently Reading:

How To Live Safely In A Science Fictional Universe, Charles Yu

The Last and First Men, Olaf Stapledon

At Bat:

Ambassadors from Earth, Jay Gallentine

Footprints In The Dust, Various

Whirlwind, Barrett Tillman

Stories, Neil Gaiman, ed.

Infoquake, David Louis Edelman

Kraken, China Mieville

River Of Gods, Ian McDonald

Masked, Lou Anders

The Left Hand of Darkness, Ursula K. LeGuin

Snow Crash, Neal Stephenson

Nights of Villjamur, Mark Charan Newton

The City and the City, China Meville

Next Up:

Shadowbridge, Gregory Frost

The Dervish House, Ian McDonald

Johannes Cabal: Necromancer, Jonathan Howard

Woken Furies, Richard K Morgan

Avandari's Ring, Arthur Peterson

The Shariff of Yrnameer, Michael Rubens

The Lovely Bones, Alice Sebold

The Legend of Sigurd and Gudrun, JRR Tolkien

The Machinery of Light, David J. Williams

I’ll get to these… Sometime.

Makers, Cory Doctorow

Stardust, Neil Gaiman

World War Z, Max Brooks

Use of Weapons, Ian Banks

The Player of Games, Ian Banks

Matterhorn, Karl Marlantes

The Gun Seller, Hugh Laurie

Miles from Nowhere, Nami Mun

The Day of Battle, Rick Atkinson

The Battle for Spain, Antony Beevor

D Day, Antony Beevorr

War Made New, Max Boot

Fatal Decision, Carlo D'Este

The Big Burn, Timothy Egan

Race of the Century, Julie Fenster

The Sling and the Stone, Thomas Hamms

1959, Fred Kaplan

The Power Makers, Maury Klein

The Echo of Battle, Brian Linn

Paris 1919, Margaret Macmillan

Triumph Forsaken, Mark Moyar

Combat Jump, Ed Ruggero

The People's Tycoon, Steven Watts

Grave Peril, Jim Butcher

Summer Knight, Jim Butcher

Death Masks, Jim Butcher

Blood Rites, Jim Butcher

Dead Beat, Jim Butcher

Small Favor, Jim Butcher

The Amber Wizard, David Forbes

Good Omens, Neil Gaiman and Terry Pratchett

The White Mountains, John Christopher

Dead Until Dark, Charlaine Harris

Inherit the Stars, James Hogan

A Game of Thrones, George R.R. Martin

A Storm of Swords, George R.R. Martin

A Fest For Crows, George R.R. Martin

A Clash of Kings, George R.R. Martin

Trading in Danger, Elizabeth Moon

Command Decision, Elizabeth Moon

Marque and Reprisal, Elizabeth Moon

His Majesty's Dragon, Naomi Novik

Revelation Space, Alastair Reynolds

Redemption Ark, Alastair Reynolds

Red Mars, Kim Stanley Robinson

The Ghost Brigades, John Scalzi

The Last Colony, John Scalzi

Zoe's Tale, John Scalzi

Atonement, Ian McEwan

The Book on the Bookshelf, Henry Petroski

How to Build Your Own Spaceship, Piers Bizony

Vampire Taxonomy, Meredith Woerner

Edison's Eve, Gaby Wood

Dry Storeroom No.1, Richard Fortey

Hot, Flat and Crowded, Thomas Friedman

The Purpose of the Past, Gordon Wood

The Fourth of July

Fireworks and cookouts, along with the Red White and Blue that symbolizes our country, characterize July 4th of every year. At the same point, it serves as a good time for reflection on the creation of the country in which we live. The founding of the country is one that is becoming shrouded in myth, with its own set of misconceptions and happenings that are relatively unknown, which makes the constant 'Happy Birthday America' status and twitter updates that I've seen all along be somewhat of humorous statement.

When looking at the founding of the country, the 4th is an obvious holiday to look at, for it was the signing of the Declaration of Independence that formally succeeded the United States from the United Kingdom, and represented the first time that the colonies became a country that stood on their own. However, the founding of the country is something that has happened numerous times throughout our history, and at points, I wonder if the 4th is really a celebration of the beginnings of America, or something else entirely.

If looking at the founding of the country, it is also best to remember that the Europeans who came to the country weren't the first here. The numerous tribes of native Americans have been on this landmass for thousands of years, presumably since the end of the last ice age, when the glacier sheets receded and isolated the continent. They came down through North America and into Central and South Americas, creating their own vast civilizations. The Vikings landed in Newfoundland, Canada around 985-1008 by Lief Eriksson, but later abandoned the settlement. It was not until 1492, on October the 12th that Christopher Columbus, with the three ships under his command, the Santa Maria, the Nina and the Piñta, discovered the Bahamas, believing that he reached the Indies, before continuing down towards Cuba and Haiti. Return trips were planned in the years following his expedition, and soon, Europe was traveling to the newly discovered landmass in larger expeditions. In 1499, the new world was named 'America', after Italian explorer Amerigo Vespucci, who discovered that the new world was not Asia, but a large landmass in between the two. The first European to reach North America was commissioned by Henry VII of England, John Cabot, while others discovered more and more of this new world.

Looking forward three hundred years, the secession of the United States was preceded by decades of events and mismanagement by their British overlords, who taxed the colonies to help offset the massive expenditures of war and government abroad. Various taxes, such as the Stamp Act, Molasses Act Quartering Act and the Tea Tax fanned the flames of irritation against the British government, inciting riots and protests. The famous Boston Tea Party occurred in 1773, as the British government aided the failing East India Tea Company, bringing about the Tea Act, prompting a riot and protest on the part of the Boston merchants. War began a couple of years later in 1775, but clearly, the seeds of discontent had been laid far earlier, bringing about the declaration of independence from the colonies. On August 22nd 1775, the colonies were declared to be in rebellion, and by October of 1781, the British surrendered, and opted to not continue the war by March of the following year, and in November, the United Kingdom recognized the independence of the United States.

In March of 1781, the Continental Congress, began to work on a permanent form of government to lead the country, with plans stretching as far back as 1776, and by the time the war ended in 1781, the Articles of Confederation became effective, setting up a government that granted responsibilities, but almost no authority to maintain those responsibilities. There was a current of distrust in a stronger central government that ultimately crippled the Congress, for it could not regulate commerce, negotiate treaties, declare war or raise an army, create a currency, maintain a judicial branch, and no head of government that was separate from the Congress. While there were upsides to the government, it was unable to effectively govern, and a series of crises arose that threatened the stability of the nation. Shay's Rebellion provides a good example of this, when western Massachusetts went into open revolt in 1786 when the legislature failed to provide debt relief. This was but a singular example of the times, and there were more advocates of a stronger centralized government, where a revision to the Articles of Confederation were demanded, for a government that could regulate interstate and international commerce, raise revenue for the country and raise a single army to confront threats. The Constitutional Convention that arose sparked numerous debates over the rights of the state vs. the federal government (antifederalists vs. federalists, respectively). Despite the intense debate, the Continental Congress closed down on October 10th, 1788, and on March 4th, 1789, the new congress elected George Washington (who believed that the Constitution would only last about 20 years), and a new federal government was born. In a every way, this was the date in which the United States that we know today was formed.

This story of the birth of the United States and 'America', the concept, are important ones to remember, for not only the sequence of events that built upon the last, but their significance in relation to one another. Current ideology amongst popular culture nowadays seems to contort many of the lessons that can be learned from this period of formation within the U.S.. The United Kingdom was thrown off because of an apathetic and overbearing monarchy that failed to represent the interests of the colonies, rather than simply because of the taxes that were levied upon them. To hear senators and public representatives speak that the colonists rebelled simply because of a tax upon tea belies the complicated nature of American independence, and the lessons that were learned in the years afterwards of the failure of a weak centralized government, but also the simple fact that the Constitution of the nation was not the direct product of the American Revolution, but that it was a work in progress, of sorts. America itself, however, has had a series of births and rebirths, and the Declaration of Independence was but one such moment in the history of the nation, concept and location. Still, July 4th is a good of a time as any to celebrate the process, and the existence of the nation itself.

Rick Atkinson & History

 

This summer’s entry in the Todd Lecture series at Norwich University was Pulitzer Prize winning author Rick Atkinson, former reporter for the Washington Post and author of several of books, most notably, An Army at Dawn, about the Invasion of Africa (which won him the Pulitzer in history), and more recently, The Day of Battle, about the invasion of Italy, both part of his epic trilogy on the events of the Invasion of Europe. In an already cluttered field of works on the Second World War in Europe, Atkinson’s books stand out immensely as some of the best books about the conflict, and the third book, of which he’s completed the research for, and is now outlining and writing, will be out in a couple of years, and will undoubtedly be a gripping read.

Atkinson spoke about an important and relevant topic to the history graduates before him: the value of narrative history, and more specifically, the need for a writer to recognize the value of a story within the heady analysis and synthesis of an argument. Personally, I find the division and outright snobbery of most academic circles to be frustrating, especially when it comes to popular and commercial non-history. Within history is a plethora of stories, values, themes and lessons to be breathed, learned and valued, and an essential part of education is bringing across the message to the reader or general audience in a way that they can comprehend and relate to the contents of any historical text.

Commercial nonfiction has its good and bad elements to it. Bringing anything to a general audience can water down an argument, and the balance between good stories and good history is one that has to be balanced finely. Some authors do this well, and from what I’ve read of Atkinson’s books, he has done just that.

Mainstream history is important. It is what helps to bring the lessons and analysis of the past to the people, and a population that reads and learns from their historians is a population that can intelligently call upon the past to make decisions for the future by comparing their current surroundings to similar happenings in the past. More than ever, this is important, and Atkinson’s talk and follow-up questions help to drive this point home.

Atkinson’s books are in the unique category of bridging the divide between academic and popular reading, and he noted that the failed to believe that history needed to be dry, uninteresting and irrelevant. History does not need to be relegated to only the academic circles, but it should be something that is in the foremost thoughts of the American population.

History is important, not just because of the lessons that are learned from it, but because of the mindset that is required to comprehend it. History is not a record of events gone past, but of the interpretation and story that those events tell. What is required from those who examine the field is an understanding of how a large number of events, political and societal movements and individuals all come together in a sort of perfect storm to create the past. Much of this is cause and effect, and contrary to popular belief, the past holds no answers for the future: it is the understanding of how said events occur, within their individual contexts that allow for the proper mindset to understand how similar happenings might happen in the future and how to prepare for what is to come.

Atkinson’s talk was a good one for students to hear, and different approaches to history are simply the nature of the field. The Military History students who graduated last week were ones that have a large number of options open to them, and Atkinson’s talk (and his own stature as a historian) demonstrated that a doctorate isn’t the only way to make a living at this.

You can watch Mr. Atkinson's talk here.

General Barksdale Hamlett

File:Barksdale Hamlett.jpg

In 1965, Major General Ernest Harmon retired as the 19th President of Norwich University, after a 15 year career in higher education, presiding over one of the largest growth periods in the University's History during the post-war boom that brought the University a number of new facilities and buildings that still stand today. In his place, General Barksdale Hamlett became the 20th president of the University, after a career that spanned three decades in the United States Army, where he attained the rank of a four-star general, during a volatile time in United States, where he presided over the Cuban Missile Crisis and the escalation of the Vietnam War.

Following a major heart attack that nearly killed him in 1964, Hamlett retired from the military, and in 1965, took the reins of Norwich University (1). In the aftermath of Harmon's rapid growth of the school, there were numerous issues that caused problems for the school. A declining enrollment in the Corps of Cadets was beginning to impact the school's budget, while Hamlett's plans to double the school's endowment from $3.5 million to $6 million dollars was slow as alumni to the school failed to help as much as possible. Just a week after taking his office in July of 1965, Hamlett noted that alumni help for the school's future was "Disappointing", noting that only 32.5% of the alumni base had actually contributed to the $500,000 raised at that point.(2)

In light of the financial issues that the school faced, Hamlett began to create the groundwork that would eventually spell out massive changes to the school. In January, after only six months on the job, he issued long range plans for the school to begin to look into integrating a non-military component and student population to the school, noting: "I told the trustees flat out that if you can't accept change, you better prepare yourself for bankruptcy,"(3) Additionally, he moved to acquire the Vermont Campus College (which occurred in 1972), a civilian school located in Montpelier, Vermont, with a predominantly female population.(4) In one administration, the roots for the modern makeup of the school were planted, and it represented a fairly bold vision for the future of the University, with major changes to a largely traditional offering. At that point, Norwich was one of three schools that was still entirely military in nature.

Currently, Norwich University has a large student population of both military and civilian lifestyle students, although the relationship with Vermont College was dissolved in 2001, Additionally, shortly after this time, the school introduced women to the curriculum, two years after Hamlett stepped down, and two years before the federal service academies. Looking at the Hamlett administration, it's fairly clear that there are a certain number of parallels with the present state of the University.

With the 2008 collapse in global markets, Norwich, like numerous other schools, faced some budget problems, which in turn have pointed to solutions to deal with the University's future, but also the current problems. In 1966, the school's future was in serious doubt, and the University made several drastic changes to the makeup of the school that carry through to the present day: the introduction of civilians, acquiring Vermont College and women to the student body, which opened the school up to new markets and helped to increase the student body.

The current problems facing the school have brought some employee cuts, but a major change in the way the school does business, looking to increase student satisfaction and thus retention to retain students who might otherwise leave. With new dorms and buildings under construction, or recently completed, the school is on track for a good recovery, and with changes put into place to help keep the school functioning for years to come. With the 2019 bicentennial coming up, the future of the school is readily secured, but it does go to show, that while Norwich has faced significant problems in the past, the option to implement drastic changes, while keeping core values at the heart of the school, should remain for those in charge of the school's future.

Hamlett's implementations have remained at the school to this day, and have ultimately proved to be a strong addition to the Norwich experience available to students, who can choose between lifestyles, but also learn from the other side of the equation. With his introduction to the school, there was an 'Emphasis on academic enrichment'(5), something that likewise remains to this day, and despite fears that the school would lose its character, demonstrates the central core of the school's focus: educating practical citizens for the future.

1 - 'Hamlett Inagurated as 20th President', Burlington Free Press, October 26, 1965 2 - 'Norwich Alumni Help Called Disappointing', Burlington Free Press, July 1965 3 - 'Cadets No Longer Submit to Petty Rules; Top Military Schools Have to Ease Rules to Stay in Business', New York Times, May 31, 1972 4 - 'Non Military Students at Norwich?', Times Argus, January 25, 1966 5 - 'Hamlett Inagurated as 20th President', Burlington Free Press, October 26, 1965