Street Fighter Mathematics

The American Mathematical Society is celebrating April 2010 as Math Awareness Month and the theme for this year is Math and Sports. This was just the excuse I needed to write about the role of mathematics in what I consider an emerging sport: competitive video gaming. Acceptance of video games as a sport hasn't reached popular acceptance yet, but the particular type of video game that I'll be focusing on does have a real world analog which is a “sport” in the classical sense of the word. The “fighting game” genre, exemplified by Capcom's Street Fighter series, is the video game counter-part to mixed martial arts competitions. Despite the fantasy indulgences of video games, the mathematical concepts discussed here can be extended to real martial arts and a variety of other sports.

The basis for this discussion is a branch of mathematics referred to as Game Theory. Game Theory has a wide range of applications from economics and politics to evolutionary biology. At the heart of Game Theory are games, which have several key components. First, a game needs one or more players. Second, there needs to some choices available to these players. Finally, there needs to be a set of rules determining the pay-off for each player – such as whether a player wins or loses. The focus of Game Theory is on strategies, which are guidelines that a players uses for making choices in the game to obtain the most favorable outcome.

Ask any competitive player about the Game Theory of fighting games and they'll tell you that the genre essentially boils down to Rock, Paper, Scissors. In Rock, Paper, Scissors, each of the 2 players simultaneously makes a hand gesture for one of the three choices. Rock beats Scissors, Scissors beats Paper, and Paper beats Rock. The goal of Rock, Paper, Scissors is to guess which gesture your opponent is going to throw and to play the gesture that beats its. Keep in mind that your opponent is going through the same line of reasoning, trying to guess which move you think they're going to play so that they can play the move that beats your move!

Mathematically, games like Rock, Paper, Scissors are represented by what's called a pay-off matrix. For a 2 player game, the choices for player 1 might be represented by different rows in the matrix and the choices for player 2 would be the different columns. The cells of the matrix contain a pair of numbers representing the pay-off for each player. For Rock, Paper, Scissors, we can describe the pay-off by using a 1 if the player wins, a -1 if the player loses, and a 0 if the players tie. This type of game is what we call a Zero-sum game, because every win for one player corresponds to a loss for the other player. The pay-off matrix would look like this:

P2: Rock P2: Paper P2: Scissors
P1: Rock (0,0) (-1,1) (1,-1)
P1: Paper (1,-1) (0,0) (-1,1)
P1: Scissors (-1,1) (1,-1) (0,0)

For example, consider the case where player 1 throws Rock and player 2 throws Scissors. We look for the row associated with Rock for player 1, row 1, and the column associated with Scissors for player 2, column 3. The contents of the cell at row 1 column 3 is the ordered pair (1,-1), which signifies that player 1 gets a win (+1) and player 2 gets a loss (-1). Furthermore, this game is symmetric because the options and pay-offs are the same for each player. Rock, Paper, Scissors is often used as a sort of “coin toss” because each player has equal odds of winning. If both players throws a random choice each round, player 1 should win about 1/3 of the time, player 2 should win about 1/3 of the time and the remaining 1/3 result in a tie. Rock, Paper, Scissors is a perfectly balanced game. The only way of winning consistently is to be able to anticipate your opponent's move in advance.

Fighting games are much more complex than Rock, Paper, Scissors, due in part by the large number of choices available to each player. For a game like Street Fighter II, with a 8-way joystick and 6 buttons, there are 576 possible inputs at any given point in time. This is far too many choices for me to list in a table, so for the moment lets focus on a simpler example. Consider a very simple fighting game with only 3 moves: an overhead attack, a high attack, and a low attack. These three attacks are analogous to the choices in Rock, Paper, Scissors. An overhead attack beats a low attack, a low attack beats a high attack, and a high attack beats an overhead attack. The pay-off matrix for this hypothetical fighting game might look like this:

P2: Overhead P2: High P2: Low
P1: Overhead (0,0) (-10,10) (10,-10)
P1: High (10,-10) (0,0) (-10,10)
P1: Low (-10,10) (10,-10) (0,0)

Normally fighting games determine the result of a game by starting each player with a set amount of health and this is reduced when hit by the opponent. Perhaps each player starts with 100 health and each attack deducts 10 health from the other player. However, it's important not to think of these values as adding or subtracting health but as the net benefit towards that player. Dealing damage to the opponent makes the opponent closer to losing and consequently making the dealer that much closer to winning. You might think of this net benefit like “tug-of-war”, like where each move is a step in the direction of winning. Even the simple act of changing position in a fighting game can have benefits towards one player or another!

Given that different characters have different strengths and weaknesses, another level of Rock, Paper, Scissors tends to emerge at the character selection level. For example, we might see a game where long range characters tend to have an advantage over short range characters, medium range characters tend to have an advantage over long range characters, and short range characters tend to have an advantage over medium range characters. However, the characters might not all be perfectly balanced with one another.

Indeed, it is often the case where some characters might be better or worse than others when you look at all possible match-ups. In fighting games, these are referred to as Tier Rankings. Consider this example for Street Fighter IV. Each character plays against each other for 10 games, and the number of wins are recorded in the corresponding cell in the match-up table. The ranking is determined by how often a character wins across all possible opponents. In Street Fighter IV, playing Sagat is at an advantage against any given character, while Dan is at a disadvantage against any given character.

In these examples so far, the rules of the game work the same for both players – the pay-off matrices are symmetric. The ability to choose different characters in a fighting game leads us away from symmetric pay-offs into asymmetric pay-offs. Let's consider a variation of our 3-move fighting game in which the pay-offs are a little bit different:

P2: Overhead P2: High P2: Low
P1: Overhead (-5,5) (-10,10) (10,-10)
P1: High (10,-10) (0,0) (-5,5)
P1: Low (-15,15) (10,-10) (5,-5)

Here, player 2 has a stronger overhead attack but at the expense of a weaker low attack. A question that a Game Theorist would ask in such a game is “what is the optimal strategy for each player under these conditions?”. One way of looking at this is by applying the Minimax rule. Each player is assumed to minimize potential losses then maximize potential gains. Let's take a look at the best and worst case scenario for each player:

P2: Overhead P2: High P2: Low Min Loss for P1 Max Win for P1
P1: Overhead (-5,5) (-10,10) (10,-10) -10 10
P1: High (10,-10) (0,0) (-5,5) -5 10
P1: Low (-15,15) (10,-10) (5,-5) -15 10
Min Loss for P2 -10 -10 -10
Max Win for P2 15 10 5

Player 2 stands to lose the same no matter what player 1 plays (-10), but stands to gain the most by playing overhead. We're presented with a situation where it is now to player 2's benefit to use the overhead attack. Player 1 stands to win the same no matter what player 2 chooses (10), but he has the least to lose by going with the high attack (-5). Knowing that player 2 stands to benefit from using the overhead, player 1 can compensate for this by playing high attack.

However, the picture changes as we add multiple plays to the picture. If both player's always adhered purely to their respective optimal strategies, player 1 would consistently win by playing high against player 2's overhead. If player 2 realizes that player 1 is predominantly playing high, it's to his advantage to “mix-it-up” and play low once in a while to punish player 1 for using a pure strategy. In practice, what we often find in these games are mixed strategies in which the players choose the different attacks with varying probabilities. Player 1 won't always play high and player 2 won't always play overhead, but they might tend to play these more often than the other choices.

The study of how players will optimize their strategies is a fascinating area of mathematical research. The Nash-Equilibrium is pair of mixed strategies for player 1 and player 2 such that neither player stands to benefit from changing their strategies while the other keeps theirs the same. A closely related concept is the notion of an Evolutionary Stable Strategy (ESS). Once an ESS is adopted by the majority of the population, it will not be invaded by any alternative strategy.

What impresses me most about playing fighting games is the way players tend to converge on the optimal strategies. Finding the optimal mixed strategy for a two-player bi-matrix game (one with different pay-off for different players) is known to be computationally hard problem. The most efficient method (that I know of) for finding the Nash-Equilibrium is the Lemke-Howson Algorithm, which takes exponentially longer to carry out as the number of choices increases. Yet somehow “pro” players seem to naturally discover these strategies despite the huge number of choices in a typical fighting game. In many ways, this process is much like the emergence of evolutionary stable strategies. The players with bad strategies lose, the players with good strategies win, and the winners gradually converge on an optimal strategy based on the behavior of the other players in the pool. Evolution seems to be quite proficient at solving optimization problems.

At this point I've probably raised more questions than I've answered, but I see this as a “good thing”. There's a wealth of beautiful mathematics taking place in these games and just shedding some light on these problems is all I could hope for. Some of the problems discussed here are ones that will keep me searching for answers in years to come. The next time you find yourself in an arcade (one of the few that remain), take a moment to stop and watch a game of Street Fighter. I hope you'll find it as awe-inspiring as I do.

Core Standards for Mathematics Feedback

What follows below is the feedback I provided on the proposed Core Standards for Mathematics.  These represent my own opinions on the direction mathematics reform should take.  As far as I know, the changes I propose have not been sufficiently supported by research.  However, I hope I may provide a fresh perspective on the direction the mathematics curriculum should take to address some of the existing problems.

I'd like to start with some background about myself, to provide a context for this critique. I was an accelerated math student in high school and majored in Mathematics in college. These views are my subjective interpretations of the mathematics curriculum as I experienced it, and the areas of mathematics in college that I felt unprepared for. My objection to the proposed Core Standards is that I do not view them as reforming these areas where I felt unprepared. Instead, all the Core Standards seem to do is set in stone the same failing curriculum I experienced as a student.

Overall I feel that the Standards for Practice expressed on pages 4-5 are solid goals for mathematics education to strive for. However, my major critique of the Core Standards is that I do not think they are sufficient to meet these goals. Two of the Standards for Practice strike me as being unsupported by the proposed curriculum: (3) Construct viable arguments and critique the reasoning of others and (5) Use appropriate tools strategically. The former requires a solid foundation in logic, set theory and critical thinking, while the latter requires an introduction to computation science. The standards that follow do little to reinforce these skills.

In order for students to construct and critique arguments, the students must first know the basic structure of an logical argument. How can students be expected to give valid arguments when the definitions of validity, soundness, completeness, and consistency are omitted from the mathematics standards? The only objective that seemed to actually address this was in the High School Algebra Standards: “Understand that to solve an equation algebraically, one makes logical deductions from the equality asserted by the equation”. I respect that this objective is included, but don't think that the curriculum leading up to it adequately supports it. When I was first exposed to college level mathematics, the notation system of formal logic was used extensively by my professors. My education leading up to that point had not covered this notation, and it felt like I was learning a second language in addition to the mathematical concepts. The notions of formal logic are not complicated, but earlier exposure to these ideas would have made me more prepared for college mathematics.

In my opinion, the K-8 standards are too focused on number and computation. The objectives covered in the K-8 curriculum reflect a notion of numbers consistent with the view of mathematics in the early 1800s. In 1889, when Peano's Axioms were introduced, the mathematical notion of a “natural number” changed. The “natural numbers” have been redefined in terms of set theory since the 1900s. Students need to have a concept of numbers that are consistent with the modern set theoretic constructions. The pen-and-paper computations covered in elementary school are valuable, but have in many instances been replaced by technology. It's not enough for students to know how to perform addition, but the modern student needs to also know that the operation of addition is analogous with joining sets. In mathematics, number theory is built upon the foundation of set theory. Focusing on numbers before sets is illogical considering the hierarchy of mathematical knowledge.

Set theory is also based on logic, which is also missing in action. Several of the objectives mention making logical conclusions about problems, but where's the logic? The mathematical definitions of “and”, “or”, “xor”, “not”, “nor”, “nand”, “implies”, “for every”, “there exists” and “proves” are absent from the standards. The relationship between the basic operations of logic and those of arithmetic needs to be thoroughly established. This is not only important from a mathematical standpoint, but it is essential to learning how computational technology works. The operations of arithmetic can be build from the basic building blocks of logic, and that is how computers manage to produce the calculations that they do. All students will work with technology in some manner or another, and developing an understanding of how that technology works will make them more effective users of technology.

The focus of the K-8 curriculum is to develop students' understanding formal axiomatic systems. Mathematics should be presented in the form a game. The rules of the game determine patterns that are produced in the process. Too much of a focus on the outcomes underemphasizes the importance of the rules to begin with. Algebra, in particular, requires students to rewrite expressions using the properties of numbers. The failure of the curriculum is that students have no prior experience with substitution systems. The algebra student is essentially thrown into the game without knowing the rules. It's no wonder that algebra presents such a challenge for mathematics eduction.

In the High School Standards, my main objection is to the separation between the general curriculum and the STEM curriculum. Most of the objectives labeled as STEM objectives, should be included in the general curriculum. STEM students need a more thorough picture of mathematics than that presented here. As an example, complex numbers are treated as a STEM only topic in the standards. The Fundamental Theorem of Algebra depends on the field of complex numbers and all students should be exposed to this result. Students preparing for STEM need to go beyond complex numbers to constructions such as dual numbers and quaternions which would help prepare students to acquire a more general notion of Clifford algebras in college. These tools may seem obscure to traditional educators, but are essential tools to physicists and engineers. Quaternions have several useful properties that make them ideal for modeling rotations, and dual quaternions can be used to represent rigid transformations.

One of my pet peeves about high school mathematics is that the picture of physics presented there was radically different than the picture presented in college. As an example, consider the following algebra problem: “Joe and Sue live 10 miles apart. Joe heads towards Sue's house at 5mph and Sue heads towards Joe's house at 3mph. If they both leave at the same time, how long until they meet?” Questions like this are usually represented as linear equations, like “3x+5x=10”. However, this gives students the false impression that this is an accurate model of velocity in the real world. The fact of the matter is that this equation is only reasonable for small numbers. A change in the numbers included in this problem could invalidate the model. Consider the modified question, “Joe and Sue live 10 light-years apart. Joe heads towards Sue's house at .8c and Sue heads towards Joe's house at .9c. If they both leave at the same time, how long until they meet?” Under these figures, a linear model of time is no longer accurate. Time slows for Joe and Sue relative to a stationary observer, and the question of “how long until they meet” is more complicated than it appears on the surface. If they each start a stopwatch at the moment of departure, their clocks will have different readings when they meet. Students in a STEM course of study need to understand that velocity in space-time is fixed at the speed of light, and what we perceive as motion is a rotation of this space-time vector.

There are several other topics that I consider important for STEM students that go beyond the standards. The study of triangles in geometry needs to extend to ordered triangles, and the linear algebra needed to manipulate such triangles. The notion of “angle” as presented in the high school curriculum is insufficient. Students need to start thinking about angles as they relate to the dot product (inner product) of vectors. The trigonometric functions of sine and cosine need to be connected to complex exponents by way of Euler's formula: “eix = cos x+i sin x”. The properties of logarithms also need to be explicitly covered in the curriculum. Covering these notions in the high school curriculum would make students better prepared for subsequent studies of calculus and geometry.

Some of these suggestions may seem like obscure areas of mathematics, but my argument is that they shouldn't be. If the purpose of K-8 is to develop an understanding of what formal axiomatic systems are, then the focus of 9-12 should be on discovering the useful properties that result from the standardly accepted axioms. I once conducted a job interview where a mathematics student, like me, was applying for a software engineering position, also like me. My employer had brought me into the interview to determine whether or not the candidate was ready to apply his mathematically expertise to computer programming. During the interview, the candidate mentioned quaternions as one of his areas of interest. In the software developed at this company, orientations and rotations are routinely stored and manipulated as quaternions. When I asked how the candidate would use quaternions to compute a rotation, he was stumped. He also became extremely interested in the interview at that point and was eager to learn more about the technique. My question had revealed that the abstract mathematics he was familiar with had a real purpose behind it – a practical use within the field of computer science. It's this kind of disconnection between abstract and applied applied mathematics that seems to be one of the major problems with mathematics education.

Abstraction plays a large role in mathematics and it's usually the use of mathematics in other disciplines which connects it to students' real world experiences. I was fortunate enough to have learned most of my mathematical knowledge in the context of computer science. Mathematics and computer science share a good deal of common ground. Furthermore, working with computers has become an essential skill for career-readiness in modern times. Learning how technology works adds to ones ability to effectively use that technology. When the Standards for Practice call for students to use computational tools proficiently, the lack of standards addressing how that technology works will hinder the obtainment of that goal. When students use computers or calculators to produce computations, they need to know that they are not working with “real numbers”. Not all real numbers are computable. Until the mathematics curriculum prepares students to tackle such notions, students will not be college or career ready.

One of the considerations for the Core Standards is that they address 21st century skills. Reading the mathematics standards, I do not think that this consideration has been met. The mathematics included in this curriculum is dated and fails to address the advances made in the past century. The Core Standards for math focus on applying known algorithms to problems with known solutions. A 21st century education needs to focus on creating and analyzing algorithms. Students not only need to know algorithms for solving mathematics problems, but need to be able to think critically about the efficiency of those algorithms. The Core Standards are a step in the wrong direction in this regard. All these standards will accomplish is that mathematics education will be catered to address the specific problems covered by the standards. Teachers will teach to the test and the critical thinking skills hinted at in the Standards for Practice will be lost in the assessment process.

I appreciate that these standards are chosen based on evidence from educational research. However, I think that the research supporting these standards is biased by the currently existing assessments. The evidence showing that American students are behind in math means there is still a gap between the standards in place and the skills students need to be college and career ready. A fixed set of national standards is not a viable solution to the problem. What the education system needs is a solid framework for an experimentation cycle in which standards are continually tested and revised to meet the changing needs of students.

Denialist Misrepresentations of Math and Evolution

This is so me right now.

I generally try to avoid flamebait, but I saw this article linked off of Twitter.  I should have stopped reading after the first section where it's clear that the author is a troll.    Evolution and science denialism aside, the misrepresentation of mathematics in the article is inexcusable.

After attacking Darwin and scientific thought in general, an appeal to emotion, he proceeds into a second hand quote from a philosopher on the subject of "fallacies".  It's kind of ironic that the inclusion of this quote would serve as an appeal to authority.

Next, he goes into intelligent design saying:

we could find incontrovertible evidence that reality, matter, life, has been designed, but that interpretation of the evidence would be discarded because naturalism dictates the exclusion of anything which might lead outside of a naturalistic explanation.

This is absolutely false.  Scientific theories are necessarily falsifiable.  If the evidence implied a "design", that's what the scientific theory would be.  The fact is that the evidence points to the contrary.  Biology shows a picture of  "unintelligent design", consistent with a process of genetic mutations occurring over time.  The naturalistic explanation is the one that the evidence supports.

Then he claims that Gödel's Incompleteness Thereom proves this.

He managed to get Gödel's basic background information right, but incorrectly describes the Incompleteness Theorem.

From the article:

  1. Any system that is consistent is incomplete.
  2. The consistency of axioms (axioms=assumptions that cannot be proven) cannot be proved from within the system.

The real Incompleteness Theorems:

  1. Any effectively generated theory capable of expressing elementary arithmetic cannot be both consistent and complete.  In particular, for any consistent, effectively generated formal theory that proves certain basic arithmetic truths, there is an arithmetical statement that is true,[1] but not provable in the theory (Kleene 1967, p. 250).
  2. For any formal effectively generated theory T including basic arithmetical truths and also certain truths about formal provability, T includes a statement of its own consistency if and only if T is inconsistent.

Notice how the part about "basic arithmetic" is conveniently left out of the definition?  That's because the author doesn't want you to know that there can exist axiomatic systems which are both complete and consistent.  First-order predicate logic was proven to be both complete and consistent by none other than Gödel himself.  Furthermore, saying that the Incompleteness Theorem "utterly [destroyed] atheist Bertrand Russell’s logical system for arithmetic" doesn't give Russell the credit he deserves.  Gödel's technique was based on the same idea as Russell's Paradox to begin with.  Despite its incompleteness, the development of Russell's work into Zermelo-Fraenkel set theory was an important building block in the foundation of later mathematics.  By referring to him as "atheist Bertrand Russell", it's clear that the author is more concerned about religion than the actual mathematics.

Next we have a very weak analogy.  He describes three items on a table and says:

Now draw a circle around those items.  We will call everything in that circle a system.  Gödel’s first theorem tells us that nothing in the circle can explain itself without referring to something outside the circle.

It's true that Gödel's theorem succeeded in "stepping out of basic arithmetic", but here's where that omitted condition of a "formal system capable of basic arithmetic" comes into play.   Are a half-full cup of coffee, a fishing pole and a jacket capable of arithmetic?  If the answer is no, then Gödel's theorem doesn't apply.  Capable of self reference?  Maybe if the coffee mug says "I'm a half full cup of coffee" on it.

The analogy of a computer is a much better example.  Computer programs are capable of basic arithmetic.  What Gödel's theorem implies for computers is that there exist certain programs which are computationally irreducible.  The only way to determine the output of such a program is to run it.   If we think of Nature like a computer program, the only way to be certain of the future "output" is to let Nature run its course.   This result does not prevent science from making conjectures about the structure of  Nature, but requires that science adopt a Black-box testing procedure which entails experimentation and observation.  There are certainly unanswerable questions in science, such as the precise position and momentum of elementary particles, but evolution isn't one of them.   The evidence for evolution is incontrovertible.

The final second shift the analogy to the universe and the claim is that what's outside the universe is unknowable.  Just because we can't see what's outside the universe, which would be white-box testing, doesn't mean we can make and test hypotheses about it as a "black-box".  The Many-worlds interpretation of quantum theory is one such example which predicts that our universe is but one of many possible universes.  Similarly, M-theory predicts the existence of hidden dimensions beyond space and time.  Just because some questions are unanswerable, doesn't mean all questions are.

The article ends by claiming that evolution and naturalism are "fallaciously circular", but here's the real circular fallacy:

  1. Author misinterprets Gödel's theorem to imply that all axiomatic systems are incomplete or inconsistent.
  2. Author mistakenly assumes that science is an axiomatic system.
  3. Based on this misinterpretation, author concludes that science must be incomplete or inconsistent.
  4. Since author concludes that complete scientific knowledge is incomplete or inconsistent, author ceases to look for empirical evidence of scientific claims.
  5. Since author ceases to look for evidence, author does not find any evidence.
  6. Since author does not find any evidence, author concludes that scientific knowledge is incomplete.
  7. As a consequence, the author's incomplete knowledge becomes a self-fulfilling prophesy.

This whole article is a Proof by Intimidation.   The "average Joe" doesn't know enough about contemporary math and science to go through and verify each detail.  The use of mathematics vocabulary in the article is deliberately being used to distract the reader from the real issue -- the overwhelming evidence for evolution.   The references to Gödel's Incompleteness Theorem are nothing more than a red herring, and the author even misstates the theorem to boot.

The Misunderstood Generation

I picked up Mark Bauerlein's The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future * or Don't Trust Anyone Under 30 (DG) yesterday and have been up all night reading it. Not because I enjoyed it, but because it made me angry. I should have anticipated this, considering how I'm 27 and the sub-title of the book is “Don't Trust Anyone Under 30”. I'm a part of the generation Bauerlein is talking about, and I consider this book a biased pseudo-scientific misrepresentation of myself and my peers.

It's important to note that I probably represent a fringe case within the generation. I read regularly and tend towards non-fiction literature. The fact that I bought this book in the first place is evidence that I'm an outlier. I play several musical instruments, saxophone and guitar being my favorites. I taught myself how to program in high school and designed websites for local businesses. I started out as a Math Major in college, but eventually double majored in Mathematics and Psychology because I was fascinated with learning how the human mind works. After graduating, I pursued another love of mine, video games, and landed a job as a programmer at a game studio. After a few years, I decided that I wanted to make video games that fostered the development of critical thinking skills. I enrolled myself in graduate school and started teaching remedial math. I'm also a complete technophile, love the latest gadgets and gizmos, and can't stand more than a day without being connected to the Net.

Bauerlein talks quite negatively of video games, and I don't think this criticism is well founded. There's a substantial amount of mathematics that can be found in video games. Gamer communities like the Elitist Jerks (http://elitistjerks.com) use spread sheets and simulation programs to mathematically optimize stats and equipment in World of Warcraft. These massively multiplayer online games are complex mathematical systems, complete with virtual economies and social interaction. “Casual” players might not experience the same depth of content, but the “hardcore” players participate in a substantial amount of meta-gaming and often reflexively analyze their performance to foster continued improvement. I think its unfair to devalue competitive video gaming as simply a leisure activity; I consider such play to be equally as intellectual as playing Chess or Go. I would also note the considerable amount of mathematics, science, and art involved in making the video game itself. From my personal experience, learning to play and create video games directly contributed to my interest in math, science and engineering. There are a myriad of video games that are trivial and superficial, but there are also games I would call “higher art” that challenged my perceptions about storytelling in an interactive medium. Bauerlein doesn't even address the topic of video games as “higher art”. He treats the entire medium as if it were completely devoid of any social value altogether.

What kinds of media does Bauerlein suggest in video games' place? A variety of gems including Harry Potter, Dante, Milton, A Christmas Carol, Rush Limbaugh, Fox News, and the Bible. Bauerlein tries to portray the problem as a cultural war, but these repeated references to religiously themed works also reveal an ideological difference. These were probably intended as generic books and news sources, but the choices used show a pattern of right-wing religious bias. The whole argument is framed like a dichotomy between the conservative-religious-elders and the liberal-secular-youth, as personified by technology. It appears like Bauerlein is more upset about students not reading his culturally biased list of literature than he is about the real faults of our nation's education system.

These are bold claims, but there are good reasons to be skeptical of DG. The information is all second-hand, and no new research is presented. The data that is presented is not even organized into a coherent framework. It reads like series of disconnected statistics are piled on, one after the other, with no consistency in procedure. In themselves, they each sound like reasonable results. However, the data is mostly tangent to the central thesis about the role technology in producing these trends. It gradually turns into “proof by verbosity”, focusing largely on differences in cultural and ideological values which are not scientifically falsifiable hypotheses to begin with. The book repeatedly references “tradition” as an authority, as if the previous generation has some mysterious source of ancient wisdom. Science is conducted in the open. Clinging onto ideas out of tradition alone is not the way to foster progress.

There are a couple of points in particular that seem suspect. First, the inconsistency between falling rates of factual recall and increasing averages on IQ tests. Memorization skill and Intelligence are two entirely separate constructs. The obvious explanation for this phenomena is that the collection of information worth memorizing has changed but general problem solving ability hasn't. The largest drop in the included performance statistics seemed to take place after the turn of the millennium, which is also a bit suspicious given 2001 passage of the No Child Left Behind Act of 2001. It's difficult to compare data from before and after a major legislative change which mandates changes in how student performance is assessed and how teachers teach. There is not enough data here to rule out the interaction of other changes in the educational process as an alternative explanation. In a scientific study, the data should speak for itself. The data presented in DG shows that there is significant need for improvement in education, but it's not enough to indict technology as the singular cause of the problem.

Another point worth making is that DG suffers from a combination of selection and actor-observer biases. In defending Generation M, I'm partially guilty of this myself. I'm an intellectual person and tend to associate with like-minded people. Thus, I have a tendency to generalize the behaviors my peer group appears to the generation as a whole. I think Brauerlein is guilty of this also. He probably tends to associate with the intellectual types and may therefore incorrectly generalizes this intellectualism to his generation as a whole. The second fallacy here, is that there is also a tendency to attribute observed behaviors to personality traits instead of the situation. As Brauerlein acknowledges, it's not unusual for teens to go through a rebellious phase, and the technology usage might just be an expression of this. Consider another option: What if Gen M-ers are being honest when they say the information they're being taught isn't relevant to their lives? Certainly these questions merit additional consideration.

This is a commercial product, which is intended to sell copies, rather than a peer-reviewed study in a scientific journal. The reviews on book cover are all from popular media sources rather than the scientific community. Some of Bauerlein's statistics are certainly interesting, but I don't think they demonstrate anything close to a causal relationship between technology usage and intelligence. He doesn't bother to define “intelligence” and tends to use it interchangeably with “knowledge”. I would have also liked to see an effort to normalize the data and plot it over time in comparison to technology usage rates. He cites plenty of sources showing a deficiency in these skills, but there are still too many external factors to point to technology as the source of the problem. The fact that learners process web information differently than print materials just shows that the two mediums need different approaches.

The language of the book is highly emotionally charged and features numerous stereotypical persuasive devices. It identifies a common enemy for the readers to rally against, uses cultural references to which older readers would relate closely, and tries to make the readers feel like a part of something larger than themselves. Even the choice of title and cover art seems like it was designed to trigger an emotional response rather than promote rational intelligent discourse. I found it particularly interesting how Bauerlein tries to present jazz as a higher art form in opposition to modern rap and rock. The irony is that jazz was all about “breaking the rules”, reversing the established chord progression, and eventually laid the foundation for the modern music which Bauerlein seems to despise so thoroughly.

During my undergraduate study, there were times where I found myself relearning subjects from new perspectives. Gödel's Incompleteness Theorem completely changed how I thought about mathematics and computing. The theorem states that any fixed formal axiomatic system will have statements that are neither provable or disprovable in that system. Math ceased being about prescribed procedures and memorization and turned into an exploration of how different sets of hypothetical rules might behave. It stopped being about blindly following the rules and instead tolerated the bending or even breaking of them. Part of me wished math could had been that way from the beginning. I wanted to provide the “past me” with a variety of different sets of rules and allow me to explore how they work in a controlled environment. That's precisely why I think Games have such potential as a educational medium. They don't need to be Video Games. Board, Card, Dice and Pen & Paper Games have very beautiful and complex mathematical structures lurking just below the surface of the rules.

My active rejection of the traditional values is different from a passive indifference as implied by DG. I might be a statistical anomaly in this cohort, but I don't think I'm alone. Brauerlein might reject the notion that the problem is in the situation and not the students, but my experiences showed me that many things presented as “facts” in middle/high school were quickly replaced by better models in college. Newtonian Physics became M-theory, Math became Meta-Math, and Technology Use evolved into Software Engineering. DG suggests that the curriculum is not “hard” enough, so maybe we just need to stop diluting the truth? I wish I had Logic and Set Theory as topics in grade school. I want “past me” to be allowed the opportunity to build a solid foundation for the “real” math I'll encounter in the “real” world. I don't want to “learn the wrong way now, learn the right way in college”. Why should I trust an authority figure that routinely hides the truth from me because “its too hard”?