Wake up Virginia District 4! Stop the hate!

/rant on

Earlier this week, the Virginian Pilot published an article entitled Forbes versus LeGrow: In God, only one trusts. Rather than focusing on the candidates’ stances on political issues, the article focuses solely on the candidates’ differing religious beliefs. Not only is this coverage thoroughly distasteful, but some of the comments added by readers demonstrate a sickening level of ignorance and intolerance. Voters in Virginia’s 4th Congressional District need to look past religion this November. To do otherwise is to reinforce a culture of bigotry and hate that has plagued this great nation for far too long.

Allow me to start by correcting Mr. Forbes on the language used in the Declaration of Independence. The “Creator” referred to in the Declaration of Independence is not “God” as used in the Christian sense of the word. Rather, the word “Creator” is used here as a metaphor for “Nature”. The Treaty of Tripoli clarifies this, explicitly stating that the US “is not, in any sense, founded on the Christian religion”.

Secondly, Mr. Forbes swore an oath to uphold the Constitution of the United States. Included in the 1st Amendment of the Constitution is the following:

Congress shall make no law respecting an establishment of religion

Mr. Forbes has sponsored two bills which, if passed, would violate this Amendment:

  • H.Con.Res.274 attempts to reaffirm “In God We Trust” as a national motto
  • H.Res.397 falsely characterizes the founding of this nation as being religious in nature

By proposing this legislation, Mr. Forbes has made it clear that he has no intention to adhere to his oath to uphold the Constitution and is therefore unfit to hold office. Mr. Forbes also started a “Congressional Prayer Caucus”, further blurring the line between church and state.

That’s not to mention the fact that Mr. Forbes also participated in Glenn Beck’s rally on 8-28-10, an event which was coincidentally held on the anniversary of Martin Luther King Jr.’s famous “I have a dream” speech and at the same location. Mr. Forbes attendance at this event is an implicit endorsement of Beck’s platform. The issues with such an endorsement are too numerous to list here, so instead I’ll point to this clip from The Colbert Report and leave it at that.

The real issue that I want to address here, are the reactions from the VA Democrats in the Pilot article. State Delegate Lionell Spruill says “I can’t take him to churches as an atheist… That would hurt me.” Really? Contrary to popular belief, atheists do not spontaneously combust upon entering churches. Spruill is not in danger of being physically hurt by bringing Dr. LeGrow into a church. Instead the issue seems to be that Spruill is afraid that supporting an atheist for office may harm his chances for re-election. This behavior is inconsistent with the Democratic party’s platform, which says that the party is committed to “[e]nding racial, ethnic, and religious profiling” (emphasis mine). Heck, even the Republican platform condones this type of behavior. The US Constitution explicitly states that “no religious Test shall ever be required as a Qualification to any Office or public Trust under the United States”. It doesn’t get much clearer than that.

Part of the problem is with constituents like Rev. Jake Manley Sr., who says in the Pilot article: “I could not vote for a man who doesn’t believe in some power higher than his.” Really, this is just a euphemism for “I could not vote for an atheist”. This is religious profiling, and not even very subtle at that! If he had instead said “I could not vote for a Black/White/Mexican/Asian/Christian/Jew/Muslim” there would be public outrage! But for some reason, people think it’s okay to engage in blatant discrimination against atheists. It’s not.

My message to my fellow voters in VA-4 is to not let religion cloud your vision this November. Here we have an opportunity to replace an incumbent who has ignored his Congressional oath with a doctor who cares about providing people with medical care, better education, and a clean energy future. Vote with reason, your nation needs it right now.

/rant off

Rationalize This!

It’s been awhile since I blogged, so I thought I’d take a moment to talk about couple interesting math problems that came up in a conversation I, @SuburbanLion, had on Twitter with @MathGuide, @RepublicOfMath, and @GMichaelGuy about a month ago. The topic of discussion was the role of rationalization problems in Algebra II and whether or not the current curriculum addresses the “conceptual core” of these problems.

A common example that one would see in an Algebra II course is:

  • Rationalize the denominator of

        \[\frac{1}{\sqrt{2}}\]

Or simply:

  • Rationalize

        \[\frac{1}{\sqrt{2}}\]

The expected answer for this problem can be obtained by multiplying the numerator and denominator both by

    \[\sqrt{2}\]

to get:

    \[\frac{\sqrt{2}}{2}\]

. Students might be assigned dozens of such problems in Algebra II. The question to be asking is “Why?”.

I suspect that the obvious answer is historical tradition. These math problems have been passed down from generation to generation as “standard Algebra II problems”, and even the new Core Standards includes “rewrite expressions using radicals and rational exponents” as an objective. Instead of treating these problems as a “means to an end”, these problems have become something of an end in themselves. Algebra II students learn to rationalize expressions because that’s what they’re going to be tested on. End of story.

The real reason for having these problems in Algebra II goes deeper than that. The Core Standards hits on this reason (at least partially) with one of the additional objectives: “Understand that rational expressions form a system analogous to the rational numbers, closed under addition, subtraction, and division by a nonzero rational expression.” While this objective is not specifically talking about radical expressions, the core concept is the same. More concisely, we might say that the important point of rationalizing radical expressions is to come to the conclusion that

    \[\mathbb{Q}+\mathbb{Q}\sqrt{p}\]

is a field.

Ironically, the 93 page Core Standards does not even mention the word “field” even though this is essentially the core concept that students are learning about. Students shouldn’t be rationalizing expressions just for the sake of rationalizing expressions, they should be exploring the intermediate steps between the field of rational numbers and the field of algebraic numbers. By the way, the phrase “algebraic number” doesn’t appear in the Core Standards either!

To make things interesting, G. Michael Guy presented a couple rationalization problems that are typically not included in Algebra II problem sets. These are actually really great examples of the potential complexity involved in rationalization problems.

Let’s warm up with the first one, which is significantly easier:

  • Rationalize

        \[\frac{1}{1+\sqrt[3]{2}}\]

This one has an elegantly simple solution using the sum of cubes factorization:

  •     \[a^{3}+b^{3} = (a + b) \cdot (a^{2}-ab + {b^{2}})\]

Making the substitutions a = 1, b =

    \[\sqrt[3]{2}\]

, we can take advantage of this product to rationalize the denominator:

  •     \[\frac{1}{1+\sqrt[3]{2}}\cdot\frac{1-\sqrt[3]{2}+\sqrt[3]{2}^{2}}{1-\sqrt[3]{2}+\sqrt[3]{2}^{2}} = \frac{1-\sqrt[3]{2}+\sqrt[3]{2}^{2}}{1+\sqrt[3]{2}^{3}} = \frac{1-\sqrt[3]{2}+\sqrt[3]{2}^{2}}{3}\]

This solution is simple enough that this could almost pass for an Algebra II problem. Rationalization problems in Algebra II are something of a gimmick: the problems chosen are special cases designed to have an easy answer. The methods taught in the textbook will solve the given problems, but they don’t generalize well to the larger class of problems. The techniques used in Algebra II provide very little help in rationalizing an expression like

    \[\frac{1}{1+\sqrt[3]{5}+\sqrt[3]{3}^{2}+\sqrt[3]{2}}\]

!

Before we attempt something like this, lets go back to

    \[\frac{1}{1+\sqrt[3]{2}}\]

and come up with a method that will generalize well. This is where easy problems come in handy as a test bed for discovering the broader patterns. The general case of rationalizing

    \[\frac{1}{x_{0}+x_{1}\sqrt[3]{2}+x_{2}\sqrt[3]{2}^{2}}\]

can be reasonably done by hand.

Rationalization of 1/(1+2^(1/3)) by hand

The method used here is not likely to be seen in a typical Algebra II classroom, as it relies on concepts from Linear Algebra which typically aren’t addressed until later. Seems a little backwards if you ask me. The trick here is to think of the product

    \[(x_{0}+x_{1}\sqrt[3]{2}+x_{2}\sqrt[3]{2}^{2})(y_{0}+y_{1}\sqrt[3]{2}+y_{2}\sqrt[3]{2}^{2})\]

as the product of a matrix and a vector:

  •     \[M_{x}\cdot\overrightarrow{y} = [1, 0, 0]\]

Solving this equation is then simply a matter of multiplying both sides by

    \[M_{x}^{-1}\]

. This method extends nicely to even harder problems, including

    \[\frac{1}{1+\sqrt[3]{5}+\sqrt[3]{3}^{2}+\sqrt[3]{2}}\]

. Inverting a 27 by 27 matrix is not something I’d want to be doing by hand, so this is where computers come in handy. Here’s an algorithm for rationalizing

    \[\frac{1}{1+\sqrt[3]{5}+\sqrt[3]{3}^{2}+\sqrt[3]{2}}\]

in Sage. Compare this with Wolfram|Alpha’s result.

I’m a strong believer that computing should play a larger role in mathematics education than it is presently. Not only should the curriculum be addressing the fact that the algebraic numbers form a field, but also that all algebraic numbers are computable numbers. By shifting the focus of discussion from solving problems to finding an algorithm for solving those problems, we can reveal a better picture of the mathematics behind the problem. Simple problems worked out by hand play an important role in the process of designing an algorithm, but an ability to generalize the solution should be the larger goal. If the rationalization problems that students are completing by the dozens do not lead the student in the direction of a general solution, then those problems are not doing their job. Perhaps some “harder” problems are necessary to encourage that generalization.

For the record, “computable numbers” are not referenced once in the Core Standards. It’s hard to give students a 21st century education when the math curriculum is trapped in the early 1900s.

Critical Thinking Chat With Howard Rheingold

Rheingold’s Model of Critical Thinking

Last night I sat in on a conversation with Howard Rheingold on Critical Thinking and have been perusing some of his many videos online.  I learned about the event through the #edchat community on Twitter and followed along on the accompanying #hrchat tag throughout the event. I wanted to take this opportunity to reflect on what I learned out of the experience.

I think Howard’s perspective on critical thinking is consicely summarized by what he describes as 21st Century Literacies. He defines “literacies” as a combination of skills and community.  While slightly different than the classical definition of “literacy”, which refers to a person’s ability to read and write, I think this is a powerful metaphor for the direction that education needs to take.  Borrowing the words of Lawerence Lessig, it moves the conversation about critical thinking from a “read only” culture to a “read-write” culture.  Indeed, I think this transition is very important and agree with Howard on the necessity of his five 21st Century Literacies:

#1: Attention

In order for any kind of learning to take place, the learner must focus to the knowledge to be acquired.  Indeed, if I not been paying attention to the discussion then I wouldn’t have anything to be writing about here!  One of the things I liked about Howard’s approach is that he doesn’t seem to view social networking as a plague on the classroom, but rather recognizes that there can be a balance in attention between focus on a single task and multi-tasking.  Those attending the discussion know that in addition to the audio discussion taking place, there was also a chat room overflowing with information and a tweeter feed.  Attention works two ways: information is “read” from various sources and a “write” process takes place where attention is re-focused as necessary.

#2: Participation

I think one of the key features of critical thinking is that it is an active process, not a passive one.  It’s not sufficient for me to simply watch and listen, but it is necessary for me to participate in the discussion.  If I were not to put my own thoughts forward in this post, I would be denied the experience of finding out how others react to what I might have to say. If I get something wrong here, I hope that someone out there will correct me!  The “read” process of participation is the act of reading or listening to information is essential, but the “write” process of producing new information is equally important.

#3: Cooperation

Like Howard, I value the importance of critical thinking in our culture.  By myself alone, I don’t have the authority to make the cultural changes necessary to make critical thinking permeate the society at large.  Cooperation makes it possible for like minded individuals to achieve more than any one could do on his/her own.  The important first step is to come in social contact with these individuals to foster collaboration.  This is essentially the “read” component of cooperation.  Had it not been for my exposure to Twitter, I would likely have never known about the powerful learning community that exists there.  The “write” component of cooperation is the collaboration of multiple individuals to produce rich content like the discussion I experienced yesterday.

#4: Critical Consumption (Crap Detection)

The literacy of critical consumption, or “crap detection”, seemed to be the major focal point of Howard’s talk.  This is certainly a valuable skill in the 21st century as there is a lot of “crap” out there on the Internet.  The focus of Howard’s Crap Detection 101″ is on determining the validity of Internet sources.  The #hrchat discussion opened with a barrage of “fake” resources.  The process of determining which sources are reliable and which ones are not is mostly a “read” process.  Looking up the information provider in a search engine or querying WHOIS for the domain holder are some of the ways to identify where the information is coming from.  There’s also a “write” component to critical consumption, and that is that the reliability of sources is also determined by a vetting process where other individuals vouch for the accuracy of a source.  As Howard notes, this vetting process for Internet sources is not quite as reliable as it was may have been for print sources.  The Internet does not contain a centralized review board that attempts to validify every webpage, blog and tweet.

#5: Network Awareness

Out of the 5 literacies, this is arguably the most vaguely defined.  Howard himself has suggested that this literacy may need a better name.  This notion of a “network” is a powerful concept in a variety of disciplines.  In mathematics, we have this notion of graphs which consist of multiple points connected by edges.  In computing, we have a notion of digital networks which are multiple computers connected by data tranmissions.  In sociology, we have social networks where multiple individuals are connected through communication.  In biology, we have neural networks which form the brain.  I think what Howard is getting at, is that in order for critical thinking to take place it is necessary to understand the context that learning is taking place in.  Learners in the 21st Century are connected through both digital and social networks, and understanding the architecture of those networks is necessary for actively participating in them.  I feel that the reason this is so important is that it is a foundation for metacognitition, the skill of critical thinkers to “think about thinking”.  However, this literacy also contains a community of connections and it’s the relation between ourselves and our connections that allows critical thinking to take place.  One of the points that Howard emphasizes is that, like the Internet, this learning network is decentralized.  There is no one single authority overseeing the entire network.

I’d like to propose an alternative term for this literacy and that is Swarm Intelligence.  Swarm Intelligence is much like the phenomena seen in ant colonies.  Each member of the swarm acts autonomously based on their own sensory information.  However, as large numbers of these individuals group together, the overall behavior of the swarm displays higher level behavior than any one individual.  Swarms are decentralized, self-organizing, intelligent agents.  Each agent “reads” information locally through senses and “writes” information locally by making decisions, but complex patterns emerge in the behavior of large numbers of these agents.  Swarm Intelligence allows for phenomena like “smart mobs” to take place.  I think the important point to be made about swarms is that the global behavior of the swarm can be changed radically by introducing “viral” changes in the behavior of individuals.  Knowing how changes in one’s local behavior will affect the global behavior of the community is an essential part of critical thinking.

Refining Rheingold’s Model

While I agree with the necessity of these 5 literacies, my main critique is that I feel there are 2 important literacies which are absent from this list.  In the discussion of Internet searches, Howard identifies two steps in the process of knowledge acquisition when using a search engine.  The first is to frame the question to come up with the search terms, and the second is to verify the legitimacy of the results.  My critique of this is that I feel this is only half the process of what I consider critical thinking to be.  In addition to searching for the answer and verifying the reputation of the discovered sources, I think that critical thinking also entails the process of making logical deductions from this information and scientifically testing the results.  I’d also argue that these processes are not just skills, but “literacies” in Howard’s definition of the term.

Proposed #6: Logical Reasoning

Much of Howard’s discussion focuses on how to distinguish between reliable and unreliable information.  However, I think that this underemphasizes the importants of making the correct logical deductions from that information which has determined to be reliable.  To draw an analogy, “Critical Consumption” is to the “soundness” of an argument as “Logical Reasoning” is to the “validity” of an argument.  This is more than just “Participation” in that there is a clear distinction between what is true and what is false.  The simple act of participation is sufficient in the creation of art and music, but in fields like mathematics and philosophy there is a need for creation to be strictly restrained within the rules the rules of the system.  Logical reasoning is a skill that is used by an individual, but there is also a need for these logical deductions to be verified by the community.  One can “write” a proof, but it doesn’t qualify as a proof until others “read” and verify it.  The importance of community involvement is evident in the history of mathematics and philosophy, but at the same time the truth of this information is independent of the author’s reputation.  It’s not true by an authority, but rather it is true because of the abstract nature of truth itself.

Proposed #7: Scientific Experimentation

The scientific method is arguably one of the most rigorous approaches to distinguishing truth from fiction.  To quote Steven Schafersman: “Critical thinking can be described as the scientific method applied by ordinary people to the ordinary world.”  I’ve chosen to label this literacy as “Scientific Experimentation”, to make a slight distinction between it and the “scientific method”.  The scientific method is a specific skill, whereas what I’m attempting to describe is the combination of this skill with a community of experts.  “Scientific Experimentation Literacy” is cycle of experimentation, evaluation, publication and review that produces new scientific knowledge.  I feel that understanding how this process works is a crucial part of critical thinking.  Many steps in the scientific method are covered by the previous literacies.  Observation requires both “Attention” and “Critical Consumption”.  Forming a hypothesis is an act of “Participation”.  Experimentation requires an understanding of “Network Awareness”, as the experimentor has a local influence on the phenomena being measured.  Data analysis requires an act of “Logical Reasoning”.  Peer review is an act of “Cooperation”.  Where I feel this literacy differs is in the act of forming conclusions.  While determining the legitimacy of sources is an important literacy, one is still ultimately taking the author’s word for it.  As the old addage goes: “A lie is a lie even if everyone believes it and the truth is the truth even if no one believes it”.  In contrast to logical arguments, scientific truth is not something abstract that can be known a priori.  The scientific experiment is the last line of defense for a critical thinker.  A well designed scientific experiment can prove even the most credible sources wrong.  Even the Standard Model of Physics is subject to revision, as we’re beginning to see in experiments conducted with high energy particle accelerators.  This is different than “Network Awareness”, or “Swarm Intelligence”, in that there is a centralized authority on the true information, which we call Nature.  The scientific experiement “reads” data from Nature, and “writes” conclusions which are then verified by independent reproduction in the community.

What’s the point of all these literacies?

One of the more vaguely answered questions in the #HRchat with Howard is how teachers should encourage critical thinking in the classroom. The general response was that critical thinking skills were implicit in everyday activities.  Tuesday’s #edchat topic was on how teachers can assess critical thinking in the classroom, which also seemed to remain an open question.  While some may insist that critical thinking is not something that can be taught or assessed, I think that identifying these “critical thinking literacies” may provide a scaffold for allowing teachers to better address these problems.  Each of these literacies does cover certain skills which teachers can provide guidance on and possibly measure results in.  However, as Howard points out, it’s the combination of these skills with learning communities that allows critical thinking to emerge.  By providing a combination of instruction in critical thinking skills with a community that supports the critical thinking process, maybe we as teachers can help a generation of critical thinkers to flourish.

Published
Categorized as Education

An Open Letter To Barack Obama On National Day of Prayer

Dear President Barack Obama,

I’m writing today to urge you to reconsider your position on the National Day of Prayer. I was most displeased to hear that you will continue to acknowledge the National Day of Prayer, despite the recent Supreme Court Ruling of its unconstitutionality. I feel that Judge Crabb’s ruling in this case was correct. While the White House argues that this ruling does not prevent you from issuing a Presidential Proclamation recognizing this day, doing so ostracizes a significant body of your constituents and contradicts the spirit of the Constitution.

I would like to say that I supported you in the 2008 election. During your campaign, you presented yourself as a man of reason and principle. Having moved from California to Virginia earlier that year, I felt like my vote made a difference for the first time in my life. This feeling was reaffirmed during your acceptance speech when you specifically thanked “non-believers” among other groups. As a atheist, this was the first time I had heard any President speak of “non-believers” in a positive light. I felt a glimmer of hope that CHANGE was possible.

Since then, that glimmer of hope has been gradually dying out. You promised to end the war in the Middle East, were awarded the Nobel Peace Prize, and yet the military occupation continues. You promised to end the arrests of medical marijuana users acting in accordance with state laws, and yet the DEA raids have continued. Dreams of single-payer health care were reduced to hopes for a public-option, and eventually turned into “be happy you got any health care reform at all”. You promised an environmentally conscious energy policy, but shifted your stance to support offshore drilling about a month before the BP oil spill. You continue to proclaim support for ending “Don’t Ask, Don’t Tell”, but I’m beginning to doubt that this will go through either.

In the times of old, people prayed for things that were beyond their control. People prayed for rain in periods of drought. When that didn’t work, they offered virgin sacrifices. Nowadays these practices are largely obsolete. Instead of praying for rain, we build aqueducts and irrigation systems. Instead of praying for the sick to improve in health, we intervene with medical treatment. While some people continue to pray in times of desperation, I am not one of them. For myself and others like me, the act of prayer is considered an ineffective method for bringing about change. Actions consistently provide better outcomes than prayers. This is my request to you: instead of a Day of Prayer, proclaim May 6th as a Day of Action. The American people didn’t elect you to office to “pray for change”, they elected you to “act for change”.

Make no mistake, such a declaration would undoubtedly draw heat from the religious community. Bear in mind that we atheists suffer through this discrimination everyday of our lives. Hate mail and death threats are no strangers to atheists who speak their minds. The separation of church and state is one of the founding principles of this nation, set forth in the Constitution that you have sworn to uphold, and I hope that you can set aside your personal views to uphold the rights of the “non-believers” who helped elect you to your present position. To pursue an appeal of Judge Crabb’s decision is a waste of government resources. There are more pressing matters that need your attention.

Please Mr. President, use May 6th to bring us a real moment of “peace and goodwill” by withdrawing our nations troops from their posts overseas. Prove your commitment to treat everyone with “dignity and respect” by ending “Don’t Ask, Don’t Tell”. Show that the right “to love one another” extends to everyone, including those in the LGBT community, by making a motion to repeal the “Defense of Marriage Act”. Set an example for what it means “to understand one another” by not alienating non-believers with a “Day of Prayer”. Do these and show that CHANGE is not beyond our control.

Street Fighter Mathematics

The American Mathematical Society is celebrating April 2010 as Math Awareness Month and the theme for this year is Math and Sports. This was just the excuse I needed to write about the role of mathematics in what I consider an emerging sport: competitive video gaming. Acceptance of video games as a sport hasn’t reached popular acceptance yet, but the particular type of video game that I’ll be focusing on does have a real world analog which is a “sport” in the classical sense of the word. The “fighting game” genre, exemplified by Capcom‘s Street Fighter series, is the video game counter-part to mixed martial arts competitions. Despite the fantasy indulgences of video games, the mathematical concepts discussed here can be extended to real martial arts and a variety of other sports.

The basis for this discussion is a branch of mathematics referred to as Game Theory. Game Theory has a wide range of applications from economics and politics to evolutionary biology. At the heart of Game Theory are games, which have several key components. First, a game needs one or more players. Second, there needs to some choices available to these players. Finally, there needs to be a set of rules determining the pay-off for each player – such as whether a player wins or loses. The focus of Game Theory is on strategies, which are guidelines that a players uses for making choices in the game to obtain the most favorable outcome.

Ask any competitive player about the Game Theory of fighting games and they’ll tell you that the genre essentially boils down to Rock, Paper, Scissors. In Rock, Paper, Scissors, each of the 2 players simultaneously makes a hand gesture for one of the three choices. Rock beats Scissors, Scissors beats Paper, and Paper beats Rock. The goal of Rock, Paper, Scissors is to guess which gesture your opponent is going to throw and to play the gesture that beats its. Keep in mind that your opponent is going through the same line of reasoning, trying to guess which move you think they’re going to play so that they can play the move that beats your move!

Mathematically, games like Rock, Paper, Scissors are represented by what’s called a pay-off matrix. For a 2 player game, the choices for player 1 might be represented by different rows in the matrix and the choices for player 2 would be the different columns. The cells of the matrix contain a pair of numbers representing the pay-off for each player. For Rock, Paper, Scissors, we can describe the pay-off by using a 1 if the player wins, a -1 if the player loses, and a 0 if the players tie. This type of game is what we call a Zero-sum game, because every win for one player corresponds to a loss for the other player. The pay-off matrix would look like this:

P2: Rock P2: Paper P2: Scissors
P1: Rock (0,0) (-1,1) (1,-1)
P1: Paper (1,-1) (0,0) (-1,1)
P1: Scissors (-1,1) (1,-1) (0,0)

For example, consider the case where player 1 throws Rock and player 2 throws Scissors. We look for the row associated with Rock for player 1, row 1, and the column associated with Scissors for player 2, column 3. The contents of the cell at row 1 column 3 is the ordered pair (1,-1), which signifies that player 1 gets a win (+1) and player 2 gets a loss (-1). Furthermore, this game is symmetric because the options and pay-offs are the same for each player. Rock, Paper, Scissors is often used as a sort of “coin toss” because each player has equal odds of winning. If both players throws a random choice each round, player 1 should win about 1/3 of the time, player 2 should win about 1/3 of the time and the remaining 1/3 result in a tie. Rock, Paper, Scissors is a perfectly balanced game. The only way of winning consistently is to be able to anticipate your opponent’s move in advance.

Fighting games are much more complex than Rock, Paper, Scissors, due in part by the large number of choices available to each player. For a game like Street Fighter II, with a 8-way joystick and 6 buttons, there are 576 possible inputs at any given point in time. This is far too many choices for me to list in a table, so for the moment lets focus on a simpler example. Consider a very simple fighting game with only 3 moves: an overhead attack, a high attack, and a low attack. These three attacks are analogous to the choices in Rock, Paper, Scissors. An overhead attack beats a low attack, a low attack beats a high attack, and a high attack beats an overhead attack. The pay-off matrix for this hypothetical fighting game might look like this:

P2: Overhead P2: High P2: Low
P1: Overhead (0,0) (-10,10) (10,-10)
P1: High (10,-10) (0,0) (-10,10)
P1: Low (-10,10) (10,-10) (0,0)

Normally fighting games determine the result of a game by starting each player with a set amount of health and this is reduced when hit by the opponent. Perhaps each player starts with 100 health and each attack deducts 10 health from the other player. However, it’s important not to think of these values as adding or subtracting health but as the net benefit towards that player. Dealing damage to the opponent makes the opponent closer to losing and consequently making the dealer that much closer to winning. You might think of this net benefit like “tug-of-war”, like where each move is a step in the direction of winning. Even the simple act of changing position in a fighting game can have benefits towards one player or another!

Given that different characters have different strengths and weaknesses, another level of Rock, Paper, Scissors tends to emerge at the character selection level. For example, we might see a game where long range characters tend to have an advantage over short range characters, medium range characters tend to have an advantage over long range characters, and short range characters tend to have an advantage over medium range characters. However, the characters might not all be perfectly balanced with one another.

Indeed, it is often the case where some characters might be better or worse than others when you look at all possible match-ups. In fighting games, these are referred to as Tier Rankings. Consider this example for Street Fighter IV. Each character plays against each other for 10 games, and the number of wins are recorded in the corresponding cell in the match-up table. The ranking is determined by how often a character wins across all possible opponents. In Street Fighter IV, playing Sagat is at an advantage against any given character, while Dan is at a disadvantage against any given character.

In these examples so far, the rules of the game work the same for both players – the pay-off matrices are symmetric. The ability to choose different characters in a fighting game leads us away from symmetric pay-offs into asymmetric pay-offs. Let’s consider a variation of our 3-move fighting game in which the pay-offs are a little bit different:

P2: Overhead P2: High P2: Low
P1: Overhead (-5,5) (-10,10) (10,-10)
P1: High (10,-10) (0,0) (-5,5)
P1: Low (-15,15) (10,-10) (5,-5)

Here, player 2 has a stronger overhead attack but at the expense of a weaker low attack. A question that a Game Theorist would ask in such a game is “what is the optimal strategy for each player under these conditions?”. One way of looking at this is by applying the Minimax rule. Each player is assumed to minimize potential losses then maximize potential gains. Let’s take a look at the best and worst case scenario for each player:

P2: Overhead P2: High P2: Low Min Loss for P1 Max Win for P1
P1: Overhead (-5,5) (-10,10) (10,-10) -10 10
P1: High (10,-10) (0,0) (-5,5) -5 10
P1: Low (-15,15) (10,-10) (5,-5) -15 10
Min Loss for P2 -10 -10 -10
Max Win for P2 15 10 5

Player 2 stands to lose the same no matter what player 1 plays (-10), but stands to gain the most by playing overhead. We’re presented with a situation where it is now to player 2’s benefit to use the overhead attack. Player 1 stands to win the same no matter what player 2 chooses (10), but he has the least to lose by going with the high attack (-5). Knowing that player 2 stands to benefit from using the overhead, player 1 can compensate for this by playing high attack.

However, the picture changes as we add multiple plays to the picture. If both player’s always adhered purely to their respective optimal strategies, player 1 would consistently win by playing high against player 2’s overhead. If player 2 realizes that player 1 is predominantly playing high, it’s to his advantage to “mix-it-up” and play low once in a while to punish player 1 for using a pure strategy. In practice, what we often find in these games are mixed strategies in which the players choose the different attacks with varying probabilities. Player 1 won’t always play high and player 2 won’t always play overhead, but they might tend to play these more often than the other choices.

The study of how players will optimize their strategies is a fascinating area of mathematical research. The Nash-Equilibrium is pair of mixed strategies for player 1 and player 2 such that neither player stands to benefit from changing their strategies while the other keeps theirs the same. A closely related concept is the notion of an Evolutionary Stable Strategy (ESS). Once an ESS is adopted by the majority of the population, it will not be invaded by any alternative strategy.

What impresses me most about playing fighting games is the way players tend to converge on the optimal strategies. Finding the optimal mixed strategy for a two-player bi-matrix game (one with different pay-off for different players) is known to be computationally hard problem. The most efficient method (that I know of) for finding the Nash-Equilibrium is the Lemke-Howson Algorithm, which takes exponentially longer to carry out as the number of choices increases. Yet somehow “pro” players seem to naturally discover these strategies despite the huge number of choices in a typical fighting game. In many ways, this process is much like the emergence of evolutionary stable strategies. The players with bad strategies lose, the players with good strategies win, and the winners gradually converge on an optimal strategy based on the behavior of the other players in the pool. Evolution seems to be quite proficient at solving optimization problems.

At this point I’ve probably raised more questions than I’ve answered, but I see this as a “good thing”. There’s a wealth of beautiful mathematics taking place in these games and just shedding some light on these problems is all I could hope for. Some of the problems discussed here are ones that will keep me searching for answers in years to come. The next time you find yourself in an arcade (one of the few that remain), take a moment to stop and watch a game of Street Fighter. I hope you’ll find it as awe-inspiring as I do.

Core Standards for Mathematics Feedback

What follows below is the feedback I provided on the proposed Core Standards for Mathematics.  These represent my own opinions on the direction mathematics reform should take.  As far as I know, the changes I propose have not been sufficiently supported by research.  However, I hope I may provide a fresh perspective on the direction the mathematics curriculum should take to address some of the existing problems.

I’d like to start with some background about myself, to provide a context for this critique. I was an accelerated math student in high school and majored in Mathematics in college. These views are my subjective interpretations of the mathematics curriculum as I experienced it, and the areas of mathematics in college that I felt unprepared for. My objection to the proposed Core Standards is that I do not view them as reforming these areas where I felt unprepared. Instead, all the Core Standards seem to do is set in stone the same failing curriculum I experienced as a student.

Overall I feel that the Standards for Practice expressed on pages 4-5 are solid goals for mathematics education to strive for. However, my major critique of the Core Standards is that I do not think they are sufficient to meet these goals. Two of the Standards for Practice strike me as being unsupported by the proposed curriculum: (3) Construct viable arguments and critique the reasoning of others and (5) Use appropriate tools strategically. The former requires a solid foundation in logic, set theory and critical thinking, while the latter requires an introduction to computation science. The standards that follow do little to reinforce these skills.

In order for students to construct and critique arguments, the students must first know the basic structure of an logical argument. How can students be expected to give valid arguments when the definitions of validity, soundness, completeness, and consistency are omitted from the mathematics standards? The only objective that seemed to actually address this was in the High School Algebra Standards: “Understand that to solve an equation algebraically, one makes logical deductions from the equality asserted by the equation”. I respect that this objective is included, but don’t think that the curriculum leading up to it adequately supports it. When I was first exposed to college level mathematics, the notation system of formal logic was used extensively by my professors. My education leading up to that point had not covered this notation, and it felt like I was learning a second language in addition to the mathematical concepts. The notions of formal logic are not complicated, but earlier exposure to these ideas would have made me more prepared for college mathematics.

In my opinion, the K-8 standards are too focused on number and computation. The objectives covered in the K-8 curriculum reflect a notion of numbers consistent with the view of mathematics in the early 1800s. In 1889, when Peano’s Axioms were introduced, the mathematical notion of a “natural number” changed. The “natural numbers” have been redefined in terms of set theory since the 1900s. Students need to have a concept of numbers that are consistent with the modern set theoretic constructions. The pen-and-paper computations covered in elementary school are valuable, but have in many instances been replaced by technology. It’s not enough for students to know how to perform addition, but the modern student needs to also know that the operation of addition is analogous with joining sets. In mathematics, number theory is built upon the foundation of set theory. Focusing on numbers before sets is illogical considering the hierarchy of mathematical knowledge.

Set theory is also based on logic, which is also missing in action. Several of the objectives mention making logical conclusions about problems, but where’s the logic? The mathematical definitions of “and”, “or”, “xor”, “not”, “nor”, “nand”, “implies”, “for every”, “there exists” and “proves” are absent from the standards. The relationship between the basic operations of logic and those of arithmetic needs to be thoroughly established. This is not only important from a mathematical standpoint, but it is essential to learning how computational technology works. The operations of arithmetic can be build from the basic building blocks of logic, and that is how computers manage to produce the calculations that they do. All students will work with technology in some manner or another, and developing an understanding of how that technology works will make them more effective users of technology.

The focus of the K-8 curriculum is to develop students’ understanding formal axiomatic systems. Mathematics should be presented in the form a game. The rules of the game determine patterns that are produced in the process. Too much of a focus on the outcomes underemphasizes the importance of the rules to begin with. Algebra, in particular, requires students to rewrite expressions using the properties of numbers. The failure of the curriculum is that students have no prior experience with substitution systems. The algebra student is essentially thrown into the game without knowing the rules. It’s no wonder that algebra presents such a challenge for mathematics eduction.

In the High School Standards, my main objection is to the separation between the general curriculum and the STEM curriculum. Most of the objectives labeled as STEM objectives, should be included in the general curriculum. STEM students need a more thorough picture of mathematics than that presented here. As an example, complex numbers are treated as a STEM only topic in the standards. The Fundamental Theorem of Algebra depends on the field of complex numbers and all students should be exposed to this result. Students preparing for STEM need to go beyond complex numbers to constructions such as dual numbers and quaternions which would help prepare students to acquire a more general notion of Clifford algebras in college. These tools may seem obscure to traditional educators, but are essential tools to physicists and engineers. Quaternions have several useful properties that make them ideal for modeling rotations, and dual quaternions can be used to represent rigid transformations.

One of my pet peeves about high school mathematics is that the picture of physics presented there was radically different than the picture presented in college. As an example, consider the following algebra problem: “Joe and Sue live 10 miles apart. Joe heads towards Sue’s house at 5mph and Sue heads towards Joe’s house at 3mph. If they both leave at the same time, how long until they meet?” Questions like this are usually represented as linear equations, like “3x+5x=10”. However, this gives students the false impression that this is an accurate model of velocity in the real world. The fact of the matter is that this equation is only reasonable for small numbers. A change in the numbers included in this problem could invalidate the model. Consider the modified question, “Joe and Sue live 10 light-years apart. Joe heads towards Sue’s house at .8c and Sue heads towards Joe’s house at .9c. If they both leave at the same time, how long until they meet?” Under these figures, a linear model of time is no longer accurate. Time slows for Joe and Sue relative to a stationary observer, and the question of “how long until they meet” is more complicated than it appears on the surface. If they each start a stopwatch at the moment of departure, their clocks will have different readings when they meet. Students in a STEM course of study need to understand that velocity in space-time is fixed at the speed of light, and what we perceive as motion is a rotation of this space-time vector.

There are several other topics that I consider important for STEM students that go beyond the standards. The study of triangles in geometry needs to extend to ordered triangles, and the linear algebra needed to manipulate such triangles. The notion of “angle” as presented in the high school curriculum is insufficient. Students need to start thinking about angles as they relate to the dot product (inner product) of vectors. The trigonometric functions of sine and cosine need to be connected to complex exponents by way of Euler’s formula: “eix = cos x+i sin x”. The properties of logarithms also need to be explicitly covered in the curriculum. Covering these notions in the high school curriculum would make students better prepared for subsequent studies of calculus and geometry.

Some of these suggestions may seem like obscure areas of mathematics, but my argument is that they shouldn’t be. If the purpose of K-8 is to develop an understanding of what formal axiomatic systems are, then the focus of 9-12 should be on discovering the useful properties that result from the standardly accepted axioms. I once conducted a job interview where a mathematics student, like me, was applying for a software engineering position, also like me. My employer had brought me into the interview to determine whether or not the candidate was ready to apply his mathematically expertise to computer programming. During the interview, the candidate mentioned quaternions as one of his areas of interest. In the software developed at this company, orientations and rotations are routinely stored and manipulated as quaternions. When I asked how the candidate would use quaternions to compute a rotation, he was stumped. He also became extremely interested in the interview at that point and was eager to learn more about the technique. My question had revealed that the abstract mathematics he was familiar with had a real purpose behind it – a practical use within the field of computer science. It’s this kind of disconnection between abstract and applied applied mathematics that seems to be one of the major problems with mathematics education.

Abstraction plays a large role in mathematics and it’s usually the use of mathematics in other disciplines which connects it to students’ real world experiences. I was fortunate enough to have learned most of my mathematical knowledge in the context of computer science. Mathematics and computer science share a good deal of common ground. Furthermore, working with computers has become an essential skill for career-readiness in modern times. Learning how technology works adds to ones ability to effectively use that technology. When the Standards for Practice call for students to use computational tools proficiently, the lack of standards addressing how that technology works will hinder the obtainment of that goal. When students use computers or calculators to produce computations, they need to know that they are not working with “real numbers”. Not all real numbers are computable. Until the mathematics curriculum prepares students to tackle such notions, students will not be college or career ready.

One of the considerations for the Core Standards is that they address 21st century skills. Reading the mathematics standards, I do not think that this consideration has been met. The mathematics included in this curriculum is dated and fails to address the advances made in the past century. The Core Standards for math focus on applying known algorithms to problems with known solutions. A 21st century education needs to focus on creating and analyzing algorithms. Students not only need to know algorithms for solving mathematics problems, but need to be able to think critically about the efficiency of those algorithms. The Core Standards are a step in the wrong direction in this regard. All these standards will accomplish is that mathematics education will be catered to address the specific problems covered by the standards. Teachers will teach to the test and the critical thinking skills hinted at in the Standards for Practice will be lost in the assessment process.

I appreciate that these standards are chosen based on evidence from educational research. However, I think that the research supporting these standards is biased by the currently existing assessments. The evidence showing that American students are behind in math means there is still a gap between the standards in place and the skills students need to be college and career ready. A fixed set of national standards is not a viable solution to the problem. What the education system needs is a solid framework for an experimentation cycle in which standards are continually tested and revised to meet the changing needs of students.

Denialist Misrepresentations of Math and Evolution

This is so me right now.

I generally try to avoid flamebait, but I saw this article linked off of Twitter.  I should have stopped reading after the first section where it’s clear that the author is a troll.    Evolution and science denialism aside, the misrepresentation of mathematics in the article is inexcusable.

After attacking Darwin and scientific thought in general, an appeal to emotion, he proceeds into a second hand quote from a philosopher on the subject of “fallacies”.  It’s kind of ironic that the inclusion of this quote would serve as an appeal to authority.

Next, he goes into intelligent design saying:

we could find incontrovertible evidence that reality, matter, life, has been designed, but that interpretation of the evidence would be discarded because naturalism dictates the exclusion of anything which might lead outside of a naturalistic explanation.

This is absolutely false.  Scientific theories are necessarily falsifiable.  If the evidence implied a “design”, that’s what the scientific theory would be.  The fact is that the evidence points to the contrary.  Biology shows a picture of  “unintelligent design”, consistent with a process of genetic mutations occurring over time.  The naturalistic explanation is the one that the evidence supports.

Then he claims that Gödel’s Incompleteness Thereom proves this.

He managed to get Gödel’s basic background information right, but incorrectly describes the Incompleteness Theorem.

From the article:

  1. Any system that is consistent is incomplete.
  2. The consistency of axioms (axioms=assumptions that cannot be proven) cannot be proved from within the system.

The real Incompleteness Theorems:

  1. Any effectively generated theory capable of expressing elementary arithmetic cannot be both consistent and complete.  In particular, for any consistent, effectively generated formal theory that proves certain basic arithmetic truths, there is an arithmetical statement that is true,[1] but not provable in the theory (Kleene 1967, p. 250).
  2. For any formal effectively generated theory T including basic arithmetical truths and also certain truths about formal provability, T includes a statement of its own consistency if and only if T is inconsistent.

Notice how the part about “basic arithmetic” is conveniently left out of the definition?  That’s because the author doesn’t want you to know that there can exist axiomatic systems which are both complete and consistent.  First-order predicate logic was proven to be both complete and consistent by none other than Gödel himself.  Furthermore, saying that the Incompleteness Theorem “utterly [destroyed] atheist Bertrand Russell’s logical system for arithmetic” doesn’t give Russell the credit he deserves.  Gödel’s technique was based on the same idea as Russell’s Paradox to begin with.  Despite its incompleteness, the development of Russell’s work into Zermelo-Fraenkel set theory was an important building block in the foundation of later mathematics.  By referring to him as “atheist Bertrand Russell”, it’s clear that the author is more concerned about religion than the actual mathematics.

Next we have a very weak analogy.  He describes three items on a table and says:

Now draw a circle around those items.  We will call everything in that circle a system.  Gödel’s first theorem tells us that nothing in the circle can explain itself without referring to something outside the circle.

It’s true that Gödel’s theorem succeeded in “stepping out of basic arithmetic”, but here’s where that omitted condition of a “formal system capable of basic arithmetic” comes into play.   Are a half-full cup of coffee, a fishing pole and a jacket capable of arithmetic?  If the answer is no, then Gödel’s theorem doesn’t apply.  Capable of self reference?  Maybe if the coffee mug says “I’m a half full cup of coffee” on it.

The analogy of a computer is a much better example.  Computer programs are capable of basic arithmetic.  What Gödel’s theorem implies for computers is that there exist certain programs which are computationally irreducible.  The only way to determine the output of such a program is to run it.   If we think of Nature like a computer program, the only way to be certain of the future “output” is to let Nature run its course.   This result does not prevent science from making conjectures about the structure of  Nature, but requires that science adopt a Black-box testing procedure which entails experimentation and observation.  There are certainly unanswerable questions in science, such as the precise position and momentum of elementary particles, but evolution isn’t one of them.   The evidence for evolution is incontrovertible.

The final second shift the analogy to the universe and the claim is that what’s outside the universe is unknowable.  Just because we can’t see what’s outside the universe, which would be white-box testing, doesn’t mean we can make and test hypotheses about it as a “black-box”.  The Many-worlds interpretation of quantum theory is one such example which predicts that our universe is but one of many possible universes.  Similarly, M-theory predicts the existence of hidden dimensions beyond space and time.  Just because some questions are unanswerable, doesn’t mean all questions are.

The article ends by claiming that evolution and naturalism are “fallaciously circular”, but here’s the real circular fallacy:

  1. Author misinterprets Gödel’s theorem to imply that all axiomatic systems are incomplete or inconsistent.
  2. Author mistakenly assumes that science is an axiomatic system.
  3. Based on this misinterpretation, author concludes that science must be incomplete or inconsistent.
  4. Since author concludes that complete scientific knowledge is incomplete or inconsistent, author ceases to look for empirical evidence of scientific claims.
  5. Since author ceases to look for evidence, author does not find any evidence.
  6. Since author does not find any evidence, author concludes that scientific knowledge is incomplete.
  7. As a consequence, the author’s incomplete knowledge becomes a self-fulfilling prophesy.

This whole article is a Proof by Intimidation.   The “average Joe” doesn’t know enough about contemporary math and science to go through and verify each detail.  The use of mathematics vocabulary in the article is deliberately being used to distract the reader from the real issue — the overwhelming evidence for evolution.   The references to Gödel’s Incompleteness Theorem are nothing more than a red herring, and the author even misstates the theorem to boot.

Guild Wars Solo Me/D

So I noticed a new comment on my old Guild Wars Me/D video, and it enticed me to play again.  Unfortunately, the build in the video had been completely changed in a recent patch.  Sand Shards no longer deals damage on misses, but instead causes a AoE DoT effect when it ends.  Whereas the old version pumped out a lot of damage if there were multiple foes to miss, the new version has lower damage output and causes the AI to scatter even easier.  Thus, I set out to see if I could make a new solo build with the same profession combination.

The survivability of the old build was still pretty much intact, but I no longer needed the self-blind of Signet of Midnight due to the change in Sand Shards functionality.  I went over to the Zaishen arena and started to play around.  What I found was that I could keep myself alive without blind by using a combination of Mystic Regeneration, Mirage Cloak and Armor of Sanctity.  I changed the elite skill to Signet of Illusions and reallocated my stat points.  This opened up a variety of other options for damage.  After some experimentation, I was able to down IWAY with the following combination:

Signet of Illusions – makes all skills use Illusion attribute (16)

Mystic Regeneration, Mirage Cloak. Armor of Sanctity – keep me alive

Channeling – energy management

Mystic Twister, Dust Cloak – additional damage

Faithful Intervention – just another enchantment

Template Code: OQpCAcwjQl9ue5uZ3pb51EA

With the proof of concept working, I decided to give it a whirl in PvE.  I went to my old Sunspear rep farming spot, East out of Camp Hojanu in Barbarous Shore.  Right outside the town are several packs of Heket which are mostly physical damage.  Since the packs were only 4 instead of 8 like IWAY, Channeling wasn’t return quite enough energy to cover the enchantment maintenance.  I switched out Faithful Intervention for Auspicious Incantation for more energy.  With the energy problem solved, I didn’t really need the blind effect on Dust Cloak anymore and switched it our Heart of Holy Flame to add burning instead.  The only problem with this farming spot are the Blue Tongue Heket monks that spawn randomly in the melee packs.  I had some success using Backfire on them, but they still could take a while to kill.  I decided it would be better to just avoid them altogether.

Template Code: OQpCAcwjQl9ueFdc3x701EA

Having Signet of Illusions leaves plenty of room for variation in the build, as it can use any skills with an effective attribute of 16.  I found that having Armor of Sanctity up was more important than Mirage Cloak in most cases.  The damage from letting Mirage Cloak drop and the synergy with Auspicious Incantation made it worth keeping, but it might be possible to do without it in some locations.  A Dervish primary could probably use a similar setup and might be able to get away without the energy management skills needed on my Mesmer.  Although Fast Casting is kind of nice against the Heket because they like to interrupt.  Not bad for a proof of concept, but more damage and a speed boost would be nice.

Anyways, good luck and happy hunting!

Bleach Bicameralism

This article is just for fun and is not targeted toward an audience unfamiliar with the Bleach series. However, if you’re a fan of Tite Kubo’s Bleach and have never heard Julian Jaynes and the Bicameral Mind I’m hoping that this will provide an entertaining introduction to this daring psychological hypothesis.

[spoiler alert!]

I originally saw the first episode of Bleach on Cartoon Network and have been delightfully following the series ever since. It’s about a orange-haired teenager named Ichigo, who becomes a Shinigami, which roughly translates as a “death god”– like the Grim Reaper in western tradition– which sends departed souls to the afterlife in “Soul Society”. Some of these lost souls turn into “Hollow”, evil spirits which accumulate power by consuming other lost souls and occasionally will turn to attack humans. This serves as a never ending source of conflict for the wide cast of Shinigami to fight off evil in extravagant action sequences. Each of the colorful characters is complemented with a unique weapon called a Zanpakutou which would is considered to be a manifestation of the wielder’s soul in a sword.

This relationship between the Shinigami and the Zanpakutou has several qualities about it that remind me of Julian Jaynes’s Bicameral Mind. The Shinigami are portrayed as conscious actors, in a Jaynesian sense, while the Zanpakutou represent their unconscious instincts to fight and kill. A recurring theme in the series is that Ichigo’s instincts tend to take over in times of severe distress, but he gradually improves at harnessing the Zanpakutou consciously to control the amount of devastation unleashed. The universe of Bleach is one of fiction, but much like Jaynes considers language of the Iliad as a metaphor for the mind of the ancient Greeks, might modern fiction also serve as a metaphor for modern social perceptions of consciousness? I’m going to focus primarily on the bicameral nature of the Shinigami-Zanpakutou relationship, but I’d note that Ichigo represents a slightly more complex model that still has the potential to revert to this bicameral state.

The first thing to note is that Jaynes’ model of consciousness is not the same as awareness, as it is commonly used in language but rather refers to something a bit more technical. There are four key features of Jaynes Consciousness (J-Con): (1) an analog “I”, (2) a metaphor “me”, (3) inner narrative, and (4) introspective mind-space. These four features enable an individual to “test” potential behaviors in the mind-space before trying them out in the real world. In contrast, an “unconscious” being acts instinctively and is immediately focused on the “here and now”. The reason I think Bleach is a great example of J-Con is because Ichigo’s Hollow form personifies the “unconscious” mind and poses a stark contrast to the behavior of Ichigo while he is “conscious”.

Ichigo’s consciousness normally resides in his human body, but when he becomes a Shinigami, his consciousness separates from his physical body. His analog “I” and metaphor “me” are manifested in his Shinigami form. Shinigami can influence their environment, including damage and destruction, and can be also be influenced by their environment, including injury and death. Ichigo is often portrayed narrating fights, consciously breaking apart his opponents fighting style. When Ichigo’s Hollow takes over, he doesn’t bother so much with reading his opponent. He just attacks relentlessly with no concern for how much damage is caused. Ichigo’s conscious mind strives to suppress and control this instinct, so that he may uses its power to protect his friends. Ichigo’s internal mind-space is depicted visually at various points in the series. Ichigo’s world resembles a sideways metropolis. In one of my favorite episodes, Ichigo literally fights against his Hollow self within this inner world.

Now that I’ve established what J-Con is, the next thing I need to define is the bicameral mind. Jaynes argues that prior to the development of J-Con, human beings behaved according to auditory hallucinations originating from the right hemisphere of the brain which commanded them to act. These hallucinations were often perceived to be the voices of “gods” or “ancestors”, and commanded the individual to act. This mode of thinking is very similar to the behaviors of schizophrenics in modern times. In hypnosis, the analog “I” gives up its power to an outside authority and the body follows this sources command. In the schizophrenia and bicameral mind, this authority is a hallucination.

In Bleach, the Zanpaktou often calls out to its Shinigami master through dreams. In the case of Captain Hitsugaya, he had a recurring dream of an icy dragon calling out to him, but he could not hear its name. When he finally heard its name, that’s when he became a Shinigami. The Zanpaktou is often portrayed as its own person, but resides within the soul of its Shinigami. Shinigami become more powerful by communicating with the Zanpaktou. When Shinigami and Zanpaktou fight as one, they come closest to meeting their full potential.

While not part of the manga, episode 255 of the anime involves a fight between Ichigo and Muramasa, a Zanpaktou with powers of hypnosis. Zangetsu, Ichigo’s Zanpaktou, speaks to him:

“Ichigo Feel him” “Zangetsu?” “His hypnosis no longer works on me. I shall be your eyes. But for this to work, we must truly communicate with one another as master and Zanpakutou.” “I understand, old man” (Bleach ep255)

The fight with Muramasa starts to turn around. When Ichigo gains the upper hand, he confesses the change to Muramasa:

“I finally understand what he’s been trying to tell me.” “What?” “We have to acknowledge each other’s existence and accept one another. That’s how Zanpakutou and Shinigami are supposed to interact.” (Bleach ep255)

I feel that this communication between Shinigami and Zanpaktou is much like the bicameral state of mind described by Jaynes. When communicating with Zangetsu, Ichigo does not descend entirely into instinctive behavior, as he does in Hollow form, but rather becomes aware of these instincts and uses them to obtain his goals. Much like bicameral humans followed the commands of auditory hallucinations, Ichigo enters a state of mind where Zangetsu dictates his actions. The wall separating Zangetsu from Ichigo’s analog “I” dissolves to a point where the two parts of his mind act as one. In fights such as the one with Muramasa, the part that is Zangetsu dictates the behavior while the part that is Ichigo listens and obeys. This bicameral state is where Ichigo’s power is greatest.

In the Origin of Consciousness, Jaynes finds support for this theory in the language of the Illiad. In the Illiad, the gods dictated the behavior of the actors. In contrast, the Odyssey presents actors which behave on their own accord. Jaynes argues that this change reflects the development of J-Con taking place in that period. If anything is to be learned from Bleach, it’s that modern culture acknowledges both modes of thinking. While humans generally exhibit behavior consistent with J-Con, the bicameral state is still partially accessible to the mind. As human beings, we need to accept that we have certain instincts. Consciousness provides us with the power to observe these instincts, and choose when and how they manifest themselves.

Ichigo’s story suggests that although humans are still capable of this bicameral state, there are risks associated with entering it. Ichigo’s Hollow self and Zanpaktou are closely related. In relying on his Zanpaktou’s powers, Ichigo runs the risk of his instincts taking over. While Ichigo obtains power by descending into a bicameral-like state, he needs to make sure that he doesn’t completely relinquish conscious control over his actions. Hollow Ichigo says that he is the “horse” and Ichigo is the “king”, but if Ichigo were to let his guard down he will be quick to “take the crown” (Bleach manga 221). In essence, the Hollow Ichigo represents what would happen if Ichigo descended completely into bicameralism. In bicameral individuals, the hallucination is the “king” and the self is the “horse”.