What is Abstract Algebra?
I’ve recently begun a study of abstract algebra (also known as modern algebra) using the text A Book of Abstract Algebra by Charles C. Pinter.
I’ll give you a brief introduction to abstract algebra by way of an example. I will write two simple proofs. The first is a proof about addition of real numbers. The second is a proof about the exclusive or operation on Boolean^{1} values. The proofs will attempt to prove similar looking things, however, the initial proofs will look completely different from one another. Then I’ll identify some common traits about the two “algebraic structures” and show you how the same proof can be applied to both problems. This is one of the things abstract algebra is about: identifying common traits between different “algebraic structures”.
Addition of Reals
Let’s consider the addition operation on the set of real numbers and prove the if then . This can be accomplished in several simple steps:
Exclusive Or
Now let’s consider the operation (exclusive or) on the Boolean values and prove that implies . I hope you can see the structure of this proposition is identical to the previous one.
I will prove this statement by case analysis and the use of the truth table for (I use the symbol for True and the symbol for False.):
To prove this we must consider every case. Let us set . We must consider that can be or and that can be or . So in total we have four cases to prove:
 First Case: and . By substituting the value of we get . By consulting the truth table we can see that this can only be true if and . Therefore .
 Second Case: and . By substituting the value of we get . By consulting the truth table we can see that this can only be true if and . Therefore .
 Third Case: and . By substituting the value of we get . By consulting the truth table we can see that this can only be true if and . Therefore .
 Fourth Case: and . By substituting the value of we get . By consulting the truth table we can see that this can only be true if and . Therefore .
We have considered every possible case and in each case we have concluded that .
Comparison of Proofs
So I have proved that for addition of reals implies and that for the exclusive or operation with Boolean values that implies . It is less than elegant however that the proofs look completely different. Is it possible to identify some traits between the two scenarios and then use this information to write one proof that works for both? Indeed there is, and that is what abstract algebra allows us to do.
So what are the traits? I’ll use the first example to call out some of the necessary traits of addition of reals that made the first proof work.
 Addition is a rule that is closed with respect to reals. This means that no matter what two real numbers we add, we end up with a result that is also a real number. We never end up with a result that is not a real. (For example, we never end up with a complex number of the form where ).
 In Step 2 of the proof we relied on the fact that every real number has an “opposite”. In this case the opposite of a real number with respect to addition is the negative of that number. In general we call this the inverse of an element.
 In Step 3 of the proof we relied on the fact that addition with respect to real numbers is associative.
 In Step 4 of the proof we relied on the fact that when a real number is combined with it’s inverse we get some known value. This known value we generally call the identity element. In this case the identity element is .
 In Step 5 of the proof we relied on the fact that any real number added with zero (the identity element) equals the original number.
Well let’s see if we can apply these assumptions to the second proposition and then see if we can reuse the structure of the original proof for the second proof. I didn’t precisely define everything in the list above, so I’ll just naively try to fulfill the same constraints for Booleans and exclusive or:
 Exclusive or is a rule that is closed with respect to Booleans. So we meet this assumption.
 What is the opposite of a Boolean? We might assume the opposite of is and the opposite of is . That’s what we normally think of. Right?
 Exclusive Or is associative. (The proof is trivial by case analysis and left to the reader.)
 What happens when we combine an element and it’s opposite: and . Therefore we can pick our identity element to be since it is the result of combining opposites.
 Now what happens when we combine the identity element with another element. . Oops. This isn’t want we wanted. To mirror our addition example we should be able to exclusive or any Boolean with the identity element and get back the original element.
So something went wrong. What is it? Well, I naively chose how to calculate the inverse of an element and also what the identity element is. It turns out that the inverse of an element depends on the operation and on the value of the identity element. The inverse of a real number with respect to addition is it’s negative and the identity element is 0.
The inverse of any real number (except 0) with respect to multiplication is it’s reciprocal and the identity element is . We can see this by noticing that for all real numbers except 0.
So let’s be a little more precise. First I’ll introduce some nomenclature. We use the symbol, , to stand for any operation when we aren’t talking about any specific operation. The inverse of an element we generically write as . The identity element we generically write as . When choosing an inverse function and a neutral element we insist that the following holds:
(The identity element combined with another element results in the other element)
(The combination of an element and its inverse is always the identity element)
If you examine the truth table you’ll see that the only way to fulfill these two requirements is to choose and . Let’s double check.
We see that the identity element, , combines with the other elements properly
and each element combined with it’s inverse (itself) equals the identity element.
Now we can redo the second proof.
On line 3, remember that the inverse of is and then consult the truth table to see that and . So no matter what is we know the result is . On line 4, remember that we’ve already shown that combined with any element results in that element.
We have now proven the second proposition using a proof whose structure is identical to the first proof.
Does This Proof Work for Every Set and Every Operation?
This proof only works when the constraints from the previous sections apply to the set of elements and the operation being studied. For example, it is not the case that given the set of Booleans implies . (The symbol is the logical or operator.) Why not? It turns out that their is no inverse function that applies. While we might be safe in picking as the identity element their is no element, , for which .
The Generic Proof
It turns out that this proof works for any set of elements, , and any operation^{2}, , if the following is true.
 The operation is associative.
 There is an element such that for every element .
 Every element has an inverse such that .
If all these rules are true then we call the pair a group. A group is one of the fundamental algebraic structures. I’m on Chapter 8 of the text and we are still just discussing groups. The two groups discussed in this post are ) and ) where is the set of real numbers and is the set of Boolean values.
As you might guess the generic proof looks like the following:

My text editor was informing me that I should capitalize Boolean. I Googled this and found confirmation. The odd thing is that the first hit was from Eric Lippert from Fabulous Adventures in Coding. It’s a blog I’ve subscribed to for years. Although I don’t ever recall this particular post. If you are a C# developer I highly recommend his blog. ↩

Note, I haven’t really defined operation. Formally, if is a set, then an operation on is a rule which assigns to each ordered pair of elements in exactly one element . ↩