# March 9: Bitwise operations in a weakly-typed programming language

The average uninitiated non-programmer hears the word “computer” and thinks of **zeros and ones…**

…forgivably, for good reason! As the most basic unit of information in existence, **zeroes and ones are the building block of digital communication. **But why? This blog post will explain:

- what makes 0 and 1 special and different from other numbers, and why they’re so important
- how one manipulates zeroes and ones in a computer programming language (and why)
- some practical examples of
**bit manipulation**and limitations thereof in a dynamically-typed programming language (like JavaScript)

# Background: Computers have fingers

We all know and love zero and one through **decimal (base-ten) numbers **— the kind we all become familiar with in grade school. Humans favor decimal numbers because we have ten fingers. Computers don’t have human fingers, but they do have something else: **switches.**

All computers are essentially just **huge interconnected bundles of on/off switches,** dancing together in harmony to create magical things like food delivery and cat GIFs. **Think of switches as computer fingers. **Just like humans, computers’ favored system for counting/representing numbers relies on switches because, for a computer, they’re familiar and intuitive. Familiarize yourself with the example below to see the same number represented as a binary and as a decimal:

A decimal digits have ten states, each of which represents a power of ten: the first digit of a decimal number is multiplied by 1, the second digit is multiplied by 10, the third digit is multiplied by 100 and so on. Binary digits have 2 states (zero and one): the first digit of a binary number is multiplied by 1, the second digit is multiplied by 2, the third digit is multiplied by 4, and so on.

Binary can be used to represent not just an integer but a **state: ***on/off,* *true/false,* *yes/no* or any other state of *boolean* (one or the other) logic. This flexibility is one reason why binary numbers are so fundamental to the transfer and processing of digital information, and we’ll see a practical application of this below!

# You’ve come a long way, baby

Feast your eyes on the world’s first commercial microcomputer, the 8800:

Modern computers can do a lot more than the 8800 thanks to things like memory, displays, operating systems, and, of course, the Internet: all cheap and available. But, just like the 8800, they’re just bundles of switches — on a much bigger scale, but speakers of the same language: binary.

In reality, object-oriented computer languages like JavaScript, Ruby, Python and C don’t make any more sense to a computer than English. Nevertheless, programs written in languages like this work because they’re *translated* to zeroes and ones by a *compiler: *a computer diplomat. But this translation takes time, sometimes an awfully long time, and nobody has time to twiddle their thumbs waiting for a computer program!

Bitwise (binary) operations are such an important subject, even for modern software engineers, because the *compile time*** **required to translate them is a fraction of that of high-order functions like `if`

, `for`

and `while`

. Computers are native speakers of binary, so it doesn’t take much effort to translate bitwise operations for them — for the same reason it’s a lot easier to translate “sombrero” into Spanish than “pork-pie hat.”

A solution to a problem that relies on bitwise operations will be faster (sometimes *much* faster) than a solution to the same problem that doesn’t make use of them. That’s why **an aspiring software engineer can count on seeing at least one interview question on bitwise operations.**

Some miscellaneous notes before continuing: eight *bits *(a single digit of a binary) make a *byte, *and a bit may be *toggled* (changed) from *set *(1) to *clear *(0) and back again. Some processors also chunk bits together into groups called *words,* of varying sizes (depending on the processor); this term is slightly outdated but may be used in discussions of compatibility.

**She’s got a type**

Understanding bitwise operations can be frustrating at first, so I’ve created a repository with a very simple web app to help visualize the **five really common useful bitwise operations** (plus a sixth that isn’t so important, included for reasons which will become clear below):

**AND**`&`

sets each bit in the result*only*if both bits are set in the operands**OR**`|`

sets each bit in the result if the bit in question is set in*either*of the operands**XOR**`^`

sets each bit in the result only if the bit is set in*one*(but*not*both or neither) of the operands**LSH**`<<`

shifts bits set to the left, filling in zeroes on the right**RSH**`>>`

shifts bits set to the right, filling in zeroes on the left

An interesting case is **NOT **`~`

, which toggles every bit in a number… or at least, that’s what you would expect, right? In reality, it only behaves this way in a *strongly-typed *language like C, where primitive variables must be defined with types which are `signed`

or `unsigned`

, or have a `float`

ing decimal point. JavaScript isn’t a language like this — instead it’s *weakly- *or *dynamically-typed, *reserving a bit or set of bits in every number to store its sign and decimal position. So as a result, bitwise NOT exhibits rather unexpected behavior; in JavaScript, `~x`

yields something like `-(x+1)`

, which isn’t terribly interesting or useful at all.

**Conclusion: Fun and profit**

At this point you may say to yourself, **“Okay, so?”**

Remember from above that bitwise operations are fast because they require minimal compile time. That means operations like the ones below are **blazing fast — **a fraction of the speed of equivalent methods using packages or objects like `Math`

:

function min( x, y ) { return y ^ ( ( x ^ y ) & -( x < y ) ); }function max( x, y ) { return x ^ ( ( x ^ y ) & -( x < y ) ); }function oppositeSigns( x, y ) { return !!( x ^ y ); }function isPowerOfTwo( x ) { return x && !( x & ( x - 1 ) ); }

And the possibilities get even more mind-blowing! One of my favorite pages on the whole World Wide Web is this list of bit-twiddling hacks by Sean Eron Anderson of Stanford University. Feast your eyes on the method below, which counts the number of bits set in a number **with a single assignment and just ten operations:**

`function countBits( number ) {`

let counter = number - ( ( number >> 1 ) & 3681400539 ) - ( ( number >> 2 ) & 1227133513 );

return ( ( counter + ( counter >> 3 ) ) & 3340530119 ) % 63;

}

It’s difficult to see the practical application of this beyond math and morbid curiosity. But remember from above that **bits can be used to represent state **and not just numbers!

Let’s imagine an example class that requires dozens and dozens of boolean (true/false) flags. Each variable in JavaScript and other languages takes up a whole byte! There’s no reason to waste a whole byte of precious memory by creating a `thisFlag`

, `thatFlag`

, `anotherFlag`

, etc. for what is essentially just a zero or one. Instead, why not create a `binaryFlags`

variable that stores flags together as a number, with each flag represented as a particular bit?

Bitwise operations frustrate and flummox new engineers because of how abstract they can seem, but their speed and efficiency make them invaluable. If you find yourself juggling booleans and it’s slowing you down in a situation where memory is limited, bitwise operations may just give you the edge you need!