Archive for March, 2015

From time to time (most recently, November 2012) I find myself pondering or discussing what I sometimes refer to as “the next big thing,” or to save myself from potential copyright issue, “the next sizable shift.” In the past couple weeks I was there, in a discussion with close friends, about when the first true artificial intelligence would be created.

Vague, lofty and loose definitions that are simple on the surface often times tend toward something much more complex and difficult to comprehend. For instance, your response to “how many stars are in the universe?” may be something like, “a lot,” or “more than you can count in a lifetime,” or “100 octillion,” but that definition leads us to something much deeper and more thought-provoking.  In the case of my AI discussion, it became less about when it would happen, and more about what would be required to make it happen.

And, there are caveats. How do we define “true AI?” For me, a true AI is an entity that can serve as a replacement to real human intelligence. It doesn’t require robots or constructs bent on world domination; it simply is a program, or computer, or whatever, that can think, process and learn in the same way that a human can. So then, the question is, what will make that possible? I’d argue that is based on two things: data and power.

Data is very important to us humans, and has been increasingly more important in recent years. It is gathered, farmed, and sold in very large quantities, and access to more digital storage space is something that pretty much everybody is accustomed to now. So, at least as it relates to AI, we know we need a lot of data…but how much? Let’s call it an infinite amount.

In some theoretical instances in mathematics, a quantity is so large that it is thought of as infinite. Let’s pretend for a moment that infinity a number, and not a mathematical concept. Now consider the decimal equivalent to the ratio 1/x.  If x is 100,000, 1/x = .00001; if x is 100,000,000,000,000, 1/x = .00000000000001.  What, then, if x = infinity?  The idea here is that the larger that bottom number becomes (or, in other words, the closer it gets to infinity), the closer and closer the resulting expression gets to zero. In other words, the larger a number gets to infinity, the closer it is to being infinite. Now, in reality we know this can’t be achieved—we can’t make a number so large that it is infinite, or a fraction so small that it is zero (aside from cases where the ratio we’re dealing with is 0/n, which will always result in 0 as long as n is not zero).  You can’t just make a number a mathematical concept…but the logic is there for you to see where it is headed. More on this later.

What about power? How much power (specifically, processing power) would be required to make an AI possible? I’d argue that we already have it, although I don’t know for sure. I am merely guessing that pooling resources between the big data players (Google, Apple, Amazon, Microsoft) would result in enough power to process a very, very large amount of data. The issue comes not in the amount of power, but in harnessing it in a single instance. The big data players don’t really like each other, and either do their ecosystems. So, pooling resources isn’t really an option, regardless of the potential benefit to (or in spite of the demise of) mankind.

So, the gobs and gobs of data that we need—it all needs to live in a single ecosystem for this thing to work…maybe you should ask Siri, or Google Voice, or Cortana, or Echo what they think. Then, using some large amount of power, an intelligent program (or collection of programs) would need to be able to harness the information and process it in a way that makes the program seem human. A true digital assistant would function a lot like your actual assistant; it would schedule meetings, order lunch, or provide you information based on your input. But over time, it would learn that you hate Friday afternoon meetings, you don’t like sweet pickles on your sandwiches, and you don’t care about basketball and otherwise don’t want to hear about it. Sure, no big deal.

Since infinity is a concept, and data storage is measured in quantities, one of the ecosystems will legitimately have more data than the others, and therefore be better equipped to handle that information…but which big data player will win?

In some sense, it doesn’t matter. Our ability to fathom quantities is not nearly as important as our ability to make computations with them. For instance, Graham’s Number (G) is so massively large that it requires a special notation to be able to even begin to comprehend it. But because it serves as “an upper bound on the solution to a certain problem in Ramsey theory,” we know that calculations can lead us there in spite of our inability to truly comprehend its size.

You might then say that G should be considered infinitely large.  Perhaps that is the quantity of megabytes of data that is required for the basis of an AI. I’m comfortable thinking of it that way. But then, infinity itself is infinitely larger than Graham’s number. In mathematics “infinities” can actually be different in size.  Consider the set of all whole numbers; 1, 2, 3, and so on, and compare that to the set of all integers: 0, -1, 1, -2, 2, etc. Both of these sets of numbers are infinite, although it would be easy enough to say that the set of all integers is double the size of the set of whole numbers.  How about the set of all real numbers? That’s another thing altogether. The idea, though, is that you can have different degrees, or multitudes, or sizes, of infinity.

So then, the goal of the ultimate ecosystem is to have all the data—the thing that represents the whole. If you have all the data, you don’t need to worry about measuring it. In the near future, the ecosystems of the big data players will have contain a mind-numbingly large amount of data—in some sense, infinite amounts (or at least, beyond comprehension). To that end, then, the ecosystem that develops the first true AI will be the one based in a library of data, infinite in size, which is of a greater magnitude than the other infinite collections of data.

Putting it all together—that is a different problem entirely. If I knew how, I surely wouldn’t be writing it down.

Thanks as always; be well and stay tuned.