1. The Talk
Last Updated 2014-04-03 12:56:28 EDT
Top links for this presentation:
1.1. Prologue : The Players (4min)
APIs as both an ecosystem and an economy are at a crossroads. The last few years have shown continued growth in both providers and consumers and, it’s easy to see Moore’s Law in effect in the API Ecosystem.
However, to sustain this level of exponential growth, this level of scaling in the ecosystem, we need to come up with ways to also scale the API Economy. All indications are demand will continue to grow.
But hardly a day goes by when we don’t see evidence that the API economy the ‘engine of execution’ is buckling under the strain.
outages, security breaches, identity theft, gov security
This is the logical outcome of a system over-taxed, under-powered, and inadequate to the task. Demand is outstripping supply and the system can’t keep up.
The way to solve this problem is to start thinking in new ways and this talk is an attempt to get us to do that; to kick-start the creative process.
And to do that, I am want look back, review where we are and recap how we got here and see what inspiration we can find along the way as we work to get beyond this roadblock onto the next phase of scalable APIs.
It was 25 years ago this month that Tim Berners-Lee, then barely over 30 years old, completed the written proposal that outlined the ideas behind of the World Wide Web.
"I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and—ta-da!—the World Wide Web." - Tim Berners-Lee
The Internet, TCP/IP, and DNS had been around for years but it was up to TimBL (as he is known online) to put it all together and create what we all now think of as the ubiquitous Web.
There are some in this room that have lived their entire lives knowing the Web was always there. And, as I look around the room, I can see a few who can recall when the Internet itself was still just an idea.
Around the same time that Sir Tim was working on his proposal at CERN, the mathematician John H. Conway, along with several others completed, the book ATLAS of Finite Groups; a land-mark tome that cataloged the known set and characteristics of a series of complex symmetrical number sets critical to the understanding of both chemistry and physics including quantum theory.
I find it fascinating that, at the same time Berners-Lee was working to create an infitely large Web of documents, Conway and his collegues released a giant physical book measuring over 420mm X300mm or 17x11 inches.
Then aged 50, Conway had already published close to half a dozen books on math and number theory each considered a high water mark in the field. However, it was an article that appeared in the October 1970 issue of Scientific American that vaulted Conway to fame. An article to which we will return shortly.
Finally, while Sir Tim was busy toiling on his proposal and the math world was struggling to incorporate the work of John Conway into quantum physics,
Netherlands native Theo Jansen was working out the details of his idea to use computers in order to evolve the design of creatures that would indepedently walk the windy shores of the Netherlands near the Hague in an effort arrest the erosion of the country’s beaches.
This idea becomes the centerpiece of Jansen’s life as he spawns dozens of creatures - some 10 meters in height - that are now a recognizable sight on the beaches near the Hague each summer.
all three images
At the time, none of these three men have met each other. However, like the Web that Sir Tim envisions, they are all connected. They share a common bond in man long dead. An increidble scientist, thinker, and personality who died in 1957 only two years after Sir Tim was born. It is this man,
build slide of some of his ideas/inventions
The idea of shape charges to initiate the chain reaction of atomic bombs, the mathematical basis of quantum mechanics, game theory and economics, the Monte Carlo simulation, Mutual Assured Destruction theory, self-replicating robots, and even the Merge sort.
photo of John Von Neumann
John von Nuemann, who - in the years after is work on the US Atomic bomb - establishes the theories and concepts that govern the very computers Sir Tim and Jansen use to do their work and is the source of a key mathematical concept central to Conway’s contribution to our story.
These are just some of the ideas that came from one man and this is where our search for creative approaches to scaling the API Economy begins.
1.2. Act I : Von Neumann (6min)
map of europe showing hungary highlighted
On a cold december morning in 1903, Neumann Janos is born to an aristocratic family in Budapest, Hungary. The first of three sons in a prominent family, Janos showed early aptitude and recieved careful instruction. As a 6-year-old, he could divide two 8-digit numbers in his head and by the age of 8, he was familiar with differential and integral calculus.
He started high school at age 8 and was tutored in advanced calculus at age 15. By 19 he’d published two major mathematical papers; by 22 completed his PhD in Mathematics. He minored in experimental physic as well as chemistry. He even simultaneously earned a diploma in chemical engineering.
He was a busy young man, but not dour. Known for his quick wit and love for bawdy jokes, Janos was often the life of the party.
shot of well-dressed von Neumman
He was also known for his sartorial splendor. Legend has it that, during a vigorous defense of his mathematics PhD, one committee member asked young Janos, "Pray, who is the candidate’s tailor?"
shot of Princeton
After teaching only a few years in Hungary, Janos accepted an invitation to a research post at Princeton University in the United States. Upon emigration to the US, Janos changed his name. Neumann Janos would, from now on, be known as John von Neumann.
build of child and man
Princeton of the 1930s was awash in amazing talent. During his tenure there, von Nuemann interacted with the likes of Turing, Teller, Einstein, Goedel, Dirac, Oppenhiemer, and even Venneavar Bush - the father of the modern Internet.
group shot or collection of mug shots
And while his contributions to the fields of mathematics and engineering are too numerous to count, there are two items I want to focus on today.
His work in describing computing architecture and his concept of celluar automata. Because these two contributions touch directly on how we can scale our API Economy.
pic of the two
While at Princeton, von Neumann met briefly w/ Alan Turing in the late 30s. It was at this time that von Neumann was exposed to Turing’s notion of a "Universal Turing Machine"; the theoretical basis for a computer that con read both the problem description and the data for that problem from the same place — desdribed by Turing to be an "infinite tape."
pic of moore school
A decade later, von Neumann was working with the Moore School of Electrical Engineering at the University of Pennsylvania on the EDVAC project. During this time he wrote a paper describing a computer architecture in which the data and the program are both stored in the computer’s memory in the same address space. He used this same approach when working on the ENIAC project. In fact von Neumann was responsible for the original Op Codes — the programming instructions — of the ENIAC.
pic of von Neumann w/ the ENIAC
BTW - Grace Hopper would use the ENIAC to design the first high-level language compilers and help creat the COBOL family of computer languages.
pic of grace hopper w/ ENIAC
shot of VN architecutre model
And it is this model, what we now call von Neumann Architecture that we all rely upon today to build and operate our computers. From mainframes to desktops, to tablets, to hand-helds, and even wearables. Theier success is due to von Neumann.
build of mainframe, desktop, tablet, hand-held, and wearable computers
But, it is important to point out that what we know as the "von Neumann Architecture" was not the only view of computing that he gave us.
There was another model, one that does not require a central processor, that von Neumann first envisioned while working with Stanislav Ullam on the Manhattan Project in the 1940s.
shot of the two (actually includes Feynman)
The only publication of this idea appeared in a work published after von Neumman’s death: Theory of Self Reproducing Automata
What makes von Neumann’s cellular automata so intriguing is that it is an almost complete opposite of the computing model we all use today. Automata don’t rely on a central processing unit (a CPU). They also do not operate on a linear programming model.
You don’t program automata in the sense we all know today.
automata example grid
Instead you create set of basic rules, feed these rules of a collection of cells in a grid and the results emerge when the cells express the rules you provide them.
Stephen Wolfram wrote about cellular automata in his book A New Kind of Science, published in 2002. And they are central to his Wolfram Alpha web service today.
The exploration of celluar automata has been ongoing in the last fifty years by dozens of researchers.
At the heart of all of this work is the notion that a set of simple rules, carefully drawn, and implemented over an entire collection of tiny independent machines …
can provide more computing power than any giant mechanical brain or collection of von Neumann-based machines.
And it is this alternate source of computation and deduction that I think we need in order to past our current API roadblock to the next level of APIs.
same as previous slide
But I’m getting a bit ahead of myself here. Wolfram is not the first to be captivated by von Neumann’s automata.
The most famous implementation of von Neumann’s alternate computing model was first brought to life in the 1960’s by another brilliant mathematician.
And that’s where our story leads next.
1.3. Act II : Conway (5min)
map highlighting the UK
On a cold morning in December 1937, while von Neumann is settling into his work at Princeton, Cyril and Agnes Conway are celebrating the birth of their third child and first son, John, in Liverpool, England.
These two men — John von Neumann and John H. Conway — share more than just given names. Like von Neumann, the Conway lad shows an incredible aptitude and curiosity at an early age. At age four Conway is able to recite the powers to two. And at age 11 he had already decided to devote his life to mathematics.
However, unlike the aristocratic Neumanns, the Conways live in working-class Liverpool where Cyril works as a chemist’s assistant. And the young John Conway spends most of his formative years under the shadow of WW II England.
Conway earned his BA in Mathematics in 1959 from the University of Cambridge and completed his PhD work there in 1964. It was during his years at Cambridge that Conway became fascinated with the game of Backgammon spending long hours playing anyone who walked by.
Conway focused most of his life’s work on Number Theory — and his love of games led him to making significant contributions Combinitorial Game Theory or CGT.
His 1976 book On Numbers and Games, also known by it’s acronym ONAG, describes as set of games known as "partizan" games; ones where chance or the roll of the dice is not a factor. Conway, along with several others have authored a series of books on strategies for this class of games called Winning Ways for your Mathematical Plays.
The material from the ONAG book caught the eye of Donald Knuth (author of The Art of Programming). It was Knuth that came up with the name "Surreal Numbers" to describe Conway s game theories and Knuth even wrote a short novel about it with the same name in the 1970s.
But it was several years before Surreal Numbers, before Knuth and ONAG, that Conway made his mark in popular culture.
same slide as earlier
In the 1970 issue of Scientific American, Martin Gardner wrote an article describing a simple game Conway called "Life".
This month we consider Conway’s latest brainchild, a fantastic solitaire pastime he calls "life". - Martin Gardner, 1970
The game was played out on an empty grid and had deceptively simple rules of Survival, Deaths, and Births.
(don’t read, just show)
Survivals. Every counter with two or three neighboring counters survives for the next generation.
Deaths. Each counter with four or more neighbors dies (is removed) from overpopulation. Every counter with one neighbor or none dies from isolation.
Births. Each empty cell adjacent to exactly three neighbors—no more, no fewer—is a birth cell. A counter is placed on it at the next move.
Originally designed to allow one to play as a solitaire game, interest (and impatience) in the outcome spurred the creation of a computer program to execute the rules in succession.
For long-lived populations such as this one Conway sometimes uses a PDP-7 computer with a screen on which he can observe the changes. - Martin Gardner, 1970
It was soon discovered that initial "states" of the game had varying, but consistent outcomes. That recognizable shapes would appear and re-appear through successive generations. These could be classified into groups:
from the LifeWiki site:
These are the very automata that von Neumann envisioned in the 1950s. Conway has not only brought them to life, but he has exposed an entire world of creatures that live and die as automata. It is more than a literary conincidence that Conway is following up on one of von Neumann’s ideas.
same as earlier slide
In fact, in 1986 not long after publishing the ATLAS of Finite Groups, Conway left Cambridge to accept a research position at Princeton University where he still occupies the von Neumann Chair of Mathematics.
Now, as entertaining as the "Game of Life" can be, it is what is behind the game that facinates me. Conway has boiled an entire class of games into a few simple rules and categories. He has showed us that, essentially, we can use these simple rules in order to create "computers on paper". Engines that play out the logic inherent in the rules themselves.
Even more intriguing to me is the notion that we, as humans, cannot always see what the results will be. The logic is expressed in such a succint and compressed form (the board, the rules, the initial state) that our own brains cannot easily penetrate the details. Instead we must discover the outcome in real-time. The results, invisible to us at the start, emerge over time.
We’ve not yet gotten used to this idea when creating our own APIs. The notion that we can’t see the end before we start is common in real life but for most all of us totally unacceptable when creating computer systems.
When it comes to computing want to know each and every step along the way and we must be assued of how this journey will end before we even start.
However, there is a man who uses computers in this novel way; he allows the computer to take him to places unknown. to tell him which road to take, which turns to make. And it has led him on an extraordinary journey of discovery and creativity.
And that is the next chapter in our story of scaling APIs.
1.4. Act III : Jansen (4min)
map showing netherlands highlighted
Around the time Conway as settling into his position at Princeton, 40 year old Dutch-born Theo Jansen was contemplating his "next big thing."
just a few years earlier, while studying Physics at Delft University he dazzled, in point of fact panicked, both Delft and Paris with is flying UFOs.
And, after completing his "Painting Machine", which used light sensors to mechanically paint a room scene, Jansen was looking for something new.
And it was a book by evolutionary biologist, Richard Dawkins (The Blind Watchmaker) that give him the inspration he was looking for.
And his idea was a stunner:
"I started with the idea that I could make wind-powered “animals” that would live on the beach and build up the dunes to save us from rising sea levels in the coming century." - Theo Jansen
Jansen not only set out to design and build his own creatures, he also set himself some incredible constraints in doing so.
He would use only cheap, readily available materials, now electronics of any kind, rely only wind-power, and would make them as self-sufficient and independent as possible.
There are so many things I could talk about, but the one thing in particular I want to mention today is the process by which Jansen designed the "legs" for his creatures.
He discovered early on that simple wheels would not work in the beach sand. He needed something different, something that wouldn’t sink into soft sand near the dunes or lose traction on the hardended sand near the waterline.
And Jansen’s tool of choice in solving this problem was brilliant.
slide of PC computer
Jansen wrote a computer program that would start with a random design for "legs" and, given the set of rules he devised, would continually cycle through birth, death, and mutation in a search for the optimal "sand leg."
"The inherited trashing movements were copied and, mixed with mutations, distributed to a random subsequent generation one more of 200 creatures." - Theo Jansen.
Just like Conway before him (and Wolfram after him) Jansen was using a simple set of rules to allow a computer to come up with an optimal solution. Jansen was using automata.
And this is what Jansen’s computer came up with.
Using this as a starting point, Jansen set about creating all sorts of creatures over the next 30 years to explore many the many aspects and challenges of "sand life" including the ability to keep from wandering into the surf, storing up wind power for use when the beaches are calm, and the ability to reverse direction when the beach runs out.
All from a simple set of constraints and a limited collection of inexpensive working materials.
His incredible journey is chronicled in his book "De Grote Fantast" or "The Great Pretender." I highly recommend this book even if you are not interested in the mechanics of his creations.
jansen, conway, neumann
So Jansen has takes to where we began. He has brought to life the very idea that von Neumann described over fifty years ago; using the computer to evolve the physical apparatus in the way Conway outlined with his Game of Life.
1.5. Epilogue : The Next Big Thing (4min)
From von Neumann’s model of computing.
from previous slide
To Conway’s automata.
from previous slide
To Jansen’s sand creatures.
from previous slide
What we see here is a line of succession from idea to worked concept to living example. And it’s just the start.
One of the last things von Neumann worked on was the notion of exploring space using machines. Machines that are able to navigate on their own, land somwhere, and use the available raw materials to replicate themselves and launch to further reaches of space.
In fact, these are referred to as von Neumann Probes and there is quite a bit of literature, evencommercial work in thi area.
These probes are predicated on the notion that space travel takes eons and only machines can stretch out a mission that far into the future. A future we, ourselves cannot see.
This is the story line of Arthur C. Clarke’s 2001: A Space Oddesy. The monolith is a von Neumann Probe.
And in David Brin’s novel Existence there are thousands of von Neumann probes found all over the galaxy.
But what does this mean for us? For those of us trying to scale APIs beyond 15B connected devices to maybe 40B or more in the next decade?
It means we need to re-think our models of what an API is and how it works.
We need to think creatively about how we can use:
Conway’s game of life …
Wolfram’s rules catalog …
and Jansen’s evolution alorithms …
to create a "new form of API" that, like Jansen’s creatures, can safely roam the Web independently.
And there is ample evidence of this work going on in various quarters.
In her book, Complexity: A Guided Tour, Melanie Mitchell of the Santa Fe Institute describes one of her "genetic programming" projects: the Soda Can Robot.
Th software spawns 100 robots to wander the grid picking up cans and top two performers "mate" by splicing their trail history together and then another spawn another 100 offspring from this "DNA." In this way, the program eventually generates creatures that travel the grid in the most efficient path possible. All through a random start, splicing, and multiple generations. No direct programming of the robots.
This is the same kind of work Conway and Jansen are doing.
And it is very much NOT like the work of Google and IBM with their statistically-based learning models.
Predicting what humans want or will-do is nothing like learning. Although great strides have been made in this statistical space, it has it’s limits. And, according to Google’s Peter Norvig, we may be reaching a major roadblock.
"As we gain more data, how much better does our system get? It’s still improving—but we are getting to the point where we get less benefit than we did in the past.” - Peter Norvig.
We should not be surprised by this. We saw that the initial wave of the Web was carried by human-centric curated, hierarchical indexes of Yahoo and others.
But it wasn’t long before the Google learned that it was not the content that held value but the links between pages that had real worth. Google realized that value emerged when people linked things together!
This notin of value in the links has it’s root in what is called the Power Law — commonly called, the Long Tail.
It is the Power Law that is behind what are known as Scale-Free Networks. And our best living example of a scale-free network today is…
show both TimBL and WWW slides from the top section
The World Wide Web; first defined 25 years ago this month.
So, aside from the obvious, what is the theme running through all the material I talked about today? It is this:
slide set for "small"
And not just "small" as Microsoervices,
or even nanobots
No. I mean "small" as in small code or even better …
letters NO CODE
Leonard Richardson, creator of the Richardson Maturity Model puts it this way:
http://www.foolscapcon.org/wp-content/uploads/2010/03/Leonard-Richardson.jpg "The maze is a metaphor for hypermedia applications in general." - Leonard Richardson
He says this, not just because the hypermedia Web is like a Maze.
But because, like a maze, the hypermedia Web can be traversed using simple rules. Like Conway’s rules.
Here are all the rules needed to traverse a simple 2D maze of any size:
This is what I mean by "small." If we are to scale the API Economy to the next level,
if we are to embrace the Scale-Free nature of the World Wide Web….
…then we have to stop doing this:
and start doing more of this:
so we can let go of this limitation:
and embrace this opportunity:
It’s up to us; the people in this room, everyone you work with, and anyone who may be looking at this talk n the future.
We can scale APIs to the next level. But, as the Ancient philospher Lao-Tzu said over 2000 years ago…
And frankly, I’m tired of standing around waiting.
So, let’s take that first step toward our future right now — today.
I want to see what’s down that road.; what the future holds for us and for our "new form of APIs."
Because, one we start on the journey, there is no telling where it might lead.
2. Loose Notes
What follows are loose notes for the talk
2.1. General Thoughts
TJ uses computers to evolve a solution to his problem. Evolve. His programs are based on requirements ("legs need to work on sand", etc.), desired properties (need to hold up over time, etc.), and constraints (only materials are PVC tubes, pot-stickers, and cable ties). Requirements, Properties, Constraints. See Fielding. See nature(?).
2.2. Machines Times Networks
For most of the last fifty years, our computing efforts have been focused on programming individual machines.
With the exception of a handful of initiatives to program parallel implementations, we’ve not yet progressed to the point of programming the networks themselves.
There are a number of reasons for this, but one of the key elements missing from most programming efforts is the ability to deal with multiple time streams.
Almost all our programming work ignores the role of times. Not a single thing called time, but multiple times.
Machines operate on a single time stream. Networks operate on multiple time streams.
Machines operate on the laws of physics as they are described by Newton.
Networks operate on the laws of physics as they are described by Einstein.
Pat Helland (Data inside/out)
2.3. John H. Conway
Conway’s parents were Agnes Boyce and Cyril Horton Conway. He was born in Liverpool. He became interested in mathematics at a very early age and his mother recalled that he could recite the powers of two when he was four years old. At the age of eleven his ambition was to become a mathematician.
After leaving secondary school, Conway entered Gonville and Caius College, Cambridge to study mathematics. He was awarded his BA in 1959 and began to undertake research in number theory supervised by Harold Davenport. Having solved the open problem posed by Davenport on writing numbers as the sums of fifth powers, Conway began to become interested in infinite ordinals. It appears that his interest in games began during his years studying at Cambridge, where he became an avid backgammon player, spending hours playing the game in the common room. He was awarded his doctorate in 1964 and was appointed as College Fellow and Lecturer in Mathematics at the University of Cambridge.
He left Cambridge in 1986 to take up the appointment to the John von Neumann Chair of Mathematics at Princeton University.
Conway resides in Princeton, New Jersey. He has seven children by various marriages, three grandchildren and four great-grand children. He has been married three times; his first wife was Eileen, and his second wife was Larissa. He has been married to his third wife, Diana, since 2001.
2.4. Charles Eames
2.5. Christopher Alexander
2.6. Von Neumann
Born Nuemann Janos (family name first) on a cold December morning in 1903 in Budapest, Hungary, John showed evidence of math prodigy at an early age. He attedend high school at age 8, was studying advanced Calculus at 15 and earned his PhD in Mathematics by age 21. He then taught Mathematics at University of Berling from 19276-1930 making him the youngest in residence scholar in history. It was in 1930 that Von Neuman left Budapest to take a teaching post at Princeton University in the US.
Died in Feb 1957.
involvement w/ Goedel’s incompleteness theorems
the father of Quantum Mechanics (w/ Dirac) in 1932 The Mathematical Foundations of Quantum Mechanics
founded the field of Game Theory (identified zero-sum games in a paper in 1928)
Theory of Games and Economic Behavior, 1944
key contributor to the Atomic bomb project at Los Alamos
credited with descriing the theory of Mutually Assured Destruction (MAD)
helped developed the Monte Carlo method of using a series of random numbers to approximate results
designed and hlep implement the first computers where both data and program was stored in memory (ENIAC/EDIAC)
designed the first op codes for ENIAC
Credited (by Donald Knuth) w/ the first merge-sort algorithm
created the field of cellular automata Theory of Self Reproducing Automata (posthumously) [worked out on paper!]
preceeds discovery of DNA
created the first computer vrius (1949)
had a habit of reading books while driving; lots of accidents!
2.7. Theo Jansen
Shapeways 3D printing: http://www.shapeways.com/shops/theojansen
Event Schedule: http://www.strandbeest.com/events.php
TED Talk: http://www.strandbeest.com/film_ted.php
Theo Jansen email: email@example.com
influenced by Richard Dawkins' The Blind Watchmaker
1990 Theo begins building strandbeest. Same year WWW is approved by CERN
"The walls between art and engineering exist only in our minds." Theo Jansen
2.7.1. Ability to calculate quickly
Two bicyclists start twenty miles apart and head toward each other, each going at a steady rate of 10 mph. At the same time a fly that travels at a steady 15 mph starts from the front wheel of the southbound bicycle and flies to the front wheel of the northbound one, then turns around and flies to the front wheel of the southbound one again, and continues in this manner till he is crushed between the two front wheels. Question: what total distance did the fly cover? The slow way to find the answer is to calculate what distance the fly covers on the first, northbound, leg of the trip, then on the second, southbound, leg, then on the third, etc., etc., and, finally, to sum the infinite series so obtained. The quick way is to observe that the bicycles meet exactly one hour after their start, so that the fly had just an hour for his travels; the answer must therefore be 15 miles. When the question was put to von Neumann, he solved it in an instant, and thereby disappointed the questioner: "Oh, you must have heard the trick before!" "What trick?" asked von Neumann, "All I did was sum the infinite series."
Stuart Russell, Norvig’s co-author of AI: A Modern Approach
2.10. Other Notes
John Von Neumann had the idea of machines that fly into space, stop at some point, duplicate themselves using local materials, then continue on. Get more on this story.
Von Neumann’s (Shannon’s?) idea of a machine w/o any supervisor (Grid stuff)
Wolfram’s Filters are like Conway, right?
Organic life (Melissa Marshall?) (Soda Can Bot)
Hofstader’s work on brains AI (not stats AI)
2.11. Computer-based life (aninmachina)
Work on ways to define, describe machine life in a computer (similar to Strandbeest).