Decentralized deployment architecture

All deployments currently imply a central authority, whether a company or developer running some kind of master system with control over the rest of the network. This implies control structures which could align with the human hierarchy, a thing we've placed in legacy .

In order to subvert existing human hierarchies, we need technology that doesn't operate like humans do..well, not in their whole being. We need to look to organic chemistry. A virus, an infection, something that can replicate itself and spread. Less the negative connotations of disease and more the biological process of cell replication. Yes, we'll still have to have central listings of some form. The internet has DNS, indicating where the distributed some things are located (as does TOR). So we'll need repositories of some kind in order to ensure the authenticity of the package (GitHub, rpm's and what not) but by intentionally creating an organic deployment methodology, we can avoid the emergence of control .

Now it's not that order is a bad thing. Surely order has given rise to the great (and fading) civilizations that we find ourselves in today (whatever the day is, we'll make new ones if these constructs fail us). But nature can break through and change society where code wrapped in business logic cannot. We need to liberate and decentralize; so less twitter (distributed as it may be, it is the only broker of tweets) and more email (truly distributed throughout the world and without ownership on the standard).

 

Traditional Deployments (modern day)

Products live in containers next to each other without knowledge of whos who.
Products live in containers next to each other without knowledge of whos who.

Modern deployments use docker or some form of virtualization . They effectively take the ins and outs of an application, tokenize them, and then slap them up in "the cloud" (AWS... all of them). Once there, when you buy into a product, they basically are just making you a containerized instance on AWS and charging you overhead. SaaS works because you are lazy (we all are) and we like having the simplicity of someone else taking out our garbage rather then us having to haul it to the dump every week.

Unfortunately, this puts us in the passenger seat to innovation . By giving up the routine of the garbage, we're beholden to the collection and disposal patterns of the garbage men. We don't need to worry ourselves with innovation, they'll ensure trash A gets to dump site B, now turn off your brain and start printing money on your existing workflows.

 

Grabbing control

By taking control, we are taking on risk (in the short-term) in order to gain creative direction in the future. Perhaps we no longer want to shill garbage, perhaps we want to recycle methane off-gasing of shallowly dug dump sites while utilizing solar powered vehicles. We now have that level of creative control. But unlike this fictitious scenario, we don't have to worry ourselves with the mechanisms involved; it's technology, and containerized technology at that; We can do (and scale) anything.

Containerizing, and for this definition I'm using I mean using separate hardware per client , is the way to increase innovation as well as reduce risk. We can keep existing systems in place with existing client bases while driving forward with new, innovative clients that have unique needs, without worry of corrupting the successes of yesteryear.

But what if we wanted to take containerization a step further. What if we containerized different aspects of existing clients, enabling us to innovate more rapidly even in existing client bases. But this must come at the expense of complexity and risk, right? Not if we start designing our deployments and products like living things .

 

Deploying an organic compound

Agency of websites
Agency of websites

We need to start thinking of our applications and their deployment as living things. We aren't just creating something people see on the web, we're creating engagement, uniting people across barriers and putting them in front of the information they need to live better lives. These experiences with a technology produce, in effect, agency .

While this is not an autonomous, living, thinking organism, it is the first step towards the evolution of more complex beings. Mapping the internet to evolutionary processes, we are describing at single celled organisms. They can't do much on their own, but they are alive and mostly just react to the stimuli they are prodded with in the world.

When we enter input, we get reaction, etc.

The fun thing about living systems though and thinking about their design as such is that single celled organisms don't remain that way for long. Interestingly though, as we move up through biology, complex structures are often formed of simpler patterns repeated thousands of times.

 

Organically pattern deployments

An organically pattern deployment is a system that has the capability to set it self up again. I'm not talking traditional 

Simple forms complex structures
Simple forms complex structures

containerization where there's a host of some kind that's doing the replication. I mean, a system that is able to set itself up, and then setup replicants of itself which are fully functioning duplicates. A deployment methodology that patterns itself more off nature then our legacy human hierarchies.

Cell division is the biological phase when a cell grows or splits off. It is a functional replica after the initial cell structure has reached an apex through the consumption of resources. With plants; sunlight, water, minerals. In the case of system deployments this could be usage, users, and data.

After enough usage, users, and data have been accumulated in a single system, the system follows the natural biological process (because we are creating life) by which its time to grow. This could mean more servers, more RAM, more processors, more storage, etc. In the case of a deployment methodology though, this means that the system will imprint itself upon a new server.

 

Replication

Creating Life
Creating Life

The git repo is instructions built in that suggest how to setup a new copy of itself. Once the new copy of itself has been established, the new cell can call home to the old and ask for additional instructions. This is where we start to mirror nature in creating complexity from organized simplicity. The new cell (Cell B) is told that it was created because we need to migrate aspects of Cell A were becoming overburdened. Maybe there's too many people using function X which can be split off into a dedicated function for Cell B. Cell A then transfers everything needed to accomplish function X onto Cell B.

Because it's nature and we're talking about small, replicated tasks over time, it is important to note that in the evolutionary process, Cell B could handle all the tasks of Cell A (technically). The point of the Division is to create a new, more complex organism that's able to delegate functions of the being more effectively. As Cell A and B expand in capabilities, users, data and overall usage; the process could be repeated, forming a new organism that's made up of the simpler parts.

 

My work

This is the direction we're taking ELMS learning network, the educational technology platform that I help drive. The system will be treated like a living thing. Through the replication of simple tasks (a system that sets itself up) and the instructions to set itself up again (its DNA) included in the repo, we can start to form fractal deployments of ELMSLN. A fractal deployment could be a copy of ELMSLN that is actually several working together to form one cohesive suite of systems . Let's say there are 13 systems in ELMSLN that are discrete functionality that use RESTful web-services to talk to each other. Then these could live on 13 different servers since they are served from 13 different domains already.

The organic fractal component to the ELMSLN architecture would then suggest that 13 different copies of ELMSLN could be stitched together seamlessly to act as one system to its user-base. These systems would be stood up as Cell Division from the original deployment. Using SSH keys and having access to automated DNS reconfiguration we could have the Cell A deployment actually set the rest of the network up automatically. It could, in effect, mutate from a single celled organism into a more complex one. And because it knows how it set itself up, and the rest of itself knows how it set itself up (meta), we could put in place security procedures and mechanisms that have the network act proactively. We could start to program the single celled organism to perform increasingly more autonomous tasks, which then propagate to the rest of the parts of the organism to produce exponentially more autonomy.

 

At that point, we're no longer just creating software, we're creating life .