Infrastructure As Code

Posted on 2020-02-19 in misc • 3 min read

Last week I reached another milestone of sorts on my journey as a developer. I took a bit of a deep dive into the AWS Cloud Development Kit (CDK). CDK is a development framework that allows a developer to stand up cloud infrastructure “stacks” by writing code in any one of a handful of popular programming languages. Not surprisingly, my personal favorite - python - gets a seat at the table.

One general trend in software development is towards the “single-threaded” (AWS’s term, I can’t take credit for that) software development team. This refers to the notion of having developers run the majority of the unit tests and manage hardware deployments, in addition to writing the actual code. This development pattern typically includes a CI/CD pipeline that allows for new code versions to be automatically released to production, assuming that it passes the automated test suite. Tools like Ansible, Chef, Saltstack, Puppet, and perhaps the grand-daddy of them all - Travis CI - all serve generally this purpose as part of an integrated stack.

This approach to SaaS delivery is great for all the same reasons that scripted instructions on the command line will always run rings around picks and clicks on a screen with a mouse. Better yet, a scripted approach is traceable, auditable, and repeatable. DevOps automation is great for all the same reasons that git is the best tool for source control and Docker has revolutionized OS management:

  • you can copy and reuse the automation code on the next project
  • commands are scripted in plain text, which is lightweight and opaque
  • the code is self-documenting because every step of the process is written out, in order
  • workflows can be version controlled with branching and “time travel”

An interesting parallel to the physical world

After getting fully stuck in to the CDK tutorials and workshops, my brain was sufficiently stretched and ready for some restorative rest and relaxation. After logging off, I realized there was an even bigger parallel to the AEC industry. When we think of DevOps and deployment automation, we talk of infrastructure in terms of networking, servers, GPUs, and object storage. However, well before computing infrastructure became ‘a thing’, we had the concept of infrastructure in the form of physical assets like roads, bridges, and tunnels. The light bulb switched on when I realized that there is a similar movement afoot to also think of these physical features in terms of infrastructure as code.

For example, as digital twins become more and more commonplace, they will require code to turn sensor data into actionable information and dashboard visualizations. Also, we are seeing computational geometry as a tool for quickly evaluating multiple design options and rapidly iterating through “what if” scenarios and optioneering. These tools typically take form as visual programming, but just behind the scenes is a full-on programming language such as IronPython lurking just beneath the surface of Dynamo. And nothing to say of the 800 lb gorilla that is AI and Machine Learning. These heavy-duty algorithms and models run upon multiple layers of complex code and are poised to disrupt AEC just as much, if not more, than other market sectors.

Conclusion

It’s been said that there is crazy huge potential for tech disruption in AEC because we’ve been much slower at transitioning to digitalization than other sectors have. Even though we might be last to the table, we can use concepts from other industries such as the practice of DevOps for “just-in-time” software delivery to realize quick gains in safety, quality, and efficiency with relatively low effort. That’s why I’m happy spending my time on a posix shell picking through a forest of seemingly random characters and indentations. It’s true that this is a window to the past worlds of mainframes and time sharing. But it’s also true that this is a preview of the future.

AWS