In Part 1 we connected to the cloud provider and created a resource template from a virtual system pattern. In Part 2 we created an application blueprint from the resource template and mapped the application components to the blueprint. We then created a new application environment from the blueprint and the new virtual system was created. Once the environment is created and the agents come online, you can now execute your normal application deployment process onto the newly provisioned environment.
In Part 3, we will go one step further and explore what it would take to satisfy the ultimate goal of providing a true self-service environment creation mechanism for developers. Exploring this a bit further, let’s take a closer look at the use case.
The promise of cloud is that you can have readily available systems at the drop of a hat, or at least much much faster than ever before. As a developer, I have some new code that I have developed that I want to test in an isolated environment (we will explore some of the subtle but challenging details behind this idea at the end). It would be awesome if I could go to some portal somewhere and request a new environment be provisioned and my application deployed to that environment without any need for knowledge of SCO or UrbanCode Deploy. Well, this capability exists today.
To begin with, SmartCloud Orchestrator has a robust business process engine that allows you to create self-service capabilities with no need to understand what is under the covers. I have no experience in this but have seen the results. You can create processes and human tasks that can be executed from the SCO website. You then categorize your various self-serve processes.
The good part about this is that you have at your disposal a full development environment and run-time that can utilize existing programming concepts. Of course we will have to take advantage of the UrbanCode REST API or command line to be able to drive a deployment process.
Before going on, I want to confess that I have not had the opportunity to get this entire flow working from A to Z. I haven’t had access to a SCO environment and enough free time to make this all work. However, I am putting it out there because I believe this is doable.
In order to satisfy our desire to have a fully provisioned environment with an application deployed, we need to setup a process that can do the job. We can use a generic process to get our work done. There is a REST API call that can kick off a generic process and therefore our SCO self-service process can use this to drive this process. In principle, our generic process can look something like this:
The first step is to provision the environment. This step requires the environment name, the application name, and the blueprint name. These must be passed into this process and therefore you need to make process properties that can be referenced by this step. NOTE: When we provisioned an environment using the UrbanCode Deploy GUI, it asks us for information that the cloud provider needs. I am not sure how that info is passed here. There is a new command line option called getBlueprintNodePropertiesTemplate, and its descriptions says that it “returns a JSON template of the properties required to provision a blueprint.” This would need to be used and populated to insure that all of the information is passed to the environment creation process. You might need to extend the create environment step to interrogate the blueprint, get the necessary properties, and insure they are all populated.If anyone out there has tried this, let me know what you find.
The other challenge we have here is that we need to insure the environment name is unique. There is an option in this step to insure that the environment name is unique and the plug-in step simply appends a random string to the end of the environment name. This poses a problem for the next step.
Step two is to wait for the environment to be provisioned. We need to wait for the resources (agents) that will come online once the provisioned nodes are spun up. If you remember, the agent names will follow a pattern. However, if we allow the previous step to make the environment name unique, we will not be able to predict the agent names. Therefore, our self-service call to this process needs to specify the environment name and insure it is unique.
Secondly, we need to somehow determine how many agents to wait for an their exact names. This will be a challenge and as of right now I am not sure how I would solve it. This would most likely require a new plug-in to be able to interrogate a blueprint, get the names of the agent prototypes, and then construct the list of agents to wait for. Again, some plug-in development is required here.
Once the agents have come up, we can now deploy our application. This step is easy enough and we can call a well known deploy process for the application to do the installation. But there is another challenge here. Deployments can either be done using a snapshot, or you have to specify the versions of each component. Snapshots are easy, but if we remember back to our original idea, a developer has some new code that he/she wants to test. Typically snapshots are not created until a group of components have been tested together. So we have a choice. We can either have the environment provisioned from an existing snapshot, and then have to manually add our own updates to test. Or we have to provide some mechanism to update/copy an existing snapshot to include a baseline plus the new stuff. This could take on lots of possibilities, but a thorough examination of the use case would be required in order to develop a solution. This also may not be the use case we care about.
One additional solution would be to go a much more custom route and write an application that does this instead of relying on existing plug-in steps. The REST API and command line API are very rich and we can ultimately get to and produce all of the information we need. But it is nice to rely on existing capabilities and processes are much more robust and flexible than a custom solution. But as we have seen above, there are enough nuances in this effort that will require some plug-in extensions or creation that it might make sense to go the fully custom application route.
Happy self-service!!! Let me know if anyone takes on this challenge.