Database-as-a-ServiceRecently we (My colleague Dimitri and myself) delivered a Proof of Concept (PoC) at a customer with vRealize Automation 7.

During the PoC we have proven a few different use cases like deploying a Linux virtual machine with a JBoss application using an integration with XLDeploy and Deploying a multi-tier SharePoint environment.

Another important use case was called ‘Database-as-a-Service’ (DBaaS).

First Define Database-as-a-Service (DBaaS)

First we had a discussion about the definition of Database-as-a-Service. On the VMware Solution Exchange there’s a download which should deliver Database-as-a-Service. This solution provides automated installation and configuration for Microsoft SQL Server service as a Database service (DB-as-a-Service). In our opinion and that of the customer this is not Database-as-a-Service. Of course this is all semantic discussion but let me explain.

The ‘as-a-Service’ addition to an item or solution generally determines the demarcation of responsibilities. So Infrastructure-as-a-Service delivers a pool of resources to a customer on which a customer can install virtual machines. The customer does not care what type of storage and which servers are used and the customer does not have to maintain the underlying hardware. With Software-as-a-Service the customers gets a instance of the required software which delivers the functionality the customer needs. The customer does not care on which hardware or operating system, etc this software runs and the customer does not have to maintain the underlying hardware, operating system or software.

So when we use this to define Database-as-a-Service, the customer can request a database. Once delivered the customer can do whatever they want with the database but they do not care on which hardware it runs, which operation system it uses or which database application is used. The most important thing, they do not need to maintain the hardware, operating system or database application. They just get a database … OK, in real life it may be crucial to know which database application is running (SQL, Postgres, etc.) for compatibility purposes, so SQL-as-a-Service may be a better name but we will call it Database-as-a-Service for now.

So with Database-as-a-Service a customer should be able to request a new database on an existing database server.

How do we do that? Well, first of all you need a database server or multiple database servers on which we can deploy new databases. In this case we used a Microsoft SQL Server 2014 SP1 instance.

Running the script inside the guest OS with vRealize Orchestrator.

Next we need a script which creates a database on an existing SQL Server instance. We used the PowerShell script below:

We are going to run this script using the VM Guest Script Manager. The Guest Script Manager allows you to add, edit, delete a bash, batch or Powershell script and run it inside the virtual machine guest OS using the VMware Tools. You can download the Guest Script Manager package from the VMware Development Center. Once downloaded, import the package in your Orchestrator instance.

Now we need to add this script to the vRealize Orchestrator by running the ‘Guest Script Manager/Script Management/Add script configuration‘ workflow. Name your imported script ‘Create a new SQL database code‘ (I added ‘code’ to the name to distinguish between the code and the actual workflow). Select ‘PowerShell’ as the script type and copy+paste this code in the script content window. Add a timeout value, I used 30 seconds and a refresh interval.

How to call the script?

Now that we have imported the script, we need to create a workflow to run this script. This workflow is the actual workflow we’re going to call as XaaS service from vRealize Automation. Besides calling the script this workflow also collects the input values required to create the database.

So create a new workflow called ‘Create a new SQL Database‘.

  • Drag and drop the workflow element ‘Run script in VM guest’ on your workflow canvas.
  • Click on Setup button at the top right to set the workflow parameters.
  • Set ‘scriptConfiguration’ as mapping type ‘Value’, click on input value and type in the filter the name of the script (‘Create a new SQL Database code’). Select it.
  • Set ‘scriptVariables‘ mapping type to ‘Value‘ but do not set any input value.
  • Because I do not want end users to know the username and password use to connect to the SQL Server, I enter them here as static values.
    • Set ‘vmUsername’ mapping type to ‘Value‘ and enter the username of the user with access to your SQL Server instance (in my case the local Administrator).
    • Set ‘scriptVariables‘ mapping type to ‘Value‘ and provide the password for this user.
  • Click ‘Promote‘.

The result should look like this.

Map the workflow inputs to the script variables

The next step is to map any script variables to workflow inputs so we can provide a database name, size, etc at the time of request. As you have probably seen, the script has the following input variables:

  • [DatabaseName].
  • [Computername].
  • [SystemTableSize] in MegaBytes
  • [AppTableSize] in MegaBytes
  • [AppTableMaxSize] in MegaBytes
  • [LogFileSize] in MegaBytes

For each of these variables we need to add an Action element to the workflow to map a workflow input to it which results in an input prompt when running this workflow. To do this:

  • Map DatabaseName:
    • Drag and drop an Action Element on the canvas and place it before the ‘Run script in VM guest’.
    • In the Search Choose the ‘addScriptVariableToArrayOfScriptVariables‘ action.
    • Click on Setup button at the top right to set the workflow parameters.
      • Rename “stringToReplace” with ‘DatabaseNameVar‘.
      • Set the mapping type to ‘Value‘.
      • Select ‘Input value’ and set it to ‘[DatabaseName]’.
      • Rename “replacingString” with ‘DatabaseName’.
      • Leave the mapping type on ‘Input‘.
      • Set the output to ‘scriptVariables‘.
      • Click ‘Promote’.
  • Map ComputerName:
    • Drag and drop an Action Element on the canvas and place it before the ‘Run script in VM guest’.
    • In the Search Choose the ‘addScriptVariableToArrayOfScriptVariables‘ action.
    • Click on Setup button at the top right to set the workflow parameters.
      • Rename “stringToReplace” with ‘ComputerNameVar‘.
      • Set the mapping type to ‘Value‘.
      • Select ‘Input value’ and set it to ‘[ComputerName]’.
      • Rename “replacingString’ with ‘ComputerName’.
      • Leave the mapping type on ‘Input‘.
      • Set the output to ‘scriptVariables‘.
      • Click ‘Promote’.
  • Map SystemTableSize:
    • Drag and drop an Action Element on the canvas and place it before the ‘Run script in VM guest’.
    • In the Search Choose the ‘addScriptVariableToArrayOfScriptVariables‘ action.
    • Click on Setup button at the top right to set the workflow parameters.
      • Rename “stringToReplace” with ‘SystemTableSizeVar‘.
      • Set the mapping type to ‘Value‘.
      • Select ‘Input value’ and set it to ‘[SystemTableSize]’.
      • Rename “replacingString’ with ‘SystemTableSize‘.
      • Leave the mapping type on ‘Input‘.
      • Set the output to ‘scriptVariables‘.
      • Click ‘Promote’.
  • Map AppTableSize:
    • Drag and drop an Action Element on the canvas and place it before the ‘Run script in VM guest’.
    • In the Search Choose the ‘addScriptVariableToArrayOfScriptVariables‘ action.
    • Click on Setup button at the top right to set the workflow parameters.
      • Rename “stringToReplace” with ‘AppTableSizeVar‘.
      • Set the mapping type to ‘Value‘.
      • Select ‘Input value’ and set it to ‘[AppTableSize]’.
      • Rename “replacingString’ with ‘AppTableSize‘.
      • Leave the mapping type on ‘Input‘.
      • Set the output to ‘scriptVariables‘.
      • Click ‘Promote’.
  • Map AppTableMaxSize:
    • Drag and drop an Action Element on the canvas and place it before the ‘Run script in VM guest’.
    • In the Search Choose the ‘addScriptVariableToArrayOfScriptVariables‘ action.
    • Click on Setup button at the top right to set the workflow parameters.
      • Rename “stringToReplace” with ‘AppTableMaxSizeVar‘.
      • Set the mapping type to ‘Value‘.
      • Select ‘Input value’ and set it to ‘[AppTableMaxSize]’.
      • Rename “replacingString’ with ‘AppTableMaxSize‘.
      • Leave the mapping type on ‘Input‘.
      • Set the output to ‘scriptVariables‘.
      • Click ‘Promote’.
  • Map LogFileSize:
    • Drag and drop an Action Element on the canvas and place it before the ‘Run script in VM guest’.
    • In the Search Choose the ‘addScriptVariableToArrayOfScriptVariables‘ action.
    • Click on Setup button at the top right to set the workflow parameters.
      • Rename “stringToReplace” with ‘LogFileSizeVar‘.
      • Set the mapping type to ‘Value‘.
      • Select ‘Input value’ and set it to ‘[LogFileSize]’.
      • Rename “replacingString’ with ‘LogFileSize’.
      • Leave the mapping type on ‘Input‘.
      • Set the output to ‘scriptVariables‘.
      • Click ‘Promote’.

Each mapping should look like this (Example: DatabaseName):

Your completed workflow should look like this:

The only thing to do now, is to create a XaaS service in vRealize Automation, call the workflow you just created and design a nice input form. To do this go to ‘vRealize Automation/Design/XaaS Blueprints‘ and select ‘New Blueprint‘. Select the Orchestrator workflow you’ve just created.

Give it a name e.g. ‘Create a new SQL Server database‘  and version it (1.0.0). Design a nice user friendly input form, so rename the AppTableMaxSize  input to something more friendly like e.g. ‘What would you like as a Maximum Application Table Size?

Besides that I also created a dropdown list for the SQL Server input, this to prevent users from guessing the correct SQL Server. You can of course populate this from an external source or query for every VM with SQL in its name.

Next publish the XaaS Blueprint, set the correct entitlements and you have yourself a shinny new Database-as-a-Service (DBaaS). You can now request a new database on an existing SQL Server.

And the result is this.

 

Special thanks to Dimitri who invested a lot of time researching and perfecting this Database-as-a-Service!