Skip navigation
All People > Gregg_Stewart > Gregg Stewart's Blog > 2015 > July > 31

Gregg Stewart's Blog

July 31, 2015 Previous day Next day
Gregg_Stewart

Git journey

Posted by Gregg_Stewart Employee Jul 31, 2015

In my last blog post I outlined some automation tasks I was interested in moving forward with. I had made progress in some of the customization's needed for the log analysis (ELK). But I have since postponed working on that project to complete setting up a Chef environment. I had been wanting to setup a chef environment for a while. In part because of the integration that CA Release Automation has for it. But also because I found myself getting to a point in my ELK customization work where I might want to begin utilizing maven and other development tools. In addition to these two motivational factors I wanted to begin migrating some of my software installations, currently defined in CA Server Automation "Applications" and "Service Templates", to chef. Or at least not write any new "Applications" in CA Server Auto but instead within Chef. CA Server Automation Self Service Portal has been great and I probably wouldn't completely replace all of the Applications. But there are times when I would like to easily install various components after a server has already been provisioned. And I feel Chef is a better fit for these things.


So... how did we get from all of that to "Git journey"? Well, once I began investigating Chef I realized and learned a few things such as:

  • There is a tool chef-solo and/or chef-zero available that seems like it can utilize the same cookbooks designed to be used by chef-server. In this case, I'll probably use git so that the provisioned systems install chefdk, pull the chef-repo from git, and install various components without involving the chef server.
  • I thought that I would use Git as a place to backup the chef-server related repo data.
  • After reading various cookbook development articles I learned that it is good practice to treat your cookbooks as any other source file that you want to version.
  • Plus, it makes since to store the customization work that I'm doing for the log analysis piece in Git.


Along in my git journey I found a couple of helpful resources. So far I found:

 

Here are some of the challenges I've had so far with Git:

  • A local git repo can be anywhere on your machine. and it appears that they are configured - or not - with certain settings. I don't know the commands to figure out all of the existing configs. But it looks like it is tracked via the .git/config file.
  • I've seen terms like origin, master, Head, index, etc.. that I don't fully understand yet.
  • I've experienced 403 errors when doing a git push -u origin master. This was resolved by updating my .git/config so that my remote "origin" section's url was set to use:
    • https://<user>:<password>@github...url/orgName/repoName.git
    • But there is no way that I'll do this on a permanent basis. I'm not sure why it didn't prompt for user/password but if I can't figure that out then i'll look into using different git push commands. I've seen something about https vs ssh and I've also seen something about Git - git-credential-store Documentation but even the credential-store documentation says that it stores the info in a file that is not encrypted and only secured based on user permission access.
  • I made one branch (master) and 1 tag (default-repo). The default-repo tag is a clone of https://github.com/opscode/chef-repo.git and the master branch is default-repo + a cookbook/maven folder (which I grabbed from Welcome - The resource for Chef cookbooks - Chef Supermarket). But then I realized that the README.md that included text at the very top saying that it was deprecated and recommended using the "chef generate repo" command. So now I wanted to update both my master branch and my default-repo tag so that it used this as its base.
    • I first focused on getting just a copy of the tagged repo which wasn't working out so I found this post on stackoverflow - explaining how git clone will give you the whole repository and that after the clone you would need to run 'git checkout tags/<tag name>'. I confirmed that the directory structure was the same as the default-repo tag.
    • Then I updated the directory contents with the chef-repo from chefdk. But I have had problems getting it to the remote git server. I'm still not entirely sure how to do this. Based on some articles I've seen so far it looks like I might need to rebase something or other. While mentioning some of this to Adam Lewandowski he pointed out that Git has a UI tool that can be used to more easily manage local repositories. I think I'll give that a try.
Gregg_Stewart

The beginning

Posted by Gregg_Stewart Employee Jul 31, 2015

For a long time I have felt that automation was the key to being more productive and efficient. About 18 months ago I began utilizing one of CA Server Automation's provisioning features - the Self Service Portal. I felt the User Interface (Liferay portal) was nicer than Self Service Reservation Manager (SSRM), the ability to install software didn't require the CA Client Automation integration, and the Application/ServiceTemplate model it uses was a step in the right direction of re-usability.

 

Eighteen months later I've provisioned over a thousand test servers for some of my teammates and myself using approximately 100 different service templates. These service templates consist of probably 30+ different applications and various versions of those applications for Windows and Linux. Even though I have had some challenges along the way and sometimes the jobs fail for various reasons (disk space on SA server is too low, VC template assignment problems, etc..), I can't imagine not having this tool. It makes it so much easier to stand up systems with the software I need to do testing. Whether its Oracle on Linux, a WebSphere Application Server, a CABI standalone or N-Tier set of systems, it's all few clicks and 2 to 3 hrs away. Don't get me wrong - the tool isn't perfect. I would really like to see a much lighter weight provisioning tool and the application model / service template model that supports the ability to select application versions when a user opens the service template to submit a provisioning job. It is possible, and I have, to offer this feature in the existing setup. But it requires scripting in the backend and the install steps are not managed through the application definitions. It is handled through the script.

 

So, what's next? There are so many things one can do. Some of the things that come to mind are:

  • Automate some of the tasks that a support engineer has to do like restoring databases. Modifying the configuration of application to utilize that database (if needed). I was thinking of using CA Process Automation for tasks of this nature.
  • Download logs and analyze them.
  • Work on a log analysis tool(s). There are some tools that already exist - like ELK (Elasticsearch, Logstash, and Kibana). This is what I'm currently interested in from a log analysis tool. There are some customization's needed for it to do the things I would want it to do. But these customization's are minor if you compare it to the amount of work needed to build something similar to the already fine work made available buy the open source community for these three components. And testing the customization's on test systems before rolling out major releases - well that sounds like a use case for CA Release Automation.
  • Automate the setup of development machines with the various software components needed in order to securely and successfully step through a given version of a products source code. The security aspect is fairly consistent. But the components used by various versions of a product vary. Maven version X, or Node version Y, etc... Having an automated system to handle all of the necessary dependencies and build state switches so that you could, at a given point in time, easily work on a given version on a single machine would be sweet. To some degree the service templates could get me a portion of what would be needed for these types of setups. But I think that will only get me so far. The plan is to achieve the remaining use cases via Chef.

 

Viva Automation!

Filter Blog

By date: