Chef Updates

Uncover our latest and greatest product updates
blogImage

6 Fundamentals of Chef Workflow that you should know

Chef Workflow is a tool built by Chef for Continuous Delivery of applications and infrastructure. It provides facility for automated testing and deployment.Chef Workflow has a shared pipeline model. Every change has to go through some predefined phases of the pipelines prior to getting released. These phases are customizable. So when you push your changes to chef-workflow, your code is tested automatically, reviewed (which is done manually of course) and then delivered.1. Chef Workflow PipelinesA pipeline is a series of automated and manual quality gates that take software changes from development to delivery.Chef Workflow pipeline is made up of 6 stages: Verify, Build, Acceptance, Union, Rehearsal and Delivered.The diagram below explains it:Each project has associated Verify, Build and Acceptance stages. Verify and Build stages perform tests on the source code.Union, Rehearsal and Delivered are a part of Shared Pipeline. They test the releasable artifacts.2. Project PipelineIt includes the stages which are unique per project. A developer has control only till the “Project Pipeline”. Here you push your change, someone reviews and approves it and then the code is shipped to the Shared Pipeline.Verify Stage:The verify stage runs automatically when someone submits a change. It is made up of various phases like:Lint:Identifies stylistic problems in your source codeSyntax:Checks that the code can be parsedUnit:Runs unit testsBuild Stage:When a change is approved, Chef Workflow merges the change into the pipeline’s target branch and triggers the Build stage. Build stage again runs lint, syntax and unit phases from Verify stage. This is because your branch may have moved ahead since the Verify stage ran on this change.Build stage has some additional phases:Quality:Runs additional test suites. Some tests are too time consuming. They can be put in Build phase instead of the Verify phaseSecurity:Security tests as well as functional test suites can be added herePublish:Produces the potentially releasable artifacts and makes them available for rest of the pipelineAcceptance Stage:Till Build stage the pipeline was analyzing the source code. From the acceptance stage onwards, it starts analyzing the artifact produced in Build stage. As the name suggests, Acceptance is the stage where the team decides whether the change should go into production or not.There are 4 phases in Acceptance stage:Provision:Provision infrastructure needed to test the artifactsDeploy:Deploy the artifacts to your infrastructureSmoke:Run smoke test. They should be short runningFunctional:Run functional tests to assure that changes are meeting the business requirements3. Shared Pipeline“Shared Pipeline” is automated by Chef Workflow. It runs the test cases for every stage, tests the cookbook/application by VM provisioning and if all stages are passed then the code gets merged into the desired branch.Union Stage:A project usually doesn’t work independently. It has dependencies on several other projects too e.g. one cookbook can have dependency on several other cookbooks. The purpose of Union stage is to analyze the impact of your change on the whole system. Here tests are performed with interactions between the interdependent projects. At times, your change may pass the Acceptance stage and fail during the Union Stage. In this case a discussion is required to find the right fix. The fix may be in your project or in some other dependent project. Phases of Union stage and the remaining stages are same: provision, deploy, smoke and functional.Rehearsal Stage:This stage is triggered if all phases of Union stage pass. The purpose of this stage is to gain confidence in your change. It repeats the same process as of Union stage in a different environment. It’s like a pre-production environment.Delivered Stage:It is the final stage and its definition can vary according to one’s requirements. It could mean deploying your changes and making them live, or publishing a set of artifacts for the customers.4. PhasesEach pipeline stage consists of some phases. The diagram below explains the phases in each stage:These phases are customizable. What happens in each phase can be defined in the Build cookbooks. Each phase is configured with a recipe in that cookbook. Build cookbooks can also be used to define what kind of artifacts to build and where to store them.5. Chef Workflow Components:The following diagram shows components involved in Chef Workflow:The build cookbooks reside on Chef Server which decide what happens in each phase. Each build node is registered with the chef server and the phase jobs run on them. It’s better to have 3 build nodes so that lint, syntax and unit phases can run in parallel.For each deploy-able stage of chef workflow (acceptance, union, rehearsal and delivered), there is a web accessible server where you can verify your changes pushed through pipeline. Server names are as per the stage i.e. Acceptance, Union, Rehearsal and Delivered.6. Infrastructure:Workstation Server:This is your working environment. Here you clone a project from delivery server and push your changes through chef workflow pipelines.Delivery Server:This is like a github repo. This is where you clone your projects from and push your changes into.Chef Server:All the build cookbooks for different phases are hosted on the chef server. These cookbooks are run on the build nodes registered with Chef Server 

Aziro Marketing

blogImage

How to Deploy Projects using Chef Automate in 8 easy steps

The previous blog gives introduction of Chef Automate Workflow. This blog explains how to use Chef Automate for deploying a project.Prerequisite:Delivery CLI ToolThis is the command line tool required for running all Automate commands. It can be used to setup and execute phase jobs as well as interact with a Chef Automate server.Delivery CLI tool comes as a part of Chefdk: https://downloads.chef.io/chef-dkChef Automate Setup:This is the first step that does the basic configuration. It configures the workstation to talk to the Automate Server. The following command is used for doing the setup:$ delivery setup --server=12.34.567.89 --user  --org This creates a “~/.delivery/cli.toml” file which contains the configuration. Contents of “cli.toml” file look like this:git_port = "8989" organization = "aziro" pipeline = "master" server = "12.34.567.89" user = "nimishas"Creating a new projectChef Automate uses projects to organize work across multiple teams. You can create one project for each major component of your system. Each project has its own Git repository. Chef Automate can host the Git repository for you or you can connect Automate to an existing project, such as one on GitHub or Atlassian Bitbucket. In this blog, you get starter code from GitHub but host your project in Automate’s Git repository.The picture below shows how projects look on the Chef Automate web UI. Here “Clogeny” is the Organization and projects are listed below.Please follow the steps given below1. Get Delivery Token$ delivery token This command is used for fetching the delivery token which is used for authentication.2. Clone your project from git$ git clone 3. Go to your project directory$ cd 4. Run Delivery init$ delivery init --project Delivery init does the following :Creates a project in Chef Automate, which includes a new Git repository that’s hosted on the Automate server.Creates a default pipeline whose target branch is master.Initializes the master branch in Automate’s Git repo from the existing master branch that you just cloned (delivery init detects whether there is an existing Git repository).Creates a branch named add-delivery-config, which is based off of master.Creates the delivery directory and adds to it a build cookbook and a configuration file. This directory contains config.json file.Submits the change for review.Opens automate web UI and starts the Verify stage.Monitor Verify StageIn the previous step, “delivery init” opened Automate web UI and started the Verify stage. You can see the phases of Verify stage(i.e. Unit, Lint and Syntax) running on the Interface.As the picture above shows, Unit and Lint phases have failed. So you need to modify your code and fix those issues.You can get the details of failed phases by clicking on the phase.After fixing the issues, all phases should pass.Pushing changes to an existing project on Chef AutomateNow your project already exists on Chef Automate github repo and you want to add changes to it. Following steps need to be followed:1. Clone your project from automate$ delivery clone 2. Create a new branch for your changes$ git checkout -b 3. Make changes and commit them (don’t push)$ git add $ git commit -m "commit message"4. Submit the change to automate$ delivery reviewSimilar to “delivery init”, a browser window opens to show the pipeline in Chef Automate. The Verify stage is automatically triggered and runs the unit, lint and syntax phases.View your pushed changes on Automate PortalAll the commits made by you will appear under your project on the automate portal. The image below explains it:Build CookbooksBuild cookbooks are used to define what happens at each phase of Chef Automate. These can be defined as per the project requirement.Let’s have a look at the Chef Automate Phases again:Now take a look at “.delivery” directory which is created after running “delivery init” command. Try to map it with the phase diagram shared above.C:\USERS\CHEF\Automate-workstation\some_project\.delivery │ config.json | └───build-cookbook │ .kitchen.yml │ Berksfile │ chefignore │ LICENSE │ metadata.rb │ README.md │ ├───data_bags │ └───keys │ delivery_builder_keys.json │ ├───recipes │ default.rb │ deploy.rb │ functional.rb │ lint.rb │ provision.rb │ publish.rb │ quality.rb │ security.rb │ smoke.rb │ syntax.rb │ unit.rb │ ├───secrets │ fakey-mcfakerton │ └───test └───fixtures └───cookbooks └───test │ metadata.rb │ └───recipes default.rbUnder the “build-cookbook” folder, you will be able to see recipes for each phase i.e. lint, syntax, unit etc. Each phase has a separate recipe. The same recipe runs for a phase, no matter at which stage it is running. e.g. Same syntax recipe is run at verify and build stage.Delivery TruckIf you don’t want to go through the process of defining all the build cookbooks, you can use “delivery-truck” cookbook which has predefined build cookbooks. For example, its unit recipe runs ChefSpec and its lint recipe runs Foodcritic and RuboCop. The “delivery-truck” cookbook is already added as a dependency in your Automate Project. You can verify this in the matadata.rb of your project.All the build cookbook recipes include implementation provided by delivery-truck by default.Configure Build Cookbooks“delivery init” command creates “.delivery\config.json” file in your project directory.This file is used for configuring the behavior of your Build cookbook. The contents of this file are explained below:{ "version": "2", "build_cookbook": { "git": "https://github.com/opscode-cookbooks/delivery-truck.git", "branch": "master", "name": "delivery-truck" }, "skip_phases": [ "smoke", "security", "quality" ], "delivery-truck": { "lint": { "foodcritic": { "ignore_rules": ["FC009", "FC011", "FC031", "FC045"] } } }}Here in the build_cookbook section, git repo is specified. You can also give path of your build cookbooks here like this:{ "version": "2", "build_cookbook": { "name": "build-cookbook", "path": ".delivery/build-cookbook" }, "skip_phases": [], "build_nodes": {}, "dependencies": [] }You can define which phases to skip (though it’s advisable not to skip). You can also define which Foodcritic rule to skip. Please refer this for customizing delivery-truck behavior.Note: You need to commit “.delivery\config.json” file into Automate after making changes.Approve ChangesOnce you are done pushing all your changes to Automate, next step is to review and approve the changes. This is a manual step where your team can approve the change and move it to build stage. They can add comments by clicking on the patch set.Once all the review comments are fixed, the changes can be approved by clicking on “Approve” button on the top of your patch set:The Approve action merges your branch into master branch and deletes your branch.Deliver the changeAfter Approval, the change moves to Build stage where your source code is verified against various phases and then Acceptance stage where the resulting artifacts are verified.Once the Acceptance stage has passed, team needs to decide whether the changes should be delivered or not. Change can be delivered by click on the “Deliver” button as shown in the picture below:After hitting the “Deliver” button, the process moves to Union, Rehearsal and Delivered Stage where Provision, Deploy, Smoke and Functional phases run repeatedly.The Union, Rehearsal, and Delivered stages form a shared pipeline, where all the projects that make up the entire system come together. All three stages have the same phases.Once all stages have passed, the Automate UI appears as follows:The definition of “delivered” vary according to the project’s requirement.It could mean deploying your changes and making them live, or publishing a set of artifacts for the customers.References:https://learn.chef.io/automate/

Aziro Marketing

blogImage

How to secure sensitive data using Chef Vault

Data Bags vs Chef Vault Chef provides two solutions for solving this problem. One is Data Bags which we have been using for long. This blog, however, talks about another data bag which is also a more secure option- Chef Vault. The basic idea used for keeping your secrets safe is same in both and that is to encrypt data. But encrypted data bag item can be decrypted on any server if its secret key is available. However in case of Chef Vault, data can be decrypted using the public key of only those servers for which it is meant to be. That’s why Chef Vault is considered secure. Chef Vault Chef Vault is a gem which is used to save your data in an encrypted form. You will have to install this gem on your workstation for encrypting data. This gem also needs to be installed on all the nodes where you would be decrypting the data. $ gem install chef-vault Command Line Previously the commands used to be like this: $ knife encrypt create [VAULT] [ITEM] [VALUES] $ knife decrypt [VAULT] [ITEM] [VALUES] But they have been deprecated now. The new command structure looks like this: $ knife vault [SUBCOMMAND] [VAULT] [ITEM] [VALUES] --mode MODE --search SEARCH --admins ADMINS Command Options –mode: Possible values are Solo and Client. It’s very important to specify mode because mode decides where the encrypted data bag will be saved. If you have a chef-server then the mode will be Client otherwise mode will be For Solo mode you need to specify data_bag_path in knife.rb file where data bags will be stored on your local workstation. –search: As I mentioned earlier that a chef vault item can be decrypted only by the public key of those servers for which it is meant to be. This is one of the options that ensures this. In —search option you can specify a SOLRsearch query. e.g –search “role:webserver”. In this case only the servers with role webserver will be able to decrypt the vault data. –admins: Here you can specify the admin users who can decrypt the chef vault data. —json: Instead of specifying each option separately, a json file can also be used that is specifying mode, search and admins Vault Create $ knife vault create secrets database '{"username": "root", "password": "mypassword"}' -S "role:dbserver" -A "admin1,admin2" This will create a vault named secrets and put an item named database with the given values of username and password. This data is encrypted for the clients role:dbserver and admins admin1, admin2. You can check the vault created under data bags on your chef server if the mode is Client. For Solo mode, a vault will be created at the path specified in data_bag_path in knife.rb. Make sure that a node is existing with role dbserver or with user admin1 or admin2 before creation of the vault. Only then the vault data will be encrypted using the public keys of the node matching the above options. Or else, the vault data can’t be decrypted on any node. Vault Update $ knife vault update secrets database '{"username": "new_user", "password": "newpassword"}' -S "role:dbserver1" -A "admin1,admin3" The vault item values and it’s options can be modified with the help of this command. Vault Remove $ knife vault remove secrets database '{"username": "root", "password": "mypassword"}' This will remove the values in username and password from the vault secrets and item database. We can also remove just the admins from the encrypted admins for the vault secrets and item database. $ knife vault remove secrets database -A "admin1,admin2" Vault Delete $ knife vault delete secrets database This will delete the item database from vault secrets. Vault Show $ knife vault show secrets $ knife vault show secrets database $ knife vault show secrets database "username,password" Vault in recipes Once we are done creating the vault items, next step is how to use this data in our recipes. The code below explains that: chef_gem 'chef-vault' do compile_time false end require 'chef-vault' item = ChefVault::Item.load("secrets", "database") item["password"] References https://www.chef.io/blog/2013/09/19/managing-secrets-with-chef-vault/ https://github.com/Nordstrom/chef-vault https://github.com/Nordstrom/chef-vault/blob/master/KNIFE_EXAMPLES.md

Aziro Marketing

EXPLORE ALL TAGS
2019 dockercon
Advanced analytics
Agentic AI
agile
AI
AI ML
AIOps
Amazon Aws
Amazon EC2
Analytics
Analytics tools
AndroidThings
Anomaly Detection
Anomaly monitor
Ansible Test Automation
apache
apache8
Apache Spark RDD
app containerization
application containerization
applications
Application Security
application testing
artificial intelligence
asynchronous replication
automate
automation
automation testing
Autonomous Storage
AWS Lambda
Aziro
Aziro Technologies
big data
Big Data Analytics
big data pipeline
Big Data QA
Big Data Tester
Big Data Testing
bitcoin
blockchain
blog
bluetooth
buildroot
business intelligence
busybox
chef
ci/cd
CI/CD security
cloud
Cloud Analytics
cloud computing
Cloud Cost Optimization
cloud devops
Cloud Infrastructure
Cloud Interoperability
Cloud Native Solution
Cloud Security
cloudstack
cloud storage
Cloud Storage Data
Cloud Storage Security
Codeless Automation
Cognitive analytics
Configuration Management
connected homes
container
Containers
container world 2019
container world conference
continuous-delivery
continuous deployment
continuous integration
Coronavirus
Covid-19
cryptocurrency
cyber security
data-analytics
data backup and recovery
datacenter
data protection
data replication
data-security
data-storage
deep learning
demo
Descriptive analytics
Descriptive analytics tools
development
devops
devops agile
devops automation
DEVOPS CERTIFICATION
devops monitoring
DevOps QA
DevOps Security
DevOps testing
DevSecOps
Digital Transformation
disaster recovery
DMA
docker
dockercon
dockercon 2019
dockercon 2019 san francisco
dockercon usa 2019
docker swarm
DRaaS
edge computing
Embedded AI
embedded-systems
end-to-end-test-automation
FaaS
finance
fintech
FIrebase
flash memory
flash memory summit
FMS2017
GDPR faqs
Glass-Box AI
golang
GraphQL
graphql vs rest
gui testing
habitat
hadoop
hardware-providers
healthcare
Heartfullness
High Performance Computing
Holistic Life
HPC
Hybrid-Cloud
hyper-converged
hyper-v
IaaS
IaaS Security
icinga
icinga for monitoring
Image Recognition 2024
infographic
InSpec
internet-of-things
investing
iot
iot application
iot testing
java 8 streams
javascript
jenkins
KubeCon
kubernetes
kubernetesday
kubernetesday bangalore
libstorage
linux
litecoin
log analytics
Log mining
Low-Code
Low-Code No-Code Platforms
Loyalty
machine-learning
Meditation
Microservices
migration
Mindfulness
ML
mobile-application-testing
mobile-automation-testing
monitoring tools
Mutli-Cloud
network
network file storage
new features
NFS
NVMe
NVMEof
NVMes
Online Education
opensource
openstack
opscode-2
OSS
others
Paas
PDLC
Positivty
predictive analytics
Predictive analytics tools
prescriptive analysis
private-cloud
product sustenance
programming language
public cloud
qa
qa automation
quality-assurance
Rapid Application Development
raspberry pi
RDMA
real time analytics
realtime analytics platforms
Real-time data analytics
Recovery
Recovery as a service
recovery as service
rsa
rsa 2019
rsa 2019 san francisco
rsac 2018
rsa conference
rsa conference 2019
rsa usa 2019
SaaS Security
san francisco
SDC India 2019
SDDC
security
Security Monitoring
Selenium Test Automation
selenium testng
serverless
Serverless Computing
Site Reliability Engineering
smart homes
smart mirror
SNIA
snia india 2019
SNIA SDC 2019
SNIA SDC INDIA
SNIA SDC USA
software
software defined storage
software-testing
software testing trends
software testing trends 2019
SRE
STaaS
storage
storage events
storage replication
Storage Trends 2018
storage virtualization
support
Synchronous Replication
technology
tech support
test-automation
Testing
testing automation tools
thought leadership articles
trends
tutorials
ui automation testing
ui testing
ui testing automation
vCenter Operations Manager
vCOPS
virtualization
VMware
vmworld
VMworld 2019
vmworld 2019 san francisco
VMworld 2019 US
vROM
Web Automation Testing
web test automation
WFH

LET'S ENGINEER

Your Next Product Breakthrough

Book a Free 30-minute Meeting with our technology experts.

Aziro has been a true engineering partner in our digital transformation journey. Their AI-native approach and deep technical expertise helped us modernize our infrastructure and accelerate product delivery without compromising quality. The collaboration has been seamless, efficient, and outcome-driven.

Customer Placeholder
CTO

Fortune 500 company