GitHub Pages is a fast and easy way to share information about you, your organization or your project with the rest of the world, without worrying about server infrastructure, configuration or databases. All you need to do is create a Pages project or branch in your Github project that will serve your website.
The BIG feature that we announced last week has just left the testing phase and was merged to the DEV branch. If all goes well, it should be live in about one month
It is now possible to resynchronize the repository with your provider. Buddy uses webhooks to fetch information about changes in GitHub, Bitbucket and GitLab repositories. Sometimes, for reasons independent from us, the web hooks did not reach our service and the changes were not reflected in Buddy. Clicking Resynchronize Repo in the Code tab will fetch the latest version of the repository and trigger dependent pipelines
When adding a new deployment action to a pipeline which has already been executed you will be asked if you want to deploy the files from scratch. This way there won’t be any problems with badly calculated changesets on the first execution
Some time ago we made some improvements to action validators and accidentally entered a bad regex which prevented adding actions containing an underscore. Fixed
There was a bug in one of the external libraries used to send the status of execution to GitHub which caused recurrent pipelines to crash. We have reported and fixed that on the very same day [applause]
If there are more than 100 files in the changeset the diffs for them are not displayed by default and you need to expand them manually. Sometimes there was a problem with that. Fixed
[Buddy GO] Fixed bug with the wrong number of concurrent executions available for Enterprise licenses. If you’re a Premium user and still have this problem, please refresh your license at $URL_TO_STANDALONE/payments/license
Of all supported Buddy integrations one of the most popular are Amazon Web Services. There are currently 5 AWS services supported by dedicated actions with more to come in the future (depending on the community feedback):
Deploy to Amazon S3
Together with Elastic Beanstalk and EC2, one the most successful Amazon service. A lot of web developers use S3 to store static files (sometimes entire static pages). With Buddy, you can define a pipeline that will automatically upload repository files to the selected S3 Bucket on push to branch. You can add additional actions that will process your files prior to the deployment, eg. Grunt or Gulp.
Deploy to Elastic Beanstalk
AWS Elastic Beanstalk is a classic PaaS service, similar to Heroku or Google App Engine. It lets developers upload, build and serve their code, leaving infrastructure configuration and scaling at the AWS side.
Deployment to Elastic Beanstalk is based on uploading a zip file with the application code. Buddy lets you create a pipeline that will upload the package automatically on push, on demand, or recurrently at a given time. Just like with S3, you can add build and test actions before the deployment.
You can also Dockerize your apps first and deploy them Elastic Beanstalk later:
Amazon EC2 Container Registry is a Docker image storage for AWS developers.
To build a Docker image, you need a repository with the application code and a dockerfile with instructions how the image should be built. You can configure Buddy to build a new Docker image and push it to the ECR on every push to the selected branch. It’s also possible to add an SSH action that will pull the Docker image on the selected host and launch the new version of the application.
You can read more about building Docker images in this article
Run Lambda function
AWS Lambda allows developers run code without provisioning servers. It scales to the size of the application and execute only when required, from a couple of requests per day to thousands of operations per second. With Buddy, you can automate deployment of Lambda functions to S3 buckets or trigger the functions with a dedicated Lambda action.
Let’s assume we have a web application with backend executed in Lambda functions. Both backend and frontend is stored in the same repository. The pipeline for that could look like this:
Build and test application on every push to repo
If all tests have passed, use Lambda function to perform backup
Deploy application frontend to SFTP server
Upload updated backend files to Amazon S3 with another Lambda function
Deploy to CodeDeploy
CodeDeploy is a tool that facilitates application deployment to EC2 instances and custom user servers. The whole process boils down to uploading the app to CodeDeploy via S3, with the rest of operations handled entirely by AWS.
Let’s assume we have a Node.js project that requires performing Gulp tasks and tests before the deployment. With Buddy, you can create a pipeline that will, for example, execute all of the above once a day at midnight – provided there were any changes at all in the repository. If one of the actions has failed, a notification with the error description will be sent to the selected Slack channel.
Appendix: AWS Policies required by Buddy
In order to make the AWS actions work properly, you need to define the policies for them. You can find the complete list of policies here.
If you have two or more integrations of one type (eg. AWS) the list of integrated items in the action settings (eg. S3 buckets in Deployment to S3) will now display the ID’s of their parent integration
Every AWS action now displays the list of policies it requires. You can find the full list here
Fixed bug with uploading files from the filesystem to a relative path on SFTP server
With Buddy you can build your own Docker images with source code from GitHub, Bitbucket and GitLab (or Buddy) repositories. In this release we’ve added context path support for Docker image builds—a good opportunity to present how to automate the delivery workflow for dockerized applications.
Docker app workflow
The workflow for the dockerized app is defined in a dockerfile kept in the repository along the source code and looks like this:
Changes to application code
Build Docker image
Push Docker image to registry
SSH to the application server
Docker pull from registry
Stop current Docker container
Start new version
Buddy lets you save time and money by automating these steps so that you can focus purely on application coding.
Automating Docker build with Buddy
In this paragraph we’ll show you how to automate build and push to registry. As a result, on every change in the repository a new version of the application image will appear in the registry. You can fork this repository and use it as template, or use your own:
Create a new project in Buddy and select the repository with the application
Add a new pipeline, assign it to the Master branch, and set the trigger mode to On every push
Add the Docker image action and select the dockerfile and context path that will be used for building the application.
Note: Usually the dockerfile is located in the directory in which the image is built. However, sometimes the code structure requires the dockerfile to be moved somewhere else. In this case, you need to provide the context path.
Select the registry to which the image will be delivered (Docker Hub, Amazon ECR, Private)
You can use Environment Variables to define the tag with which the image will be pushed, for example $BUDDY_EXECUTION_ID or $BUDDY_EXECUTION_REVISION. This way every execution will create a new version of the image in the registry with an adequate tag.
(Optional) Depending on the host you build your image on, you may want to use different variables to define the image values. These arguments are available upon clicking More options.
Once you add the action, every push to the repository will automatically build a new Docker image and push it to the registry with no need for action. You can also run the action manually:
Adding tests before Docker build
The principle rule of Continuous Deployment is that your code is always tested before it is deployed to the production server. This also applies to the code that you use to build your images. With Buddy you can automatically test every change pushed to the repository by adding a build action before building the Docker image:
Add Node.js at the beginning of the pipeline and define your tests. You can install any missing dependencies in the Packages & Setup tab
When ready, make a push or run the pipeline manually and watch Buddy execute your tests.
Re-running the Docker container after the build
The application is tested and the image is built. At this moment the image is usually run on a server. This too can be handled with Buddy:
Add the SSH action at the end of the pipeline, provide your server details and enter these commands:
PHPUnit is a popular programmer-oriented testing framework for PHP. You can use it to test your code before the deployment so that you know the copy on your server is free of errors.
Configuring tests and delivery with Buddy
There’s still a wide range of developers who haven’t heard or don’t know how to introduce Continuous Integration to their process. Buddy lets you set up the whole workflow in 10-15 minutes (either through GUI or YAML) and easily scale and expand once your projects grow bigger.
Create a new project, choose your Git provider, and select your PHP repository.
Add a new pipeline and switch the trigger mode to On every push
Add PHPUnit action. You can install any missing modules in the Packages & Setup tab
Add FTP/SFTP file transfer action and configure your deployment
You can also add the SSH action and use it, eg. to run composer install on your server.
This way every time you’ll make a push to the selected branch, Buddy will automatically test and deploy the changes to the server + run the desired SSH commands:
If your tests require a database to run, eg. MySQL, you can easily activate them in the Services tab:
A good practice is to add a conditional notification in the end that will send you a message in case the tests or the deployment have failed, for example to your Slack channel:
Deploy files to Rackspace Cloud Files with the new file transfer action!
In order to deploy to Rackspace Cloud Files you need to provide your username and access token generated in your Rackspace account.