0% found this document useful (0 votes)
16 views

Guide to Bitbucket Pipeline for Advanced CI_CD Workflows _ by Mert Mengü _ Medium

Uploaded by

Trí Dũng Lê
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Guide to Bitbucket Pipeline for Advanced CI_CD Workflows _ by Mert Mengü _ Medium

Uploaded by

Trí Dũng Lê
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

7

Search Write

Get unlimited access to the best of Medium for less than $1/week. Become a member

Guide to Bitbucket Pipeline for


Advanced CI/CD Workflows
Mert Mengü · Follow
10 min read · Oct 4, 2024

Bitbucket Pipelines is a powerful tool for automating CI/CD workflows,


integrated directly into Bitbucket. In this guide, we’ll explore advanced
techniques, best practices, and practical examples to help you master
Bitbucket Pipelines.

Why Use Bitbucket Pipelines?


1. Built-in CI/CD: Natively integrated into Bitbucket for seamless
automation.

2. Customizable Workflows: Supports a range of triggers including pushes,


pull requests, and merges.

3. Docker-Based: Run pipelines in isolated Docker containers.

4. Scalability: Flexible parallel steps, caching, and deployment strategies.

Pipeline Basics: YAML Structure

Bitbucket Pipelines configuration is defined in a bitbucket-pipelines.yml file


located at the root of your repository. Here’s a simple example:
image: node:20

pipelines:
default:
- step:
name: Install and Test
caches:
- node
script:
- npm install
- npm test

Key Components:
• image: Specifies the Docker image for the pipeline.

• pipelines: Lists the pipeline events (e.g., default, branches, pull-requests).

• step: Individual jobs in the pipeline.

• caches: Cache resources (e.g., dependencies) to speed up builds.

Advanced Topics in Bitbucket Pipelines


1. Pipeline Triggers and Events
Bitbucket Pipelines offers flexible triggers, allowing you to define workflows
based on branch pushes, pull requests, tags, and custom events.

pipelines:
branches:
develop:
- step: *run_tests
master:
- step: *build_deploy
pull-requests:
'**':
- step: *run_tests
tags:
v*.*.*:
- step: *deploy_production

• Branches: Pipelines can be triggered on specific branches (develop,


master, etc.).

• Pull Requests: Trigger specific jobs when a PR is created.

• Tags: Use version tags to trigger production deployments.

2. Defining Multiple Steps and Parallel Execution


Steps define the individual jobs in the pipeline. You can run steps
sequentially or in parallel. Parallel steps speed up pipelines by running jobs
concurrently.

pipelines:
default:
- parallel:
- step:
name: "Run Unit Tests"
script:
- npm run test:unit
- step:
name: "Run Integration Tests"
script:
- npm run test:integration
- step:
name: "Build and Deploy"
script:
- npm run build
- ./deploy.sh

• Parallel Steps: Both unit and integration tests are run simultaneously,
reducing the overall pipeline time.

• Sequential Step: The build and deploy step runs only after the parallel
tests complete.
3. Advanced Parallelism: Conditional Matrix Builds
Bitbucket also allows conditional parallel execution, which can be useful in
large repositories where specific configurations or tests might not always
need to run.

pipelines:
branches:
master:
- parallel:
- step: &run_tests
name: Run Node.js 14 Tests
script:
- npm install
- npm test
condition:
changesets:
includePaths:
- "**/*.js"
- step:
name: Run Python 3.9 Tests
script:
- pip install -r requirements.txt
- pytest
condition:
changesets:
includePaths:
- "**/*.py"
• Why: Optimizes build times by running tests only when relevant changes
occur.

4. Custom Caching
Caching speeds up pipeline execution by storing dependencies between
runs.

definitions:
caches:
node: ~/.npm

pipelines:
default:
- step:
caches:
- node
script:
- npm install

• In this example, the ~/.npm directory is cached, reducing installation


time on subsequent pipeline runs.

5. Deployment Pipelines and Environments


Bitbucket Pipelines supports multi-environment deployments using
variables and deployment steps.

pipelines:
branches:
master:
- step:
script:
- echo "Deploying to Production"
- ./deploy.sh production
- step:
name: "Notify Deployment"
script:
- curl -X POST -d '{"text":"Production deployed!"}' $SLACK_WEBHOOK_URL

• Environment-Specific Deployments: The script triggers deployment to


production and notifies via Slack.

• Slack Integration: Use environment variables for notifications and other


integrations.

Pipeline Optimization and Best Practices


1. Docker and Custom Images
By default, Bitbucket Pipelines runs inside Docker containers. You can
define custom images or use public ones.

image: my-custom-image:latest

• Repository Variables: Store secrets as environment variables in Bitbucket


and reference them securely.

• Built-in Support for Environments: Bitbucket offers built-in support for


development, staging, and production environments.

2. Conditional Steps
You can use conditionals in Bitbucket Pipelines to control when certain steps
are executed.

script:
- if [[ "$BITBUCKET_BRANCH" == "master" ]]; then
./deploy.sh production;
else
./deploy.sh staging;
fi

• Branch-Specific Logic: This example deploys to production only when


the pipeline is triggered on the master branch, while all other branches
deploy to staging.

Conditional execution allows certain steps to run only if previous steps meet
specific conditions.

pipelines:
default:
- step:
name: Build
script:
- echo "Building..."
- step:
name: Deploy
script:
- echo "Deploying..."
condition:
changesets:
include:
- src/**
• This allows deployment steps to execute only if changes are detected in
specific directories.

3. Managing Artifacts
Artifacts are files generated by your pipeline that you can access after the
pipeline completes.

pipelines:
default:
- step:
name: "Build"
script:
- npm run build
artifacts:
- dist/** # Store all files in the dist directory

• Use artifacts to share build outputs (like logs, reports, or binaries)


between steps or with your team.

4. Reusable YAML Snippets


DRY (Don’t Repeat Yourself) is a key principle in software development, and
Bitbucket Pipelines support reusable YAML snippets to reduce duplication.
definitions:
steps:
- step: &run_tests
name: Run Tests
script:
- npm run test

pipelines:
default:
- step: *run_tests

Docker Options in Bitbucket Pipelines


Bitbucket Pipelines natively support Docker to run builds inside isolated
containers. You can configure Docker in two ways: run services (e.g., Redis,
Postgres) or specify Docker options like CPU, memory limits, or custom
container configurations.

Running Services like Redis or PostgreSQL:

pipelines:
default:
- step:
name: Build and Test
services:
- postgres
- redis
script:
- npm install
- npm test

definitions:
services:
postgres:
image: postgres:13
variables:
POSTGRES_DB: 'testdb'
POSTGRES_USER: 'postgres'
POSTGRES_PASSWORD: 'password'
redis:
image: redis:6
variables:
REDIS_PASSWORD: 'mypassword'

• Services: Redis and Postgres are spun up as Docker containers and


available to your test suite.

• Custom Variables: Environment variables like database names, users,


and passwords are passed into the containers.

Customizing CPU, Memory, and Docker Resource Limits


You can define resource limits for your Docker containers to better control
build performance.

pipelines:
default:
- step:
name: Test with Limited Resources
size: 2x
services:
- redis
script:
- npm test

definitions:
services:
redis:
image: redis:6
cpu: 1024 # 1 CPU core
memory: 512 # 512MB of RAM

In this example:

• size: 2x: Allocates twice the default CPU/memory for the pipeline step.

• cpu and memory: Set specific resource limits for Redis to avoid overuse.
Additional Features to Optimize Bitbucket Pipelines

1. Docker-in-Docker (DinD) Support


You can build and push Docker images within a Bitbucket Pipeline by using
Docker-in-Docker.

image: docker:20.10

pipelines:
default:
- step:
name: Build and Push Docker Image
services:
- docker
script:
- docker build -t my-image .
- docker tag my-image myrepo/my-image:latest
- docker push myrepo/my-image:latest

definitions:
services:
docker:
image: docker:20.10-dind

• Docker-in-Docker (DinD) service allows running Docker commands


inside Docker containers, enabling you to build and push images.
2. Custom Docker Registries
If you’re using a private Docker registry, you can authenticate it directly in
your pipeline.

image: myregistry.com/my-image:latest

pipelines:
default:
- step:
name: Build Docker Image
script:
- docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD myregistry.com
- docker build -t my-image .
- docker push myregistry.com/my-image

3. Secret Management with Deployments


Bitbucket’s Deployments feature allows you to define environment-specific
variables. Here’s an example of how you can use secrets based on
deployments:

pipelines:
branches:
main:
- step:
name: Deploy to Production
deployment: production
script:
- echo $PRODUCTION_API_KEY
- ./deploy.sh
pull-requests:
'**':
- step:
name: Deploy to Staging
deployment: staging
script:
- echo $STAGING_API_KEY
- ./deploy.sh

• deployment: staging or deployment: production: Bitbucket automatically


loads secrets from the “Deployments” section for the specific
environment.

• You can securely manage API keys, database passwords, and other
sensitive info under the “Deployments” tab for each environment. These
variables are accessible within the pipeline only for the associated
environment.

Use Case:

In scenarios where you need different configurations for various stages like
development, staging, and production, Deployments in Bitbucket make it
easy to segregate and protect secrets for each environment.

Example in Production:

pipelines:
branches:
production:
- step:
name: Deploy to Production
deployment: production
script:
- echo $PRODUCTION_API_KEY
- docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD
- ./deploy.sh

• Here, it pulls secrets specifically for production during the deployment


phase, keeping staging, dev, or other environment configurations
separate.

Benefits:

• Security: Keeps sensitive data protected and segregated per environment.


• Ease of Management: Centralized control over environment-specific
secrets across your pipeline.

By combining secrets with Deployment environments, you ensure a secure


and streamlined process, significantly enhancing your CI/CD pipeline
workflows.

4. Scheduling Pipelines
You can schedule your pipelines to run at specified times, allowing for
regular checks or automated tasks like nightly builds.

pipelines:
custom:
nightly-build:
- step:
name: Nightly Build
script:
- echo "Running nightly build"

You can trigger this pipeline using a cron job.

5. Pipe Marketplace & Using Different Pipes in Per Step


It might be useful to expand on the concept of the Pipes Marketplace, where
Bitbucket provides pre-built “pipes” for many common tasks, like deploying
to Kubernetes, Slack notifications, and integrating with third-party services.

Bitbucket Pipeline configuration that installs npm packages, deploys to AWS


S3, and validates the deployment using CloudFront, all with one base image
and two pipes.

Example Pipeline Configuration

image: node:20 # Base image for the pipeline

pipelines:
default:
- step:
name: Build, Deploy to S3, and Validate
script:
- echo "Installing npm packages..."
- npm install
- echo "Building the application..."
- npm run build
- echo "Deploying to S3..."

services:
- docker # Use Docker service if required

pipes:
- atlassian/s3-deploy:1.0.0 # First pipe to deploy to AWS S3
- atlassian/cloudfront-invalidation:1.0.0 # Second pipe to invalidate CloudFront cache
variables:
S3_BUCKET: 'your-s3-bucket-name' # Specify your S3 bucket name
CLOUD_FRONT_DISTRIBUTION_ID: 'your-cloudfront-distribution-id' # Specify your CloudFron

deployment:
production:
S3_BUCKET: $S3_BUCKET
CLOUD_FRONT_DISTRIBUTION_ID: $CLOUD_FRONT_DISTRIBUTION_ID

• It reduces the complexity of writing custom scripts for integrations.

• Base Image: The pipeline uses a Node.js image to run npm commands.

• Script Section: This section installs dependencies, builds the application,


and prepares for deployment.

• Pipes Section:
The S3 Deploy Pipe uploads your built application files to the specified S3
bucket.

• The CloudFront Invalidation Pipe then invalidates the CloudFront


distribution cache to ensure that the latest version of your application is
served to users.

This configuration efficiently integrates building, deploying, and validating


steps in one streamlined process.
6. Monitoring and Logging
Monitoring pipeline execution is crucial for ensuring reliability and
efficiency. Bitbucket Pipelines provides several options to monitor the
performance of your CI/CD workflows and log execution details.

Options for Monitoring


• Built-in Logs: Bitbucket provides a logging mechanism that captures the
output of every step in the pipeline. You can access these logs directly in
the Bitbucket UI under the “Pipelines” section.

• External Monitoring Services: For more advanced monitoring, you can


integrate with external services like Datadog, Prometheus, or Grafana.
These tools allow for real-time monitoring and alerting based on custom
metrics.

Example:
- step:
name: Notify on Failure
script:
- if [ "$BITBUCKET_PIPELINE_STATUS" == "FAILED" ]; then
curl -X POST -H 'Content-Type: application/json' --data '{"text":"Pipeline failed!"}'
fi

• Custom Metrics: You can log specific metrics from your builds or tests to
a centralized logging platform like ELK Stack (Elasticsearch, Logstash,
and Kibana) or Splunk.

Example:

- step:
name: Log Metrics
script:
- echo "Logging custom metrics"
- curl -X POST -d '{"metric": "build_time", "value": "$BITBUCKET_BUILD_DURATION"

7. Integration with Issue Tracking


Integrating your CI/CD workflows with issue tracking systems, such as Jira,
can streamline the development process by automating updates based on
pipeline status.

Automating Updates to Issue Trackers


You can configure Bitbucket Pipelines to update issues in Jira based on the
results of your builds or deployments. This integration helps maintain a
clear status of development tasks.

- step:
name: Update Jira Issue
script:
- >
if [ "$BITBUCKET_PIPELINE_STATUS" == "SUCCESS" ]; then
curl -X POST -H "Content-Type: application/json" --data '{"fields":{"status":{"name":"Do
fi

• when the pipeline succeeds, it sends a request to Jira to transition the


issue status to “Done.”

8. Security Scans
Integrating security checks into your Bitbucket Pipelines helps ensure that
vulnerabilities are caught early in the CI/CD process, reducing the risk of
deploying insecure code. With the use of third-party tools like Snyk, you can
easily automate security scanning as part of your pipeline configuration.

By incorporating the snyk/snyk-scan pipe into your pipeline, you can


automatically scan for security vulnerabilities in your dependencies. Here's
an example of how to add a Snyk security scan step to your pipeline:

pipelines:
default:
- step:
name: Snyk Security Scan
script:
- pipe: snyk/snyk-scan:0.5.0
variables:
SNYK_TOKEN: $SNYK_TOKEN

• SNYK_TOKEN : This is your Snyk authentication token, stored securely as a


pipeline environment variable.

Adding security scans to your pipeline ensures that code vulnerabilities are
identified and addressed during the development cycle, maintaining your
project’s security standards over time.
9. Monorepo Support
Monorepos allow you to maintain multiple projects or services within a
single repository. With Bitbucket Pipelines, you can configure workflows to
run tests and builds for each project in parallel, ensuring efficient CI/CD
operations across different parts of the repository.

To efficiently manage a monorepo, you can define parallel steps in your


pipeline that target specific directories. For example, let’s assume a
repository contains both frontend and backend applications. You can create a
pipeline like this:

pipelines:
default:
- step:
name: Install Root Dependencies
script:
- npm install
- parallel:
- step:
name: Test Frontend
caches:
- node
script:
- cd frontend
- npm install
- npm test
- step:
name: Test Backend
caches:
- node
script:
- cd backend
- npm install
- npm test

• The Install Root Dependencies step runs first, ensuring that shared
dependencies are installed.

• Then, two parallel steps ( Test Frontend and Test Backend ) execute
concurrently, each testing the relevant parts of the monorepo.

By structuring your pipeline this way, you can reduce the time required to
test multiple components within a monorepo while keeping workflows
efficient and manageable.

10. Self-Hosted Runners


For teams that require greater control over their CI/CD infrastructure,
Bitbucket Pipelines provides the option to run pipelines on self-hosted
runners. This feature is particularly useful in scenarios where you need to
execute builds on custom hardware, within specific network environments,
or with more powerful machines than the default Bitbucket cloud runners.
To define and use a self-hosted runner, you first need to install the runner on
your own infrastructure. After that, you can configure the runner in your
pipeline definition as follows:

definitions:
runners:
my-self-hosted-runner:
image: "my-custom-runner"
size: 2x

pipelines:
default:
- step:
name: Build on Self-Hosted Runner
runs-on:
- my-self-hosted-runner
script:
- npm install
- npm test

• my-self-hosted-runner : Refers to a self-hosted runner that you've installed


on your infrastructure.

• image : Specifies the Docker image or environment that the runner will
use.
• size : Indicates the machine's size, which can be customized based on
the resource requirements of your build.

Self-hosted runners offer the flexibility to tailor your build environments


while ensuring compliance with internal policies or meeting specific
performance needs.

Conclusion
Bitbucket Pipelines offers a powerful and flexible CI/CD solution, seamlessly
integrated with Bitbucket. By leveraging advanced features such as
conditional steps, caching, Docker-based execution, and environment-
specific secrets management, teams can automate their development
workflows effectively. Additionally, integrating with external monitoring
services and issue trackers enhances visibility and responsiveness to
deployment changes. By mastering these capabilities, teams can streamline
their processes, improve code quality, and accelerate delivery cycles,
ultimately driving greater efficiency and success in their software
development lifecycle.
Bitbucket DevOps Cloud Computing Cloud Software Development

Written by Mert Mengü Follow


42 Followers · 10 Following

No responses yet

More from Mert Mengü


Mert Mengü Mert Mengü

AWS SAM CLI: Advanced Advanced Feature-Based


Serverless Deployments Deployment Strategies: Dynamic…
Upcoming Document Overview Feature-based deployments provide a
powerful mechanism for creating isolated…

Oct 2 15 Oct 6
Mert Mengü Mert Mengü

Guide to GitHub Actions for Mastering AWS ECS Fargate: A


Advanced CI/CD Workflows Comprehensive Guide for DevOp…
Introduction to GitHub Actions In today’s cloud-native world, container
orchestration has become a cornerstone of…

OctSee
4 all from
3 Mert Mengü Oct 2 4

Recommended from Medium


Shazia Massey Bibhu Mishra

GitLab CI/CD pipeline project Git 3-Branch Strategy


In the initial phase of the project, three EC2 A Git branching strategy is crucial for
instances have been provisioned with the… managing development, bug fixes, and…

Jul 3 Sep 14 38 1

Lists

General Coding Knowledge Stories to Help You Grow as a


20 stories · 1838 saves Software Developer
19 stories · 1531 saves

Coding & Development Good Product Thinking


11 stories · 952 saves 13 stories · 789 saves
Ata Seren In AI Advances by Debmalya Biswas

Creating a DevSecOps pipeline AI Agents Marketplace &


with Jenkins—Part 2 Discovery for Multi-agent Systems
This is the part 2 of my journey in creating a Why LLMs as a run-time execution engine for
robust DevSecOps pipeline by using Jenkins… Agentic AI systems do not Scale?

Jul 5 66 2 3d ago 565 9

In With Code Example by Harendra dikshant mali

The Best Open Source Alternatives Best Practices ArgoCD | K8s


to Your Favorite Productivity Tools Deployment Strategy With ArgoCD
Open-source alternatives to Google Drive, ArgoCD Deployment Strategy
Notion, Figma, Zoom, and Photoshop

3d ago 102 1 Sep 1 69 1

See more recommendations


Help Status About Careers Press Blog Privacy Terms Text to speech Teams

You might also like