Guide to Bitbucket Pipeline for Advanced CI_CD Workflows _ by Mert Mengü _ Medium
Guide to Bitbucket Pipeline for Advanced CI_CD Workflows _ by Mert Mengü _ Medium
Search Write
Get unlimited access to the best of Medium for less than $1/week. Become a member
pipelines:
default:
- step:
name: Install and Test
caches:
- node
script:
- npm install
- npm test
Key Components:
• image: Specifies the Docker image for the pipeline.
pipelines:
branches:
develop:
- step: *run_tests
master:
- step: *build_deploy
pull-requests:
'**':
- step: *run_tests
tags:
v*.*.*:
- step: *deploy_production
pipelines:
default:
- parallel:
- step:
name: "Run Unit Tests"
script:
- npm run test:unit
- step:
name: "Run Integration Tests"
script:
- npm run test:integration
- step:
name: "Build and Deploy"
script:
- npm run build
- ./deploy.sh
• Parallel Steps: Both unit and integration tests are run simultaneously,
reducing the overall pipeline time.
• Sequential Step: The build and deploy step runs only after the parallel
tests complete.
3. Advanced Parallelism: Conditional Matrix Builds
Bitbucket also allows conditional parallel execution, which can be useful in
large repositories where specific configurations or tests might not always
need to run.
pipelines:
branches:
master:
- parallel:
- step: &run_tests
name: Run Node.js 14 Tests
script:
- npm install
- npm test
condition:
changesets:
includePaths:
- "**/*.js"
- step:
name: Run Python 3.9 Tests
script:
- pip install -r requirements.txt
- pytest
condition:
changesets:
includePaths:
- "**/*.py"
• Why: Optimizes build times by running tests only when relevant changes
occur.
4. Custom Caching
Caching speeds up pipeline execution by storing dependencies between
runs.
definitions:
caches:
node: ~/.npm
pipelines:
default:
- step:
caches:
- node
script:
- npm install
pipelines:
branches:
master:
- step:
script:
- echo "Deploying to Production"
- ./deploy.sh production
- step:
name: "Notify Deployment"
script:
- curl -X POST -d '{"text":"Production deployed!"}' $SLACK_WEBHOOK_URL
image: my-custom-image:latest
2. Conditional Steps
You can use conditionals in Bitbucket Pipelines to control when certain steps
are executed.
script:
- if [[ "$BITBUCKET_BRANCH" == "master" ]]; then
./deploy.sh production;
else
./deploy.sh staging;
fi
Conditional execution allows certain steps to run only if previous steps meet
specific conditions.
pipelines:
default:
- step:
name: Build
script:
- echo "Building..."
- step:
name: Deploy
script:
- echo "Deploying..."
condition:
changesets:
include:
- src/**
• This allows deployment steps to execute only if changes are detected in
specific directories.
3. Managing Artifacts
Artifacts are files generated by your pipeline that you can access after the
pipeline completes.
pipelines:
default:
- step:
name: "Build"
script:
- npm run build
artifacts:
- dist/** # Store all files in the dist directory
pipelines:
default:
- step: *run_tests
pipelines:
default:
- step:
name: Build and Test
services:
- postgres
- redis
script:
- npm install
- npm test
definitions:
services:
postgres:
image: postgres:13
variables:
POSTGRES_DB: 'testdb'
POSTGRES_USER: 'postgres'
POSTGRES_PASSWORD: 'password'
redis:
image: redis:6
variables:
REDIS_PASSWORD: 'mypassword'
pipelines:
default:
- step:
name: Test with Limited Resources
size: 2x
services:
- redis
script:
- npm test
definitions:
services:
redis:
image: redis:6
cpu: 1024 # 1 CPU core
memory: 512 # 512MB of RAM
In this example:
• size: 2x: Allocates twice the default CPU/memory for the pipeline step.
• cpu and memory: Set specific resource limits for Redis to avoid overuse.
Additional Features to Optimize Bitbucket Pipelines
image: docker:20.10
pipelines:
default:
- step:
name: Build and Push Docker Image
services:
- docker
script:
- docker build -t my-image .
- docker tag my-image myrepo/my-image:latest
- docker push myrepo/my-image:latest
definitions:
services:
docker:
image: docker:20.10-dind
image: myregistry.com/my-image:latest
pipelines:
default:
- step:
name: Build Docker Image
script:
- docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD myregistry.com
- docker build -t my-image .
- docker push myregistry.com/my-image
pipelines:
branches:
main:
- step:
name: Deploy to Production
deployment: production
script:
- echo $PRODUCTION_API_KEY
- ./deploy.sh
pull-requests:
'**':
- step:
name: Deploy to Staging
deployment: staging
script:
- echo $STAGING_API_KEY
- ./deploy.sh
• You can securely manage API keys, database passwords, and other
sensitive info under the “Deployments” tab for each environment. These
variables are accessible within the pipeline only for the associated
environment.
Use Case:
In scenarios where you need different configurations for various stages like
development, staging, and production, Deployments in Bitbucket make it
easy to segregate and protect secrets for each environment.
Example in Production:
pipelines:
branches:
production:
- step:
name: Deploy to Production
deployment: production
script:
- echo $PRODUCTION_API_KEY
- docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD
- ./deploy.sh
Benefits:
4. Scheduling Pipelines
You can schedule your pipelines to run at specified times, allowing for
regular checks or automated tasks like nightly builds.
pipelines:
custom:
nightly-build:
- step:
name: Nightly Build
script:
- echo "Running nightly build"
pipelines:
default:
- step:
name: Build, Deploy to S3, and Validate
script:
- echo "Installing npm packages..."
- npm install
- echo "Building the application..."
- npm run build
- echo "Deploying to S3..."
services:
- docker # Use Docker service if required
pipes:
- atlassian/s3-deploy:1.0.0 # First pipe to deploy to AWS S3
- atlassian/cloudfront-invalidation:1.0.0 # Second pipe to invalidate CloudFront cache
variables:
S3_BUCKET: 'your-s3-bucket-name' # Specify your S3 bucket name
CLOUD_FRONT_DISTRIBUTION_ID: 'your-cloudfront-distribution-id' # Specify your CloudFron
deployment:
production:
S3_BUCKET: $S3_BUCKET
CLOUD_FRONT_DISTRIBUTION_ID: $CLOUD_FRONT_DISTRIBUTION_ID
• Base Image: The pipeline uses a Node.js image to run npm commands.
• Pipes Section:
The S3 Deploy Pipe uploads your built application files to the specified S3
bucket.
Example:
- step:
name: Notify on Failure
script:
- if [ "$BITBUCKET_PIPELINE_STATUS" == "FAILED" ]; then
curl -X POST -H 'Content-Type: application/json' --data '{"text":"Pipeline failed!"}'
fi
• Custom Metrics: You can log specific metrics from your builds or tests to
a centralized logging platform like ELK Stack (Elasticsearch, Logstash,
and Kibana) or Splunk.
Example:
- step:
name: Log Metrics
script:
- echo "Logging custom metrics"
- curl -X POST -d '{"metric": "build_time", "value": "$BITBUCKET_BUILD_DURATION"
- step:
name: Update Jira Issue
script:
- >
if [ "$BITBUCKET_PIPELINE_STATUS" == "SUCCESS" ]; then
curl -X POST -H "Content-Type: application/json" --data '{"fields":{"status":{"name":"Do
fi
8. Security Scans
Integrating security checks into your Bitbucket Pipelines helps ensure that
vulnerabilities are caught early in the CI/CD process, reducing the risk of
deploying insecure code. With the use of third-party tools like Snyk, you can
easily automate security scanning as part of your pipeline configuration.
pipelines:
default:
- step:
name: Snyk Security Scan
script:
- pipe: snyk/snyk-scan:0.5.0
variables:
SNYK_TOKEN: $SNYK_TOKEN
Adding security scans to your pipeline ensures that code vulnerabilities are
identified and addressed during the development cycle, maintaining your
project’s security standards over time.
9. Monorepo Support
Monorepos allow you to maintain multiple projects or services within a
single repository. With Bitbucket Pipelines, you can configure workflows to
run tests and builds for each project in parallel, ensuring efficient CI/CD
operations across different parts of the repository.
pipelines:
default:
- step:
name: Install Root Dependencies
script:
- npm install
- parallel:
- step:
name: Test Frontend
caches:
- node
script:
- cd frontend
- npm install
- npm test
- step:
name: Test Backend
caches:
- node
script:
- cd backend
- npm install
- npm test
• The Install Root Dependencies step runs first, ensuring that shared
dependencies are installed.
• Then, two parallel steps ( Test Frontend and Test Backend ) execute
concurrently, each testing the relevant parts of the monorepo.
By structuring your pipeline this way, you can reduce the time required to
test multiple components within a monorepo while keeping workflows
efficient and manageable.
definitions:
runners:
my-self-hosted-runner:
image: "my-custom-runner"
size: 2x
pipelines:
default:
- step:
name: Build on Self-Hosted Runner
runs-on:
- my-self-hosted-runner
script:
- npm install
- npm test
• image : Specifies the Docker image or environment that the runner will
use.
• size : Indicates the machine's size, which can be customized based on
the resource requirements of your build.
Conclusion
Bitbucket Pipelines offers a powerful and flexible CI/CD solution, seamlessly
integrated with Bitbucket. By leveraging advanced features such as
conditional steps, caching, Docker-based execution, and environment-
specific secrets management, teams can automate their development
workflows effectively. Additionally, integrating with external monitoring
services and issue trackers enhances visibility and responsiveness to
deployment changes. By mastering these capabilities, teams can streamline
their processes, improve code quality, and accelerate delivery cycles,
ultimately driving greater efficiency and success in their software
development lifecycle.
Bitbucket DevOps Cloud Computing Cloud Software Development
No responses yet
Oct 2 15 Oct 6
Mert Mengü Mert Mengü
OctSee
4 all from
3 Mert Mengü Oct 2 4
Jul 3 Sep 14 38 1
Lists