Differences Between Jenkins Scripted and Declarative Pipeline
Last Updated :
28 Mar, 2025
Jenkins Pipeline is a tool that helps teams automate software development, testing, and deployment. It ensures that every new update to the code is tested and deployed smoothly without manual intervention.
This guide explains the two types of Jenkins Pipelines—Scripted and Declarative, their use cases, and best practices to build an efficient CI/CD pipeline.
Scripted Vs. Declarative Pipeline
Choosing between Scripted Pipeline and Declarative Pipeline often depends on the project requirements, developer expertise, and the flexibility needed in the pipeline configuration.
Aspect | Scripted Pipeline | Declarative Pipeline |
---|
Syntax | Uses Groovy scripting. | Uses structured Groovy DSL. |
Flexibility | Highly customizable. | Easier to use, less flexible. |
Error Handling | Manual handling needed. | Built-in error handling. |
Code Reusability | More reusable & modular. | Less focus on reuse. |
Readability | Can be complex. | More readable & beginner-friendly. |
Best for | Best for advanced workflows. | Standard CI/CD tasks. |
Scripted Pipeline
The Scripted Pipeline is the original pipeline syntax in Jenkins, and it is based on the Groovy scripting language. It provides a flexibility and control over the pipeline process. It requires a more understanding of Groovy scripting for the implementation of the complex code.
node {
stage('Build Step') {
// Build the application
sh 'mvn clean install'
}
stage('Test Step') {
// Run tests
sh 'mvn test'
}
stage('Deploy Step') {
// Deploy the application
sh 'deploy.sh'
}
}
How it works?
- Build Step: Runs the build process using Maven.
- Test Step: Executes unit tests to verify the code.
- Deploy Step: Deploys the application.
When to use a Scripted Pipeline?
- If your pipeline requires custom logic, parallel execution, or complex error handling
- When your team is comfortable writing Groovy scripts
- If you need greater flexibility and control over execution
Declarative Pipeline
The Declarative Pipeline is a more recent addition to Jenkins and provides a more structured and simpler syntax compared to the Scripted Pipeline. The syntax uses a Domain-Specific Language (DSL) based on Groovy, designed to be intuitive and more readable. Declarative Pipelines are defined inside a pipeline
block and are less verbose than Scripted Pipelines.
pipeline {
agent any
stages {
stage('Build Step') {
steps {
// Build the application
sh 'mvn clean install'
}
}
stage('Test Step') {
steps {
// Run tests
sh 'mvn test'
}
}
stage('Deploy Step') {
steps {
// Deploy the application
sh 'deploy.sh'
}
}
}
}
What makes it different?
In this Declarative Pipeline example we having the following section include:
- agent any: Specifies that the pipeline should run on any available Jenkins agent.
- stages: Defines the stages of the pipeline — Build, Test, and Deploy.
- steps: Contains the commands that should be executed for each stage.
When to use a Declarative Pipeline?
This type of pipeline is best for teams looking for a simple and easy way to automate deployment
- If you want a simpler, more readable pipeline
- For standard CI/CD workflows
- When you need quick setup with minimal coding.
Example: Automating a Web App Deployment
A common DevOps workflow involves building, testing and deploying a Java web application.
Declarative Pipeline Example for Web App Deployment
A common use case in Jenkins pipelines is to automate the build, test, and deployment process for a web application. Here's how you can structure the pipeline using Declarative syntax:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh 'mvn clean package' // Build the Java application
}
}
stage('Test') {
steps {
sh 'mvn test' // Run unit tests
}
}
stage('Deploy') {
steps {
sh 'ansible-playbook -i hosts deploy.yml' // Deploy using Ansible
}
}
}
}
This pipeline will:
- Build the Java application using Maven.
- Run tests to check for errors.
- Deploy the application using Ansible.
Why Use Declarative Here?
- Simple and easy to maintain
- Ideal for standard build-test-deploy workflows
Advanced Scripted Pipeline Example
If your project requires parallel execution or advanced logic, a Scripted Pipeline is a better choice.
node {
stage('Build') {
sh 'mvn clean package' // Build the application
}
stage('Test') {
parallel (
"unit-tests": {
sh 'mvn test' // Run unit tests
},
"performance-tests": {
sh 'jmeter -n -t test_plan.jmx' // Run performance tests
}
)
}
stage('Deploy') {
withCredentials([usernamePassword(credentialsId: 'ansible-credentials', usernameVariable: 'USER', passwordVariable: 'PASS')]) {
sh "ansible-playbook -i hosts -u ${USER} -k -v deploy.yml" // Deploy the application using Ansible
}
}
}
In this Scripted Pipeline, the Test stage runs unit and performance tests in parallel, and Deploy uses withCredentials to securely pass Ansible credentials.
Why Choose a Scripted Pipeline for This?
- Runs unit tests and performance tests in parallel (faster execution).
- Uses secure credentials for deployment.
- Can handle dynamic workflows.
If you need a highly customizable CI/CD pipeline, a Scripted Pipeline is the way to go.
When to Choose Scripted vs Declarative Pipeline?
Use Scripted Pipeline When:
- You need full control over execution.
- Your workflow requires dynamic behavior, parallel execution, or custom logic.
- Your team is comfortable writing Groovy scripts.
Use Declarative Pipeline When:
- You prefer simplicity and readability.
- Your pipeline involves a standard CI/CD pipeline(Build, Test, and Deploy)
- You want easier setup with minimal coding.
Best Practices for Jenkins Pipelines
A well-structured Jenkins pipeline can significantly improve your CI/CD workflow by making it more efficient, scalable, and secure. Here are some best practices to follow:
1. Break Pipelines into Smaller Stages
Instead of cramming everything into one long script, divide the pipeline into clear stages like Build, Test, and Deploy:
- Build: Compile the code and create necessary artifacts.
- Test: Run different types of tests, such as unit tests and integration tests.
- Deploy: Push the application to staging or production environments.
Keeping these steps separate makes the pipeline easier to understand, debug, and maintain.
2. Speed Up Builds with Parallel Execution
Running tasks one after another can slow things down. Instead, take advantage of parallel execution:
- Run different test types (unit tests, UI tests, API tests) at the same time.
- Deploy to multiple environments simultaneously instead of sequentially.
Jenkins supports parallel execution, helping reduce waiting times and speeding up feedback loops.
3. Automate Test Execution
Manual testing slows down development and introduces human errors. Instead, automate testing by triggering builds whenever:
- A developer pushes new code.
- A pull request is created.
- A scheduled job runs at fixed intervals (e.g., nightly builds).
This ensures that issues are caught early rather than after deployment.
4. Use Reporting Tools for Better Visibility
Test results and logs can be hard to track without proper reporting. Use Jenkins plugins to visualize and analyze test results:
- JUnit & TestNG: Generate structured test reports.
- Allure Reports: Create detailed reports with history tracking.
- SonarQube: Run static code analysis to maintain code quality.
Good reporting helps teams quickly spot and fix problems instead of digging through log files.
5. Protect Sensitive Information
Jenkins often needs API keys, passwords, and other credentials to connect with external systems. To keep these secure:
- Store them in Jenkins Credentials Manager, not in code.
- Use secrets management tools like HashiCorp Vault or AWS Secrets Manager.
- Limit access using role-based permissions in Jenkins.
This prevents sensitive data from getting exposed accidentally.
6. Keep Jenkins Running Smoothly
Jenkins can slow down if not properly maintained. To keep it running efficiently:
- Make sure the Jenkins server has enough CPU, RAM, and disk space.
- Regularly delete old builds to free up storage.
- Use Jenkins agents to distribute workloads and avoid overloading the master node.
- Set up monitoring tools like Prometheus or Grafana to track performance.
A well-optimized Jenkins setup reduces downtime and build failures, keeping development on track.
Which Pipeline Should You Choose?
- Use Scripted Pipelines when you need full control, parallel execution, or advanced logic
- Use Declarative Pipelines for simpler, standard CI/CD workflows
Conclusion
Both Scripted and Declarative Pipelines are powerful tools for automating the CI/CD process in Jenkins. The choice between the two largely depends on your project’s complexity, the level of control required, and your team's familiarity with Groovy scripting.
For developers seeking ease of use and readability, Declarative Pipeline provides a more structured, intuitive approach, while Scripted Pipeline offers greater flexibility for more complex, custom workflows. By understanding the differences and strengths of each, you can make an informed decision on which pipeline type best fits your needs.
Similar Reads
Difference between Scripted and Unscripted Testing
Scripted Testing: Scripted Testing is a type of software testing that is performed by organizing the details of all the tasks and planning. Testing team properly plans for the scripted testing and a test script is written. A test script is a set of rules, phases involved and different steps involved
2 min read
Difference between Test Case and Test Script
In the field of software testing, two terms often emerge: Test Case and Test Script are two synonyms used interchangeably in test throughout the project. Both are essential in proving the integrity of a software application while testing does involve the same function as verification but they differ
5 min read
Difference between Data Driven Testing and Keyword Driven Testing
In software testing, especially automated testing, different methods and frameworks help organize test cases and make the testing process more efficient. Two popular approaches are Data Driven Testing (DDT) and Keyword Driven Testing (KDT). Both methods improve the flexibility and effectiveness of a
7 min read
Difference between Scripted testing and Exploratory testing
In software testing, both methods have specific purposes and goals, one method has a writing test with a purpose and the other has explored the app with a specific purpose. In this article, we have deeply looked into both testing which is Scripted testing and Exploratory Testing.Learn More: Software
4 min read
Difference between Agile Testing and Waterfall Testing
In software development, testing methodologies like Agile and Waterfall offer distinct approaches to ensuring quality. Agile Testing involves ongoing testing alongside development, promoting teamwork and flexibility. Waterfall Testing, in contrast, separates testing into distinct phases after develo
3 min read
Difference between Continuous Integration and Continuous Delivery
Introduction :In this, you will see the overview of Continuous Integration and Continuous Delivery. And mainly our focus on the Difference between Continuous Integration and Continuous Delivery. Let's discuss it one by one. Continuous Integration (CI) : CI, as the name suggests, is an approach that
3 min read
Difference between BPR and BPM
1. Business Process Re-engineering (BPR) : Developed in early 1990s, Business Process Re-engineering (BPR) is a management approach aiming at improvements by means of elevating efficiency and effectiveness of the process that exists within and across the organizations. It is one of the eight techniq
2 min read
Difference between JIRA API and JIRA Align API
Project management is a crucial task and to perform it efficiently, software tools like JIRA are used. JIRA is a software that was developed by Atlassian (An Australian company) in the year 2002 to ease the process of project management and bug tracking. In this article, we will explore the differen
4 min read
What is an ETL Pipeline?
An ETL Pipeline is a crucial data processing tool used to extract, transform, and load data from various sources into a destination system. The ETL process begins with the extraction of raw data from multiple databases, applications, or external sources. The data then undergoes transformation, where
5 min read
Integration of jmeter with jenkins
Integrating JMeter with Jenkins is a powerful way to enhance CI/CD pipelines by automating performance testing. JMeter, a widely used tool for load testing and performance analysis, can be seamlessly integrated with Jenkins, an open-source automation server, to ensure that your applications are reli
5 min read