As your team’s CI/CD needs grow, managing complex YAML and ensuring fast feedback can become challenging. That’s why we’re excited to announce a fundamental reimagining of how workflows are designed and implemented in Bitbucket Pipelines, designed to help your teams scale.
This change is centred around one main new feature called triggers – a powerful way to automate & orchestrate builds, tests, and deployments with simple, event-driven rules.
With triggers, you define conditions that are evaluated when certain events occur. When those triggers are activated, they can be configured to run as many pipelines as you need in parallel. Just set your conditions, and Bitbucket does the rest.
What’s new – “triggers” and “conditions”
Top-level triggers property
This feature lets you declare event-driven rules referencing one or more custom pipelines that may be executed in response to certain business events, if certain conditions are also met. It provides a streamlined way to maintain separation-of-concerns by separating the “defining” of your pipelines from the “executing” of them. Triggers are implemented via a new top-level triggers property.
Condition property on triggers
For each trigger that is defined in your .yaml, there is one or more condition blocks. These blocks define a set of custom pipelines that will execute if the condition they are defined under evaluates to true when the corresponding trigger executes.
Logical expressions + glob()
When writing conditions, you are able to combine various operators such as &&, ||, (), >, <, <=, >=, ==, != and ! along with comparisons and pattern matching using glob() functions. This combination allows for highly precise and flexible control over conditions, making it easier to define complex rules that match specific patterns and logical criteria.
Stay tuned:
Keep your eyes open for updates coming in the next few days regarding how you can also use this new condition syntax on individual steps within your pipelines for fine-grained control of how your workflows execute at runtime.
Parallel fan-out
When a specified condition matches, all referenced pipelines run simultaneously. There is no “first-match wins” scenario as there was with the previous pipelines execution model; every matching pipeline executes concurrently.
Pull request pipelines
We listened to your feedback that the existing pull request pipelines lack a critical feature – the ability to trigger based on the destination branch. You can now achieve this using triggers. Additionally, each pipeline run in parallel against a pull request will also return a separate build-status.
Key differences vs. traditional selectors (branches, pull-requests, default etc)
| Aspect | Traditional selectors | Triggers |
| Matching behaviour | First match runs | All matching entries run; pipelines fan out in parallel |
| Condition complexity | Basic patterns on branches/tags | Logical operators, comparisons, and glob() |
| Multiple pipelines per event | Not supported | Explicit list; all run in parallel |
Why these changes matter
Composable CI/CD
These changes are designed to support you in breaking down large, monolithic pipelines into smaller, focused, and reusable units that can be independently developed and maintained to create flexible and efficient workflows.
Separate build status
Provides a distinct build status for each pipeline, enhancing observability. It allows teams to monitor the progress and health of each pipeline separately, making it easier to identify issues quickly and improve overall reliability and transparency of the build process.
Getting started with triggers
Define a triggers section at the top level in your bitbucket-pipelines.yml. Each event type can list multiple condition blocks. When a condition evaluates to true, all of its referenced custom pipelines run in parallel.
Example
triggers:
repository-push:
- condition: BITBUCKET_BRANCH == "main"
pipelines:
- unit-tests
- scan
- deploy
pullrequest-push:
- condition: glob(BITBUCKET_BRANCH, "feature/*") && BITBUCKET_PR_DESTINATION_BRANCH == "main"
pipelines:
- unit-tests
- lint
- condition: glob(BITBUCKET_BRANCH, "hotfix/*")
pipelines:
- unit-tests
pipelines:
custom:
unit-tests:
- step:
name: Unit Tests
script:
- npm ci
- npm test
lint:
- step:
name: Lint
script:
- npm run lint
scan:
- step:
name: Scan
script:
- ./scan.sh
deploy:
- step:
name: Deploy
script:
- ./deploy.sh
Supported trigger types:
- repository-push
Evaluated when a commit is pushed to any branch, or when a git tag is created. - pullrequest-push
Evaluated when a pull request is created or updated.
Note:
As you might expect, this list of triggers is just the start. We are planning to support a whole range of additional triggers in the near future to turbocharge your SDLC automation workflows.
For further details, see triggers documentation.
Build Status Updates
We are also improving how build statuses appear in order to better handle this new parallel execution model. Previously, multiple pipelines sometimes shared the same build status, resulting in fewer build statuses than expected in the pull request view. Now, we assign a unique build status to each pipeline and have clarified the naming to specify each one clearly.
Example
A commit generated four pipelines: one from the default section of bitbucket-pipelines.yml and three from the custom pipelines section.
Before:
Custom pipelines shared the same commit status
After:
All pipelines have a unique status, with meaningful names
Things to know
- Only custom pipelines can be referenced in
triggers.- Imported pipelines are supported but must still be defined within the
customsection (see documentation).
- Imported pipelines are supported but must still be defined within the
- A maximum of 20 conditions per trigger type are supported.
- A maximum of 100 pipelines can be initiated from a single trigger.
- The new
triggersworkflow operates along-side the legacy selector-based workflow. If both match, pipelines from both mechanisms will run.
There is no plan to deprecate the legacy selector-based workflow, however we do strongly recommend teams begin gradually migrating to the new syntax as they iterate on their pipeline workflows.
triggersare supported in dynamic pipelines. Triggers and their associated custom pipelines must be returned from Dynamic Pipelines providers in order to be evaluated for execution.
Documentation for this functionality will soon be updated under Pipeline start conditions.
Feedback
We’re always keen to hear your thoughts and feedback – if you have any questions or suggestions on this feature, feel free to share via the Pipelines Community Space.