Dynamically Show the Overall Progress of Running Jenkins Flows


Everyone knows Jenkins, right? And I think nobody don’t love Jenkins. Maybe it’s not the fastest or the fanciest, but it’s really easy to start to use, even for rookies, due to its short learning curves. What’s more, it has a great ecosystem of plugins and add-ons, which has significantly improved its capability. It is also optimized for easy customization. It can be configured to build codes, create Docker containers, run tons of tests, push to staging/production and etc.

Jenkins has become an indispensable tool for Continuous Integration (aka CI) and DevOps. With the help of Jenkins-Job-Builder (aka JJB), the configurations of Jenkins Jobs/Flows in JSON format can be maintained and updated through Git, which makes it easier to track every change on every Job/Flow.

However there are some issues regarding scaling and performance, which isn’t so unusual. Jenkins is built as a CI tool, which also needs CI for itself.

There are other cool solutions such as Travis CI and Circle CI, which are both hosted solutions that don’t require any maintenance on our side.


Ordinarily when a build flow is running, we want to track and dynamically show the real-time status of the Jenkins Build Flow. There is already a plugin named Build Graph View Plugin, which computes a graph of related builds starting from the first job to the current one, and renders it as a graph.

However, that plugin is full-fledged with no standalone daemon, which is hard to customized and integrated into your own dashboard. What’s more, that plugin cannot fully display the whole flow graph until all the subjobs/pipelines finish. So it is quite hard for developers, testers and operations engineers to maintain/monitor the overall progress of the current flow.

So I write such a standalone web service named reflatus, short for real-time jenkins flow status.

A Demo Flow

On how to use and configure this tool, please refer to the project wiki.

What it can NOT do

Reflatus only has a static parser, which can NOT parse the dedicated DSL defined by build flow. For the reasons, please refer to FAQ.

Then an extra yaml file is needed to explicitly define the build workflows (aka build pipelines). More info can be seen in the Configuration section.


  • Why not adding/using a parser to handle the dedicated DSL defined by build flow ?

    If so, there is no need to manually add an extra yaml file. Actually it will become quite complex to implement this feature. Regardless of the complicated build flow combinations, the name of a build job/pipeline can be dynamically acquired by triggered parameters, environment variables or an explicit name. This also applies to build job/pipeline parameters. These all adds more workloads and complexity to this tool. It is for this consideration that I discard this feature.