Welcome!

Silverlight Authors: Automic Blog, Michael Kopp, AppDynamics Blog, Kaazing Blog, Steven Mandel

Related Topics: @DevOpsSummit, Linux Containers, Containers Expo Blog

@DevOpsSummit: Blog Post

How to Stop Your Developers from Jumping Ship | @DevOpsSummit #Agile #DevOps

For developers working across a broken workflow, the frustration can be felt acutely

We are experiencing a ‘crisis of engagement,' according to a survey by management consultancy Gallup, with a staggering 87 percent of employees worldwide disengaged with their jobs. While there are many factors that can influence an employee's contentment - from support to compensation to job security - a common grievance in software development is disempowerment, where talented individuals are feeling hampered by their working environment.

Operating within a flawed system is infuriating in any profession. But for developers working across a broken workflow, the frustration can be felt more acutely. One of the major bones of contention is that developers find themselves working in a way that conflicts with their training and established best practices, one that is not conducive with Agile and DevOps working methodologies and prevents them from optimizing their tools-of-choice.

A main contributor to an unproductive working environment is a fragmented toolchain, which is a regular occurrence at organizations worldwide. Many of them are adopting a best-of-breed approach to tool procurement without a holistic investment strategy. Individual specialists, teams and disciplines are specifying a tool (or version thereof) as and when required - i.e., a project manager with CA Clarity PPM, a tester in HP ALM, a QA in HPE QC, a developer in JIRA and so on - with little consideration as to how it may impact the rest of the software lifecycle.

While it is logical to equip employees with the best tools for the job, this ad-hoc and unsynchronized approach is littered with time bombs that could explode at any time, negatively impacting a developer's satisfaction and, ultimately, an organization's software delivery capability. So what can be done to stop developers jumping ship to join Facebook, Netflix, et al?

Grounds for divorce
Any investment strategy must consider software lifecycle integration (SLI), otherwise an organization is quietly laying the foundation for an unproductive environment that will come back to haunt it. Tools in the software lifecycle are not designed to integrate with other third-party tools, meaning they do not naturally communicate with each other, and project-critical data typically becomes siloed within the tools.

The result is a workflow with little-to-no visibility, traceability or governance across the lifecycle, with stakeholders unable to share information with other stakeholders, greatly impacting collaboration. Such a dynamic can frustrate a developer in several ways, all of which are avoidable. Here are just a few of the issues caused by a disconnected toolchain:

  • Agile and DevOps blasphemy: Collaboration, interaction and information are the triumvirate of Agile and DevOps. The flow of artifacts between tools can be considered more important than the tool itself. As both methodologies are widely accepted as industry standards, it's understandable that most developers will be looking to apply the principles in their day-to-day work within their teams. But if their tool is isolated and they're unable to share or receive vital data at a click of button, they will become discontented as their training is wasted and skill set underutilized.
  • Developers just want to...develop: Developers are genuinely passionate about their work. They're problem solvers who love tackling issues with code, taking great pleasure in working with their colleagues to build and maintain projects, drive innovation, challenge the status quo and, significantly, create value for their company. But it's a tough job; intensive coding is an all-encompassing specialty that requires undivided attention. They don't - and shouldn't have to - spend valuable hours lost in boring non-value administrivia (status meetings, emails, data entry, spreadsheet merging, etc.). Such tasks can be minimalized, or even removed completely. If they're not, don't be surprised if your developers seek new pastures.
  • Friend or foe: If a tool isn't implemented correctly and working in unison with the lifecycle, it can quickly become a burden or, worse, a foe. Isolating a quality tool can quickly strip it of its capabilities and functionality, as developers spend more time fighting the tool than harnessing its powers. Benign issues become malignant, and great and engaged developers become frustrated and disinterested.
  • Rotten apple: As the adage goes, ‘one rotten apple spoils the whole bunch.' Team unity, especially in software development, is integral to Agile and DevOps principles. A strong software lifecycle needs all players in the game to be happy in their roles or balls will be dropped. If a developer is disenchanted by her role within the team because of inefficient processes and procedures, it's only a matter of time before her colleagues become demoralized too. Beware the domino effect.

Pain relief
Addressing a fragmented software lifecycle is the first step in creating harmony among your software development and delivery personnel. An integrated software value stream has many benefits, but at its heart, it's all about removing waste and getting rid of all the annoying work that plagues a developer's day.

What do we mean by ‘software value stream'? A value stream is a notion borrowed from Lean manufacturing; it's the sequence of activities an organization undertakes to deliver a customer request, focusing on both the production flow from raw material to end-product and the design flow from concept to realization.

Looking at software development from a value stream perspective puts the emphasis on creation of customer value, rather than simply looking at these activities as a process. It's all about the ‘Big Picture' - improving the whole process, not just the parts - to minimize waste and ensure customers get exactly what they asked for.

How does an integrated values stream boost employee engagement? The troubles experienced by a leading global bank during its four-year Agile and DevOps transformation illustrates the point perfectly. Given the nature of its business, it's critical that all information is consistent across all the bank's systems, yet it lacked an automated flow of information between tools.

This meant that developers were still rekeying from one tool to another, taking up to two hours a day in duplicate entry. Not only was this a huge waste of valuable labor hours that cost the business up to $10 million annually in lost value, developers often got so frustrated that they would leave. This case study is painfully familiar, and again, entirely avoidable.

A Developer's Paradise
An integrated software value stream yields many benefits for developers (and the rest of the software lifecycle), including:

  • Information flows automatically across teams without costly manual intervention and oversight, removing non-value work and bottlenecks.
  • Collaboration happens within the work, rather than in email or disconnected tools.
  • Reports and analytics emerge from a holistic view of all the artifacts.
  • Traceability and governance become a natural act rather than an expensive manual process.
  • Visibility into the value stream enables managers to understand project status and optimize processes, meaning developers have the freedom and insight to build extraordinary software.
  • A modular Agile toolchain means forward-thinking developers can plug in new tools (or new versions of their existing tools) to experiment with new ideas and innovations.

As a result, these organizations experience:

  • Faster time-to-market
  • Reduced development cost
  • Ability to add more features, more efficiently, while improving quality and reducing risk
  • Ability to leverage their software development capability to bring differentiating value to their businesses

And perhaps most importantly, it results in high developer engagement and job satisfaction; the fuel behind any software-driven organization that's looking to transform its business in a digital world.

More Stories By John Rauser

John Rauser is the IT Manager at Tasktop Technologies, a global enterprise software company. He also serves as VP Operations at the board of the Project Management Institute - Canadian West Coast Chapter, providing leadership and expertise on technology issues. He has a passion for discussing the business impacts of technology and analyzing strategies for managing IT.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...