Welcome!

Video Authors: Elizabeth White, Yakov Fain, Liz McMillan, Dan Ristic, Jnan Dash

Related Topics: @DevOpsSummit, Microservices Expo, Containers Expo Blog

@DevOpsSummit: Blog Post

Applying Agile Approaches to Traditional Processes | @DevOpsSummit #Scrum #Agile #DevOps

Agile approaches are valuable in controlling uncertainty because they constrain the complexity that stems from it

Beyond Software Development: Applying Agile Approaches to Traditional Processes

While we understand Agile as a means to accelerate innovation, manage uncertainty and cope with ambiguity, many are inclined to think that it conflicts with the objectives of traditional engineering projects, such as building a highway, skyscraper or power plant. These are plan-driven and predictive projects that seek to avoid any uncertainty. This type of thinking, however, is short-sighted.

Agile approaches are valuable in controlling uncertainty because they constrain the complexity that stems from it. For example, iterative cycles place a limit on how far planning can go and how much work can be taken on at one time. What if those same constraints could help control a different kind of complexity -- the type that undermines classic plan-driven projects?

Followers of Lean production are familiar with the idea that limiting work in progress ("WIP") allows a system to operate with a steady flow of work, thereby increasing its overall efficiency and effectiveness. The goals of Lean are galvanized further by Agile practices, because any technique that limits WIP should prove to be valuable, especially in situations where the sheer magnitude of work in progress is the primary source of complexity.

The work of Simon Orrell pays testament to that. His clients' use of Scrum in major engineering projects has been effective in managing complexity by controlling and limiting WIP. Embedded inside a traditionally managed waterfall project, practices such as multi-disciplinary teams, iterative cycles and backlog grooming provide natural constraints on WIP and a framework in which to prioritize and work collaboratively.

While plan-driven projects may have little margin for error or tolerance for change, that does not place them at odds with Agile, which provides a useful set of tools for managing problems in the predictive domain, though perhaps unexpectedly considering their origins.

Agile in Software Development
Software development has confounded traditional planning and management systems due to the high level of ambiguity involved in the work. Uncertainty abounds for many reasons:

  1. We have incomplete information about markets, customers and requirements.
  2. We need to accommodate constantly changing needs and desires.
  3. The product's design is difficult to nail down.
  4. We're not sure about the best software architecture.
  5. We don't know how long it will to take to build.

Without a method to manage the ambiguity and uncertainty, waste pervades. Many of us have lived through software projects that "finish," only to find that the result is not even what the customer asked for.

Agile approaches have been very effective because they break up work and provide a structure within which it is acceptable to fail. In software, the emphasis is on using practices like iterative cycles, and continuous feedback loops serve to control and constrain risks from ambiguity - if a feature was not executed correctly, we can't get too far along

Project Management in Engineering
Traditional engineering projects, on the other hand, have few of the same uncertainty risks that software development projects face:

  1. We have very precise understanding of the finished product.
  2. The customer knows exactly what they need.
  3. Designs are based largely on precedent.
  4. We can create very clear architecture plans to build from.
  5. We know how long it should take to build.

Rather, many of the challenges in engineering projects stem from the sheer magnitude of the work being performed. The problem here is not the planning, it is the coordinating, i.e., ensuring that the work from each stage meshes with the work in the next. The size and complexity of the project means that there are a vast number of dependant tasks that need to be performed across a very broad range of activities.

Work is typically assigned by a project manager to highly specialized teams, and each of those will kick off their own set of tasks. A large amount of work across many problem domains is therefore started at once, and it can remain open indefinitely. Teams may not be able to share information or resources, and lack the benefit of system level visibility to make decisions and use information and resources effectively.

Cascading failure is also a huge risk. Knowing every detail about what work needs to be performed does not help us recover when something fails. In other words, there is an extreme problem with WIP overload. Some of the symptoms include:

  1. Very long running tasks
  2. Large number of handoffs
  3. Domino effect when dependencies are misaligned
  4. Team silos and lack of knowledge integration
  5. Difficulty focusing on what is important

The fifth problem is probably the most damning. We've all been in this situation: Faced with such a staggering amount of work, how do you determine what needs to be done right now? How do you prioritize a work item when everyone is telling you that their work item is urgent? And how do you manage your work effectively when you don't have insight into what's happening at the system level? This is where Agile shines.

Applying Agile in Engineering Projects
Orrell's success in applying Scrum to execute the engineering phase for a natural gas processing plant demonstrated the effectiveness in using Agile to control WIP complexity. This played out in several ways:

1. With a multidisciplinary, matrixed team of operational specialists, WIP can be managed by consensus. Team members are provided with the opportunity to understand and manage their work at a system level, not at just a functional level. Cross-functional knowledge could be integrated and shared.

2. Scrum ceremonies such as daily stand-up, sprint planning and retrospectives provide continuous insight into how the project is performing and how the teams could make better decisions about what to work on. Progress could be determined empirically and objectively. Teams were able to truly know the state of their work with each iteration. With this information, focus could be maintained on understanding and completing high-priority items in the backlog.

3. Team members pulled tasks from a backlog groomed and refined by the project manager and the cross-discipline team. Backlog items were created out of traditional project artifacts -- schedule progress, roadblocks to deliverables in the schedule, risks from the risk log, process requirements, infrastructure requirements and so on. Techniques such as mind-mapping were used to discover blockers and knowledge gaps so the team could work together to determine how to solve them.

4. The product increment for each sprint included iterating on the 3D model of the plant that was to be built, and completing specific deliverables like engineering drawings, procurement contracts and purchase orders needed for construction to begin.

The result was a new way of working that allows for delivery ahead of schedule, under budget and with less stress for everyone involved.

By building out a backlog from project artifacts and using a multidisciplinary, iterative approach to prioritization, a natural structure for controlling WIP emerges and the complexity of large projects can be reduced.

Moving Towards Co-existence
As we head towards a world where predictive project management techniques and Agile approaches are used together, there will be several key obstacles to overcome:

1. When Agile teams grow and become distributed they need to adopt their own artifact-based collaboration tools to manage their work. In doing so they will need to find ways to integrate their tools with existing processes and make their work visible to all stakeholders in the organization. Without these tools and integrations, work will get bogged down with meetings, spreadsheets, documents and manual processes.

2. Similarly, as the methodology is scaled out to include all phases of a project, there will be cultural and organizational issues to deal with - the same issues as when scaling Agile in software organizations: increased complexity, reduced visibility and resistance to the agile mindset, and so on.

3. Probably the largest obstacle will be convincing people to use Agile approaches in the first place. Getting the buy-in at the executive level to manage the delivery of expensive and significant projects like factories and plants in new ways could prove to be difficult.

Nevertheless, having seen that Agile approaches such as Scrum have value not just in reducing uncertainty from the unknown, but also in coping with unmanageability from too many knowns, we need to continue moving forward with understanding how to integrate plan-driven projects with Agile practices in the world outside software.

As Orrell noted in Agile Coaching: Wisdom from Practitioners, "the success of organizations, their products, and projects lies in their teams and how they work together." To that end, the coexistence of plan-based and Agile projects offers a wealth of new opportunities to improve and evolve our organizations.

More Stories By John Rauser

John Rauser is the IT Manager at Tasktop Technologies, a global enterprise software company. He also serves as VP Operations at the board of the Project Management Institute - Canadian West Coast Chapter, providing leadership and expertise on technology issues. He has a passion for discussing the business impacts of technology and analyzing strategies for managing IT.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...