Recent from talks
Contribute something
Nothing was collected or created yet.
Software development process
View on Wikipedia
This article needs additional citations for verification. (December 2010) |
| Part of a series on |
| Software development |
|---|
A software development process prescribes a process for developing software. It typically divides an overall effort into smaller steps or sub-processes that are intended to ensure high-quality results. The process may describe specific deliverables – artifacts to be created and completed.[1]
Although not strictly limited to it, software development process often refers to the high-level process that governs the development of a software system from its beginning to its end of life – known as a methodology, model or framework. The system development life cycle (SDLC) describes the typical phases that a development effort goes through from the beginning to the end of life for a system – including a software system. A methodology prescribes how engineers go about their work in order to move the system through its life cycle. A methodology is a classification of processes or a blueprint for a process that is devised for the SDLC. For example, many processes can be classified as a spiral model.
Software process and software quality are closely interrelated; some unexpected facets and effects have been observed in practice.[2]
Methodology
[edit]The SDLC drives the definition of a methodology in that a methodology must address the phases of the SDLC. Generally, a methodology is designed to result in a high-quality system that meets or exceeds expectations (requirements) and is delivered on time and within budget even though computer systems can be complex and integrate disparate components.[3] Various methodologies have been devised, including waterfall, spiral, agile, rapid prototyping, incremental, and synchronize and stabilize.[4]
A major difference between methodologies is the degree to which the phases are sequential vs. iterative. Agile methodologies, such as XP and scrum, focus on lightweight processes that allow for rapid changes.[5] Iterative methodologies, such as Rational Unified Process and dynamic systems development method, focus on stabilizing project scope and iteratively expanding or improving products. Sequential or big-design-up-front (BDUF) models, such as waterfall, focus on complete and correct planning to guide larger projects and limit risks to successful and predictable results.[6] Anamorphic development is guided by project scope and adaptive iterations. In scrum,[7] for example, one could say a single user story goes through all the phases of the SDLC within a two-week sprint. By contrast the waterfall methodology, where every business requirement[citation needed] is translated into feature/functional descriptions which are then all implemented typically over a period of months or longer.[citation needed]
A project can include both a project life cycle (PLC) and an SDLC, which describe different activities. According to Taylor (2004), "the project life cycle encompasses all the activities of the project, while the systems development life cycle focuses on realizing the product requirements".[8]
History
[edit]The term SDLC is often used as an abbreviated version of SDLC methodology. Further, some use SDLC and traditional SDLC to mean the waterfall methodology.
According to Elliott (2004), SDLC "originated in the 1960s, to develop large scale functional business systems in an age of large scale business conglomerates. Information systems activities revolved around heavy data processing and number crunching routines".[9] The structured systems analysis and design method (SSADM) was produced for the UK government Office of Government Commerce in the 1980s. Ever since, according to Elliott (2004), "the traditional life cycle approaches to systems development have been increasingly replaced with alternative approaches and frameworks, which attempted to overcome some of the inherent deficiencies of the traditional SDLC".[9] The main idea of the SDLC has been "to pursue the development of information systems in a very deliberate, structured and methodical way, requiring each stage of the life cycle––from the inception of the idea to delivery of the final system––to be carried out rigidly and sequentially"[9] within the context of the framework being applied.
Other methodologies were devised later:
- 1970s
- Structured programming since 1969
- Cap Gemini SDM, originally from PANDATA, the first English translation was published in 1974. SDM stands for System Development Methodology
- 1980s
- Structured systems analysis and design method (SSADM) from 1980 onwards
- Information Requirement Analysis/Soft systems methodology
- 1990s
- Object-oriented programming (OOP) developed in the early 1960s and became a dominant programming approach during the mid-1990s
- Rapid application development (RAD), since 1991
- Dynamic systems development method (DSDM), since 1994
- Scrum, since 1995
- Team software process, since 1998
- Rational Unified Process (RUP), maintained by IBM since 1998
- Extreme programming, since 1999
- 2000s
- Agile Unified Process (AUP) maintained since 2005 by Scott Ambler
- Disciplined agile delivery (DAD) Supersedes AUP
- 2010s
- Scaled Agile Framework (SAFe)
- Large-Scale Scrum (LeSS)
- DevOps
Since DSDM in 1994, all of the methodologies on the above list except RUP have been agile methodologies - yet many organizations, especially governments, still use pre-agile processes (often waterfall or similar).
Examples
[edit]The following are notable methodologies somewhat ordered by popularity.
- Agile
Agile software development refers to a group of frameworks based on iterative development, where requirements and solutions evolve via collaboration between self-organizing cross-functional teams. The term was coined in the year 2001 when the Agile Manifesto was formulated.
- Waterfall
The waterfall model is a sequential development approach, in which development flows one-way (like a waterfall) through the SDLC phases.
- Spiral
In 1988, Barry Boehm published a software system development spiral model, which combines key aspects of the waterfall model and rapid prototyping, in an effort to combine advantages of top-down and bottom-up concepts. It emphases a key area many felt had been neglected by other methodologies: deliberate iterative risk analysis, particularly suited to large-scale complex systems.
- Incremental
Various methods combine linear and iterative methodologies, with the primary objective of reducing inherent project risk by breaking a project into smaller segments and providing more ease-of-change during the development process.
- Prototyping
Software prototyping is about creating prototypes, i.e. incomplete versions of the software program being developed.
- Rapid
Rapid application development (RAD) is a methodology which favors iterative development and the rapid construction of prototypes instead of large amounts of up-front planning. The "planning" of software developed using RAD is interleaved with writing the software itself. The lack of extensive pre-planning generally allows software to be written much faster and makes it easier to change requirements.
- Shape Up
Shape Up is a software development approach introduced by Basecamp in 2018. It is a set of principles and techniques that Basecamp developed internally to overcome the problem of projects dragging on with no clear end. Its primary target audience is remote teams. Shape Up has no estimation and velocity tracking, backlogs, or sprints, unlike waterfall, agile, or scrum. Instead, those concepts are replaced with appetite, betting, and cycles. As of 2022, besides Basecamp, notable organizations that have adopted Shape Up include UserVoice and Block.[10][11]
- Chaos
Chaos model has one main rule: always resolve the most important issue first.
- Incremental funding
Incremental funding methodology - an iterative approach.
- Lightweight
Lightweight methodology - a general term for methods that only have a few rules and practices.
- Structured systems analysis and design
Structured systems analysis and design method - a specific version of waterfall.
- Slow programming
As part of the larger slow movement, emphasizes careful and gradual work without (or minimal) time pressures. Slow programming aims to avoid bugs and overly quick release schedules.
- V-Model
V-Model (software development) - an extension of the waterfall model.
- Unified Process
Unified Process (UP) is an iterative software development methodology framework, based on Unified Modeling Language (UML). UP organizes the development of software into four phases, each consisting of one or more executable iterations of the software at that stage of development: inception, elaboration, construction, and guidelines.
Comparison
[edit]This section needs additional citations for verification. (January 2024) |
The waterfall model describes the SDLC phases such that each builds on the result of the previous one.[12][13][14][15] Not every project requires that the phases be sequential. For relatively simple projects, phases may be combined or overlapping.[12] Alternative methodologies to waterfall are described and compared below.[16]
| Waterfall | RAD | Open source | OOP | JAD | proto-typing | End User | |
|---|---|---|---|---|---|---|---|
| Control | Formal | MIS | Weak | Standards | Joint | User | User |
| Time frame | Long | Short | Medium | Any | Medium | Short | Short
– |
| Users | Many | Few | Few | Varies | Few | One or two | One |
| MIS staff | Many | Few | Hundreds | Split | Few | One or two | None |
| Transaction/DSS | Transaction | Both | Both | Both | DSS | DSS | DSS |
| Interface | Minimal | Minimal | Weak | Windows | Crucial | Crucial | Crucial |
| Documentation and training | Vital | Limited | Internal | In Objects | Limited | Weak | None |
| Integrity and security | Vital | Vital | Unknown | In Objects | Limited | Weak | Weak |
| Reusability | Limited | Some | Maybe | Vital | Limited | Weak | None |
Process meta-models
[edit]Some process models are abstract descriptions for evaluating, comparing, and improving the specific process adopted by an organization.
- ISO/IEC 12207
ISO/IEC 12207 is the international standard describing the method to select, implement, and monitor the life cycle for software.
- Capability Maturity Model Integration
The Capability Maturity Model Integration (CMMI) is one of the leading models and is based on best practices. Independent assessments grade organizations on how well they follow their defined processes, not on the quality of those processes or the software produced. CMMI has replaced CMM.
- ISO 9000
ISO 9000 describes standards for a formally organized process to manufacture a product and the methods of managing and monitoring progress. Although the standard was originally created for the manufacturing sector, ISO 9000 standards have been applied to software development as well. Like CMMI, certification with ISO 9000 does not guarantee the quality of the end result, only that formalized business processes have been followed.
- ISO/IEC 15504
ISO/IEC 15504 Information technology—Process assessment, a.k.a. Software Process Improvement Capability Determination (SPICE), is a framework for the assessment of software processes. This standard is aimed at setting out a clear model for process comparison. SPICE is used much like CMMI. It models processes to manage, control, guide, and monitor software development. This model is then used to measure what a development organization or project team actually does during software development. This information is analyzed to identify weaknesses and drive improvement. It also identifies strengths that can be continued or integrated into common practice for that organization or team.
- ISO/IEC 24744
ISO/IEC 24744 Software Engineering—Metamodel for Development Methodologies, is a power type-based metamodel for software development methodologies.
- Soft systems methodology
Soft systems methodology is a general method for improving management processes.
- Method engineering
Method engineering is a general method for improving information system processes.
See also
[edit]References
[edit]- ^ "Selecting a development approach" (PDF). Centers for Medicare & Medicaid Services (CMS) Office of Information Service. United States Department of Health and Human Services (HHS). March 27, 2008 [Original Issuance: February 17, 2005]. Archived from the original (PDF) on June 20, 2012. Retrieved October 27, 2008.
- ^ Suryanarayana, Girish (2015). "Software Process versus Design Quality: Tug of War?". IEEE Software. 32 (4): 7–11. doi:10.1109/MS.2015.87.
- ^ "Systems Development Life Cycle from". FOLDOC. Retrieved June 14, 2013.
- ^ "Software Development Life Cycle (SDLC)" (PDF). softwarelifecyclepros.com. May 2012. Retrieved June 26, 2025.
- ^ "SDLC Overview: Models & Methodologies". Retrieved December 12, 2021.
- ^ Arden, Trevor (1991). Information technology applications. London: Pitman. ISBN 978-0-273-03470-4.
- ^ "What is Scrum?". December 24, 2019.
- ^ Taylor, James (2004). Managing Information Technology Projects. p. 39.
- ^ a b c Geoffrey Elliott (2004). Global Business Information Technology: an integrated systems approach. Pearson Education. p. 87.
- ^ "Foreword by Jason Fried | Shape Up". basecamp.com. Retrieved September 11, 2022.
- ^ "Is Shape Up just a nice theory?". Curious Lab. Retrieved September 12, 2022.
- ^ a b US Department of Justice (2003). INFORMATION RESOURCES MANAGEMENT Chapter 1. Introduction.
- ^ Everatt, G.D.; McLeod, R Jr (2007). "Chapter 2: The Software Development Life Cycle". Software Testing: Testing Across the Entire Software Development Life Cycle. John Wiley & Sons. pp. 29–58. ISBN 9780470146347.
- ^ Unhelkar, B. (2016). The Art of Agile Practice: A Composite Approach for Projects and Organizations. CRC Press. pp. 56–59. ISBN 9781439851197.
- ^ Land, S.K.; Smith, D.B.; Walz, J.W. (2012). Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards. John Wiley & Sons. pp. 341–3. ISBN 9780470289952.
- ^ Post, G., & Anderson, D., (2006). Management information systems: Solving business problems with information technology. (4th ed.). New York: McGraw-Hill Irwin.
External links
[edit]- Selecting a development approach Archived January 2, 2019, at the Wayback Machine at cms.hhs.gov.
- Gerhard Fischer, "The Software Technology of the 21st Century: From Software Reuse to Collaborative Software Design" Archived September 15, 2009, at the Wayback Machine, 2001
Software development process
View on GrokipediaOverview
Definition and Scope
The software development process refers to a structured set of activities, methods, and practices that organizations and teams employ to plan, create, test, deploy, and maintain software systems in a systematic manner.[7] This framework ensures that software is developed efficiently, meeting user needs while managing risks and resources effectively. According to the ISO/IEC/IEEE 12207:2017 standard, it encompasses processes for the acquisition, supply, development, operation, maintenance, and disposal of software products or services, providing a common terminology and structure applicable across various software-centric systems.[8] The scope of the software development process is bounded by the technical and engineering aspects of software creation, typically from initial planning and requirements elicitation through to deployment and ongoing maintenance, but it excludes non-technical elements such as post-deployment legal compliance, marketing strategies, or business operations unrelated to the software itself.[9] It applies to both custom-built software tailored for specific needs and off-the-shelf solutions that may involve adaptation or integration, covering standalone applications as well as embedded software within larger systems.[8] This boundary emphasizes repeatable engineering practices over one-off project executions, allowing for scalability across projects while aligning with broader system engineering contexts when software is part of integrated hardware-software environments.[9] Key components of the software development process include core activities such as requirements analysis, design, coding, testing, integration, and deployment; supporting artifacts like requirements specifications, design documents, source code repositories, and test reports; defined roles for participants including developers, testers, project managers, and stakeholders; and expected outcomes such as reliable, functional software products that satisfy defined criteria. These elements interact through defined workflows to produce verifiable results, with activities often interleaved to address technical, collaborative, and administrative needs.[10] In distinction from the broader software lifecycle, which represents a specific instance of applying processes to a single project from inception to retirement, the software development process focuses on the reusable, standardized framework of methods and practices that can be tailored and repeated across multiple projects to promote consistency and improvement.[8] This repeatable nature enables organizations to define, control, and refine their approaches over time, separate from the unique timeline or events of any individual lifecycle.[9]Importance and Role in Software Engineering
Formalized software development processes are essential for reducing the inherent risks in software projects, where industry analyses show that up to 70% of initiatives fail or face significant challenges due to poor planning and unstructured execution. By establishing clear stages and checkpoints, these processes enhance predictability in timelines and deliverables, enable better cost estimation and control, and ultimately boost stakeholder satisfaction through consistent quality outcomes. For instance, structured methodologies have been linked to success rates improving from as low as 15% in traditional ad-hoc efforts to over 40% in disciplined environments, as evidenced by comparative studies on development practices.[11] Within the broader discipline of software engineering, formalized processes serve as the backbone for applying core engineering principles, such as modularity—which breaks systems into independent components—and reusability, which allows code and designs to be leveraged across projects for efficiency and scalability. These processes foster interdisciplinary collaboration by defining roles, communication protocols, and integration points for diverse teams, including developers, testers, and domain experts. Moreover, they ensure alignment with business objectives by incorporating requirements analysis and iterative feedback loops that tie technical decisions to strategic goals, as outlined in established software engineering standards.[12][13] The economic implications of robust software development processes are profound, contributing to a global software market that generated approximately $945 billion in revenue in 2023 and continues to drive innovation across critical sectors like finance, healthcare, and artificial intelligence. Effective processes not only sustain this market's growth by minimizing waste and accelerating time-to-market but also enable the creation of reliable systems that underpin digital transformation in these industries. In contrast, ad-hoc development approaches heighten risks, leading to technical debt that accumulates from shortcuts and incomplete implementations, exacerbating security vulnerabilities through unaddressed flaws, and creating ongoing maintenance burdens that can inflate costs by up to 30% over time.[14][15][16]Historical Evolution
Origins and Early Models (Pre-1980s)
The origins of structured software development processes can be traced to the mid-20th century, emerging from the practices of hardware engineering and early scientific computing. The ENIAC, completed in 1945 as the first programmable general-purpose electronic digital computer, required manual reconfiguration through physical wiring and switch settings for each program, highlighting the ad hoc nature of initial programming that blended hardware manipulation with computational tasks.[17] This approach, rooted in wartime ballistics calculations, laid the groundwork for recognizing the need for systematic methods as computing shifted toward more complex, reusable instructions.[18] In the 1950s, efforts to manage large-scale programming introduced the first explicit models for software production. Herbert D. Benington presented a stagewise process in 1956 during a symposium on advanced programming methods, describing the development of the SAGE air defense system as involving sequential phases: operational planning, program design, coding, testing, and information distribution.[19] This linear documentation-driven approach emphasized dividing labor and documenting each stage to handle the scale of military projects, serving as a precursor to later models without formal iteration.[20] The 1960s intensified the push for disciplined processes amid growing project complexities, exemplified by IBM's System/360 announcement in 1964, which demanded compatible software across a family of computers and exposed severe development challenges, including staff disarray and delays in operating systems like OS/360.[21] The airline industry's SABRE reservation system, deployed in 1964 after years of overruns, further illustrated these issues, as its massive scale—handling real-time bookings—revealed inadequacies in ad hoc coding practices.[22] These strains culminated in the 1968 NATO Conference on Software Engineering in Garmisch, Germany, where participants coined the term "software crisis" to describe widespread cost overruns, delivery delays, and maintenance difficulties in large systems, prompting calls for engineering-like rigor.[23] A pivotal contribution came from Edsger W. Dijkstra's 1968 critique of unstructured programming, particularly the "goto" statement, which he argued led to unreadable "spaghetti code" and advocated for structured control flows using sequence, selection, and iteration to enhance clarity and verifiability.[24] This emphasis on modularity influenced early process thinking. In 1970, Winston W. Royce formalized a linear model in his paper "Managing the Development of Large Software Systems," depicting a cascading sequence of requirements, design, implementation, verification, and maintenance, tailored for documentation-heavy projects like defense systems, though Royce himself noted risks in its rigidity without feedback loops.[25] These pre-1980s foundations addressed the escalating demands of computing but underscored the limitations of sequential approaches in dynamic environments.Modern Developments and Shifts (1980s-Present)
In the 1980s and 1990s, software development processes began transitioning from rigid, linear models toward more iterative approaches that incorporated risk management and evolving paradigms like object-oriented programming (OOP). Barry Boehm introduced the Spiral Model in 1986 as a risk-driven framework, where development proceeds through iterative cycles of planning, risk analysis, engineering, and evaluation, allowing for progressive refinement based on identified uncertainties rather than upfront specification. This model addressed limitations in earlier sequential methods by explicitly prioritizing risk assessment at each iteration, influencing subsequent processes to integrate feedback loops for handling complexity in large-scale projects. Concurrently, the rise of OOP, exemplified by Smalltalk developed at Xerox PARC in the 1970s and widely adopted in the 1980s, reshaped process designs by emphasizing modularity, encapsulation, and prototyping, which encouraged iterative experimentation and reuse in software architecture.[26][27] The early 2000s marked a pivotal shift with the publication of the Agile Manifesto in 2001, which emerged as a direct response to the perceived inflexibility of plan-driven methodologies, advocating for adaptive practices that prioritize customer value and responsiveness. The Manifesto outlines four core values: individuals and interactions over processes and tools, working software over comprehensive documentation, customer collaboration over contract negotiation, and responding to change over following a plan. This philosophy gained traction through frameworks like Scrum, first formalized by Ken Schwaber and Jeff Sutherland in a 1995 paper presenting it as an iterative, incremental process for managing complex projects, but it surged in popularity after the Manifesto's release as organizations sought faster delivery cycles. By promoting self-organizing teams and short iterations, these developments fostered a broader move away from exhaustive upfront planning toward empirical process control.[28][29][30] From the 2010s onward, integration of DevOps practices further accelerated this evolution, blending development and operations to enable continuous integration and delivery (CI/CD), with roots tracing to collaborative efforts around 2007-2008 that matured into widespread adoption by the mid-2010s. DevOps emphasized automation, shared responsibility, and rapid feedback to shorten release cycles, often building on Agile foundations to support continuous deployment in dynamic environments. The launch of Amazon Web Services (AWS) in 2006 exemplified cloud computing's role in this shift, providing on-demand infrastructure that decoupled development from hardware constraints, enabling scalable testing, deployment, and global distribution while reducing time-to-market. More recently, AI and machine learning tools have automated aspects of coding, testing, and maintenance, such as code generation and anomaly detection, enhancing efficiency in adaptive processes.[31][32][33] Overall, these developments reflect a fundamental trend from plan-driven processes, which relied on detailed upfront specifications, to adaptive ones that embrace uncertainty through iteration and collaboration, as articulated in analyses of methodological evolution.[34] By 2023, this shift was evident in adoption data, with 71% of organizations using Agile practices, often in hybrid forms combining traditional and iterative elements to suit varying project scales.[35]Development Methodologies
Traditional Sequential Models
The Waterfall model, a foundational sequential methodology in software development, was introduced by Winston W. Royce in his 1970 paper on managing large software systems.[25] It structures the process into distinct, linear phases executed in strict order: system requirements analysis, software requirements definition, preliminary and detailed design, coding and debugging, integration and testing, and finally deployment and maintenance, with each phase building upon the deliverables of the previous one. Although often interpreted as strictly linear, Royce recommended iterative elements and feedback to mitigate risks.[36] This approach emphasizes upfront planning and documentation, making it particularly suitable for projects with stable, well-defined requirements where predictability is paramount, such as in embedded systems development.[37] The V-Model emerged in the 1980s as an extension of the Waterfall model, incorporating a graphical representation that pairs each development phase on the left side (verification) with a corresponding testing phase on the right side (validation) to ensure systematic quality assurance throughout the lifecycle.[38] For instance, requirements analysis is verified against acceptance testing, while detailed design aligns with unit testing, promoting early defect detection and traceability in safety-critical applications like automotive software.[39] This pairing reinforces the sequential nature but integrates testing as an integral counterpart to each step, rather than a post-development activity. Traditional sequential models excel in environments requiring extensive documentation and compliance, such as regulatory sectors; for example, the U.S. Food and Drug Administration (FDA) references a waterfall-like structure in its design control guidance for medical device software, where phases must be sequentially documented to meet traceability and audit requirements.[40] Their strengths include enhanced predictability and risk management for fixed-scope projects, facilitating clear milestones and resource allocation.[41] However, a key drawback is their inflexibility to requirement changes once a phase is completed, often leading to costly rework if project needs evolve.[42] In terms of estimation, cycle time in these models is calculated as the sum of individual phase durations with no overlaps, providing a straightforward formula for total project timeline: , where is the overall cycle time and represents the duration of phase . This metric supports budgeting in stable projects but assumes accurate upfront predictions, underscoring the models' reliance on initial planning accuracy.[43]Iterative and Agile Approaches
Iterative and agile approaches represent a shift from rigid, linear processes to flexible, feedback-driven methods that emphasize incremental development, continuous improvement, and adaptation to changing requirements. Unlike traditional sequential models, which often struggle with late-stage changes and risk accumulation due to upfront planning, iterative methods build software in cycles, allowing for early detection and mitigation of issues. These approaches prioritize delivering functional increments regularly, fostering collaboration and responsiveness in dynamic environments. The Spiral model, introduced by Barry Boehm in 1986, integrates iterative prototyping with systematic risk analysis to guide software development. It structures the process into repeating cycles, each comprising four quadrants: determining objectives, alternatives, and constraints; evaluating options and identifying risks; developing and verifying prototypes or products; and planning the next iteration. This risk-driven framework is particularly suited for large, complex projects where uncertainties are high, as it explicitly addresses potential pitfalls before committing resources. Boehm's model has influenced subsequent adaptive methodologies by highlighting the need for ongoing evaluation and adjustment. The Agile Manifesto, authored by a group of software developers in 2001, outlines four core values—individuals and interactions over processes and tools, working software over comprehensive documentation, customer collaboration over contract negotiation, and responding to change over following a plan—and supports them with 12 principles. These principles emphasize customer satisfaction through early and continuous delivery of valuable software, welcoming changing requirements even late in development, frequent delivery of working software, close daily cooperation between business stakeholders and developers, motivated individuals supported by the work environment, face-to-face conversation as the most efficient information exchange, working software as the primary measure of progress, sustainable development pace, continuous attention to technical excellence and good design, simplicity in maximizing work not done, self-organizing teams, and regular reflection for improved effectiveness. The manifesto's principles have become foundational for modern software practices, promoting adaptability and quality. Within the Agile umbrella, Scrum provides a structured framework for implementing these principles through defined roles, events, and artifacts. Key roles include the Product Owner, who manages the product backlog and prioritizes features; the Scrum Master, who facilitates the process and removes impediments; and the Development Team, a cross-functional group responsible for delivering increments. Scrum organizes work into fixed-length sprints (typically 2-4 weeks), featuring events such as sprint planning, daily stand-ups for progress synchronization, sprint reviews for stakeholder feedback, and retrospectives for process improvement. This framework enables teams to deliver potentially shippable product increments at the end of each sprint, enhancing predictability and alignment. Kanban, developed by David J. Anderson in the early 2000s as an evolution of lean manufacturing principles applied to knowledge work, focuses on visualizing workflow and limiting work in progress to optimize flow efficiency. It uses a Kanban board to represent tasks in columns such as "To Do," "In Progress," and "Done," allowing teams to pull work as capacity permits rather than pushing predefined assignments. By emphasizing continuous delivery without fixed iterations, Kanban reduces bottlenecks and improves throughput, making it ideal for maintenance or support teams where priorities shift frequently. Lean software development, popularized by Mary and Tom Poppendieck in their 2003 book, adapts lean manufacturing concepts to software by focusing on delivering value while eliminating waste. Core principles include eliminating waste (such as unnecessary features or delays), amplifying learning through feedback loops, deciding as late as possible to defer commitments, delivering as fast as possible via small batches, empowering teams for decision-making, building integrity with automated testing, and optimizing the whole system over subsystems. In practice, Lean has been widely adopted in startups for creating minimum viable products (MVPs) that validate ideas quickly with minimal resources, enabling rapid iteration based on user feedback. Adopting iterative and agile approaches yields significant benefits, including higher customer satisfaction; for instance, 93% of organizations using Agile report improvements in this area according to the 17th State of Agile Report. These methods also accelerate delivery, with 71% of respondents noting faster time-to-market. However, challenges arise in scaling to large teams, such as coordination across multiple units, managing dependencies, and maintaining consistency in practices, often requiring frameworks like SAFe or LeSS to address inter-team communication and alignment. Despite these hurdles, the emphasis on empiricism and adaptation has made iterative and agile methods dominant in contemporary software development.Comparison of Methodologies
Various software development methodologies differ in their approach to managing complexity, change, and delivery, with key criteria including flexibility (ability to accommodate requirements changes), documentation level (extent of upfront and ongoing records), suitability for team size (scalability for small vs. large groups), risk handling (mechanisms for identifying and mitigating uncertainties), and time-to-market (speed of delivering functional software).[44] The Waterfall model, a linear sequential process, offers low flexibility as changes require restarting phases, but it emphasizes high documentation through structured requirements and design documents, making it suitable for small to medium teams in stable environments with well-defined needs.[45] In contrast, the Spiral model incorporates iterative cycles with explicit risk analysis, providing moderate to high flexibility and effective risk handling via prototyping, though it demands expertise and can be costly for larger teams due to repeated evaluations.[46] Agile methodologies, such as Scrum, prioritize high flexibility and iterative delivery with minimal initial documentation, excelling in risk handling through continuous feedback but often suiting smaller, co-located teams better, as scaling can introduce coordination challenges.[44] Empirical studies highlight trade-offs in outcomes; for instance, according to the Standish Group CHAOS Report (2020), Agile projects are approximately three times more likely to succeed than Waterfall projects, with success rates of 39% for Agile versus 11% for Waterfall, and reduced time-to-market by enabling incremental releases that address risks early.[47] Waterfall's rigid structure suits projects with fixed requirements, like embedded systems integrated with hardware, while Agile is preferable for dynamic domains such as web applications where user needs evolve rapidly.[45] The Spiral model bridges these by balancing predictability with adaptability, ideal for high-risk projects like large-scale defense software, though its complexity limits use in time-constrained scenarios.[46]| Methodology | Pros | Cons |
|---|---|---|
| Waterfall | High documentation and clear milestones for tracking progress Suitable for small teams and projects with stable requirements Low risk in predictable environments due to sequential validation[44] | Low flexibility; changes are costly and disruptive Longer time-to-market as testing occurs late Poor risk handling for uncertain projects, leading to higher failure rates (e.g., 59% vs. 11% for Agile per Standish Group CHAOS Report 2020)[47] |
| Spiral | Strong risk handling through iterative prototyping and evaluation Moderate flexibility allows incorporation of feedback across cycles Balances documentation with adaptability for medium to large teams[46] | Higher costs from repeated risk analysis and prototypes Requires expert teams for effective risk identification Slower time-to-market due to multiple iterations |
| Agile | High flexibility and rapid time-to-market via short iterations Effective risk mitigation through continuous integration and stakeholder involvement Scales to various team sizes with frameworks like SAFe, though best for smaller groups initially[44] | Lower documentation can lead to knowledge gaps in large teams Potential for scope creep without disciplined practices Less suitable for highly regulated projects needing extensive upfront compliance |
