Recent from talks
Contribute something
Nothing was collected or created yet.
Software design
View on Wikipedia| Part of a series on |
| Software development |
|---|
Software design is the process of conceptualizing how a software system will work before it is implemented or modified.[1] Software design also refers to the direct result of the design process – the concepts of how the software will work which may be formally documented or may be maintained less formally, including via oral tradition.
The design process enables a designer to model aspects of a software system before it exists with the intent of making the effort of writing the code more efficiently. Creativity, past experience, a sense of what makes "good" software, and a commitment to quality are success factors for a competent design.
A software design can be compared to an architected plan for a house. High-level plans represent the totality of the house (e.g., a three-dimensional rendering of the house). Lower-level plans provide guidance for constructing each detail (e.g., the plumbing lay). Similarly, the software design model provides a variety of views of the proposed software solution.
Part of the overall process
[edit]In terms of the waterfall development process, software design is the activity that occurs after requirements analysis and before coding.[2] Requirements analysis determines what the system needs to do without determining how it will do it, and thus, multiple designs can be imagined that satisfy the requirements. The design can be created while coding, without a plan or requirements analysis,[3] but for more complex projects this is less feasible. Completing a design prior to coding allows for multidisciplinary designers and subject-matter experts to collaborate with programmers to produce software that is useful and technically sound.
Sometimes, a simulation or prototype is created to model the system in an effort to determine a valid and good design.
Code as design
[edit]A common point of confusion with the term design in software is that the process applies at multiple levels of abstraction such as a high-level software architecture and lower-level components, functions and algorithms. A relatively formal process may occur at high levels of abstraction but at lower levels, the design process is almost always less formal where the only artifact of design may be the code itself. To the extent that this is true, software design refers to the design of the design. Edsger W. Dijkstra referred to this layering of semantic levels as the "radical novelty" of computer programming,[4] and Donald Knuth used his experience writing TeX to describe the futility of attempting to design a program prior to implementing it:
TEX would have been a complete failure if I had merely specified it and not participated fully in its initial implementation. The process of implementation constantly led me to unanticipated questions and to new insights about how the original specifications could be improved.[5]
Artifacts
[edit]A design process may include the production of art Software design documentation such as flow chart, use case, Pseudocode, Unified Modeling Language model and other Fundamental modeling concepts. For user centered software, design may involve user experience design yielding a storyboard to help determine those specifications. Documentation may be reviewed to allow constraints, specifications and even requirements to be adjusted prior to coding.
Iterative Design
[edit]Software systems inherently deal with uncertainties, and the size of software components can significantly influence a system's outcomes, both positively and negatively. Neal Ford and Mark Richards propose an iterative approach to address the challenge of identifying and right-sizing components. This method emphasizes continuous refinement as teams develop a more nuanced understanding of system behavior and requirements.[6]
The approach typically involves a cycle with several stages:[6]
- A high-level partitioning strategy is established, often categorized as technical or domain-based. Guidelines for the smallest meaningful deployable unit, referred to as "quanta," are defined. While these foundational decisions are made early, they may be revisited later in the cycle if necessary.
- Initial components are identified based on the established strategy.
- Requirements are assigned to the identified components.
- The roles and responsibilities of each component are analyzed to ensure clarity and minimize overlap.
- Architectural characteristics, such as scalability, fault tolerance, and maintainability, are evaluated.
- Components may be restructured based on feedback from development teams.
This cycle serves as a general framework and can be adapted to different domains.
Design principles
[edit]Design principles enable a software engineer to navigate the design process. Davis[7] suggested principles which have been refined over time as:
- The design process should not suffer from "tunnel vision"
- A good designer should consider alternative approaches, judging each based on the requirements of the problem, the resources available to do the job.
- The design should be traceable to the analysis model
- Because a single element of the design model can often be traced back to multiple requirements, it is necessary to have a means for tracking how requirements have been satisfied by the design model.
- The design should not reinvent the wheel
- Systems are constructed using a set of design patterns, many of which have likely been encountered before. These patterns should always be chosen as an alternative to reinvention. Time is short and resources are limited; design time should be invested in representing (truly new) ideas by integrating patterns that already exist (when applicable).
- The design should "minimize the intellectual distance" between the software and the problem as it exists in the real world
- That is, the structure of the software design should, whenever possible, mimic the structure of the problem domain.
- The design should exhibit uniformity and integration
- A design is uniform if it appears fully coherent. In order to achieve this outcome, rules of style and format should be defined for a design team before design work begins. A design is integrated if care is taken in defining interfaces between design components.
- The design should be structured to accommodate change
- The design concepts discussed in the next section enable a design to achieve this principle.
- The design should be structured to degrade gently, even when aberrant data, events, or operating conditions are encountered
- Well-designed software should never "bomb"; it should be designed to accommodate unusual circumstances, and if it must terminate processing, it should do so in a graceful manner.
- Design is not coding, coding is not design
- Even when detailed procedural designs are created for program components, the level of abstraction of the design model is higher than the source code. The only design decisions made at the coding level should address the small implementation details that enable the procedural design to be coded.
- The design should be assessed for quality as it is being created, not after the fact
- A variety of design concepts and design measures are available to assist the designer in assessing quality throughout the development process.
- The design should be reviewed to minimize conceptual (semantic) errors
- There is sometimes a tendency to focus on minutiae when the design is reviewed, missing the forest for the trees. A design team should ensure that major conceptual elements of the design (omissions, ambiguity, inconsistency) have been addressed before worrying about the syntax of the design model.
Design concepts
[edit]Design concepts provide a designer with a foundation from which more sophisticated methods can be applied. Design concepts include:
- Abstraction
- Reducing the information content of a concept or an observable phenomenon, typically to retain only information that is relevant for a particular purpose. It is an act of Representing essential features without including the background details or explanations.
- Architecture
- The overall structure of the software and the ways in which that structure provides conceptual integrity for a system. Good software architecture will yield a good return on investment with respect to the desired outcome of the project, e.g. in terms of performance, quality, schedule and cost.
- Control hierarchy
- A program structure that represents the organization of a program component and implies a hierarchy of control.
- Data structure
- Representing the logical relationship between elements of data.
- Design pattern
- A designer may identify a design aspect of the system that has solved in the past. The reuse of such patterns can increase software development velocity.[8]
- Information hiding
- Modules should be specified and designed so that information contained within a module is inaccessible to other modules that have no need for such information.
- Modularity
- Dividing the solution into parts (modules).
- Refinement
- The process of elaboration. A hierarchy is developed by decomposing a macroscopic statement of function in a step-wise fashion until programming language statements are reached. In each step, one or several instructions of a given program are decomposed into more detailed instructions. Abstraction and Refinement are complementary concepts.
- Software procedure
- Focuses on the processing of each module individually.
- Structural partitioning
- The program structure can be divided horizontally and vertically. Horizontal partitions define separate branches of modular hierarchy for each major program function. Vertical partitioning suggests that control and work should be distributed top-down in the program structure.
Grady Booch mentions abstraction, encapsulation, modularization, and hierarchy as fundamental software design principles.[9] The phrase principles of hierarchy, abstraction, modularization, and encapsulation (PHAME) refers to these principles.[10]
Design considerations
[edit]There are many aspects to consider in the design of software. The importance of each should reflect the goals and expectations to which the software is being created. Notable aspects include:
- Compatibility
- The software is able to operate with other products that are designed for interoperability with another product. For example, a piece of software may be backward-compatible with an older version of itself.
- Extensibility
- New capabilities can be added to the software without major changes to the underlying architecture.
- Fault-tolerance
- The software is resistant to and able to recover from component failure.
- Maintainability
- A measure of how easily bug fixes or functional modifications can be accomplished. High maintainability can be the product of modularity and extensibility.
- Modularity
- The resulting software comprises well defined, independent components which leads to better maintainability. The components could be then implemented and tested in isolation before being integrated to form a desired software system. This allows division of work in a software development project.
- Overhead
- How the consumption of resources required for overhead impacts the resources needed to achieve system requirements.
- Performance
- The software performs its tasks within a time-frame that is acceptable for the user, and does not require too much memory.
- Portability
- The software should be usable across a number of different conditions and environments.
- Reliability
- The software is able to perform a required function under stated conditions for a specified period of time.
- Reusability
- The ability to use some or all of the aspects of the preexisting software in other projects with little to no modification.
- Robustness
- The software is able to operate under stress or tolerate unpredictable or invalid input. For example, it can be designed with resilience to low memory conditions.
- Scalability
- The software adapts well to increasing data or added features or number of users. According to Marc Brooker: "a system is scalable in the range where marginal cost of additional workload is nearly constant." Serverless technologies fit this definition but you need to consider total cost of ownership not just the infra cost.[11]
- Security
- The software is able to withstand and resist hostile acts and influences.
- Usability
- The software user interface must be usable for its target user/audience. Default values for the parameters must be chosen so that they are a good choice for the majority of the users.[12]
Modeling language
[edit]A modeling language can be used to express information, knowledge or systems in a structure that is defined by a consistent set of rules. These rules are used for interpretation of the components within the structure. A modeling language can be graphical or textual. Examples of graphical modeling languages for software design include:
- Architecture description language (ADL)
- A language used to describe and represent the software architecture of a software system.
- Business Process Modeling Notation (BPMN)
- An example of a Process Modeling language.
- EXPRESS and EXPRESS-G (ISO 10303-11)
- An international standard general-purpose data modeling language.
- Extended Enterprise Modeling Language (EEML)
- Commonly used for business process modeling across a number of layers.
- Flowchart
- Schematic representations of algorithms or other step-wise processes.
- Fundamental Modeling Concepts (FMC)
- A modeling language for software-intensive systems.
- IDEF
- A family of modeling languages, the most notable of which include IDEF0 for functional modeling, IDEF1X for information modeling, and IDEF5 for modeling ontologies.
- Jackson Structured Programming (JSP)
- A method for structured programming based on correspondences between data stream structure and program structure.
- LePUS3
- An object-oriented visual Design Description Language and a formal specification language that is suitable primarily for modeling large object-oriented (Java, C++, C#) programs and design patterns.
- Unified Modeling Language (UML)
- A general modeling language to describe software both structurally and behaviorally. It has a graphical notation and allows for extension with a Profile (UML).
- Alloy (specification language)
- A general purpose specification language for expressing complex structural constraints and behavior in a software system. It provides a concise language base on first-order relational logic.
- Systems Modeling Language (SysML)
- A general-purpose modeling language for systems engineering.
See also
[edit]- Aspect-oriented software development – Programming paradigm
- Design rationale – Explicit listing of design decisions
- Graphic design – Interdisciplinary branch of design and fine arts
- Interaction design – Specialization of design focused on the experience users have of a product or service
- Icon design – Genre of graphic design
- Outline of software – Topical guide to software
- Outline of software development – Overview of and topical guide to software development
- Outline of software engineering – Overview of and topical guide to software engineering
- Search-based software engineering – Application of metaheuristic search techniques to software engineering
- Software Design Description – Written design description of a software product
- Software development – Creation and maintenance of software
- User experience – Human interaction with a particular product, system or service
- User interface design – Planned operator–machine interaction
- Web design – Creation and maintenance of websites
- Zero One Infinity – Software design rule
References
[edit]- ^ Ralph, P. and Wand, Y. (2009). A proposal for a formal definition of the design concept. In Lyytinen, K., Loucopoulos, P., Mylopoulos, J., and Robinson, W., editors, Design Requirements Workshop (LNBIP 14), pp. 103–136. Springer-Verlag, p. 109 doi:10.1007/978-3-540-92966-6_6.
- ^ Freeman, Peter; David Hart (2004). "A Science of design for software-intensive systems". Communications of the ACM. 47 (8): 19–21 [20]. doi:10.1145/1012037.1012054. S2CID 14331332.
- ^ Ralph, P., and Wand, Y. A Proposal for a Formal Definition of the Design Concept. In, Lyytinen, K., Loucopoulos, P., Mylopoulos, J., and Robinson, W., (eds.), Design Requirements Engineering: A Ten-Year Perspective: Springer-Verlag, 2009, pp. 103-136
- ^ Dijkstra, E. W. (1988). "On the cruelty of really teaching computing science". Retrieved 2014-01-10.
- ^ Knuth, Donald E. (1989). "Notes on the Errors of TeX" (PDF).
- ^ a b Fundamentals of Software Architecture: An Engineering Approach. O'Reilly Media. 2020. ISBN 978-1492043454.
- ^ Davis, A:"201 Principles of Software Development", McGraw Hill, 1995.
- ^ Judith Bishop. "C# 3.0 Design Patterns: Use the Power of C# 3.0 to Solve Real-World Problems". C# Books from O'Reilly Media. Retrieved 2012-05-15.
If you want to speed up the development of your .NET applications, you're ready for C# design patterns -- elegant, accepted and proven ways to tackle common programming problems.
- ^ Booch, Grady; et al. (2004). Object-Oriented Analysis and Design with Applications (3rd ed.). MA, US: Addison Wesley. ISBN 0-201-89551-X. Retrieved 30 January 2015.
- ^ Suryanarayana, Girish (November 2014). Refactoring for Software Design Smells. Morgan Kaufmann. p. 258. ISBN 978-0128013977.
- ^ Building Serverless Applications on Knative. O'Reilly Media. ISBN 9781098142049.
- ^ Carroll, John, ed. (1995). Scenario-Based Design: Envisioning Work and Technology in System Development. New York: John Wiley & Sons. ISBN 0471076597.
- ^ Bell, Michael (2008). "Introduction to Service-Oriented Modeling". Service-Oriented Modeling: Service Analysis, Design, and Architecture. Wiley & Sons. ISBN 978-0-470-14111-3.
^Roger S. Pressman (2001). Software engineering: a practitioner's approach. McGraw-Hill. ISBN 0-07-365578-3.
Software design
View on GrokipediaIntroduction
Definition and Scope
Software design is the process of defining the architecture, components, interfaces, and data for a software system to satisfy specified requirements.[3] This activity transforms high-level requirements into a detailed blueprint that guides subsequent development phases, serving both as a process and a model for representing the system's structure.[3] According to IEEE Std 1016-2009, it establishes the information content and organization of software design descriptions (SDDs), which communicate design decisions among stakeholders.[4] The scope of software design encompasses high-level architectural design, which outlines the overall system structure; detailed component design, which specifies individual modules; and user interface design, which addresses interaction elements.[3] It occurs after requirements analysis and before construction in the software development lifecycle, distinguishing it from coding, which implements the design, and testing, which verifies it.[3] While iterative in nature, software design focuses on conceptual planning rather than execution or validation.[3] Key characteristics of software design include abstraction, which simplifies complex systems by emphasizing essential features while suppressing irrelevant details; modularity, which divides the system into independent, interchangeable components with defined interfaces to enhance maintainability; and decomposition, which breaks down the system hierarchically into manageable subcomponents.[3] These principles enable the management of complexity in large-scale software projects.[3] For example, software design transitions vague requirements—such as "a system for processing user data"—into a concrete blueprint specifying database schemas, API interfaces, and modular services, thereby facilitating efficient implementation.[3]Historical Development
The origins of software design as a disciplined practice trace back to the 1960s, amid growing concerns over the "software crisis" characterized by escalating costs, delays, and reliability issues in large-scale projects. At the 1968 NATO Conference on Software Engineering in Garmisch, Germany, participants, including prominent figures like Edsger Dijkstra and Friedrich L. Bauer, highlighted the crisis's severity, noting that software development was failing to meet schedules, budgets, and quality expectations for systems like IBM's OS/360, which required over 5,000 person-years and exceeded estimates by factors of 2.5 to 4. This event spurred calls for systematic design approaches to manage complexity, emphasizing modularity and structured methodologies over ad hoc coding.[5] A pivotal advancement came with the introduction of structured programming in the late 1960s, which advocated for clear, hierarchical control structures to replace unstructured jumps like the goto statement. In his influential 1968 letter "Go To Statement Considered Harmful," published in Communications of the ACM, Edsger Dijkstra argued that unrestricted use of goto led to unmaintainable code, promoting instead sequence, selection, and iteration as foundational elements; this work, building on the 1966 Böhm-Jacopini theorem, laid the theoretical groundwork for verifiable program design and influenced languages like Pascal. The 1970s saw the emergence of modular design principles, focusing on decomposition to enhance maintainability and reusability. David Parnas' 1972 paper "On the Criteria to Be Used in Decomposing Systems into Modules," in Communications of the ACM, introduced the information hiding principle, which recommends clustering related data and operations into modules while concealing implementation details behind well-defined interfaces to minimize ripple effects from changes. This approach contrasted with earlier hierarchical decompositions based on functionality alone, proving more effective in experiments with systems like a keyword-in-context indexing program. From the 1980s to the 1990s, object-oriented design gained prominence, shifting focus from procedural modules to encapsulating data and behavior in objects for better abstraction and inheritance. Key contributions came from Grady Booch, James Rumbaugh, and Ivar Jacobson—known as the "three amigos"—who developed complementary notations: Booch's method (1980s, emphasizing design), Rumbaugh's Object Modeling Technique (OMT, 1991, for analysis), and Jacobson's objectory process (1992, with use cases). Their collaborative efforts culminated in the Unified Modeling Language (UML) 1.0, submitted to the Object Management Group (OMG) in 1997 and standardized that year, providing a visual notation for specifying, visualizing, and documenting object-oriented systems. The 1994 book Design Patterns: Elements of Reusable Object-Oriented Software by Erich Gamma, Richard Helm, Ralph Johnson, and John Vlissides further solidified object-oriented practices by cataloging 23 reusable solutions to common design problems, such as the Singleton for ensuring unique instances and the Observer for event notification; this "Gang of Four" work, published by Addison-Wesley, drew from architectural patterns and became a cornerstone for scalable software architectures. Entering the 2000s, software design evolved toward more flexible paradigms, with the 2001 Agile Manifesto—drafted by 17 practitioners including Kent Beck and Jeff Sutherland—prioritizing iterative development, customer collaboration, and responsive change over rigid planning, influencing methodologies like Scrum and Extreme Programming to integrate design incrementally. Concurrently, service-oriented architecture (SOA) rose in the early 2000s, leveraging web services standards like SOAP and WSDL to compose loosely coupled, interoperable services across enterprises, as formalized in the 2006 OASIS SOA Reference Model. By the post-2010 era, microservices architecture refined SOA's ideas into fine-grained, independently deployable services, popularized by pioneers like Adrian Cockcroft at Netflix and articulated in James Lewis and Martin Fowler's 2014 analysis, enabling scalable, cloud-native designs for high-traffic applications.[6][7] The mid-2010s introduced containerization and orchestration technologies that transformed deployment and scaling practices. Docker, released in 2013, popularized container-based virtualization for packaging applications and dependencies, while Kubernetes, originally developed by Google and open-sourced in 2014, became the de facto standard for orchestrating containerized workloads across clusters, facilitating resilient and automated infrastructure management.[8][9] These advancements supported the DevOps movement, which gained traction in the 2010s by integrating development and operations through continuous integration/continuous delivery (CI/CD) pipelines to accelerate release cycles and improve reliability.[10] Serverless computing emerged around 2014 with platforms like AWS Lambda, allowing developers to focus on code without managing underlying servers, promoting event-driven architectures and fine-grained scalability.[11] By the late 2010s and into the 2020s, artificial intelligence and machine learning integrated deeply into software design, with tools like GitHub Copilot (launched in 2021) using large language models to assist in generating code, architectures, and even design patterns, marking a shift toward AI-augmented human-centered design processes as of 2025.[12]Design Process
General Process
The general process of software design involves transforming requirements into a structured blueprint for implementation through phases such as architectural design and detailed design. This process can follow sequential models like waterfall, emphasizing documentation and verification, or iterative approaches with feedback loops for refinement, depending on the development methodology. The process begins after requirements analysis and focuses on creating representations of the system from high-level to low-level, often incorporating adaptations to align with evolving needs.[13][14] The primary phases include architectural design, which establishes the overall system structure by partitioning the software into major subsystems or modules; detailed design, which specifies the internal workings of individual components such as algorithms, data structures, and processing logic; and interface design, which defines the interactions between components, users, and external systems to ensure seamless communication. In architectural design, the system is decomposed to identify key elements and their relationships, often using patterns like layered or client-server architectures. Detailed design refines these elements by elaborating functional behaviors and control flows, while interface design focuses on defining protocols, data exchanges, and user interactions to promote usability and interoperability.[13][14] Inputs to this process primarily consist of the outputs from requirements analysis, such as functional and non-functional specifications, use cases, and stakeholder needs, which provide the foundation for design decisions. Outputs include comprehensive design documents outlining the system architecture, component specifications, and interface protocols, along with prototypes to validate early concepts. These artifacts serve as bridges to the implementation phase, ensuring traceability back to initial requirements.[13][14] Key activities encompass the decomposition of the system into manageable modules, the allocation of specific responsibilities to each module to optimize cohesion and minimize coupling, and ongoing verification through reviews, inspections, and simulations to confirm adherence to requirements. Decomposition involves breaking down high-level functions into hierarchical layers, while responsibility allocation assigns behaviors and data handling to appropriate components. Verification ensures design completeness and consistency, often via walkthroughs or formal checks against the requirements specification.[13][14] Representation during this process relies on tools such as flowcharts for visualizing control flows, pseudocode for outlining algorithmic logic without implementation details, and entity-relationship diagrams for modeling data dependencies and structures. These tools facilitate clear communication of design intent among stakeholders and developers. For instance, in transforming user needs for an inventory management system—such as tracking stock levels and generating reports—the process might yield a hierarchical module breakdown, with top-level modules for user interface, business logic, and data storage, each further decomposed and verified against the requirements.[13][14]Requirements Analysis
Requirements analysis precedes the software design process, where the needs of stakeholders are identified, documented, and analyzed to form the basis for design activities. This phase ensures that the software system aligns with user expectations, business objectives, and technical constraints by systematically gathering and refining requirements. It provides clear inputs, such as functional specifications and performance criteria, that guide architectural and implementation decisions.[15] Requirements are categorized into functional, non-functional, and constraints to comprehensively capture system expectations. Functional requirements specify the services the system must provide, including behaviors, user interactions, and responses to inputs, such as processing data or generating reports.[13] Non-functional requirements define quality attributes and constraints on system operation, encompassing performance metrics like response time, reliability standards, security measures, and usability features.[15] Constraints represent external limitations, including budget allocations, regulatory compliance, hardware compatibility, and technology standards that bound the design space.[13] Elicitation techniques are employed to gather requirements from diverse stakeholders, ensuring completeness and relevance. Interviews involve structured or open-ended discussions with users and experts to uncover explicit and implicit needs, often conducted jointly between customers and developers.[15] Surveys distribute questionnaires to a wider audience for quantitative insights into preferences and priorities.[13] Use case modeling documents scenarios of system interactions with actors, deriving detailed functional requirements from business goals, such as outlining user authentication flows.[13] Prototyping builds preliminary models to validate assumptions and elicit feedback, particularly for ambiguous interfaces.[15] Specification follows elicitation, organizing requirements into verifiable formats like use cases or structured documents to minimize misinterpretation. Validation methods confirm the requirements' accuracy, completeness, and feasibility through systematic checks. Traceability matrices link requirements to sources, designs, and tests, enabling impact analysis and ensuring no gaps in coverage.[13] Reviews, including walkthroughs and inspections, involve stakeholders in evaluating documents for consistency, clarity, and alignment with objectives.[15] Challenges in requirements analysis include resolving ambiguity and managing conflicting stakeholder needs, which can lead to costly rework if unaddressed. Ambiguity arises from vague language or incomplete descriptions, potentially causing developers and users to interpret requirements differently; this is mitigated by defining precise terms in glossaries and using formal notation.[13] Conflicting requirements emerge when stakeholders with divergent priorities, such as users emphasizing usability versus managers focusing on cost, require negotiation and prioritization techniques like voting or trade-off analysis.[15] For example, in developing a patient management system, requirements analysis might derive use cases from business goals like efficient appointment scheduling, specifying functional needs for calendar integration and non-functional constraints on data privacy under regulations like HIPAA.[13]Iterative and Agile Approaches
Iterative design in software engineering involves repeating cycles of prototyping, evaluation, and refinement to progressively improve software components, such as user interface elements, allowing designers to incorporate feedback and address issues early in the development process.[16] This approach contrasts with linear methods by emphasizing incremental enhancements based on user testing and stakeholder input, often applied to specific modules like UI prototypes to ensure usability and functionality align with evolving needs.[17] Pioneered in models like Barry Boehm's spiral model, iterative design integrates risk analysis into each cycle to mitigate potential flaws before full implementation.[17] Agile methodologies extend iterative principles into broader software design practices, promoting adaptive processes through frameworks like Scrum and Extreme Programming (XP). In Scrum, design emerges during fixed-duration sprints, where teams collaborate to refine architectures based on prioritized requirements and retrospectives, minimizing upfront planning in favor of responsive adjustments. XP, developed by Kent Beck, advocates emergent design, where initial simple structures evolve through refactoring and test-driven development, ensuring designs remain flexible to changing specifications without extensive preliminary documentation.[18] These methods align with the Agile Manifesto's values of responding to change over following a rigid plan, fostering collaboration and customer involvement throughout design iterations.[19] The benefits of iterative and agile approaches include significant risk reduction via early validation of design assumptions, as prototypes allow teams to identify and resolve issues before substantial resource investment, compared to traditional methods.[20] Tools like story mapping facilitate this by visualizing user journeys and prioritizing design features, enabling focused iterations that enhance adaptability and stakeholder satisfaction. Key concepts such as time-boxing constrain design iterations to fixed periods—typically 2-4 weeks in Scrum sprints or DSDM timeboxes—to maintain momentum and deliver tangible progress, while continuous integration ensures frequent merging and automated testing of design changes to detect integration issues promptly.[21][22] A representative example is evolutionary prototyping in agile teams, where an initial functional prototype is iteratively built upon with user feedback, refining core designs like system interfaces to better meet requirements without discarding prior work, as seen in risk-based models that combine prototyping with mitigation strategies.[23] This technique supports ongoing adaptation, ensuring the final software design evolves robustly in dynamic environments.[23]Design Artifacts and Representation
Artifacts
Software design artifacts are the tangible outputs produced during the design phase to document, communicate, and guide the development of software systems. These include various diagrams and descriptions that represent the structure, behavior, and interactions of the software at different levels of abstraction. They serve as essential intermediaries between high-level requirements and implementation details, enabling stakeholders to visualize and validate the proposed design before coding begins.[2] Common types of software design artifacts encompass architectural diagrams, data flow diagrams, sequence diagrams, and class diagrams. Architectural diagrams, such as layer diagrams, illustrate the high-level structure of the system by depicting components, their relationships, and deployment configurations, providing an overview of how the software is organized into layers or modules.[24] Data flow diagrams model the movement of data through processes, external entities, and data stores, highlighting inputs, outputs, and transformations without delving into control logic.[25] Sequence diagrams capture the dynamic interactions between objects or components over time, showing message exchanges in a chronological order to represent behavioral flows.[26] Class diagrams define the static structure of object-oriented systems by outlining classes, attributes, methods, and associations, forming the foundation for code generation in many projects.[26] The primary purposes of these artifacts are to act as a blueprint for developers during implementation, facilitate design reviews by allowing peer evaluation and feedback, and ensure traceability to requirements by linking design elements back to specified needs. As a blueprint, they guide coding efforts by clarifying intended architectures and interfaces, reducing ambiguity and errors in development.[2] They support reviews by providing a shared visual or descriptive medium for stakeholders to assess completeness, consistency, and feasibility.[27] Traceability is achieved through explicit mappings that demonstrate how each design artifact addresses specific requirements, aiding in verification, change management, and compliance.[27] A key standard governing software design artifacts is IEEE 1016-2009, which specifies the required information content and organization for software design descriptions (SDDs). This standard outlines views such as logical, process, and data perspectives, ensuring that SDDs comprehensively capture design rationale, assumptions, and dependencies for maintainable documentation.[4] The evolution of software design artifacts has progressed from primarily textual specifications in the mid-20th century to sophisticated visual models supported by digital tools. In the 1960s and 1970s, amid the software crisis, early efforts focused on structured textual documents and basic diagrams like flowcharts to formalize designs.[28] Methodologies such as Structured Analysis and Design Technique (SADT), developed between 1969 and 1973, introduced more visual representations, including data flow diagrams, enabled by emerging computer-aided software engineering (CASE) tools.[28][29] The 1990s marked a shift to standardized visual modeling with the advent of the Unified Modeling Language (UML) in 1997, which integrated diverse diagram types into a cohesive framework, facilitated by digital tools like Rational Rose.[28] Today, artifacts leverage integrated development environments (IDEs) and collaborative platforms for automated generation and real-time updates, emphasizing agility while retaining traceability.[28] For instance, in a web application, a component diagram might depict module interactions such as the user interface component connecting to a business logic component via an API interface, with the latter interfacing with a database component for data persistence, illustrating dependencies and deployment boundaries.[30] These diagrams, often created using modeling languages like UML, help developers understand integration points without implementation specifics.Modeling Languages
Modeling languages provide standardized notations for representing software designs, enabling visual and textual articulation of system structures, behaviors, and processes. The Unified Modeling Language (UML), initially developed by Grady Booch, Ivar Jacobson, and James Rumbaugh and standardized by the Object Management Group (OMG), is a widely adopted graphical modeling language that supports the specification, visualization, construction, and documentation of software systems, particularly those involving distributed objects. UML encompasses various diagram types, including use case diagrams for capturing functional requirements from user perspectives, activity diagrams for modeling workflow sequences and decisions, and state machine diagrams for depicting object state transitions and behaviors over time.[26] These diagrams facilitate a common vocabulary among stakeholders, reducing ambiguity in design communication.[31] Beyond UML, specialized modeling languages address domain-specific needs in software design. The Systems Modeling Language (SysML), an extension of UML standardized by the OMG, is tailored for systems engineering, supporting the modeling of complex interdisciplinary systems through diagrams for requirements, structure, behavior, and parametrics. Similarly, the Business Process Model and Notation (BPMN), also from the OMG, offers a graphical notation for specifying business processes, emphasizing executable workflows with elements like events, gateways, and tasks to bridge business analysis and technical implementation.[32] Textual alternatives to graphical modeling include domain-specific languages (DSLs), which are specialized languages designed to express solutions concisely within a particular application domain, often using simple syntax for configuration or behavior definition.[33] YAML (YAML Ain't Markup Language), a human-readable data serialization format, serves as a DSL for software configuration modeling, allowing declarative descriptions of structures like application settings or deployment parameters without procedural code. For instance, YAML files can define hierarchical data such as service dependencies in container orchestration, promoting readability and ease of parsing in tools. These modeling languages offer key advantages, including standardization that ensures interoperability across tools and teams, as seen in UML's role in fostering consistent design practices.[31] They also enable automation, such as forward engineering where models generate executable code, reducing development time and errors by automating boilerplate implementation from high-level specifications.[34] A practical example is UML class diagrams, which visually represent object-oriented relationships, attributes, operations, and inheritance hierarchies—such as a base "Vehicle" class extending to "Car" and "Truck" subclasses—to clarify static design elements before coding.Core Principles and Patterns
Design Principles
Design principles in software design serve as foundational heuristics that guide architects and developers in creating systems that are maintainable, scalable, and adaptable to change. These principles emphasize structuring software to manage complexity by promoting clarity, reducing dependencies, and facilitating evolution without widespread disruption. Originating from early efforts in structured programming and object-oriented paradigms, they provide prescriptive rules to evaluate and refine design decisions throughout the development lifecycle.[35] Central to these principles are modularity, abstraction, and cohesion, which form the bedrock of effective software organization. Modularity advocates decomposing a system into discrete, self-contained modules that encapsulate specific functionalities, thereby improving flexibility, comprehensibility, and reusability; this approach, rooted in information hiding, allows changes within a module to remain isolated from the rest of the system.[36] Abstraction complements modularity by concealing unnecessary implementation details, enabling developers to interact with higher-level interfaces that reveal only essential behaviors and data, thus simplifying reasoning about complex systems.[37] Cohesion ensures that the elements within a module—such as functions or classes—are tightly related and focused on a single, well-defined purpose, minimizing internal fragmentation and enhancing the module's reliability.[38] These concepts trace their historical basis to structured design methodologies, particularly the work of Edward Yourdon and Larry Constantine, who in 1979 formalized metrics for cohesion and coupling to quantify module interdependence and internal unity, arguing that high cohesion paired with low coupling yields more robust architectures.[39] Building on this foundation, the SOLID principles, articulated by Robert C. Martin in 2000, offer a cohesive framework specifically for object-oriented design, addressing common pitfalls in class and interface organization to foster extensibility and stability.[40] The Single Responsibility Principle (SRP) states that a class or module should have only one reason to change, meaning it ought to encapsulate a single, well-defined responsibility to avoid entanglement of unrelated concerns.[40] For instance, in a payroll system, separating user interface logic from salary calculation into distinct classes prevents modifications to display formatting from inadvertently affecting computation accuracy. The Open-Closed Principle (OCP) posits that software entities should be open for extension—such as adding new behaviors via inheritance or composition—but closed for modification, ensuring existing code remains unaltered when incorporating enhancements.[40] This is exemplified in plugin architectures where new features extend a core framework without altering its codebase. The Liskov Substitution Principle (LSP) requires that objects of a derived class must be substitutable for objects of the base class without altering the program's correctness, preserving behavioral expectations in inheritance hierarchies.[40] A violation occurs if a subclass of a "Bird" class, intended for flying behaviors, includes a "Penguin" that cannot fly, breaking assumptions in code expecting uniform flight capability. The Interface Segregation Principle (ISP) advises creating smaller, client-specific interfaces over large, general ones, preventing classes from depending on unused methods and reducing coupling.[40] For example, instead of a monolithic "Printer" interface with print, scan, and fax methods, segregate into separate interfaces so a basic printer class need not implement irrelevant scanning functions. Finally, the Dependency Inversion Principle (DIP) inverts traditional dependencies by having high-level modules depend on abstractions rather than concrete implementations, and abstractions depend on no one, promoting loose coupling through dependency injection.[40] In practice, a service layer might depend on an abstract repository interface rather than a specific database class, allowing seamless swaps between SQL and NoSQL storage without refactoring the service. Applying these principles, including SRP to avoid "god classes" that centralize multiple responsibilities—such as a monolithic controller handling authentication, validation, and data persistence—results in designs that are easier to maintain and scale, as changes are localized and testing is more targeted.[40] Overall, adherence to modularity, abstraction, cohesion, and SOLID ensures software evolves efficiently in response to new requirements, reducing long-term costs and errors.[38]Design Concepts
Software design concepts encompass the foundational abstractions and paradigms that structure software systems, emphasizing how components interact and represent real-world entities or processes. These concepts provide the philosophical underpinnings for translating requirements into modular, maintainable architectures, distinct from specific principles or patterns by focusing on broad organizational strategies rather than prescriptive rules. Core concepts in software design include encapsulation, inheritance, and polymorphism, which are particularly prominent in object-oriented approaches. Encapsulation, also known as information hiding, involves bundling data and operations within a module while restricting direct access to internal details, thereby promoting modularity and reducing complexity in system changes.[41] Inheritance enables the reuse of existing structures by allowing new components to extend or specialize base ones, fostering hierarchical organization and code economy in designs. Polymorphism ensures that entities with similar interfaces can be substituted interchangeably, supporting uniform treatment of diverse implementations and enhancing flexibility.[42] Design paradigms represent overarching approaches to organizing software, each emphasizing different balances between data and behavior. The procedural paradigm structures software as sequences of imperative steps within procedures, prioritizing control flow and step-by-step execution for straightforward, linear problem-solving. Object-oriented design models systems around objects that encapsulate state and behavior, leveraging relationships like inheritance to mimic real-world interactions. Functional design treats computation as the composition of pure functions without mutable state, focusing on mathematical transformations to achieve predictability and composability. Aspect-oriented design extends these by modularizing cross-cutting concerns, such as logging or security, that span multiple components, allowing cleaner separation from primary logic. A key distinction in design concepts lies between data-focused and behavior-focused modeling. Entity modeling emphasizes the structure and relationships of persistent data elements, such as through entity-relationship diagrams, to define the informational backbone of a system. In contrast, behavioral specifications capture dynamic interactions and state transitions, outlining how entities respond to events over time to ensure system responsiveness. Trade-offs in component interactions are central to robust design, particularly the balance between coupling and cohesion. Coupling measures the degree of interdependence between modules, where low coupling minimizes ripple effects from changes by limiting direct connections.[39] Cohesion assesses the unity within a module, with high cohesion ensuring that elements collaborate toward a single, well-defined purpose to enhance reliability and ease of maintenance.[39] Designers aim for high cohesion paired with low coupling to optimize modularity. For instance, polymorphism in object-oriented design allows interchangeable implementations, such as defining a base "Shape" interface with a "draw" method that subclasses like "Circle" and "Rectangle" implement differently; client code can then invoke "draw" on any shape object without knowing its specific type, promoting extensibility.Design Patterns
Design patterns provide proven, reusable solutions to frequently occurring problems in software design, promoting flexibility, maintainability, and reusability in object-oriented systems.[43] These patterns encapsulate best practices distilled from real-world experience, allowing developers to address design challenges without reinventing solutions. The seminal work on design patterns, published in 1994, catalogs 23 core patterns that form the foundation of modern software architecture.[43] The patterns are organized into three primary categories based on their intent: creational, structural, and behavioral. Creational patterns focus on object creation mechanisms, abstracting the instantiation process to make systems independent of how objects are created, composed, and represented; examples include Singleton, which ensures a class has only one instance and provides global access to it, and Factory Method, which defines an interface for creating objects but allows subclasses to decide which class to instantiate.[43] Structural patterns deal with class and object composition, establishing relationships that simplify the structure of large systems while keeping them flexible; notable ones are Adapter, which allows incompatible interfaces to work together by wrapping an existing class, and Decorator, which adds responsibilities to objects dynamically without affecting other objects. Behavioral patterns address communication between objects, assigning responsibilities among them to achieve flexible and reusable designs; examples include Observer, which defines a one-to-many dependency for event notification, and Strategy, which enables algorithms to vary independently from clients using them.[43] Each of the 23 Gang of Four (GoF) patterns follows a structured description, including intent (the problem it solves), motivation (why it's needed), applicability (when to use it), structure (UML class diagram), participants (key classes and their roles), collaborations (how participants interact), consequences (trade-offs and benefits), implementation considerations (coding guidelines), sample code (pseudocode examples), known uses (real-world applications), and related patterns (connections to others).[43] This format ensures patterns are not rigid templates but adaptable blueprints, with consequences highlighting benefits like increased flexibility alongside potential drawbacks such as added complexity.[43] Modern extensions build on these foundations, adapting patterns to contemporary domains like distributed systems and cloud computing. The Pattern-Oriented Software Architecture (POSA) series extends GoF patterns to higher-level architectural concerns, such as concurrent and networked systems, introducing patterns like Broker for distributing responsibilities across processes and Half-Sync/Half-Async for handling layered communication.[44] In cloud-native environments, patterns like Circuit Breaker enhance resilience by detecting failures and preventing cascading issues in microservices; it operates in states—closed (normal operation), open (blocking calls after failures), and half-open (testing recovery)—to avoid overwhelming faulty services.[45] Selecting an appropriate design pattern requires matching the pattern's context and applicability to the specific problem, ensuring it addresses the design intent without introducing unnecessary abstraction or over-engineering.[43] Patterns should be applied judiciously, considering trade-offs like performance overhead or increased coupling, and evaluated against the system's non-functional requirements such as scalability and maintainability. A representative example is the Observer pattern, commonly used in event-driven systems to notify multiple objects of state changes in a subject. The pattern involves a Subject maintaining a list of Observer dependencies, with methods to attach, detach, and notify observers; ConcreteSubjects track state and broadcast updates, while ConcreteObservers define update reactions. This decouples subjects from observers, allowing dynamic addition or removal without modifying the subject.[43] The UML structure for Observer can be sketched as follows:+----------------+ +-------------------+
| Subject |<>-----| Observer |
+----------------+ +-------------------+
| -observers: List| | +update(): void |
| +attach(obs: Obs)| +-------------------+
| +detach(obs: Obs)| ^
| +notify(): void | |
+----------------+ |
+----------------+ +-------------------+
| ConcreteSubject | | ConcreteObserver |
+----------------+ +-------------------+
| -state: int | | +update() |
| +getState(): int| +-------------------+
| +setState(s: int)|
+----------------+
+----------------+ +-------------------+
| Subject |<>-----| Observer |
+----------------+ +-------------------+
| -observers: List| | +update(): void |
| +attach(obs: Obs)| +-------------------+
| +detach(obs: Obs)| ^
| +notify(): void | |
+----------------+ |
+----------------+ +-------------------+
| ConcreteSubject | | ConcreteObserver |
+----------------+ +-------------------+
| -state: int | | +update() |
| +getState(): int| +-------------------+
| +setState(s: int)|
+----------------+
