From Abstract Concepts to Tangible Value: Software Architecture in Modern IT Systems

Georges Lteif

Georges Lteif

Software Engineer

Last Updated on April 21, 2022.
Subscribe now to stay posted!
About Us
15 min read

1. Overview

Software design and architecture are two very elusive concepts; even Wikipedia’s entries (ref. architecture, design) are somewhat fuzzy and do not clearly distinguish between the two.

The Agile manifesto’s statement on architecture and design is especially brief and raises more questions than answers.


The best architectures, requirements, and designs emerge from self-organizing teams.

The most common definition of software architecture is as follows: it is those aspects that are important and hard to change. As for software design, the standard definition is: selecting the correct elements or components for the solution.

But what are those aspects and elements? What criteria have been used to rank their importance or appropriateness? Is everything that is hard to change (the programming language, for example) part of the software’s architecture?

Most sources and experts are silent on these issues.

If we can’t distinctly define the terms we use and their meanings (poor ontology and semantics), we probably cannot examine, let alone optimize and maintain, them in the best possible manner.

This brings us to the objective of this article, to define and explore what solution architecture means and compare and contrast it with solution design.

The ideas presented here are inspired by a series of lectures on the fundamentals of Systems Engineering on MIT OpenCourseWare. I believe they have the required rigour and universality to form a solid foundation for a discussion on architecture in the world of software.


2. Table of Contents


3. Challenges of Defining Solution Architecture

Why are these two concepts (solution architecture and design) so elusive? There are two main reasons I can think of.

  • The first is that architecture and design fade away on small scales and with low levels of complexity. In that context, the technical decisions that make them what they are are trivial and the two become indistinguishable from one another.

When you think of scale, think of the Great Wall of China or the Great Pyramids of Geza — two massive but otherwise homogenous structures. On the other hand, living organisms are the best and probably most extreme example when you think of complexity.

If you inspect a simple program that captures information from the user via the command-line interface and stores it in a file, there are very few concepts that you need to invent or design choices you need to make. It is hard to surface and expose such a program’s architectural or design traits.

  • The second reason is the maturity of many of the concepts of Information Technology and their implementations.

Let’s say you have a requirement to persist structured information. Your first thought would undoubtedly be to use a relational database that supports standard SQL. Your design choices will consist solely of selecting the brand and perhaps a few other parameters like the server size on which the database will be hosted.

Minimal forethought would need to be given to the architecture or design of the database engine itself. This concept (the reliable and easily accessible information store) and its implementation (the relational database) were invented long ago and are now useable off the shelf.


Small scale solutions are still ubiquitous today, but the norm for big enterprises is large and complex projects. The significance of software architecture and design expresses itself quite well under these conditions.

Performance, functionality, quality, constraints (typically on schedule and budget), or any other stakeholder requirements bring architecture and design to the forefront.

To have a clear idea of what solution architecture is, we need to explain concept creation and selection. A prerequisite to that is a close familiarity with the generation and delivery of business value.


4. Creating Business Value

In a previous article on engineering superior production processes, we divided the sources of business value into two broad categories. A product has value if it:

The below diagram illustrates the value creation process.

Creating Business Value
Creating Business Value

Naturally, everything starts with stakeholders having business needs. These can be vague and ambiguous, and for any project to succeed, well-defined and very specific objectives would need to be derived.

Creating concrete attainable goals from (nebulous) business needs is called Value Identification.

Now that we have clear objectives to work on, delivery processes will be set in motion to create artifacts that consume raw information and produce services. At the artefact’s core are transformation processes that act on the input to produce an output that has some utility (value) to the stakeholders.

This series of steps is known as Value Delivery.

Information and IT

We have used raw information (such as client orders) instead of something more generic (like raw material which encompasses everything including information) simply because this website is about software and the existential purpose of software programs is to process information.

Similarly, and for the same reasons, we used services (such as order preparation, payment, and delivery), although goods and services would have worked equally well.

Finally, a Value Proposition indicates to prospective clients why they should use your services instead of your competitor’s.

The significance of your value proposition is determined by two factors:

  • Your solution’s capacity to resolve all your customer’s issues.

Now that we have part of the essential tools in place, we will examine a concrete (and ancient) example of an engineering problem that requires architecting a decent solution.

This example will provide a pathway to understanding concept creation and selection before finally taking a deep dive into solution architecture.


5. Case Study: Architecting a Communications System

5.1 Problem Definition

As a solution architect or systems engineer, imagine that you have to solve the age-old problem of information transmission across large distances.

The problem details are as follows: a centrally-located command and control centre requires near-real-time intelligence from geographically remote monitoring outposts.

Your job is to design a fast, reliable, and secure communications system within two years and for a budget of a few million dollars.

Below is an executive summary of the task.

  • Business need
    • An organization requires the implementation of a fast, secure, and reliable intelligence transmission system from outposts in remote areas to a central server in headquarters.
  • Stakeholders:
    • The organisation’s clients, competitors, suppliers, owners, and employees all can influence the organisation’s progress and as such, are considered its stakeholders.
  • Project constraints are:
    • A two-year timeframe to launch
    • A total budget of $5M
  • Requirements lists: The main highlights of such a system can be:
    • The communication channels between the monitoring stations and the headquarters provide near-real-time, encrypted, and reliable transfer of valuable data.
    • The addition of new monitoring stations does not result in system changes in other stations or headquarters.
    • The system operates reliably under different weather conditions with 99.99% data accuracy.
    • The operations team requires no more than basic IT engineering skills to operate the system.
    • The accuracy of the information relayed to the centre should be at least 90%.
  • Value generation and delivery occur when:
    • The system satisfies the stakeholders’ needs.
    • The system in place is of high quality, low cost, and can be considered safe, and easy to operate and maintain.

5.1 Creating a Architectural Model

Having established some key requirements of the communications system we want to design, we can now model the system (using systems modelling language like OPM) as follows:

Engineering a Communications Solution
Engineering a Communications Solution

The diagram contains entities, states, functions, and forms which we describe below:

  • Entities are objects that constitute the system’s building blocks.
    • The first entity we are concerned with is Valuable Information or Intelligence.
    • The second entity is a property of the Valuable Information object, which is its Availability to the headquarters.
  • States are specific configurations in which a property of an entity can take.
    • Intelligence can be in two states: 1) unknown, not yet available, or 2) available.
  • Functions applied to entities can transform them from one state to another.
    • Information Availing is a function that transfers intelligence between the two states mentioned above. We can identify three different instances of this function: transportmirroring, and forecasting.
    • Transport is the traditional method of communication where data is physically transported via a medium like radio waves.
    • Mirroring or transportation of entangled quantum qubits to form quantum communication networks.
    • Forecasting is another method for “transporting” information from the future rather than other locations using mathematical models and historical data.
  • Forms are a particular type of entity, differing from the ones mentioned earlier in that these are concrete systems while the main ones (Valuable Information and Availability) are abstractions. Forms are physical systems that deliver the functionality in question.
    • Traditional communication systems are based on radio links (wires, antennas, and satellite links). The most mature and well-known of the three options.
    • Quantum networks involve quantum processors and qubits. However, their feasibility can be questioned at the time of writing as practical solutions that can be deployed and operated in the field.
    • Mathematical models use machine learning or statistical models to predict and forecast future events. Depending on the model’s prediction power, this solution can be viable.

Entities, states, functions, and forms allow us to conceptually represent what the system should do, and how it would do it.

In the next paragraphs, we will use this example as our reference to explain how architecture is created.


6. Solution Architecture

6.1 The Two Stages of Architecture

Solution architecture consists of two stages, concept creation and concept selection.

  1. Concept creation is the process of determining what works and what doesn’t. The architectural model presented in the previous paragraph is the outcome of a concept creation activity.
  1. Concept Selection: When looking at potential solutions, technical feasibility along with a score of other attributes is examined, and solutions that are not acceptable are discarded. There are a few things to consider when deciding whether to adopt or discard a specific solution:
    1. Constraints such as physical dimensions (size, weight, volume), or budget and timeframe
    2. Quality such as reliability, maintainability, safety, scalability and many others.
    3. Performance such as throughput, uptime, and capacity.
    4. Operational Requirements such as skills required of operations staff, how much automaton is involved vs manual operation.
    5. Side-effects such as environmental pollution or waste.

6.2 Concept Creation

If we take a closer look at the case study on the communications system, every function, major and specialized form is a potential distinct solution to our problem.

For example, Transport/Radio Links/Satellite Links is one solution while Prediction/Mathematical Models/AI is another.

These combinations of function, form, and specialization are called concepts.


Concept: a system vision that embodies working principles and a mapping from function to form.

This process that generates those solutions is referred to as Concept Creation, and consists of the following steps (refer to case study on the communications system):

  • First, an operand is identified (Intelligence). This entity is what the stakeholders care about as it offers a certain utility in its final and processed form; it is the source from which business value can be created.
  • Next, a suitable operation (availing the intelligence to the desired party) is located to move the operand between its different states. This transformation of the operand produces value.
  • Then, a major form is then specified to supply the operation. Radiowaves and radio links are examples of a major form. By modulating radio waves at the sender and demodulating them at the receiver, information is moved across distant locations and becomes available for storage and analysis.
  • Identify the supporting functions and secondary forms. This procedure is called logical decomposition, an important topic which we will come to later. Here, minor forms are derived from the major form as specialized instances. Examples are antennas, satellites, and wireless routers, all of which use the physics behind radio waves to produce radio communication. Supporting functions are derived from the main function.

6.3 Logical Decomposition

NASA Systems Engineering Handbook describes an interesting process for approaching design problems called Logical Decomposition. The process is very systems engineering-oriented, and it had to be adapted to suit software design problems. The essence, however, is the same.

On a high level, the process consists of two major stages that are iteratively applied to produce an optimal solution.

In the first stage, a high-level requirement is established, off of which a high-level design is produced.

In the next stage, specific technical choices are made, requiring the gathering of additional, albeit low-level, requirements. This time, a low-level design is produced off of these requirements.

The low-level design is validated to ensure it satisfies both levels of requirements.

Logical Decomposition Nasa Systems Engineering Handbook
Logical Decomposition — Nasa SE Handbook

The details of each step are as follows:

  • Step 3: Logical Decomposition is now performed. The functions and forms of the High-level Solution Design are decomposed on one of the following bases into smaller, more specialized components and subfunctions:
    • Logical Flow
    • Functional Flow
    • Data Flow
    • Behavioural
    • States and modes
  • Depending on the system’s complexity, there may be one or more decomposition levels.
  • Step 4: Now that technical design choices have been made, the second round of requirements gathering needs to be completed. Here, the requirements are low-level and delve into the specific details of the subfunctions and specialised forms.
  • Step 5: A failure to produce a working design might trigger a revision of the low-level requirements. Perhaps they need to be relaxed or even fundamentally changed.
  • Step 6: The design is then assessed on a functional basis. If it satisfies all stakeholder requirements, follows industry best practices, and passes regulatory mandates, it is adopted. Otherwise, the high-level design is altered and the cycle starts again.
  • Step 7: A failure to produce a working design that passes functional tests might prompt a revision of the high-level requirements. A new concept might be tried.

The central theme of this approach is the iterative process that cycles through the alternative solutions to produce an optimal solution. It is also acceptable to simultaneously test out variations of the low-level design (through simulations, for example, or POCs) to identify the best combination.


Before we end this section, a few words on the decomposition process itself (Step 3 above), why it is needed, and what would be an appropriate depth and breadth of its levels.

Decomposition is a method for dealing with the limits of our cognitive abilities. In a famous article published in 1956 by George A. Miller, he stated that our minds could faithfully retain seven chunks of data on average and make correct absolute judgments on problems with around seven elements (like estimating the number of beans on a floor).

If you have a system composed of many components, you can organize them logically or functionally in a hierarchy where any level in this hierarchy contains around seven elements.

Another straightforward application that I find very useful is maintaining no more than seven lines in a single PowerPoint slide and where each sentence is around seven words.

6.4 Form-Function Mapping

Function-form mapping is an essential step in any design exercise. Here are its fundamentals ideas:

  • A useable design should be detailed enough so that we can validate its capacity in satisfying the business needs. The uncovered details can remove ambiguity and ensure that the setup is sound.
  • Logical decomposition breaks down all primary forms and functions into secondary forms and supporting functions. There can be multiple levels of decomposition depending on the size of the solution.
  • A mapping of function to form is then performed. Because this mapping may not be one-to-one, high-interconnectivity and second-order responses can generate a lot of complexity. Modifying the system to add or update functionality becomes challenging.
  • This complexity can be even further exacerbated when considering the different form to form connections as well.

The below diagram takes us back to the communications system case study.

Form-Function Mapping
Form-Function Mapping

On the left-hand side, we can see how the main Information Availing function can be decomposed into a primary part responsible for generating forecasts while secondary supporting functions are responsible for the admin part of the system, such as user setup and maintenance, data collection, processing, and storage, and reporting functionalities.

On the right-hand side, we have the submodules of the system. At its core is the statistical model responsible for generating forecasts, while other modules such as Active Directory provide user access and security functions.

6.5 Architecture and Complexity

The subfunctions and secondary forms must be numerous if rich functionality is desired. Otherwise, the product will produce the bare essentials.

Client tastes have become increasingly sophisticated, and decision-making has moved from the technical to the business people. Products must offer a wide gamut of supporting functionality to compete.

Complex systems with many parts and plenty of functionality can be costly. Optimizing the design is crucial to keep the costs down, but this cost-saving exercise has its price.

Design optimization would result in high interconnectivity between functions and forms to decrease the overall number of parts and make the best use of existing functionality by leveraging reusable components and processes.

The increased interconnectivity coupled with a large number of components may create second-order, self-reinforcing positive feedback loops. These new properties of the system make it hard to predict its behaviour under unknown circumstances.

Optimization brings complexity which leads to fragility, and the system architecture’s job is to manage that complexity.

The relationship between complexity, optimization, and a highly-connected system with a large number of parts is not restricted to software systems but can also be observed in the global economy and political system (ref. Antifragile by Nassim Taleb).

The remarkably efficient global communication and transport system that has emerged in the last couple of decades has incredibly optimized the world economy.

The major producers of primary goods such as gas, wheat, and computer chips have grown larger and larger until only a handful of them have come to dominate the world market.

This optimization of resources and capacity has resulted in cheaper and better goods at the cost of increasing fragility.

6.6 Concept Selection

Concept selection is the application of decision analysis to select an optimal solution from a candidate list. Below are some of its key ideas:

  • The stakeholder requirements have been divided into mandatory (in the form or shall statements) and nice-to-have (the should statements). The group of compulsory requirements allows us to discard those solutions that do not fit early on. On the other hand, solutions that satisfy a range or subset of optional constraints present viable alternatives. Concept selection tries to identify the optimal alternative to be adopted.
  • Selecting the best candidate is not a trivial problem, and many tools have been created that can help solve this issue. Some examples are the Pugh matrixutility functions, and The Six Thinking Hats. However, the general approach is simple and consists of selecting a representative set of criteria based on which the different alternatives can be compared. Judgment biases are inevitable, but methods for taming them exist.

Any selection method you use should satisfy the following constraints to be usable:

  • Robustness: the results are repeatable and minor variations in the selection criteria or decision-making groups do not radically alter the outcome.
  • Accurate and reliable to ensure that the solution space has been fully explored and that great solution did not go unnoticed. It must also allow two weak solutions to be combined and assessed should they produce a stronger candidate.

7. Architecture vs Design

There is often the question of how architecture and design are related and whether or not they mean the same thing. While there is considerable overlap between solution architecture and design, they are two different things.

As we have seen, architecture creates and selects the conceptual models of the solution before moving to decomposition, form-to-function mapping, and establishing the system’s key design variables and operating parameters.

On the other hand, design instantiates the concept by selecting the correct values for the design parameters and optimizing the final selection.

In the communications system example, the architect selects the concept of radio communications while the design engineers determine the antenna’s dimensions, weight, materials, power, mode of transport, assembly, and geographical distribution.


8. Final Words

A few decades ago, architecture was simple enough. Solutions consisted of a few applications, and these were faithfully modelled in the traditional Model-View-Control blueprint.

This architecture can now be considered severely outdated; any modern solution has to make place for single-sign-on, analytics, automated testing, and continuous delivery at a minimum. Modern architecture needs to allow for that.

Architecture also involves intensive decision-making activities and can significantly impact influential players, making it more crucial to have a decision-making framework that helps avoid emotionally-driven choices with adverse strategic consequences.

Making the right decisions early on can avoid significant redesign efforts at the later stages of any implementation.

It is sometimes tempting to give in to management pressure for reducing design efforts for the sake of cost or time. Pressure can likewise be applied to force specific views by experts who also happen to be influential people in the organization.

Therefore, sound architecture also relies on striking a balance between what’s practical and what works. We hope the ideas in this article help you find that balance.


9. References


Technical Risk Management and Decision Analysis — Introduction and Fundamental Principles

1. Overview I could not find a better way to start an article on Risk and Risk Management than by quoting the opening lines of Donald Lessard and Roger Miller’s 2001 paper that, briefly but lucidly, summarizes the nature of large engineering endeavours. It goes like this: This article leans heavily on three handbooks thatContinue reading “Technical Risk Management and Decision Analysis — Introduction and Fundamental Principles”

Complexity and Complex Systems From Life on Earth to the Universe: A Brief Introduction

1. Overview Dealing with complexity is an integral part of our lives, even if we do not realise it.  An organisation can be modelled as a complex system from the scale of megacorporations right down to the smallest teams. The architecture of software solutions can be equally complicated, and megaprojects and implementations are certainly involved.Continue reading “Complexity and Complex Systems From Life on Earth to the Universe: A Brief Introduction”

Book Review: Programming the Universe — A Quantum Computer Scientist Takes on the Cosmos

Synopsis Most physical theories adopt a mechanistic view when examining natural phenomena where any system can be modelled as a machine whose initial conditions and dynamics govern its future behaviour. In this book, Programming the Universe — A Computer Scientist Takes on the Cosmos, Professor Seth Lloyd proposes a radically different approach centred around aContinue reading “Book Review: Programming the Universe — A Quantum Computer Scientist Takes on the Cosmos”

Business Requirements and Stakeholder Management: An Essential Guide to Definition and Application in IT Projects

1. Overview The complexity of business requirements in IT projects has experienced exponential growth due to pressures by increasingly sophisticated client preferences, novel technologies, and fierce competition. Consider, for example, the case of financial payments. In the mid-80s, most payment transactions occurred inside bank branches, and only the biggest banks offered services on ATM orContinue reading “Business Requirements and Stakeholder Management: An Essential Guide to Definition and Application in IT Projects”

Human Groups as Complex Systems: Structure, Organization, Power Distribution, and Dynamics

1. Overview During most of the time we spend awake, we find ourselves an integral part of a large and complex system that we know very little about. We experience events that bewilder us and prove to us, time and again, how poor our abilities to predict future events or people’s reactions can be. OurContinue reading “Human Groups as Complex Systems: Structure, Organization, Power Distribution, and Dynamics”

Loading…

Something went wrong. Please refresh the page and/or try again.

Leave a Reply