Patterns - Clean Architecture



In the ever-evolving world of software development, maintaining code quality and structure is paramount. The human brain adopts few paradigms to handle complexity effectively, and in the haste to launch products quickly, it is easy to overlook this. However, in the competitive landscape of project building, where speed to market is critical, one could propose that long-term success hinges on the stability, depth and richness of product features. This depth brings complexity, and with it, the necessity for maintainance.

Thus, the most critical aspects of any software development project is to ensure maintainability and enhancing the developer experience of the codebase. Once the initial excitement of developing a proof of concept, prototype, or minimum viable product subsides, the software should be straightforward to adjust and maintain.

With a naive analogy, when constructing a any software product, think of it like building a house. It's not just about creating strong walls; over time, these walls will face more stress and load than initially planned, and eventually, their core foundation may become outdated (changes of technology, construction paradigm, security issues, etc). Therefore, an effective construction architecture is modular, allowing any worker to easily refurbish, replace, or rebuild any component without significant cost. This adaptability ensures the system can handle increasing demands and evolving requirements efficiently.

In the same vain, a well-structured codebase ensures robustness, scalability, and ease of maintenance, allowing new features to be integrated seamlessly without the fear of inducing feature regressions, and keeping the various constructs boundaries very clear for the system.

The capital market is replete with projects that began with great promise but quickly lost their competitive edge. Poorly designed codebases lead to technical debt, making it difficult to implement new features, striving away leaders from visionnary activities to financing ones in order to pay for maintenance, fix bugs, and onboard new developers. This results in slower development cycles, a lack of essential or timely features, eventually reduced funding efficiency ( amount invested vs outcome, leading away investors all together), waning developer interest, and ultimately, the project's failure in the face of more resilient competition.


The Clean Architecture paradigm, a concept popularized by Robert C. Martin in the software development ecosystem. It is really a mixings of as-old-as-age architectural principles across and within various industries and under various names. It is a development pattern attempting to modularize software development through various layers, each with a distinct and scoped responsibility. It emphasizes focusing on the project's core objectives and integrating domain concepts into the technical realm. This is all to make software easier to understand, develop, resilient, adaptable, test and maintain. That is, an intent to making the developer's life easier, to make the construct tameable over time. Which ultimately means cheaper projects in time.

These approaches to system design are often overlooked due to product lifecycle pressures, short-term cost-cutting measures, and the allure of new tools and technologies. Developers easily brag about how cool they are using the latest hammers, like web 2.0 development framework Angular19 or React, shaping front end development paradigm in the 2010, and other Defi rollups protocols, and more. And with reasons, the "React Days" and other community base events of the like bring real hipe in the development space and make one's employable. These amazing tools have the tendency to be provide broad usage spectrums, having their own practice spaces (eg. this is the way you do it with Angular), where one easily forget the basis of development culture, complexity management, while leveraging the various framework tools for laziness, understandably, in front the ever evolving complexity of software development.

independent of specific frameworks, user interfaces, databases, and external systems.

The modular approach to managing complexity wants the system design to clear out this (relational) complexity into independant layer, each layer of the architecture:

  • have a life of its own (no adhesion)
  • showcases very clear responsability and boundaries
  • displays an official interactional surface (modular means seggregation means communication)
  • importantly, is loosely coupled to any particular tech stack subject to constant evolution ( translating to effective instability eg. some framework API short term release evolving every 6 month).

Hence the line of conduct is as follow:

  1. Engage on the core functionality of the application using stable tools like the programming language and algorithms.
  2. Ensure that any external systems and tools, effectively beyond the team's control, is kept in plug-and-play capacity (such as UI frameworks, dependencies, and database technologies).

This approach allows your team to maintain control and flexibility, adapting to changes without significant disruption.

Where the above sounds like an exciting MOTO, let's discuss core principles of the pattern, explore it's most obvious layered structure, and see a practical approach to front end software design.

The principles

This modularization patterns is grounded in several core principles that ensure the longevity, maintainability, and flexibility of software systems. These principles help create a robust architecture that stands the test of time and adapts to evolving requirements. Here are the key principles:

Independent of User Interface (UI) One of the fundamental tenets of Clean Architecture is that the business logic of an application should not be dependent on its user interface. By decoupling the UI from the core logic, you can easily change or update the user interface without affecting the underlying business rules. This separation allows for multiple user interfaces to be implemented (e.g., web, mobile, desktop, or even cross applications (eg. micro-front-end)) that interact with the same core logic, enhancing the software's versatility and adaptability.

Independence of Frameworks The pattern promotes independence from specific frameworks. Frameworks come and go, and they often evolve rapidly, leading to potential obsolescence or major changes that could impact your codebase / product. I have personally worked with teams with such an extended adhesion surface that update of a framework version took 3 weeks for a team to stabilize accross multiple products. That's truly expansive for any enterprise. By ensuring that your core logic and business rules are not tightly coupled to any particular framework, you can swap out frameworks or update them with minimal disruption. This principle provides long-term stability and reduces the risk associated with framework dependencies.

Independent of Data sources The core business logic should also be independent of the database. Data storage and retrieval mechanisms are critical, but they should not dictate the structure or functionality of the core logic. By abstracting database interactions, you can switch databases, modify data schemas or else to respond to any particular constraints, without impacting the core application. This abstraction facilitates easier migrations and scalability options, ensuring that your software remains flexible in handling data.

Independent of Any External Agency External systems, such as APIs, libraries, and services, should not dictate the structure of your core business logic, but merely details enhancing it. Clean Architecture advocates for an isolated core that interacts with external systems through well-defined interfaces. This isolation ensures that changes or failures in external systems do not ripple through and destabilize the applicative core. Again, it also enhances security and stability by limiting the exposure of the core logic to external dependencies.

Testability A critical advantage of the pattern is the emphasis on testability. By designing software with clear boundaries and separated concerns, it's then possible to write unit tests for individual components without requiring the entire system to be operational. This modular approach makes it easier to test business rules, use cases, and other core logic independently, leading to more reliable and maintainable code. Obviously, testability ensures that changes can be made confidently, knowing that the tests will catch any regressions or issues.

Best Practices

The approach suffers from a strong influence of DDD (Domain Driven Design) that brings restrictions and good practices in relation to the application's domain code. It promotes the following of the principles S.O.L.I.D (Single Responsibility, Open-Close Principle, Liskov Substitution Principle, Interface Segregation and Dependency inversion)

The benefits

Beyond the above mentionned opportunities it brings, the paradigms significant advantages is the separation of concerns. It involves dividing a software application into distinct sections, each addressing a separate concern or aspect of the application's functionality. The immediate benefits:

  • Reduced Propagation costs: This metric measures how costly a change in system code is, that is, the size of the avalanche triggered by a change in code. It describes the proportion of system files that are directly or indirectly connected with each one. The higher the result the more costly a change in the application. Which eventually induces the following point.

  • Improved Maintainability: By isolating different aspects of the application, such as business logic, user interface, and data access, it clearly makes it easier to locate and fix bugs. Maintenance becomes more straightforward because changes in one part of the application do not affect others. This isolation minimizes the risk of introducing errors and regression when making modifications.

  • Enhanced Scalability: Each layer or module can be developed, tested, and deployed independently, enabling teams to scale the application more efficiently. This modularity supports adding new features or components without disrupting existing functionality.

  • Greater Flexibility: It ensures that the core business logic is decoupled from specific technologies or frameworks. This decoupling provides the flexibility to switch technologies, such as migrating to a new database or adopting a new user interface framework, without significant rewrites of the core application. Adaptability is crucial in a rapidly changing technological landscape.

  • Easier Testing: With clearly defined boundaries and responsibilities, testing becomes more manageable. Unit tests can target individual components or layers in isolation, ensuring that each part works correctly. This directly leads to more reliable and maintainable code, as issues can be identified and resolved early in the development process.

  • Improved Team Collaboration: Pushed to the limits, for inner or cross products, different teams should be able to work on distinct layers or modules simultaneously without stepping on each other's toes. Pretty much like microservicing the various part of the application. This division of labor forsters parallel development and faster delivery of features. Which induces the following point.

  • Micro front-end: The paradigm has garnered significant attention as client applications continue to grow in size and complexity. This approach involves modularizing front-end monoliths, not just into various modules, but into fully-fledged applications themselves. By doing so, front-end development teams can isolate responsibility scopes and enhance collaboration, allows different teams to work on distinct parts of the front-end architecture, ultimately accelerating the development process and not spreading inhouse developers know-how over business notions too thin. This paradigm is akin to microservices in the backend, where each micro front-end application operates independently but integrates seamlessly with others through well-defined surface contracts, such as APIs. There, thanks to layers seggregation, teams accross mono or multi repo can refer to common notions (eg. domain entities) and services (eg. shared utilities on the data or ui layer) to remain consistent and avoid code duplication. For more micro-front-end readings, see here.


Front End Layers Layout

A user interface application connects a user to data. That's the ultimate goal. Any appreciation regarding the elevation of the development practice is about how this is done.

Clean Architecture principles can be effectively applied to front-end development by organizing the code into distinct layers, each with a specific responsibility. Here are the layers handled on a daily bases over our applications. Note, this is non exhaustive and/or adaptable to specific use cases. From a top down approach:

The Short version

The User 🧑🏻‍💻

---The Front End--

  • ⬇ the ui layer - rendering the ui
  • ⬇ the presentation layer - logic as to which ui to render
  • ⬇ the domain layer - core front-end business logic
  • ⬇ the data layer - logic handling data sources management
  • ⬇ the gateway layer - logic as to how data fetched
  • ⬇ the middleware layer - eg. inbound/outbound interceptors, monitors, etc.


---The Internet Network--- 🌐

---The Back End / Micro Service---

  • ⬇ the middleware layer - eg. inbound/outbound interceptors, authenticators, etc.
  • ⬇ the controller layer - API interaction surface
  • ⬇ the domain layer - core back-end business logic
  • ⬇ the data layer - logic handling data sources management
  • ⬇ the gateway layer - drivers to data sources


---The Data---


Or to put it sequentially:

  1. The component asks what to render to the presenter,
  2. The presenter knows what data to present, and asks for it to the usecases,
  3. The usecase knows how to compose the requested data, and consolidates it from one or multiple repositories,
  4. The repository knows where the data is, uses the gateway as abstract request tooling.
  5. The gateway fetches the data from source, normalize it, apply control policies and returns it.

Or to put it with profiling:

  1. Frameworks: UI Layer
  • Protagonists: Components
  • Data units: DPOs (Data Presentation Object)
  • Purpose: This layer is responsible for the UI framework used to present an interface to the user. It handles the components lifecycles, change detection, and data display (visual libraries, etc).
  • Characteristics: Minimal or framework only logic; plug-and-play aspects of the framework. Ensures that components are rendered correctly and efficiently.
  • Example: Framework components (eg. React or Angular directives) that render some D3JS UI elements and update the view on user interaction and/or data changes.
  1. Presentation Layer
  • Protagonists: Presenters
  • Data units: DPOs (Data Presentation Objects) and PLOs (Presentation Lifecycle Objects).
  • Purpose: This layer contains the logic for UI presentation, deciding when and what should be displayed by the components to the user.
  • Characteristics: The presentation logic manages the state of the ui layer to update the screens based on user interactions or other events in a logical manner. This involves the application behaviors regarding data fetching states, error management, data manipulation, and more.
  • Example: Presenter classes in Typescript using PLO and DPO to control component ingest and data outlay.
  1. Domain Layer
  • Protagonists: Usecases, Interactors
  • Data units: DBOs.
  • Purpose: The layer encapsulates the management of the various business structures/notions used by the application and the core logic managing them, ie. the business rulings translated to the technical realm.
  • Characteristics: Includes DBO (Data Business Objects, so called "Entity") and business know-how logic in individual classes/functions (then called "UseCases") or re-grouped and categorized around a given notions (then called "Interactors"). This class is ideally pure language, independent from external dependencies; purely focused on business rules and logic.
  • Example: TypeScript objects representing business entities, and methods that implement applicative use cases such as user authentication or order processing.
  1. Data Layer
  • Protagonists: Repositories
  • Data units: DTOs.
  • Purpose: This layer handles the logic for managing data, including where the data shall live (the various data sources) and its validation.
  • Characteristics: Abstracts data storage, retrieval and its validation (eg. caching policy); provides a consistent interface for data access from the domain layer, handles data conversion (eg. DTO to DBO), and more.
  • Example: Repositories classes that interact with abstracts data sources encapsulating the details of data logical management.
  1. Frameworks: Gateway Layer
  • Protagonists: Gateways
  • Data Units: DTOs.
  • Purpose: Acts as an intermediary normalizing between the data layer and network or data sources; abstracts and normalizes data.
  • Characteristics: High framework or api adhesion, technical layer only.
  • Example: API clients or middleware that fetch data from multiple sources (e.g., REST APIs, GraphQL endpoints, Browzer LocalStorage API) and transform it into a standard format fed to the data layer.

In Schematics:

schematic_representation


Done!

Thanks, and Congratulations for reading this to the end. We hope this article brings a little clarity over the Clean Architecture pattern, and how to use it.