This post is part of the Testing Series, an overview of testing strategies for Elixir applications.

Poor testing has a direct impact on culture, leading to a decline in professional standards, a shift towards expedient solutions, and overall disengagement among team members. This situation is often exacerbated by a leadership gap, where leaders who foster poor testing practices cannot fully grasp the root causes of project failures, thereby increasing workplace stress and anxiety.

  • Impact on Professional Integrity: The erosion of standards leads to compromises as deadline pressures increase, generating dissatisfaction and a loss of pride.

  • Cultural Shift Towards Short-Term Fixes: Poor testing perpetuates a cycle of quick fixes that fail to address underlying issues, leading to recurring regressions and patchwork solutions.

  • Overall Disengagement: When individuals see the same issues recurring without resolution, their engagement and commitment to the organization and their enthusiasm for the project goals wane.

  • Persistent Anxiety and Stress: Unstable software leads to a state of anxiety, with team members worrying about the repercussions of undetected errors and being perpetually unprepared for system failures.

  • Decreased Trust and Cohesion: Skepticism bred by recurrent issues erodes trust within the team, fracturing team unity, decreasing collaboration, and leading to a workplace where individuals feel isolated or unsupported.

  • Avoidance of Innovation: A risk-averse atmosphere, stemming from the fear of introducing new errors, leads individuals to avoid potentially beneficial changes.

  • Heightened Frustration: Regular disruptions caused by inadequate testing lead to a reactive work culture dominated by frustration, overshadowing the fulfillment derived from creative and constructive work.

Naive leaders often react to a stagnating project by insisting on accelerating the pace of output, inadvertently reinforcing poor testing practices. This approach increases the frequency of bugs and system failures, and intensifies the spiral of anxiety, frustration, and disengagement among team members.

Below are some of my takeaways for implementing good testing practices in a Phoenix/Elixir project.

Layered Testing

Test at multiple levels to ensure business rules are enforced consistently across all stages of the application:

  • Changeset Validation validates data against predefined schemas and business rules at the foundational level.

  • Database Service Integrity focuses on data protection and integrity to ensure that only validated and correct data is committed to the database.

  • UI Behavior Shaping guides and restricts user actions according to business rules at the interface level, enhancing user experience by preventing invalid operations.

Taking Responsibility

Testing is not just about confirming whether something works; it is about safeguarding the assumptions embedded in the code. Tests validate the code, but more importantly, they prevent regressions, acting as a critical component in maintaining software integrity over time.

Using Metaprogramming

Leverage Elixir’s metaprogramming capabilities to create a domain-specific language (DSL) tailored specifically for testing, organizing and encapsulating the best practices for testing the application.

  • DataCase: Integrates Factory and Ecto for handling database interactions.
  • ConnCase: Builds on DataCase, encapsulating best practices for testing Phoenix live views and web interactions.
  • ModelCase: Validates the application’s data structures and integrity constraints, including logic for testing schemas and changesets.

Focusing on DAMP (Descriptive And Meaningful Phrases)

Testing inherently involves some redundancy, some of which should be abstracted into the DSL. However, the tests themselves need to be self-encapsulated and prioritize readability. Emphasizing the creation of tests that are both descriptive and meaningful, insures future developers can quickly grasp what is being tested and why, without diving into the codebase or external documentation.

Avoiding Fragile Tests

Creating resilient tests that focus on functionality rather than superficial or non-functional aspects of the application is crucial.

  • Focus on Behavior, Not Implementation: Opt for black-box testing methods that concentrate on inputs and expected outputs.

  • Do Not Test Cosmetic Changes: Avoid tests for UI elements that don’t impact functionality.

  • Use Abstraction Wisely: Abstract implementation details into well-defined fixtures or helper functions to maintain test clarity and relevance.

By adhering to these principles, tests will be more robust and maintainable, ensuring that the testing suite remains an asset as code evolves and scales.