Iteration, or Insanity?

Mar 2, 2026

Exploring the high cost of repeating failure, in the hope of finding success

Executive summary

Innovation in digital channels can easily fall into cycles of "insanity": repeating failed marketing campaigns or AB tests in the hope that external factors were to blame for poor results.

Whilst re-running experiments to validate findings is scientifically sound, repeating them in the hope of finding a positive result from a flat experiment is an ineffective commercial strategy that erodes impact and return on investment.

This article explores why organisations find themselves running insane experimentation programmes, the temptation of writing off poor results due to external factors and how a supposedly “scientific approach” can be used to justify repeating the same tests over and over in the hope of reaching a different outcome.

We explore how organisations must progress from “insanity” to iteration; a mindset of constant evolution and improvement to experimentation, underpinned by unbiased data analytics and customer insight, as opposed to simply repeating past experiments in the hope of reaching a different result.

Ultimately, commercial impact is not achieved by doing the same thing twice and praying for a different result; it is found by ensuring the disciplined evolution of your response to real customer frustration.

The challenge

It is a commonly held notion that the definition of insanity is doing the same thing over and over while expecting a different outcome.

Based on that definition, many digital innovation programmes are insane.

These programmes of activity are characterised by following the same "best practices" and repeating the same experiments and campaigns time and again while expecting, or more precisely, hoping, for a different result.

This challenge of insanity is particularly pronounced in both conversion optimisation and marketing investments.

An organisation may re-run a marketing campaign despite poor user engagement and low return on investment in the hope is that poor performance was due not to the campaign itself, but to an external factor or by something outside of their control.

In the realm of conversion optimisation, teams may invest heavily in a single AB test; producing a data-backed user experience that is signed off by half a dozen senior stakeholders. But if the test loses (or worse, fails to have any impact) it is tempting to blame external variables, such as a change in the marketing mix or the wider trading period. As with the marketing example, organisations hope that by re-running the test, sometimes multiple times, they will see a different result and therefore be able to declare a winning experience.

The commercial impact of this approach is significant

In the overwhelming majority of cases the repeated campaign or experience delivers the same result. Not only does this slow commercial impact but it also can hold back the entire innovation agenda.

The temptation of insanity

From a scientific perspective, the desire to re-run a campaign or customer experience is understandable. External factors do impact digital experimentation programmes and there are legitimate instances where a repeat of previous activity is appropriate, such as an analytical error or a genuine anomaly in the data, or even a significant macro-economic factor.

The difference between iteration and insanity is the purpose of the repeated activity

In iterative programmes of experimentation, the objective is not to obtain a different result, but rather to validate the initial findings. Organisations that repeat experiences simply to validate results rather than change them are always in a better position to adjust and evolve their investments in response to real data and customer insight - because no single experiment has to win, only the experimentation programme must be successful.

By comparison, insanity is tempting because it masquerades as being logical and informed by data but, at its core, amounts to little more than a hope that the first test was somehow wrong.

Simply repeating past experiments is tempting also because it increases testing velocity, after all re-running a test is still running a test, with many organisations and agencies focusing less of the commercial result of their activity but rather the volume of activity; “we want to increase testing to 100 tests per month!”.

On paper, this approach makes sense but doesn’t deliver impact in practice. Organisations find themselves investing heavily in repeating the same ineffective experiments, trying to find winning experiments by any means and shifting their focus away from delivering impact to delivering volume. 

The importance of iteration

Putting aside rare technical errors, there is rarely the genuine need to re-run a failed campaign or AB test, but there is always a need to iterate it. 

Imagine you tested an alternative landing page experience backed by data and insight. On paper, the test should have won, as the creative responded to real customer understanding. But in reality, it had no impact.

An iterative mindset accepts the results for what they are, but rather than repeating the experience to achieve a different result, focuses on understanding why the experience didn't have the desired impact, as an avenue to identifying the next step in the experimentation programme.

This deeper analysis might reveal that while the content within the new landing page was of high-quality, a lack of clear anchor links or encouragement for the user to engage meant that less than 30% of users ever saw it. But for the users who did see the content, the page drove a 10% uplift in conversion. The failure wasn't the page itself, but the fact that the impactful components were too hard to find.

This data informs an iteration: for example keeping the core content but pivoting the primary CTA above the fold from "Buy Now" to "Find Out More" (or similar) anchor-linking users down the page to the relevant information.

Commercial impact is achieved through this constant evolution based on data and customer understanding. You won’t get it right the first time, but if you keep at it, you will get it right eventually.

The cost of failing to act

An iterative approach to experimentation can be off-putting. It feels slow and complicated, requiring constant data analysis and ongoing hypothesis creation. Unfortunately, this is true - it is a more complicated proposition to run an iterative programme of innovation rather than an insane one.

But, without an iterative mindset, your investments in experimentation cannot deliver impact.

Failing to adopt iteration means repeating the same experiences and rehashing the same hypotheses time and again, never moving forward. This leads to a decline in return on investment, stagnant growth, and the loss of market share to the competition.

What you can do about it

Migrating from insanity to iteration requires three deliberate shifts in the operating model:

Slowing testing velocity

You don’t need to run 50 experiments a month if none of them drive impact, unless you are an enterprise scale organisations with millions of visits per week you do not need to run dozens of experiments at once - a lower level of highly focused and evidence-informed experiments will deliver a greater impact.

Defining the measurement strategy

Be specific about what you are hoping to achieve, whether it's conversion growth or email sign-ups and ensure you segment reporting to capture the behaviour of users who actually engage, not just everyone to a landing page or everyone who saw a campaign ad.

Align experimentation with customer insight

Ensure activity is aligned with scientific hypotheses, backed by data and customer insight. "We think this... which we'll know when we see..." is not an effective structure for commercial impact (as we will explore in a later article).

Conclusion

Innovation in digital channels is too often treated as a volume game: more tests, more campaigns, more activity and inevitably, more cost. 

But volume without evolution is both inefficient and ineffective. Re-running past experiments in the hope of finding a different results leads to organisations investing resources in activities where there is no evidence of return, not only holding back commercial impact but leading to a loss of faith in experimentation itself.

To progress from insanity to iteration, organisations must establish the discipline of asking why; why didn’t the test win, why was campaign engagement poor, why do customers fail to buy. 

This can be complex and time consuming, however, it is also the only way to move from guesswork to an experimentation programme that drives substantive commercial impact. 

Real growth isn't found by doing the same thing twice and praying for a different result; it's found by addressing the real causes of customer frustration and evolving your response and customer experience in a process of iterative, evidence-based experimentation programmes.

 

Our mission

To combine expertise in data, insight and the scientific method, working with ambitious digital organisations to challenge, inform and support teams deliver the greatest commercial impact from every investment in digital channels.

A passion for data… An obsession with impact

Our mission

To combine expertise in data, insight and the scientific method, working with ambitious digital organisations to challenge, inform and support teams deliver the greatest commercial impact from every investment in digital channels.

A passion for data… An obsession with impact

Our mission

To combine expertise in data, insight and the scientific method, working with ambitious digital organisations to challenge, inform and support teams deliver the greatest commercial impact from every investment in digital channels.

A passion for data… An obsession with impact

Get in touch

Unlock the impact of every digital decision, without compromise.

How to contact us:

Complete the enquiry form

Find us on LinkedIn

Get in touch

Unlock the impact of every digital decision, without compromise.

How to contact us:

Complete the enquiry form

Find us on LinkedIn

Get in touch

Unlock the impact of every digital decision, without compromise.

How to contact us:

Complete the enquiry form

Find us on LinkedIn