Treating creative as a core, measurable lever — not an afterthought — is rapidly becoming the default playbook for app growth teams. That was the central message from Megan Evans, Creative Director at ConsultMyApp, in an App Talk interview recorded at App Promotion Summit London 2025. Evans urged teams to approach creative work as a specialised, data‑driven discipline that deserves the same strategic rigour as acquisition or product decisions, and to build systems that find and scale what actually converts on mobile.
That system begins with disciplined testing. Evans warned against making too many simultaneous changes and recommended isolating variables so teams can reliably attribute lifts to specific creative elements. This mirrors established A/B testing best practice which advises running mutually exclusive test groups long enough to reach statistical significance, documenting hypotheses, and avoiding overlapping experiments that contaminate results.
Notably, the highest‑performing creative is often the least expected. “Sometimes it’s the one you don’t want that wins, which is difficult for brands to understand,” Evans told Business of Apps at the summit. Industry measurement supports that claim: a major 2025 industry report analysing more than a million video creatives shows that a small share of assets drive a disproportionate amount of results, and that emotional storytelling and user‑centred signals frequently correlate with installs and longer‑term retention. Those findings help explain why “ugly” or off‑brand ads — rougher, more authentic executions that resonate with audiences — can outpace polished, on‑brand spots in conversion.
Creative optimisation is not limited to paid ads; it must extend across the entire user journey. Store listing assets such as icons, poster frames and the first screenshots form an instant impression that can decide installs within seconds, so paid creative and app‑store creative should be aligned rather than treated as separate experiments. That alignment reduces drop‑off between click and install and makes it easier to scale winning creative across acquisition and retention channels.
At the same time, teams face accelerating production demands. Evans and others advocate using AI to remove repetitive work so human teams can concentrate on strategic choices. “Instead of spending time on things that can be automated through AI, we’re now focusing on the parts that need the human brain,” she said. This approach echoes the industry’s evolving view of agentic AI: tools can accelerate ideation, speed prototyping and automate routine edits while leaving critical judgement, ethical safeguards and narrative craft to humans.
Frameworks for how to operationalise these ideas are already in circulation. Growth consultancies recommend broad early ideation followed by disciplined experiments: allocate a dedicated test budget, limit concurrent variants to keep tests interpretable, scale validated winners with significant spend, and monitor creative fatigue so cadence can be adapted by channel. Platforms’ reporting also highlights the need to diversify testing to guard against over‑reliance on a handful of top creators and to segment audiences so the right creative meets the right users.
For practitioners, the practical checklist is familiar but often ignored: change one element at a time, run tests long enough to reach significance, capture secondary metrics as well as headline conversion, and document both hypotheses and outcomes so learnings compound across campaigns. These simple controls distinguish repeatable optimisation from noisy, anecdotal “what‑worked‑that‑time” decisions.
The organisational challenge is cultural as much as technical. Creative optimisation requires cross‑functional collaboration between creative, product and analytics teams and a willingness from brand stakeholders to embrace unexpected winners. When that alignment happens, teams can reallocate time and budget away from repetitive production and into strategic experiments — using AI where it accelerates output but stopping short of ceding creative judgement.
Ultimately, success looks less like crafting ever‑prettier ads and more like building a system that discovers what resonates, validates it rigorously, and scales it intelligently across the funnel. As recent industry analysis makes clear, a small set of creatives will often deliver outsized returns — but finding them takes methodical testing, cross‑channel consistency and judicious use of automation to let human insight do what machines cannot.
📌 Reference Map:
Reference Map:
- Paragraph 1 – [1], [2]
- Paragraph 2 – [1], [4]
- Paragraph 3 – [1], [3], [2]
- Paragraph 4 – [1], [7], [2]
- Paragraph 5 – [1], [5]
- Paragraph 6 – [6], [3], [1]
- Paragraph 7 – [4], [6], [1]
- Paragraph 8 – [5], [6], [1]
- Paragraph 9 – [3], [1], [2]
Source: Noah Wire Services