edrone — engineering the aha moment
Marketing Automation · AI features
Senior Product Designer
2025 – 2026
AI email generator — research, design, analytics setup and iterative improvement

the context
The problem
New edrone users weren't seeing ROI fast enough — and without a clear early win, they churned before the platform had a chance to prove its value. Research data pointed to a specific blocker: preparing an email campaign took too long. So users kept putting it off, and the longer they waited, the less likely they were to start at all.
The approach
We had the data — the question was how to act on it. The goal was to remove the effort barrier entirely: get users to their first sent campaign as quickly as possible, with as little friction as possible. And then measure everything.
my role
I owned this project end-to-end — from shaping the initial concept based on existing research, through design and iteration, to setting up the full analytics layer in Amplitude myself. That last part mattered a lot: I didn't just hand off specs and wait for results. I built the measurement system that let us see, in real time, exactly how users were moving through the feature.
I also ran regular feedback sessions with users throughout — making sure the iterations we shipped were grounded in what people were actually experiencing, not just what the numbers suggested.
the process
research foundation
This project didn't start with a blank slate. Previous research had already identified the pattern — new users who didn't send a campaign within their first weeks rarely went on to become active customers. The time-to-first-campaign metric was directly tied to retention. That gave us a clear, specific problem to design against rather than a vague mandate to "improve activation".
AI email generator — v1
The first version was designed around one idea: zero effort for the user. The generator scraped branding directly from the customer's website — colours, fonts, products — and used it to create ready-to-send email campaigns tied to upcoming holidays in their calendar. No copy to write, no template to configure. Just a campaign, ready to go.
The design challenge was trust — how do you present AI-generated content in a way that feels reliable enough to send without heavy editing? I focused on transparency: showing users exactly what was pulled from their site and making it easy to review before sending.
analytics setup
I instrumented the entire feature flow with Amplitude events myself — every step, every action, every drop-off point. This wasn't something handed off to a data team. Having full ownership of the measurement layer meant I could move fast: define what we needed to know, set it up, and start reading the data without waiting on anyone else.
The dashboards gave us real-time visibility into how customers were actually moving through the generator — where they engaged, where they hesitated, and where they dropped off. That signal drove every subsequent decision.

iteration — promptable emails (v2)
Usage data from v1 confirmed the direction was right — but it also showed us where users wanted more control. The second iteration introduced promptable emails: users could now guide the AI with their own input, shaping the content without having to write it from scratch. Low effort, higher flexibility.
This wasn't a hunch — it was a direct response to what the data and the feedback sessions were telling us. v2 was scoped and prioritised based on actual evidence of where v1 fell short.

feedback sessions & continuous improvement
Throughout the project I ran regular feedback sessions with users — checking what was working, what felt off, and what they wished the tool could do. Alongside the Amplitude data, these sessions gave us a complete picture: the numbers told us what was happening, the conversations told us why.
Usage of the feature grew steadily over time — not as a spike after launch, but as a consistent upward trend that reflected real adoption rather than novelty.


impact & results
Faster time to value
The generator directly removed the blocker that was keeping new users from sending their first campaign — cutting the time and effort required from hours to minutes. For a product where early activation drives long-term retention, that's the metric that matters.
Data-driven iteration
By setting up the Amplitude tracking myself, every decision about what to iterate — and what to leave alone — was grounded in real usage data. v2 wasn't a guess. It was a direct response to what the dashboards and feedback sessions were telling us.
Growing adoption
The feature saw consistent usage growth after launch — and kept that momentum through both iterations. Adoption was organic rather than forced, which was the real signal that the generator was solving a genuine problem, not just something users tried once and forgot about.