Flex Tier
Unified the shared and serverless cluster tier experience through Growth launched experiments.
2024
Growth
Product design
Stakeholder management
Cross-team collaboration
2 Designers
1 Researcher
3 Engineers
1 Product manager
1 Data analyst
Current tier offerings force users to consider high tradeoffs:
Shared clusters don’t scale easily
Serverless clusters limit access to key data developer platform capabilities with a difficult and unpredictable operations-based pricing
Dedicated clusters can be far more advanced than most self-serve needs.
How might we provide a better middle ground offering that offers predictable pricing, supports light elasticity, and provides access to modern data platform features?
We teamed up with user research to validate 3 key factors for this new tier offering:
Value proposition
Pricing model
Naming
We utilized a multitude of research and analysis techniques to identify the top option to be tested via an implemented painted door growth experiment:
9 Competitive product audits
7 Rapid moderated A/B pricing model testing
15 Unmoderated A/B name testing
11 Moderated naming and value proposition interviews
With the Flex Cluster tier name, pricing and value proposition aligned on from internal iteration, a painted door experiment was conducted leading to statistically significant results.
Clearly defined the flex cluster tier's value proposition as a middle ground offering that provides users with more capabilities compared to the free tier but has less cost associations compared to its dedicated counterpart.
Made billing information more digestible and comparable to other tier offerings. Added in expandable pricing details for more granular information, emphasizing a $30/month max.
Mapped out and provided messaging/CTAs to ensure existing users could automatically upgrade shared clusters or serverless clusters to the new Flex tier offering. Included support for upgrades to dedicated clusters for further paid conversions.
We teamed up with data analysis to run an A/B test on 60% of total cluster traffic, monitoring paid cluster creation on the starter page, upgrade page, and advanced configuration page. Secondary metrics tracked included overall deployments, churn and drop off rates. This experiment proved successful and the rest of the new experience was launched to 95%, then 100% of users afterwards.
Paid cluster creation
Paid cluster upgrades
Organizations with >10% MRR change
Design craft & mentorship
Partnering with a senior designer strengthened my ability to document decisions, align across milestones, and collaborate on ideation, visualization, and edge case thinking.
Designing under evolving inputs
Working in parallel with research and shifting requirements required flexibility—balancing iteration with avoiding rework, and establishing foundations that could adapt as new insights (e.g., pricing, naming, flows) emerged.
Pragmatic system design with engineering
Early attempts to support all edge cases revealed feasibility and UX tradeoffs. By aligning with engineering and product, we prioritized high-impact scenarios and handled unsupported cases through clear messaging and constraints.
Cross-functional alignment & growth mindset
Collaborated closely with research, PM, marketing, billing, and data teams to align on experiments and measure impact—iterating based on results to improve adoption and conversion outcomes.