How we increased content platform traffic by 3000% with A/B testing
To stay ahead, Funda keeps exploring new ways to improve and personalise the user experience. In this blog post, Data Analyst Naomi Smulders shares how A/B testing and behavioural tracking helped refine our advertising audiences. Read how Naomi and her team achieved these results through data and experimentation.
At Funda, a significant part of our revenue is generated by the advertising possibilities on our platforms. To be an attractive advertising space, we need a thorough understanding of our audiences to help our advertisers target the right customers. After all, a personalised advertisement that targets a customer’s underlying interest or demographic is more effective than its generic contextual counterpart.
Therefore, to be able to match the ad to the right customer, we incorporate as much information as possible about our customers’ interests and behaviours, based on different sources available. One such source is customer behavioural tracking across our platform through Segment. This approach allows us to enrich user profiles and build more accurate advertising audiences.
CDP for understanding your customers’ interests
At Funda, we use Twilio’s Segment for our web tracking. This Customer Data Platform (CDP) is the central place to collect all information about our customers, and we use the profiles and audiences it creates to feed into output and content services such as email, push notifications, advertising and marketing.
For everyone unfamiliar with tracking or Segment, here is a small explanation: Segment is a CDP and thus builds user profiles and audiences by collecting and unifying data from every customer interaction across our platforms and mobile apps. Most commonly it tracks page views and clicks, which are all captured in pre-defined events. Each event includes contextual information about the website element and user interaction, as well as general metadata about the user, e.g. device type or referral source. Segment also captures a user identifier to help map the customer journey, always in compliance with GDPR and only for users that have accepted cookies.
All these mapped user journeys are then aggregated into user profiles that can be enriched with computed attributes and traits, such as frequent searches, visit frequency or search area. From these profiles, audiences can be defined based on these user behaviours or traits, e.g. 'Users that have searched weekly for a home in a big city'. All audiences are updated in real time based on user behaviour and are synced with our Google Ads platform.
Content drives audience creation
One of the most important inputs of the user profiles is the interaction with the content articles on our platform/meer-weten/. These articles cover a wide range of topics around housing and the real estate market and can really pinpoint the particular interest of the customer. For example, two of the most popular categories are articles on sustainability and on remodelling a house. Combining these gives a user profile of someone who is likely to be interested in making their home more sustainable. Knowing exactly who these customers are has of a lot of value for advertising partners that want to advertise a sustainable product, such as heat pumps.
See also: Inside Funda's data pipeline: how we structure and manage our data
Interactions with the content platform therefore highlight key interests of our customers. However, this part of the platform is only found by a limited number of them. To get more value we ran an A/B test to see if we could increase traffic to the content platform, without distracting customers too much from their usual searching journey.
Online experimentation
Pre-analysis and behavioural context
We hypothesized that for the content platform to be viewed more easily, we would have to make entry points available in the places on the user journey in which the customer might benefit from this information the most. However, guiding the user away to another part of the platform could not be allowed to harm other important behavioural indicators and KPIs, such as a contact request to a real estate agent by a customer that might be interested in buying a house.
Based on previous user research we opted to run the experiment on the Object Detail Page (see screenshot). This page displays all the details of a specific object (house or apartment), such as price, size, year it was built and energy label. Therefore, on this page users are already in a learning and information processing mindset while they are browsing all the detailed information about a particular property. Providing an option for additional information by entry points to the content platform would therefore not feel out of place or be distracting.
Hypothesis
We summarized these evaluations in our hypothesis. We adhered to the standard hypothesis form ‘By [design change], [main metric] will [measurable effect], because [behavioural/ psychological reason].’ Thus, our hypothesis was as follows: 'By providing entry points to the content platform at the end of the Object Detail Page, traffic towards the content platform will increase because users are already in an information processing mindset and will explore their interests and curiosities further on the content platform.'
Metrics
Our decision (main) metric was the traffic to the content platform, measured by the Click-Through Rate (CTR) to the content pages. Secondary to the main metric, we also tracked some guidance metrics such as the CTR of surrounding components. These secondary metrics were used to verify that users were not just clicking because of the novelty effect but were behaving with an intentional interest. Lastly, we set some guardrail metrics, which are the KPIs and behavioural indicators that could not be negatively impacted by the changes in our experiment. The most important of these was the lead generation metric, as any changes to the design of the webpage should not negatively affect leads.
To make sure all these metrics could be analysed with statistical rigour, we ran a power calculation to determine the runtime of the experiment and the traffic allocation.
Results
We ran the experiment for 2 weeks on 10% traffic and found a statistically significant winning variant. There was a positive effect on the CTR to the content platform, without a negative effect on any of the guardrail metrics. The new entry points were therefore quickly implemented on the platform.
Implementation impact
After implementation, we measured the effect on the traffic from the Object Detail Page to the content platform. Within a month, traffic to the articles explicitly linked by the entry points increased by 3000% - a statistic very happily received by the content marketers.
This also means that we now have a better understanding of the interests of a far larger portion of our customer base. The click behaviours on the entry points and consequent browsing of the content platform enriches the profiles of many customers with previously unknown information. Consequently, it is now easier to target these customers with curated advertising. Moreover, we now have a bigger target audience group with those interests which makes the platform a more attractive advertising space for advertisers.
Next steps
This case highlights the benefits of a multidisciplinary approach to optimize our platform for all our clients, customers and advertisers alike. Furthermore, it shows how user research, data and experimentation drive our decision-making, so that we remain data-driven.
In turn, it opens new possibilities for other teams at Funda. Not only can the advertising team benefit from the refined audiences, but also the content team itself. When adding content to the platform, the content team can also make use of the same audiences to personalise content.
In addition, there is an opportunity for the content team to keep optimizing the location of the entry points. From the above experiment we learned that the Object Detail Page is a good location. However, this can be further refined, because where on this long page is the need for this additional information the most wanted and effective? The initial experiment looked at the bottom of the page, but follow-up experiments will test other locations on the page, such as the middle section or a dedicated side bar. More generally, we encourage all product, marketing and advertising teams to follow a similar optimisation approach.
In conclusion, this case has shown that experimentation and A/B testing allows us not only to gain better insights into the differences between the users on the platform and thereby the possibility to personalize, but also provides a data-driven way to validate general User Experience (UX) improvements that are beneficial for all the users of our platform.
See also: An intern's research at Funda: using GNNs to recommend houses
Question?
Do you have a burning question for Naomi after reading this blog? Feel free to reach out to her via email.