Some of you have asked, “How do we decide on making changes on Scoop.it?” We felt that this is an interesting opportunity to share the answer openly.
First, let me start by saying that it’s a process that has evolved to become much more complex now that millions are using Scoop.it every week. In the beginning, we were able to let our vision and intiution guide us, but now we have a responsibility to you, the Scoop.it community, who have decided to use this service as your content curation hub on a daily basis.
Sometimes decisions are easy: when you asked for curated newsletter capability on our feedback forum, it was just a matter of planning this together with the right resources and partner. It can take some time (bear with us…) but the decisions are simple. Sometimes, it’s a question of vision: we have strong values and a vision for what we feel content curation and the interest graph should stand for, and that of course. continues to guide us just like we recently experimented by launching Read.it.
At the UX (user experience) level though, this can be more difficult: not so much for the inspiration and the big ideas but for the little details that can have a big impact. Should this button be at the top or the bottom? Left or right? Should we give users one main option and a bunch of secondary ones or should we highlight the three that are the used most often? Did we make that feature visible enough? Or is it too prominently displayed and annoying? A lot of these questions don’t have good or bad answers you can easily guess: you have to try out to find out.
To deal with that, we’ve increasingly been relying on performing A/B tests of our user interface. A method for optimizing Web sites that has been extensively used by Google ever since its beginnings, A/B Testing means that instead of making the change for all of you, we will first implement it for a randomly selected group of Scoopiteers and measure the results of this B group vs the majority of people unaffected by the change (group A). By comparing results we can see whether the change is having the desired impact before rolling it out to everyone (as a matter of fact, we even take this further by comparing not just A vs B but by also segmenting users according to how long they’ve been using Scoop.it).
Sometimes we’re disappointed: a change that seemed to make sense ends up having no impact or even sometimes a negative one. Sometimes results are spectacular.
A recent example is the way we’ve tested the position of the sharing buttons on the scoop pages. Again, this might seem a trivial issue but we feel that we owe you the best platform for your content curation and that means making sure your scoops can be easily reshared and amplified by your visitors. So for us, finding if we could increase the number of visitors who share your content when they land on your scoop pages was an important question. If we do, it means more visibility for you, and of course it also helps us grow.
Until recently, here’s how the scoop page was structured and here’s how it is now following a conclusive test.
The historical reason for grouping all sharing options behind a common “Share” button was because we wanted to give you a choice to share on various platforms: we have several different options. The change we tested consisted of isolating the three main networks that generate referral traffic to Scoop.it pages (Facebook, Twitter and LinkedIn) and isolate them as first level options, grouping the other ones behind a “+” sign. Our rationale was that other sharing options are now behind a discrete + button, the whole concept of sharing is well communicated by the Facebook, Twitter and LinkedIn buttons (hence no need for a button that says “Share”) and visitors who are likely to share the scoop will primarily use one of these 3 networks.
Results: 273% increase in sharing actions by page impression. Yes, close to 4x more chances that visitors will share your scoops!
This might look trivial (especially after we got the results…) but some examples are not. When we introduced the insight feature back in December, we felt strongly about it as we feel content curation is a means of expression and giving an opinion on curated content is an important way to add value to your audience. But we wanted to understand whether this will come at a cost by making the whole publishing process more complex, especially for new users. So here’s a test we’ve done:
On the left you can see the current Scoop.it window while on the right you can see a simpler version where we’ve removed the insight and that we tested with new users of Scoop.it. Less is more, right? So we’d expect this simpler version to perform better, shouldn’t we? As it turns out, both windows are performing identically in the sense that the number of users who end up publishing a scoop is the same whether they’re in the A or the B group.
You might have observed some other changes we made as a result of similar conclusive tests such as the format of the “New Post” button, the direct access to curated topics in the margin of the dashboard, etc… The only web services that stop changing are the dead ones so we plan to continue to do tests like these for small iterations of our UX while we also keep executing on our vision to bring you the best platform we can build. One thing, though, it is critical to us that, while we look at data, crunch numbers and derive percentages, we don’t want to be blinded by the analytics and stop listening to your feedback. So please, don’t refrain from sending us remarks, critiques or encouragements: as much as we can A/B test, we will never replace human interactions and community feedback. And in addition, we also wanted to open it up to your ideas: if there are design changes you feel are needed to help the whole Scoop.it community grow faster, please tell us in the comments.