On June 11th, we started working on a project we call Reviewers.

We hand picked ten reviewers who agreed to volunteer their time to review four proposals a month. We expect that reviews take 30 minutes each, so the commitment is two hours per month per reviewer.

Reviewers is a three month long project that will finish at the end of August. At the end of August we will evaluate what went well, what didn't go well, and determine the fate of this project.

We are now one month into Reviewers. This is post covers our intentions, tangible progress, and next steps for Reviewers.

Our Intentions

Over the last six years, reviewing proposals have been done entirely our in-house staff. When we started the website in 2011, Denny and I reviewed (and often wrote) all proposals. As we scaled the platform we required more help with reviews. To solve this problem we hired employees to join our team full-time. When your project volume is low, this is an okay way to ensure scientific accuracy and high quality.

As we started to host more projects the experience for each scientist started to deviate from what we felt was optimal. Ideally every scientist coming through our system could feel the same feeling scientists felt when onboarded by a founder. When I was supporting scientists one-on-one it was easy for me to be efficient, prioritize accessibility, and verify scientific integrity of individual proposals.

Taking the long-term sustainability and scalability into account, we feel it is impossible to accomplish our mission of democratizing the research process so anyone can science if we do not address the bottle-neck of our review system.

Build a system to scale to 100x projects

We will never expand beyond our current scale if our reviews system depends on three staff members reviewing all proposals. And, what happens when these three humans are gone? For the long-term sustainability of the platform we need to ensure the platform can survive beyond the founders and current staff. For the scalability of the platform we need a system of reviewers that can scale with the influx of 10x or 100x new proposals.

Standardize objective measures for quality

The value of our community is directly correlated with the quality of the scientific research projects we support. We think we can do better in terms of efficiency, accessibility, and scientific integrity. With a larger pool of reviewers we can ensure proposals are returned within 48 hours (and possibly faster). With a distributed review team the first goal for the volunteers will be to coach scientists on clarity of proposals, but if we think about the long-term having volunteers across the globe this is a start to accepting proposals in multiple languages. With scientists with expertise in multiple fields we can verify the scientific integrity of proposals.

Open up governance to the community

"You have to remember that nothing remains the same. It’s always going to change. The whole world keeps changing, we keep changing, things in our lives keep changing. Nothing remains the same."

As we think about the long-term sustainability of the community, the more I think about prioritizing our foundation. The foundation that we build today influences our community's long term voyage. We want to incentivize the leaders of our community to always be serving scientists first and foremost. The community needs to stay relevant to make it into the next century. Just as the Internet changed everything about how science can be communicated, we expect to see future revolutionary technologies that will change how science is done. Our community needs to be ready for this.

Reviewers is a step in the right direction opening up a constant dialogue between the founders and the community. The intention of Reviewers is to put us in the right direction to strive to achieve our mission, standardize our opinion of what good science is, and initiates a start for how community governance will evolve.

Tangible Progress

We have a working version of our Reviewers product live for all our reviewers. This section will cover subjective decisions that were made by the founders and what we have learned in the first month. Below is a screenshot of what our review interface looks like visually.

https://d3t9s8cdqyboc5.cloudfront.net/images?path=5/1JvYoQTRWu6NVts0UG37_Review | Temporal dynamics of mutation accumulation in trees of the Chernobyl Exclusion Zone | Experiment 2018-07-24 10-23-07.png&width=650&height=