Research Observatory
What
Research hub that coordinates deliberation data, standardized evaluations, and researcher-practitioner collaboration, providing secure access to historical process data, optimizing limited participant research time, and developing comparative evaluation frameworks for the field.
Why
Overcome research bottlenecks ↓ in participant and researcher time, enable comparative analysis ↑ across processes, improve research quality ↑, accelerate field learning ↑, and optimize limited observation opportunities ↑.
Problem Definition
Deliberation research faces two critical bottlenecks: (1) hard constraints on participant time available for evaluation and testing innovations, and (2) limits on researcher time and coordination. Researchers with novel evaluation ideas cannot feasibly work with all practitioners. The field lacks standardized evaluation methodologies, making comparative analysis impossible. Historical data is scattered and inaccessible. No system optimizes the allocation of scarce research opportunities or facilitates researcher-practitioner collaboration at scale.
Definition of Success
Create a functioning observatory serving 50+ deliberative processes annually within 2 years. Researchers choose to use Observatory resources as their primary data source and practitioners adopt standardized evaluation frameworks. For the Deliberation Field: enable comparative research across 100+ historical processes; reduce researcher coordination time by 70%; increase research output quality by establishing standardized metrics; and save ~$1M annually in duplicated or incompatible research efforts.
Requirements
- Core Observatory Functions: secure data repository for citizens’ assembly data; research opportunity allocation system (‘telescope time’ coordinator); and researcher-practitioner matchmaking platform.
- Data & Evaluation: standardized evaluation methodologies and metrics; automated data gathering tools and protocols; historical data migration and cleaning capabilities; and privacy-compliant data sharing agreements.
- Key Challenges: ensuring data privacy and participant consent; balancing open access with security needs; standardizing across diverse cultural contexts; and establishing incentives for researchers and practitioners to share data.
Existing Limitations
Researchers work in silos without coordination. Data from past processes is often lost or inaccessible. Practitioner-researcher connections happen ad-hoc through personal networks. Limited participant time is allocated inefficiently across competing research needs.
Milestones
- Establish founding partnership with research foundation
- Deploy MVP data repository with 10 historical processes
- First coordinated multi-process research study
- 50+ processes using standardized evaluations
- Self-sustainability through grants and partnerships
Starting Points
- Partner Organizations: The New Democracy Foundation (existing research foundation), MosaicLab (practitioner), Unify America (practitioner).
- Researchers: Ali Cirone (LSE), Bailey Flanigan (MIT).
- Technical Infrastructure: build on existing research data repositories like Harvard Dataverse, UK Data Service.
- Standards: adapt from related fields like CONSORT for clinical trials, STROBE for observational studies.
- Existing Work: OECD’s Database of Representative Deliberative Processes, Participedia platform.