Metrics & Evidence
How we measure reach and impact
Baseline established: January 2026
This page documents how usage data is collected for the HashimaXR Learning Resource. It provides transparency about what we measure, what we cannot measure, and how this evidence may be used in research impact assessment.
📊 Analytics Baseline
Start date: January 2026
Platform: Netlify Analytics (cookieless, privacy-focused)
Data retention: Rolling 30-day window (Netlify default)
Historical exports: Monthly CSV exports archived for longitudinal analysis
What We Measure
Website Analytics
Netlify Analytics provides aggregate, cookieless metrics that do not track individual visitors or require consent banners.
| Metric | What It Tells Us | Limitations |
|---|---|---|
| Page views | Total requests for each page | Includes bots; no unique visitor count |
| Top pages | Most accessed content | Cannot distinguish new vs. returning visits |
| Bandwidth | Data transfer volume | Proxy for engagement depth |
| Referrers | Where visitors come from | Many referrers hidden by browser privacy |
What We Cannot Measure
Cookieless analytics cannot provide: unique visitor counts, session duration, user journeys, return visit rates, or individual-level tracking. This is by design — we prioritise visitor privacy over granular metrics.
Evaluation Surveys
Voluntary surveys provide qualitative and quantitative evidence of learning outcomes. Three survey instruments are available:
| Survey | Target Audience | Data Collected |
|---|---|---|
| Learner Survey | Adult learners (18+) | Pre/post self-assessment, learning context, feedback |
| Educator Feedback | Teachers, lecturers | Implementation context, observed outcomes, recommendations |
| Practitioner Feedback | Heritage professionals | Professional relevance, application to practice |
All surveys are versioned (e.g., "Learner Survey v2.0, January 2026") to maintain data provenance across updates.
Adoption Reports
The Quick Adoption Report provides a lightweight mechanism for educators and institutions to register their use of the resource without completing a full survey. This creates verifiable signals of institutional reach.
How Evidence Will Be Used
Research Impact Assessment (REF 2029)
Aggregated metrics and survey findings may be included in impact case study documentation submitted to the UK Research Excellence Framework. This may include:
- Total page views and geographic distribution of access
- Number of survey responses and aggregated findings
- Documented institutional adoptions
- Testimonials (with explicit consent)
Individual survey responses are never identifiable in published findings. Only aggregated, anonymised data is reported.
Data Provenance
To ensure evidence integrity for research assessment:
- Analytics exports: Monthly CSV exports with timestamps
- Survey data: Stored in Netlify Forms with submission timestamps
- Version control: All survey instruments are versioned
- Changelog: Site changes documented in Version History
Questions?
For questions about data collection or research impact documentation, contact the project lead:
Dr Christopher Gerteis
Senior Lecturer in Modern Japanese History
SOAS University of London
cg24@soas.ac.uk