Why clarity matters in sustainability case studies
Sustainability case studies are a common way to show progress and attract customers. When carbon or energy performance is included, unclear presentation can create the impression of greenwashing. Clear reporting preserves credibility, supports internal decision making, and reduces legal and reputational risk.
Core elements every credible case study must include
A single affirmative claim about CO2 or energy without supporting detail is the main source of confusion. At minimum a credible case study should offer the following elements so readers can assess the validity of results.
- Scope and boundaries Describe which parts of the product or service are included. Are you reporting emissions for a single page view, a feature, a whole application, or a supply chain item?
- Time period State the dates the measurement covers and whether the result is a snapshot or an average over a period.
- Metric and units Use standard units such as kilograms CO2e per unit of service and explain what a unit means in context.
- Methodology Name the calculation method or standard used and summarise key steps. If you used a custom model, provide the model logic and assumptions.
- Assumptions and exclusions List what you assumed and what was deliberately excluded. Common items to call out include third party services, user device energy, and embodied emissions.
- Baseline or comparator Explain what the result is compared to. Is it an absolute reduction from a previous period, a change versus a specific competitor configuration, or an efficiency improvement per unit?
- Uncertainty and sensitivity Report an uncertainty range and a short sensitivity analysis for the inputs that matter most.
- Verification or review Say whether the data or method has been externally verified, peer reviewed, or internally audited.
How to present CO2 results without misleading readers
Presentation choices shape how claims are interpreted. Use these practical rules when you design the narrative and visuals for results.
Name the primary metric first
Start with the precise metric rather than a marketing statement. For example, lead with kilograms CO2e per user session and then add context such as the activity that defines a session.
Display boundaries prominently
Place a concise summary of scope and important inclusions or exclusions near the headline metric. Readers should not have to hunt through a long appendix to learn whether device energy or cloud emissions were counted.
Use consistent comparators
When you compare before and after numbers use the same measurement method and the same boundaries. If any aspect of the measurement changed between periods, explain how that affects comparability and, where possible, recalculate prior results to the new method.
Quantify uncertainty
Numerical results without uncertainty communicate false precision. Provide a plausible range and a short explanation of the largest sources of uncertainty such as activity data quality or emission factors.
Avoid relative language without context
Claims like lower or more efficient are meaningful only when anchored to a specified metric and baseline. Replace vague phrases with explicit statements such as 18 kilograms CO2e per 1000 requests compared to 24 kilograms CO2e per 1000 requests three months earlier.
Separate operational performance from offsets
If you used offsets or certificates to claim neutrality, present operational emissions and offsetting actions in separate sections. Do not fold net claims into performance metrics without clear labeling and documentation of the offset type and timing.
What to avoid in wording and visuals
Certain common choices create a greenwashing risk even when the underlying data are accurate. Avoid the following.
- Headlines that imply full life cycle coverage when the study only covers operational activity
- Percent improvements without absolute values or baseline dates
- Cherry picked time windows that make short term variability look like structural change
- Using icons or badges that suggest third party certification unless such certification exists and is current
Two practical case study templates
The templates below can be adapted to different audiences. Each template explains what to put in the short narrative and what to include in a technical appendix.
Template A: Customer facing summary
Use this format when the audience is customers, sales teams, or the general public. Keep the main page concise and link to the technical appendix.
- Headline metric One sentence stating the metric, unit, and time period
- One line scope Which activities are included and any major exclusions
- Key result Absolute value and percent change against an explicit baseline
- What changed Two to three short points explaining interventions and why they reduced emissions
- Credibility flags Mention verification, standards used, and a link to the appendix
In the technical appendix provide raw input data ranges, calculation steps, emission factors used, assumptions, and an uncertainty statement.
Template B: Technical report summary
Use this format for sustainability teams, procurement reviewers, and auditors who need reproducibility.
- Executive summary Single paragraph with headline metric and scope
- System boundary diagram A short textual or visual diagram describing included components
- Data sources Table of datasets, timestamps, and provenance
- Calculation method The step by step logic, equations if needed, and emission factors with versions
- Sensitivity analysis Show how key inputs change the final result
- Verification Describe any internal checks and provide links to external reviews if available
How to report third party and device emissions
Many teams underreport emissions by omitting third party services or end user device energy. If these items are material to the activity under study, include them or state why they were excluded. When you include third party services, document the data provider, the age of data, and the method for apportioning shared services to the studied activity.
Visuals that support clarity
Choose visuals that make assumptions visible rather than hiding them under styling. Good choices include a small table of assumptions, an annotated bar chart that shows absolute values and uncertainty ranges, and a callout box for the most important caveat. Avoid using stacked area charts that can hide which category drives the change unless each layer is clearly labeled and explained.
Checklist to run before publishing
Before publishing run a brief credibility check with these questions.
- Is the scope stated within the first visible screen of the case study
- Is the metric expressed in standard units and accompanied by absolute values
- Are the baseline and time period clearly identified
- Are the major assumptions and exclusions listed and justified
- Is uncertainty quantified and described
- Is any offsetting separated from operational results and fully documented
- Is there at least one internal or external review that confirms the approach is consistent with recognized standards
Examples of short illustrative narratives
The short narratives below are illustrative. They are anonymized and intended to show phrasing that is transparent and verifiable.
Example for a customer story
We reduced server energy per user session by 22 percent between January and June 2025. Measurement covers compute and data transfer for the core application only. Results are expressed as kilograms CO2e per 1000 sessions and are calculated using the GHG Protocol operational control approach and region specific emission factors. See the technical appendix for data tables, emission factors, and uncertainty ranges.
Example for a technical appendix opening paragraph
This appendix documents the inputs and calculations used to produce the headline metric. Boundaries exclude embodied emissions and third party analytics. Activity data were sourced from server logs and cloud provider billing. Emission factors are from the referenced database and reflect grid mix for the data center region during the measurement period. A Monte Carlo sensitivity on the three largest inputs yields a 90 percent confidence interval of plus or minus 15 percent on the headline value.
Next steps for teams publishing case studies
Adopt a consistent template, publish a technical appendix, and build a lightweight internal review that checks for scope clarity and uncertainty disclosure. Over time a consistent approach will make claims easier to compare and harder to contest.