Tokenizing the Environmental Attributes of Liquefied Natural Gas

Tokenizing the Environmental Attributes of Liquefied Natural Gas

PAYNE INSTITUTE COMMENTARY SERIES: COMMENTARY

By Brad Handler and Mark McCurdy

July 9, 2025

EXECUTIVE SUMMARY

Efforts to combat emissions of global warming/greenhouse gases are spawning the development of systems to track, record and share the carbon footprint and other environmental attributes of various operations.

Methane emissions from the natural gas industry is one such area. This is in part driven by regulations coming into force in the Europe Union, which will prescribe maximum levels of emissions of methane associated with the production of liquefied natural gas (LNG) entering the region.

Many of the systems currently in place to track and transact environmental attributes, for example existing carbon offset registries, are inadequate, in part as they are susceptible to manipulation or error as they rely on manual entry that can be changed by those managing the registries. Rather, a system that can support the legal requirements and scale of a global LNG market, i.e. to readily manage the varying needs of international buyers, producers, regulators, and financial institutions, needs to be digital, have entries that are immutable and have been independently verified, and be based on (measurement) standards/protocols that are independently established.

An ecosystem of actors has formed to address the “value chain” of digital methane emissions tracking. One set are measurement service providers that help determine emissions at different points across the LNG value chain. This paper, however, focuses on a set of service providers that:

  • aggregates and synthesizes data, pulling together independently measured elements (e.g. production volumes and methane emissions) into a single information system;
  • establishes that the gas and the processes used to derive/calculate the environmental attributes conform with independent standards (e.g., methane emissions intensity, the use of mass balance calculations);
  • “tokenizes” the attributes (including the environmental attributes) of the LNG, which is to say they create digital units that act as a form of record of these environmental attributes or that can become financial instruments/securities.

This paper as laid out as follows. The Introduction offers background, including the advent of low methane emissions-designated gas in the U.S., some of the regulation that has emerged in the European Union (and that is being considered in select Asia markets) and earlier efforts at digitizing the environmental attributes of such gas. This will help illustrate the data/information requirements for LNG sales in these markets. The Measurement Section describes the state of measurement/ establishing reportable fugitive methane emissions at each level of the LNG supply chain. The Data Tokenization Section begins by reviewing the data aggregation process and then describes the systems now in place to create and provide digitized registry services for Environmental Attributes (Certificates) of the natural gas and to create “tokens” that enable/facilitate transactions of those attributes; the section also discusses how tokenization is being taken a step further towards financially regulated securitization and the potential impact and benefits to the industry of such securitization. The Conclusion considers attributes of tokenization that likely best serve climate mitigation efforts going forward.

INTRODUCTION

Background

Producing natural gas with less fugitive methane emissions has roots in best commercial practice – reducing fugitive methane has allowed gas producers to sell that methane (gas) instead, supporting the necessary investment in hardware and systems.

However, formalizing a designation of gas as having been produced with low fugitive emissions, also referred to as having low methane emissions intensity, only began in 2019 in the U.S. as one facet of a system that denoted the gas as being responsibly sourced. (The “Responsibly Sourced” label included consideration of other social and environmentally favorable practices, for example producing gas with responsible water management and strong societal protections for indigenous communities.) Such gas was also referred to as “Certified” and “Differentiated.”

Adoption of practices to have gas be designated as responsibly sourced grew rapidly, to over 30%[1] of all U.S. gas being produced. This was concentrated in the gas-rich basins of Appalachia and the Haynesville[2], although there was penetration in the Permian basin and elsewhere.

Relative to some of the more societally-oriented attributes associated with responsibly sourced gas, the determination of methane intensity was always intended to be quantitatively based. Recognizing the challenges of accurately measuring fugitive methane emissions, such quantification efforts initially relied primarily on the use of emissions factors, determined by the Environmental Protection Agency (EPA), for specific equipment at the well site. Recent field measurements (virtually all the studies over the last few years) have found that the emissions factors meaningfully understate the emissions[3], however. As knowledge has grown around accurately estimating emissions and technologies have emerged that lower monitoring costs at the well site and make more data available (e.g. from satellites launched by independent parties[4]), operators are increasingly deriving their well-site methane emissions estimates through direct measurement (still with some contribution from emissions factors as necessary).

Early Tokenization

Tokens such as Fiùtur’s Methane Performance Certifications (MPCs) and EarnDLT’s Certified Environmental Tokens (CETs) emerged in 2022 and were conceived as a way for producers to get premium pricing on low methane emissions gas. Tokens gave producers the flexibility to sell their gas’ environmental attributes (i.e., selling the attributes unbundled from the underlying gas) while also improving the transparency of documentation for “bundled” sales.

Uptake of such tokens was lackluster. Although the voluntary nature of the market was no doubt a factor, the weak interest was also attributed to uncertainty of climate impact given varying standards and the heavy reliance on emissions estimation noted above that undermined the credibility of low emissions claims.

Demand for Low Methane Emissions Gas

As far as driving demand for low emissions gas, the EU has been the most important source. This initially came through voluntary corporate action that includes large long term offtake agreements for LNG produced with differentiated gas, such as those signed by Uniper with Southwestern[5] and Engie with NextDecade[6]. EU legislation as of August 2024, however, points to a significant expansion of the practice. The legislation includes a path to purchasing lower methane emissions intensity natural gas beginning in 2030, requires measurement, reporting, and verification (MRV) at the beginning of 2027[7] and initial data reporting obligations this year[8]. The EU consumed 330 billion standard cubic meters per year[9] of natural gas, or ~8%[10] of the world’s total, and imports 80%[11] from outside its borders.

Other importing countries, including Japan and Korea, have taken steps to also mandate buying low methane intensity natural gas. Japan and Korea have also joined with exporters in Australia and the U.S. through the Coalition for LNG Emission Abatement toward Net-zero (CLEAN) initiative. The initiative is a commitment to working together to collect information on the status of methane emission management and emission reduction efforts for LNG projects through established best practices[12].

Standards and Documentation

In line with the commitments by these importers, industry organizations have begun to stipulate physical and reporting requirements for low methane intensity basis. An example is the International Group of Liquefied Natural Gas Importers (GIIGNL), a third party made up of industry leaders[13], which have set a framework for establishing a standard for low carbon LNG. GIIGNL draws upon the Internation Organization for Standardization (ISO) to establish standards and certify the actions taken by LNG participants. It is important to note that the GIIGNL addresses attributes beyond methane; the organization’s characteristics in a differentiated carbon footprint life cycle include the CO2 content of field gas, flaring and venting behavior, the emissions associated with power supply as required, carbon capture (if relevant), transportation distances, and carrier design and fuel used for shipping.

The need to represent the methane emissions intensity to buyers through the entire value chain of the production of LNG is the impetus for systems that can accurately capture and represent such emissions. Specialized platforms have emerged that can deliver this representation. The systems work by creating a “digital twin” of the produced LNG volumes, which includes all the available data regarding the characteristics of the asset, including but not limited to provenance, energy and water intensity of production, emissions measurements and factors of the site (and thereby allocated to the natural gas volumes). That digital twin is then “tokenized” to capture the characteristics of the natural gas in an immutable and distributed fashion.

Figure 1 outlines four steps involved in the process­—from measuring emissions data to creating and securitizing the digital asset—along with the relevant parties that facilitate each function. Indicated in blue, is the section where an overview for each step is provided. The sections that follow discuss these steps in greater detail.

Figure 1: Data Management Steps to Creating and Trading “Digital” Natural Gas. Source: Payne Institute

MEASUREMENT OF LNG CARBON FOOTPRINT

The production of LNG includes three broad steps: production and processing of natural gas, transportation of that gas to a liquefaction facility (distribution), and the liquefaction of the gas. The LNG is then transported (shipped) to (customers’) regasification terminals. The quantification of emissions in each phase varies in terms of how much is determined through direct measurement vs. emissions factors, based on the producers’ investment in measurement and practical considerations. The greenhouse gas and methane quantification methods employed at each stage of the supply chain are discussed below.

Production Stage

Historically, “measurement” of GHG emissions at the wellhead relied primarily on emissions factors that had been set by the EPA for specific pieces of equipment onsite, as noted in the Introduction section. Over time and with technological development that has lowered the cost and increased access to data acquisition, best practice now includes integrating multiple sources of measurement — including in some cases continuous monitoring through fixtures at the well site, flyovers and satellites — paired with sophisticated data analysis techniques to interpret the measurements[14]. An independent service industry has emerged providing 3rd party methane emissions measurement services.

At the same time, industry consortia and others have contributed to provide guidance and support for operators as they establish practices for both measurement and methane reduction verification. An example of the former is the Oil and Gas Methane Partnership (OGMP) 2.0[15]. Members of OGMP 2.0 encompass over 140 companies from more than 70 countries, including the world’s biggest oil and gas producers[16]. The group has set requirements for members to reach a 45% reduction in methane emissions over 2015 estimate levels by 2025, and a 60-75% reduction by 2030. Alternatively, upstream members can strive for a collective average target of 0.25% emissions intensities by 2025.

OGMP 2.0 has established a range of five measurement (and reporting) practice quality levels. This offers a window to the investment commitment of members, which are required to reach a level 4 or 5 within three years of joining for operated assets and within five years for non-operated assets.

  • Level 1- Typically used by companies who have not taken any methane emissions source mapping and information is very limited. Reporting at this level does not differentiate between categories of emissions or source types. Emissions are quantified based on a high-level factor assuming specific assets/ ventures are comparable to others.
  • Level 2- Requires categorizing emissions sources- Venting, Fugitive losses, Flaring, Energy/ Combustion, Other. Each category is quantified using generic emissions factors but can use more advanced quantification.
  • Level 3- Involves estimation at the specific asset-level using generic, source-specific emissions factors.
  • Level 4- Requires source level measurement and sampling to establish specific emissions and activity factors for estimation. Simulation tools to establish emissions factors can be used “when appropriate”.
  • Level 5- Level 4 plus site level measurements to improve confidence in source level measurements.

Another example of an advisory consortium is the Energy Emissions Modelling and Data Lab (EEMDL) are also developing tools to facilitate accurate estimates for emissions at natural gas production sites, including emissions event duration, and for shipping LNG[17].

Figure 2: EEMDL Activities. Source EEMDL[18]

Transmission Stage

With 3 million miles of natural gas pipelines in the United States[19], 1,700 compressor stations[20] and 1,400 interconnects[21], it is extremely complicated to track natural gas movement — and therefore the methane emissions — of any given natural gas molecule. Yet that will be required of LNG producers if they are to be able to ascribe methane emissions intensity to their cargoes.

The lack of comprehensive measurement systems across this pipeline network system demands that such tracking still involves estimation. Industry actors are, however, getting more sophisticated in their estimating processes. Liquified natural gas company Cheniere offers an important example. The company recent study captured 138 unique pathways for its sourcing of natural gas for its liquefication facilities in Corpus Christi, Texas and Cameron Parish, Louisiana[22], as depicted in figure 3. , as depicted in figure 3. It developed a unique emissions intensity for each pathway, using algorithms to determine the flowpath of the gas purchase and leveraging publicly available emissions data as well as data collected directly from their suppliers to inform the emissions intensities[23]. For suppliers that are natural gas producers, Cheniere’s model estimates emissions based upon their internal transaction system and data on contracted capacities (counterparty pipeline capacity rights and volumes), which then informs via the algorithm the production basin, if the gas went through processing, and the unique set of pipeline miles and compressor stations the gas flowed through from producing region to one of Cheniere’s liquefaction terminals. For suppliers that are not producers, the model’s algorithm estimates gas origin based on the likely physical flow of one or more producing regions to the purchase location based on data sourced from electronic bulletin boards (EBBs) on scheduled flows.

  1. a) Pathways relevant to Cameron Parish, Louisiana. b) Pathways relevant to Corpus Christi, Texas.

Figure 3: Distribution GHG Emissions Physical Flow. Source: Cheniere Energy[24]

 

A basin-average production emissions profile is modeled when gas is purchased from non-producer counterparties or if facility level emissions data are not available, though this is rare. The model applies basin specific emissions intensity profiles based on EPA Greenhouse Gas Reporting Program (GHGRP) data for the latest available year. For transmission pipelines and compressor stations, the unique facilities are modeled where data exists (either publicly reported through the GHGRP or provided directly to Cheniere during an annual supplier data collection). When a pipeline or compressor station does not have facility-specific data available, a US average emissions profile is applied based on GHGRP data[25].

Through this process, Cheniere derived that high-emissions pathways had up to 6x higher emission intensities than low ones, and individual pathways from basin to liquefaction facility can have up to 99% variation in emission intensity[26].

LNG Production and Shipment Stages

GHG emissions from the liquefaction process are generated from the power used to cool the gas. Emissions are estimated based on emissions factors, with variation in factors primarily dependent upon the hydrogen sulfide content, or the classification of the gas as sweet or sour. GHG emissions from storage are minimal due to the heavily insulated tanks and lack of systematic venting once in storage[27].

Transportation-related GHG emissions, such as via vessels and barges, are calculated through emissions factors that are specific to the type of transit, fuel, and molecule mix. The emissions factors are set by the U.S. department of Energy’s Office of Energy Efficiency and Renewable Energy[28].

 

DATA TOKENIZATION

Data Ingestion, Aggregation and Synthesis

Measurements taken in the field (and emissions that are still based on estimates) must be sent to service providers’ data management platforms to allow for synthesis, analysis and ultimately token creation of the gas’ environmental attributes. This data ingestion is increasingly being facilitated through automation, which increases efficiency, transparency and reduces error (by eliminating human data input). Remote, offline devices (that are agnostic to the sensor) can be configured to link smart meters and IoT sensors at assets (like equipment at well sites) to data management sources and automatically transmit data collected by field measurement devices (that is time-stamped, etc.) directly to data aggregation platforms.

These same firms, including CarbonAi, CleanConnect.ai (part of its ProveZero offering)[29], Context Labs (Decarbonization-as-a-Service)[30] and Triangle Digital (AssetOS)[31], then aggregate the disparate elements of data, whether measured or estimated, into a data management platform. This “ties together” the various attributes of a given produced volume of gas — for example, it associates the measured/estimated methane emissions with production data and allows the calculation of  outcomes such as GHG/methane intensity certifications (see Figure 4).

Figure 4: Automated Data Input and Aggregation. Source: CarbonAi[32]

Such calculation can be done according to the standards set by various methodologies; in other words, the calculation of GHG reductions (or emissions intensity reductions) can be done according to the requirements of approved methodologies of carbon market registries (both voluntary and regulatory) or other standard bearers such as OGMP 2.0 or International Standards Organization (ISO) 14067.

The data aggregator and the entity doing the calculations/data synthesis do not have to be the same (different service providers in fact have differing views on whether best practices allow for these functions to be conducted by the same party). Either way, good governance practices suggest that the data and the calculations should be verified by an independent party. Examples include Context Labs’ formal partnership with KPMG in which KPMG provides such verification[33] and Fiùtur’s automated, rule-based/role-based data verification mechanism (see Figure 5)[34]. That verification is tracked and becomes part of the “digital version” of the natural gas volume.

Figure 5: Fiùtur’s process to establish “Digital Fuels”. Source: Fiùtur

Mass Balance and Chain of Custody

It is worth noting that the data synthesis/analysis includes steps that (can) specifically align with the fact that LNG cargoes are comprised of multiple sources (and pathways) of natural gas. At least two of the profiled data synthesizers/tokenizers, CleanConnect.ai and Context Labs, integrate mass balance calculations to establish the carbon footprint for chain of custody[35]. This synthesis is consistent with the principles of a “Trace and Claim” system that follows the gas as it moves through its production steps and changes ownership[36]. Although using mass balance is not currently required in carbon footprint reporting standards, it is thought likely that future iterations of such standards will do so[37] and at least one voluntary standard, the International Sustainability & Carbon Certification (ISCC) does require it[38].

Digital Twinning/Tokenization

The aggregation, synthesis and calculation (and verification/validation) of attribute data for natural gas/LNG volumes is part of a process of creating a “digital twin” of that volume. In other words, digital twinning captures a record of all the attributes — physical, environmental, and financial (depending on platform) — such that there is a digital representation of that volume from production to delivery to the end user[39].

These providers then create “tokens” through the accumulation of digital twins for a fixed amount of natural gas; these tokens are generally referred to as Environmental Attribute Certificates, or EACs[40] and function as an environmental accounting mechanism.

The tokens are created by uploading encrypted data onto a network sync domain via a smart contract. Sync domains serve to sequence (i.e., establish the order of) and timestamp data onto the registry’s network and are servers to which a party can connect to a distributed ledger to access their information/assets and counterparties for transaction processes.

Several providers stress that their tokens include data synthesis that conforms with reporting standards (this is an extension of the data synthesis that conforms with given protocols, as discussed in the previous section). EarnDLT’s, for example, current tokens, Quantified Emissions Tokens (QETs)[41], ensures that the measurement protocols conform to designated upstream and midstream reporting standards (e.g., OGMP 2.0, Veritas, ISO 14067, etc.), supply chain standards (GIIGNL, ISCC, GREET, etc.), and ESG/GHG reporting standards (GRI, SBTi, ISO 14064-1)[42].

Tokens are stored in distributed ledgers to ensure data is immutable and made available to relevant constituencies (including buyers, regulators, financing and accounting institutions that may support a buyer or seller, and others). Related, the providers generally offer “registry” services, in which they can catalogue and “manage” the handling of the EACs, including tracking their “status”, i.e., where along the value chain the gas currently resides, and retiring them (once they have been used). And further, some providers offer trading platforms to execute commercial transactions.

Although the companies mentioned in this section provide similar and overlapping services, there are some distinctions. One is the form of distributed ledger technology used for their tokens — this is discussed in detail in the next sub-section. Others include how many of the discrete steps each provides. This ranges from CleanConnect.ai, which is a “one-stop shop”, to those that are more specialized. CleanConnect.ai offers its own sensing hardware line under the Minerva brand name[43], although it can work with others’ hardware as well. Then, as noted above, the company provides data ingestion, aggregation and synthesis, assessing the environmental attribute data against the ISO 14067 standard. And it issues tokens and offers both registry services and a trading platform (“ProveZero”) to trade its EACs[44]. Other firms are somewhat more specialized.

There has been some “teaming up” in the space to provide end-to-end services through tokenization, such as between (CarbonAi and Fiùtur)[45]. And some token providers are teaming up with trading services. To offer one example, Context Labs, which stores its EACs on its CLEAR (Context Labs Environmental Attribute Registry) path, are designed to be traded on a marketplace of the buyer’s choosing. But Context labs also has a relationship with EarnDLT  to take existing certificates from Project Canary from the upstream segment of the energy value chain and include it in an End-to-End Pathway Environmental Attribute issued by Context Labs[46]). 

Use/Trading (Distributed Ledger Technologies)

Tokenization service providers use different forms of distributed ledger technologies (DLTs).  DLTs allow for the operation of transactions without the need of a central authority (which traditionally would be a bank or government). Blockchain is the most well-known example of DLT.

Within blockchain, servers can be segmented into public and private domains. In a public domain, the blockchain is transparent, meaning transactions can be seen by anyone, although, importantly, details pertaining to the transaction can be kept private (subject to encryption) and the wallet addresses are pseudonymous. CleanConnect.ai’s ProveZero is an example of a tokenization platform using a public blockchain.

In private, or permissioned, blockchain domains, the data owner has more control over access to information. However, private blockchains lack the access and transparency that makes the environmental attribute certificate data widely auditable. Context Labs uses this form of blockchain, for example.

Some tokenizers are using systems to manage the privacy required for some constituents while retaining the advantages of being on a public blockchain. Fiùtur’s tokenization infrastructure utilizes the Canton Network, a “public network of permissioned subnets” [47]. In other words, it offers privacy (like a private blockchain) but on a public network. This manifests in two primary differences between it and (purely) public blockchain-enabled platforms. The first is the ability to enable selective visibility of active contract sets. Each party involved in a transaction is entitled to view the parts that are distinctly defined by privacy conditions. The data owner grants permission to access their data at their discretion; in other words it is a far more flexible structure than public blockchain networks.

The second difference is that because Canton is a “network of networks”, it allows operability between participants’ own blockchain system; in other words, the network allows for connecting of different systems.

The company argues that both attributes are instrumental for meaningful engagement by the financial services industry because these financial firms have (1) distinct privacy limitations that preclude the use of standard blockchain networks (such as Ethereum) and (2) proprietary smart contract networks. Thus the Canton Network is designed to allow financial institutions to work with environmental attribute tokens in ways they would struggle with because of data privacy concerns and these institutions will be able to integrate the climate impact from this low emissions gas/LNG into their portfolios, be it for specific loans (e.g. where the interest rate is impacted by specific climate-related performance) or at a portfolio-wide level as they enable documentation of portfolio decarbonization commitments[48].

As noted above, blockchain systems are just one form of DLT. EarnDLT offers an example of a tokenizer using Hedera’s Hashgraph network, which claims efficiencies of processing and energy use due to a different architecture. Traditional blockchains organize data into sequential blocks linked in a chain; they therefore must eliminate transactions that might allow creating a new chain, Hedera Hashgraph utilizes a Directed Acyclic Graph (DAG) structure, where transactions can be incorporated (woven together into the whole), allowing for more parallel processing. It claims to offer the same privacy /privileging capacity as the Canton Network described above[49].

As noted, the Fiùtur and EarnDLT systems allow for privacy and permissioning of access to data that is key to engaging key constituencies, including sources of capital (financial firms) and participants across the value chain. Such versatility might be enhanced if the oversight/governance surrounding the process were to be taken further still.

From Tokens to Financial Securities

Most registries currently are “unqualified custodians” and the tokens themselves are produced into a voluntary market. Triangle Digital, another distributed ledger platform which utilizes blockchain technology, aggregates both financial and ESG data under one asset and is designed to convert a token into a — financially regulated — security that can be used across the financial services industry (see Figure 6.). The company notes that in addition to the financial services applications mentioned above, securitization allows the tokens to function as collateral and thus create a basis for the financial industry to lend against; securitization can also foster more trading and investment of the tokens themselves[50].

Triangle’s Bermuda subsidiary is licensed to conduct its digital asset business by the Bermuda Monetary Authority. The Bermuda body requires supervisory procedures to create securities, including Know Your Customer (KYC)/Anti-Money Laundering (AML) compliance, financial analysis, audits, issuance, and listing. Triangle is pursuing regulation from the U.S. Financial Industry Regulatory Authority (FINRA). FINRA approval will legitimize the digital assets to become tradable through U.S. institutions[51]. In Triangle’s vision, the components of having a regulated security include that issuing entities have Written Supervisory Procedures (WSP), follow the KYC/AML practices, have minimum capital requirements and that the practices are overseen by the National Futures Association; custodians of the securities must be “qualified” in the same way as other financial securities; and the National Institute for Standards & Technology oversees a National Certification Board of methodologies and protocols for these digital assets[52].

Figure 6: Climate and operating data supply chain. Source: Triangle Digital[53]

CONCLUSION

Many systems currently in place to track environmental attributes/climate-related data, including those employed in some carbon markets, are inadequately robust to satisfy the verification requirements associated with legislation and formal (country and company) commitments tied to climate mitigation. Instead, a new architecture for recording emissions intensity and determining compliance will be required that does not allow for manipulation after-the-fact and does allow for scrutiny by multiple parties, including government/regulators, industry and the financial sector, while offering data privacy as demanded by the commercial parties involved.

Nowhere is demand for this transparency, immutability and ability to assure compliant practices clearer than in LNG, for which the European Union has established the requirement for reporting (and thus allowing for verification of) methane emissions intensity of its imports beginning in 2027. But a world that is to faithfully progress towards minimizing its carbon emissions will need to apply such architecture much more broadly than in just the EU and natural gas.

At the same time, and with the same systems, a robust documentation architecture that is fully digitized on distributed ledgers, along with product development procedures that comply with regulated (governance) policies akin to those applied to financial securities, also allows for the productization of environmental attributes like methane emissions intensity. Such productization, like the MPC and QET instruments discussed in this paper that allow for transactions of these attributes (unbundled from the underlying natural gas), holds the potential to significantly broaden the universe of participants and thus to dramatically increase the funds available to be put towards climate mitigation activities.

That there are varying approaches to data capture, processing and digitization/tokenization (and in particular the variances in DLTs being used to transact and register tokens) seems consistent with a nascent industry. It isn’t obvious that one system or process must “win out”, although we offer some observations. First, questions of adequate “checks” on data and calculations, i.e. a part of governance processes, are very important; the protocols must stand up to reasonable scrutiny. Second, automated processes to data handling (subject to independent verification with spot checks, etc.) appear to be key for establishing the integrity of the data. Manually inputted data is likely too prone to error (or manipulation) to satisfy reasonable standards (as well as being inefficient). Third, the financial services industry has distinct and differing privacy commitments and concerns that appear to conflict with the public nature of many blockchain networks. And fourth, national and even supranational entities are likely to be engaged in registering and approving the environmental attribute data (e.g., LNG importers in the EU will have to register the methane emissions intensity of their purchased product with the EU). It seems likely that these entities are going to use different architecture for that purpose, i.e., they may use different types of blockchain.

Irrespective of the approaches settled on by the industry for data management and tokenization, however, the capacity to reliably track, verify and disseminate environmental attribute information exists and holds tremendous potential for these attributes to be used to meet decarbonization goals. As countries/regions and corporations prime themselves for massive growth in the volume of climate mitigation activities, it is time that formal processes for managing the data accompanying those efforts follow suit.

 

ACKNOWLEDGEMENTS

The authors gratefully acknowledge representatives of the following firms, for their teaching and support: CarbonAi, Cheniere, CleanConnect.ai, Context Labs, EarnDLT, Fiùtur and Triangle Digital.

References

[1] MiQ. Certified Gas: A Cost-Effective Path to Emission Reduction

[2] S&P Global. Record volume of certified gas hits US markets after strong commitments in 2021

[3] Natural Gas Supply Collective. Comparison of Natural Gas Certification Programs

[4] Fiùtur. Fiuturx.com/Solution

[5] LNG Journal. Uniper signs agreement US natural gas supplies from LNG feed-gas player Southwestern Energy. 6/21/22

[6] NextDecade. NextDecade and ENGIE Execute 1.75 MTPA LNG Sale and Purchase Agreement. 5/2/22.

[7] International Energy Agency (IEA). EU Methane Regulations – Policies

[8] Clean Air Task Force. Harnessing Data-Driven Accountability: How ‘Following the Money’ Can Track Fossil Fuels Across the Supply Chain. 2/25/25

[9] European Commission. Quarterly Report on European Gas Markets, Vol. 17, Issue 4.

[10] Enerdynamics. Natural Gas Consumption: Global Outlook 2024-2034

[11] U.S. Energy Information Administration (EIA). Europe relies primarily on imports to meet its natural gas needs

[12] Japan Organization for Metals and Energy Security (JOGMEC). JOGMEC published the Results of the CLEAN initiative, “CLEAN Annual Report” at the LNG Producer-Consumer Conference 2024 : News Releases

[13] International Group of Liquefied Natural Gas Importers (GIIGNL). MRV-and-GHG-Neutral-Framework-1.pdf

[14] Project Canary. Homepage

[15] Oil and Gas Methane Partnership 2.0 (OGMP2.0). Starters Guide

[16] OGMP 2.0. Membership

[17] Energy Emissions Modelling and Data Lab (EEMDL). Tools and Resources

[18] EEMDL. Homepage

[19] U.S. Energy Information Administration (EIA). Natural gas pipelines

[20] The Elliott Group. The US Natural Gas Compression Infrastructure: Opportunities for Efficiency Improvements

[21] Interstate Natural Gas Association of America (INGAA). The Interstate Natural Gas Transmission System: Scale, Physical Complexity and Business Model

[22] ACS Sustainable Chemistry & Engineering. S.A. Roman-White et. al. Gas Pathing: Improved Greenhouse Gas Emission Estimates of Liquefied Natural Gas Exports through Enhanced Supply Chain Resolution. 11/25/24

[23] ibid.

[24] ibid.

[25] ibid.

[26] Ibid.

[27] American Petroleum Institute (API). GHG Emissions from LNG Operations

[28] ibid.

[29] CleanConnect.ai. ProveZero

[30] Context Labs. Solutions-DaaS

[31] Triangle Digital. Asset OS Platform

[32] CarbonAi. Homepage

[33] KPMG. KPMG U.S., Context Labs Announce Alliance to Enhance Credibility, Rigor of Environmental Measurement and Reporting. 4/26/23

[34] Fiùtur. Trust Infrastructure for the Transition Economy. 11/20/23

[35] Under mass balance, different sources of natural gas may be physically mixed, but are segregated on a bookkeeping basis. This allows some of the physical product to be sold as having a certain emissions footprint but prevents over-counting of how much product meets that standard. Mass balance must follow the physical flow of the material throughout the supply chain.

[36] Clean Air Task Force. Harnessing Data-Driven Accountability: How ‘Following the Money’ Can Track Fossil Fuels Across the Supply Chain. 2/25/25

[37] Carbon Trust. Meet customer calls for lower carbon products through a mass balance approach. 6/9/25

[38] International Sustainability & Carbon Certification (ISCC). ISCC EU Mass Balance Guidance Document Version 1.0.

[39] Of the canvassed purveyors mentioned in this section, CleanConnect.ai appears to have interpreted digital twinning more expansively. It creates a digital mirror of the operation/site as well as the volume.

[40] Note that some tokens have proprietary nomenclature by provider; for example Fiùtur produces a Digital Commodity Unit or DCU

[41] QETs replaced the CETs cited earlier in this report. As the name implies, QETs are meant to represent Environmental Attribute Certificates that are measurement-informed as opposed to CETs, which relied (heavily) on emissions factors such as those indicated in the EPA’s Greenhouse Gas Reporting protocols.

[42] EarnDLT. Unlock value in environmental attributes data

[43] CleanConnect.ai. Minerva Sensor Fusion systems

[44] CleanConnect.ai. ProveZero

[45] Fiùtur. Fiùtur and CarbonAi Empower Global Markets with “Digital Integrity” for Energy Transition Claims

[46] Context Labs. Context Labs and EarnDLT Announce Collaboration to Generate Quantified Emissions Attributes at Scale through integrating Context Labs Decarbonization as a Service (DaaS™) Platform with EarnDLT’s Quantified Emissions Tokens®. 8/15/24

[47] Canton. Canton Network: A Network of Networks for Smart Contract Applications

[48] Fiùtur. Fiùtur and Digital Asset Announce Tokenization of Energy Supply Chain Attributes; March 5, 2025

[49] Hedera Hedera: A Public Hashgraph; Network and Governing Council

[50] Triangle Digital. Triangulating Data, Assets & Finance

[51] Triangle Digital. Regulated Carbon Credits Policy; Triangulating Climate, Data & Finance

[52] Triangle Digital. Triangulating Data, Assets & Finance

[53] Triangle Digital. Digital Twinning

ABOUT THE AUTHORS

Brad Handler, Payne Institute Program Director, Energy Finance Lab, and Researcher
Brad Handler is a researcher and heads the Payne Institute’s Energy Finance Lab. He is also the Principal and Founder of Energy Transition Research LLC. He has recently had articles published in the Financial Times, Washington Post, Nasdaq.com, Petroleum Economist, Transition Economist, WorldOil, POWER Magazine, The Conversation and The Hill. Brad is a former Wall Street Equity Research Analyst with 20 years’ experience covering the Oilfield Services & Drilling (OFS) sector at firms including Jefferies and Credit Suisse. He has an M.B.A from the Kellogg School of Management at Northwestern University and a B.A. in Economics from Johns Hopkins University.

Mark McCurdy, MS Mineral and Energy Economics, Colorado School of Mines
Mark McCurdy is a student researcher at the Payne Institute. He is pursuing his M.S. in Mineral and Energy Economics at the Colorado School of Mines.

ABOUT THE PAYNE INSTITUTE

The mission of the Payne Institute at Colorado School of Mines is to provide world-class scientific insights, helping to inform and shape public policy on earth resources, energy, and environment. The Institute was established with an endowment from Jim and Arlene Payne and seeks to link the strong scientific and engineering research and expertise at Mines with issues related to public policy and national security.

The Payne Institute Commentary Series offers independent insights and research on a wide range of topics related to energy, natural resources, and environmental policy. The series accommodates three categories namely: Viewpoints, Essays, and Working Papers.

Visit us at www.payneinstitute.mines.edu

FOLLOW US

 

 

DISCLAIMER: The opinions, beliefs, and viewpoints expressed in this article are solely those of the author and do not reflect the opinions, beliefs, viewpoints, or official policies of the Payne Institute or the Colorado School of Mines.