load.network
  • Load Network
  • Quickstart
  • About Load Network
    • Overview
    • Network Releases Nomenclature
    • Load Network Alphanets
    • Key Features
    • ELI5
  • Using Load Network
    • Compatibility & Performance
    • Network configurations
    • Load Network Bundler
    • 0xbabe2: Large Data Uploads
    • Load Network Bundler Gateways
    • Load Network Precompiles
    • LN-Native JSON-RPC Methods
    • load:// Data Protocol
    • Self-Hosted RPC Proxies
      • Rust Proxy
      • JavaScript Proxy
    • Code & Integrations Examples
      • ethers (etherjs)
      • Deploying an ERC20 Token
  • Load Network Cloud Platform
    • Cloud Platform (LNCP)
    • Load S3 Protocol
    • load0 data layer
  • Load Network for evm chains
    • Ledger Archiver (any chain)
    • Ledger Archivers: State Reconstruction
    • DA ExEx (Reth-only)
    • Deploying OP-Stack Rollups
  • Load Network ExEx
    • About ExExes
    • ExEx.rs
    • Load Network ExExes
      • Google BigQuery ETL
      • Borsh Serializer
      • Arweave Data Uploader
      • Load Network DA ExEx
      • Load Network WeaveDrive ExEx
  • Load Network Arweave Data Protocols
    • LN-ExEx Data Protocol
    • Load Network Precompiles Data Protocol
  • DA Integrations
    • LN-EigenDA Proxy Server
    • LN-Dymension: DA client for RollAP
  • load hyperbeam
    • About Load HyperBEAM
Powered by GitBook
On this page
  • Upload data
  • Integrating ledger storage
  • Using Load DA
  • Migrate from another storage layer
Export as PDF

Quickstart

Get set up with the onchain data center

PreviousLoad NetworkNextOverview

Last updated 2 months ago

To easily feed Load Network docs to your favourite LLM, access the compressed knowledge (aka LLM.txt) file from Load Network:

Let's make it easy to get going with Load Network. In this doc, we'll go through the simplest ways to use Load across the most common use cases:

Upload data

The easiest way to upload data to Load Network is to use a bundling service. Bundling services cover upload costs on your behalf, and feel just like using a web2 API.

The recommended testnet bundling service endpoints are:

  • (upload)

  • (retrieve)

Instantiate an uploader in the using this endpoint and the public testnet API key:

API_KEY=d025e132382aea412f4256049c13d0e92d5c64095d1c88e1f5de7652966b69af

Limits are in place for the public testnet bundler. For production use at scale, we recommend running your own bundling service as explained , or

Full upload example

import { BundlerSDK } from 'bundler-upload-sdk';
import { readFile } from 'fs/promises';
import 'dotenv/config';

const bundler = new BundlerSDK('https://upload.onchain.rs/', process.env.API_KEY);

async function main() {
  try {
    const fileBuffer = await readFile('files/hearts.gif');
    const txHash = await bundler.upload([
      {
        file: fileBuffer,
        tags: {
          'content-type': 'image/gif',
        }
      }
    ]);
    console.log(`https://resolver.bot/bundle/${txHash}/0`);
  } catch (error) {
    console.error('Upload failed:', error.message);
    process.exit(1);
  }
}

main().catch(error => {
  console.error('Unhandled error:', error);
  process.exit(1);
});

Need to upload a huge amount of data?

The above example demonstrates posting data in a single Load Network base layer tx. This is limited by Load's blocksize, so tops out at about 8mb.

For practically unlimited upload sizes, you can use the large bundles spec to submit data in chunks. Chunks can even be uploaded in parallel, making large bundles a performant way to handle big uploads.

Integrating ledger storage

Chains like Avalanche, Metis and RSS3 use Load Network as a decentralized archive node. This works by feeding all new and historical blocks to an archiving service you can run yourself, pointed to your network's RPC.

Using Load DA

With 125mb/s data throughput and long-term data guarantees, Load Network can handle DA for every known L2, with 99.8% room to spare.

Right now there are 4 ways you can integrate Load Network for DA:

  1. DIY

Migrate from another storage layer

If your data is already on another storage layer like IPFS, Filecoin, Swarm or AWS S3, you can use specialized importer tools to migrate.

AWS S3

Filecoin / IPFS

The load-lassie import tool is the recommended way to easily migrate data stored via Filecoin or IPFS.

Just provide the CID you want to import to the API, e.g.:

https://lassie.load.rs/import/<CID>

Swarm

Switching from Swarm to Load is as simple as changing the gateway you already use to resolve content from Swarm.

The first time Load's Swarm gateway sees a new hash, it uploads it to Load Network and serves it directly for subsequent calls. This effectively makes your Swarm data permanent on Load while maintaining the same hash.

...Or to avoid copy-pasting.

The makes it possible for developers to spin up their own bundling services with support for large bundles.

As well as storing all real-time and historical data, Load Network can be used to reconstruct full chain state, effectively replicating exactly what archive nodes do, but with a decentralized storage layer underneath. Read to learn how.

DIY docs are a work in progress, but the to add support for Load Network in Dymension can be used as a guide to implement Load DA elsewhere.

Work with us to use Load DA for your chain - get onboarded .

The provides a 1:1 compatible development interface for applications using AWS S3 for storage, keeping method names and parameters in tact so the only change should be one line: the import .

The importer is also self-hostable and further documented .

before: <hash>

after: <hash>

clone this example repo
Rust Bundler SDK
Clone the archiver repo here
here
As a blob storage layer for EigenDA
As a DA layer for Dymension RollApps
As an OP-Stack rollup
commit
here
Load S3 SDK
here
https://api.gateway.ethswarm.org/bzz/
https://swarm.load.rs/bzz/
https://gateway.load.rs/bundle/0x5eef8d0f9a71bbee9a566430e6b093f916900b7d6d91d34e5641768db4ee3ef7/0
upload.onchain.rs
resolver.bot
bundler-upload-sdk
here
get in touch
Uploading data
Integrating ledger storage
Using Load DA
Migrate from another storage layer