How was your experience with IPFS?

Reply to this note

Please Login to reply.

Discussion

Not OP but I developed an OTA update system for electric motorbikes and sharing of their log files - it works well for this use case as I have a free 5GB account with web3.storage.

Skimmed your article and yes it's built by Eth maximalists so I'm unsure of the reliability going forward but it's great riding on the back of all the defi wave for sharing relatively small files around in a distributed manner.

I imagine your app just talks to that backend for the files?

I bake the updates and push to IPFS.

The bikes grab OTA updates and push log files to and from IPFS.

Firebase is used to share the CIDs and some metadata.

I inrend to switch to Nostr on the v2 release instead of Firebase, and possibly blossom.

Seemed great at first but then hit issues with files taking a more than acceptable time to load or not showing at all.

Same as mine then, many years ago.

Been using for years on and off and has massively improved in terms of speed. IPFS is basically like Dropbox or something now in terms of speed from upload to download availability.

Protocol labs did raise almost half a billion so some of it must have been put to use.

It’s like a lot of things in technology, hurdles can be overcome to a point. Do you use any particular tools for your IPFS tasks that you feel make all the difference?

I'm forced to use the ipfs-car CLI library now web3.storage (and nft.storage) have started to make their process a bit more restricted. I avoid using their NodeJS libraries wherever possible since they change / break too often. https://web3.storage/docs/concepts/car/#ipfs-car

Previously you could shove any data you want to an IPFS endpoint and it would kindly return the IPFS CID, now you need to generate the CID and some other file-specific info prior to uploading.

Thankfully getting data is still straightforward and can be requested anonymously from any number of IPFS gateways.

Happy to share NodeJS snippets if you want a primer since it's a bit tedious to work through their docs.

That would be interesting to see, yes please as I’d like to have a look into these things

Unless you have a prior project with IPFS I would simply use blossom. But since you asked:

# Step 1: Get filesize and hash using ipfs-car. The hash is the CID.

```

const { exec } = require('child_process');

async function getFileHashAndSize(file) {

return new Promise((resolve, reject) => {

const npxPath = '/usr/local/bin/npx'; // Full path to npx

exec(`${npxPath} ipfs-car hash ${file}`, (error, stdout, stderr) => {

if (error) {

return reject(new Error(`Error getting file hash: ${error.message}`));

}

const hashOutput = stdout.trim();

const hash = hashOutput.split(' ')[0];

exec(`wc -c < ${file}`, (error, stdout, stderr) => {

if (error) {

console.error(`Error executing wc command: ${stderr}`);

return reject(new Error(`Error getting file size: ${error.message}`));

}

const size = stdout.trim();

resolve({ hash, size });

});

});

});

}

```

# Step 2: Create JSON to upload to endpoint using acquired hash and filesize

```

function createStoreJson({ hash, size }) {

try {

if (!hash || !size) {

throw new Error('Invalid hash or size');

}

return {

tasks: [

[

'store/add',

'did:key:{StorageRepoID}',

{

link: {

'/': hash,

},

size: parseInt(size, 10),

},

],

],

};

} catch (error) {

throw new Error(`Error creating JSON: ${error.message}`);

}

}

```

# Step 3: Post JSON (not file) to web3 endpoint. This then tells you whether the data already exists or if you have to then upload the file as well.

```

// WEB3_STORAGE.ENDPOINT = https://up.web3.storage/bridge

async function postFun(json) {

try {

const response = await axios.post(WEB3_STORAGE.ENDPOINT, json, {

headers: {

'X-Auth-Secret': WEB3_STORAGE.SECRET,

'Authorization': WEB3_STORAGE.AUTH,

'Content-Type': 'application/json',

},

});

return response.data;

} catch (error) {

let errorMsg = `Error posting to ${WEB3_STORAGE.ENDPOINT}: ${error.message}`;

if (error.response) {

errorMsg += `\nResponse status: ${error.response.status}`;

errorMsg += `\nResponse data: ${JSON.stringify(error.response.data)}`;

}

throw new Error(errorMsg);

}

}

```

# Step 4: Post the file if step 3 returns false. The endpoint (puturl in this case) will be provided by the previous step, and is ALWAYS an AWS endpoint 😀

I've left the JSDoc comment here since there's a few more parameters.

```

/**

* Function to upload a file using a PUT request.

* @param {Object} params - The parameters for the PUT request.

* @param {string} params.putUrl - The URL to send the PUT request to.

* @param {number} params.size - The size of the file in bytes.

* @param {string} params.checksumSha256 - The SHA-256 checksum of the file.

* @param {string} params.filename - The path to the file to be uploaded.

* @returns {Promise} - An object containing the status and data of the response.

*/

async function putFun({ putUrl, size, checksumSha256, filename }) {

try {

const response = await axios.put(putUrl, fs.createReadStream(filename), {

headers: {

'Content-Length': size,

'x-amz-checksum-sha256': checksumSha256,

},

maxBodyLength: Infinity,

});

return { status: response.status, data: response.data };

} catch (error) {

let errorMsg = `Error putting file: ${error.message}`;

if (error.response) {

errorMsg += `\nResponse status: ${error.response.status}`;

errorMsg += `\nResponse status: ${error.response.data}`;

}

throw new Error(errorMsg);

}

}

```

Thanks, seems pretty comprehensive and to be honest I’ll probably opt for Blossom in future projects. I’m getting more and more engrossed in Nostr at the minute.