Is there a way to register a legacy webpage on #nostr ?

#askNostr

For example I register my webpage by injecting some #nostr code in #html header ..and then embed the same code in a new #nostr event ..

Why ?

Once I register my page on nostr - it snapshots the url ( not content ) .. and then any comments on nostr may reflect on my legacy website !

nostr:nprofile1qqs8hhhhhc3dmrje73squpz255ape7t448w86f7ltqemca7m0p99spgpzemhxue69uhkzat5dqhxummnw3erztnrdakj7qgmwaehxw309a3ksun0de5kxmr99ej8gmmwdahzucm0d5hsz8rhwden5te0vdhh2mn5wf5k2uewve5kzar2v9nzucm0d5hsxh4ddm nostr:nprofile1qqsrx4k7vxeev3unrn5ty9qt9w4cxlsgzrqw752mh6fduqjgqs9chhgpzamhxue69uhhyetvv9ujuurjd9kkzmpwdejhgtcpz4mhxue69uhhyetvv9ujuerpd46hxtnfduhszyrhwden5te0dehhxarj9ekk7mf0mpwca6

This appears to be an alternative approch to nostrify legacy content .. some of the benefits..

- websites can stay whereever they are ..no need to migrate content to nostr and unnecessarily fill up relays .

- websites can benefit from. Key nostr features - social interactions and #zaps

- websites can offer nostr log in .. such that any comment on website , automatically reflect in nostr clients ..

Reply to this note

Please Login to reply.

Discussion

I think you could deploy a script on your site to display events from a specific npub, so the website itself is a very basic nostr client that fetches the events and displays them as a website. Is that what you mean?

Yes .. but that is very slow .. 1 second rule ! If page didn't load in 2 seconds ..it might even never load ..

There should be some caching, maybe a cron job that runs every 5minutes and refreshes the cache, so the site loads instantly, but has 5minute old data max. You could reduce the cron job to even less...

Doesn't work .. I have been using npub.pro for almost an year .. it is great ..but not fast enough

How about a Nostr NIP for website? And you can have your "relay" serve that website as it's only event. So your relay (server) keeps your website and a list of other websites and their last known addresses. And a specialized Nostr client (web browser) just looks it up on the relays, finds your relay address, and gets your website.

Any reason this can't work?

That is the problem .. relays are slow and unreliable .. . Legacy web is 100 x faster than nostr ..

And why ? .. we should not expect billion websites to migrate to nostr ? Nostr should meet them where they are .. just like podcasts in #fountain ..

The relays wouldn't relay other people's websites, just their locations. So it would be kinda like a decentralized DNS.

Yesss .. that makes sense .. but I crave for social interactions and zaps on my legacy site .. not DNS ..

no point solving a problem that has already been solved !