what the fuck was their plan exactly? >one random day >chimp out and invade Israel >kill 1000 israelis >rape some jewesses >take some hostages >hold territory for like 4 days >retreat ...
Thread: https://boards.4chan.org/pol/thread/469061917

#barbarians #animals
This a real statue? In a public area, where it can be viewed?
Norse paganism is replacing Christianity. What do you do?
Thread: https://boards.4chan.org/pol/thread/468984512

“Hear me and rejoice! You have had the privilege of being saved by the Great Titan. You may think this is suffering. No... it is salvation. The universal scales tip toward balance because of your sacrifice. Smile... for even in death, you have become children of Thanos.” – Avengers: Infinity War (2018)
Result of an orientating session with ChatGPT:
When dealing with a growing and changing index on a decentralized system like IPFS, you need a data structure that can efficiently handle updates, additions, and deletions. Several data structures are well-suited for such a use case:
### 1. **Merkle Trees and DAGs (Directed Acyclic Graphs)**
IPFS itself is based on Merkle DAGs, which allow for efficient and verifiable updates. This structure can accommodate dynamic data while ensuring integrity and traceability of changes.
### 2. **CRDTs (Conflict-Free Replicated Data Types)**
CRDTs are designed for distributed systems where multiple replicas can independently update the data structure and later converge to a consistent state. These are particularly useful for maintaining an index in a decentralized manner.
### 3. **Append-Only Logs**
An append-only log can record changes to the index over time, which can then be aggregated and searched. This approach makes it easy to maintain an evolving dataset.
### 4. **IPLD (InterPlanetary Linked Data)**
IPLD is a data model for representing complex data structures on IPFS. It allows you to create linked data structures that can evolve over time.
### Detailed Approach Using IPLD and Merkle DAGs
#### **Using Merkle DAGs for a Dynamic Index**
1. **Initial Setup:**
- Split the index into manageable chunks.
- Store each chunk on IPFS and get the CIDs.
- Create a Merkle DAG where each node points to the chunks (leaves).
2. **Updating the Index:**
- When updating, create new chunks if necessary.
- Update the Merkle DAG to point to the new chunks.
- Publish the new root CID of the DAG on Nostr.
#### **Implementation Steps:**
##### 1. Chunking and Storing on IPFS
```python
import ipfshttpclient
client = ipfshttpclient.connect('/dns/localhost/tcp/5001/http')
def chunk_and_store_index(index_data, chunk_size=1024):
chunks = [index_data[i:i + chunk_size] for i in range(0, len(index_data), chunk_size)]
cids = [client.add_str(chunk) for chunk in chunks]
return cids
```
##### 2. Creating a Merkle DAG
```python
import json
def create_merkle_dag(cids):
dag = {'chunks': cids}
dag_cid = client.add_str(json.dumps(dag))
return dag_cid
```
##### 3. Updating the Index
```python
def update_index(new_data, existing_cids, chunk_size=1024):
new_chunks = [new_data[i:i + chunk_size] for i in range(0, len(new_data), chunk_size)]
new_cids = [client.add_str(chunk) for chunk in new_chunks]
all_cids = existing_cids + new_cids
return create_merkle_dag(all_cids)
```
##### 4. Publishing Metadata on Nostr
```python
import nostr
from nostr.event import Event
def publish_index_metadata(dag_cid):
relay_url = 'wss://your.nostr.relay'
client = nostr.Client()
client.connect(relay_url)
metadata_event = Event(kind=1, content={'dag_cid': dag_cid})
client.publish_event(metadata_event)
# Example usage
index_data = "Your initial large index data..."
cids = chunk_and_store_index(index_data)
dag_cid = create_merkle_dag(cids)
publish_index_metadata(dag_cid)
```
##### 5. Fetching and Assembling the Index on Clients
```python
def fetch_index_metadata():
relay_url = 'wss://your.nostr.relay'
client = nostr.Client()
client.connect(relay_url)
events = client.get_events(kind=1, limit=1) # Get the latest event of kind=1
if events:
return events[0].content['dag_cid']
return None
def fetch_and_assemble_index(dag_cid):
dag = json.loads(client.cat(dag_cid).decode('utf-8'))
index_data = ""
for cid in dag['chunks']:
index_data += client.cat(cid).decode('utf-8')
return index_data
# Example usage
dag_cid = fetch_index_metadata()
if dag_cid:
index_data = fetch_and_assemble_index(dag_cid)
print(index_data)
```
### Summary
- **Chunking and Storing**: Split the index into chunks and store them on IPFS.
- **Merkle DAG**: Use a Merkle DAG to link the chunks, providing an updatable structure.
- **Publishing Metadata**: Use Nostr to share the latest DAG root CID.
- **Client-Side Fetching**: Clients fetch the DAG and reconstruct the index.
This approach allows the index to grow and change dynamically while ensuring that clients can always access the latest version in a decentralized manner. Using IPLD and Merkle DAGs aligns well with the principles of IPFS, providing a scalable and efficient way to manage large, evolving datasets.
I'm from the RFC-age... – https://www.rfc-editor.org/standards#IS
And the #AMOC hasn't even broken yet...
https://www.vox.com/climate/24099459/amoc-atlantic-meridional-overturning-circulation-freeze-europe
They took your money 🤡
git is open source... (do you mean GitHub?)
BREAKING: FORMER ARMY OFFICER CONFIRMS ALIENS EXIST ... 1) CONFIRMS THEY EXIST 2)CONFIRMS WE HAVE INTERACTED WITH THEM FOR A LONG TIME 3) CONFIRMS WE HAVE TREATIES WITH THEM. 4) CONFIRMS OTHER COUNTRI...
Thread: https://boards.4chan.org/pol/thread/468782724

I know... (I am an alien from out space)
There's great need for a Linus type benevolent dictator for life ( BDFL).
Which one of you slanted eyes mother fucker brought these fuckers here
Thread: https://boards.4chan.org/pol/thread/468759602

WOW! #WTF?
Android is not an OS (there's Linux underneath). Plus e/OS is an Android fork (?)
All the statements in the note are based on pseudoscientific hearsay. Read one of the other comments that also addresses this point.
Actually I stated the opposite concerning her intelligence (that was the whole point of my note)
I suspect you're a bubble...

