IPZN, a new ZeroNet based on IPFS

in ipfs •  5 years ago 

sticker.png

https://gitlab.com/ipzn/ipzn

IPZN is a new ZeroNet based on IPFS, which mean it uses IPFS(libp2p) protocol and IPLD datastructure.
To be compatible, it also supports ZeroNet protocol and ZeroNet data structure (Filesystem).

When all users moved to IPZN, the ZeroNet protocol will be possibly abandoned.

Pros and cons of using IPFS as infrastructure.

Pros:

  • Everything is efficiently stored. IPFS uses merkledag structure and addresses everything as multihash, and large files are splitted to small chunks.
  • Supports DHT, and many other routing methods.
  • Immutable, as I said above, everything is content-addreesed, and so it's immutable.
  • Modularized, better code architecture.
  • Used by enterprises, which means government will not blocked due to economy effect. Compared to ZeroNet and BitTorrent, ZeroNet can be blocked without considering anything, because it's just illegal

Cons:

  • Due to content-addressing, what you are looking for can be known by others, so pirating becomes dangerous
    • Solution: Use tor or i2p

IPZN's purpose is a general purpose d-app platform/framework, make d-app dev easier.

What IPFS to use?

Currently, go-ipfs's pubsub is not finished, js-ipfs's DHT is not done. So we have to wait for them.

IPZN will support operating IPFS both in js-ipfs direct node js api call, making js-ipfs a dependency, and use js-ipfs-http-client to control external ipfs daemon.

Programming language to use

I think we should use typescript, but I haven't learnt it yet.

IPZN works like git.

It has metadata chain, you can think it's chain of content json.
Every site has a metadata chain, each metadata is a ipld block(aka dag object), which has

  • Data
    • Real data
      • Previous metadata multihash
      • Timestamp
      • Multihash of site directory root object
    • Publick key
    • Signature
  • (No links in this ipld object)

When you modify a site, the modification is added to pending list.
Then you flush the modifications, all objects are added to ipld by ipfs.add(Not MFS). You got a new metadata

For a user, its contents are stored in 'sites' that has metadata too, for each site respectively.

User contents are propagated with pubsub(WIP).

In api, a ipzn site can query the list of known user content sites.

We won't automatically add user content to sqlite database, except for compatibility.

Why importing data to sqlite is bad?

It takes hours to build a large index site, e.g. search engine horizon. Especially on HDD. Although it seems not significant on small sites, it also has huge effect on performance.
For example, we want to build a decentralized Facebook. Every second a new post is sent, the ipfs client receives the content and adds it to immutable content storage level db, the content won't be modified until garbage collection.
If you use auto sqlite importing,we need a second step after receiving a message.
Because sqlite database is not for storing immutable data, e.g. change post content, the db may be malformed in concurrent operations.

Solution

Directly query in ipfs, aggregate data client-side.
In the future, we should use golang for better performance.

To be compatible to ZeroNet

Actually partly, and not possible to be fully.

We need multiformats of crypto, encrypt, decrypt and sign, since crypto functions are secure forever. Eddsa is better than what bitcoin uses.

In ipzn, the publoc key and signature are represented as multiformats, but receiving from and sending to ZeroNet, we tranform the format to normal.

What to sign?

Ipzn, in low level, won't care about user content, so data is binary. However, we can't give ZeroNet a content json with a field filled with based encoded random stuff.
So if a site wants compatibility, it needs basic info. And in ipzn, the basic info are stored as bson, but the content to be signed is json. In user content sites, the same.
Give and receive content from ZeroNet, in json.
Sign in json, store in bson.

Note that there's no data directly stored in metadata, only hashes and timestamp.Content json prepared for ZeroNet is a bson of a object in site root directory.

ZeroNet -> Json --Transform--> Bson -> IPZN (-> Dapp)

IPZN -> Bson --Transform--> Json -> ZeroNet

IPZN -> Bson -> Json -> Sign, Bson -> Store

Things above are only for sites wnat compatibility.

Backend

We should not use frontend to do heavy work because sometimes the network is very slow if ipzn is running on a vps, i experienced.

So, if things can be done with a single connection, that's good.

Two types of backend, so far

  • node.js, by vm2
  • python, i don't know if the security is ok

Pre-render

By a node.js pre-render engine, using headless chrome, optional.
For spiders like Google

It's necessary to be indexed by Google

Blockchain - Consensus and incentivization

  • Proof of Work

This wastes resources and makes the network centralized, unacceptable for me.

  • Proof of Stake, the more money you have, the more probability you are selected to verify a block and earn money

It makes the rich richer, the poor poorer, still not good.

  • Delegated Proof-of-Stake, the rich vote for others, the more votes you get, the more probability you are selected to verify a block and earn money.

This is the best method, so far, I think

Why we need blockchain for dWeb ?

You may say it is not necessary, but imagine that you have some resources, only a few people needs it, there's no enough peers to start downloading from another computer.
So you can download films with bittorrent fastly, and can't download rare stuff.

In order to make others seed your file, when they don't want your file, you have to pay for storage and retrieval.

One day, you also provide file storage to others, you can earn coins then.

That's the transferring of value, which incentives peers to do something.

So what I am going to do ?

Except FileCoin for storage, Steemit for quality content, what's missing ?

  • Quality search results, SearchCoin
  • Real, and Quality user, Decentralized Verification

And with blockchain, we can also:

  • Decentralized autonomous organization, make development and management to IPZN completely decentralized

SearchCoin

Centralized dweb search engine can be easily banned, but we really need a search engine for dWeb.

I finally found a way to make decentralized dweb search engine, and of course, not just YaCy.

It seems like a combining of FileCoin and Steemit.

When you search something, the request is broadcasted to all nodes that provide searching service, then they return the result to you. If you are satisfied to the result, you vote for it, and the providers earned money.
Results from nodes are mixed and sorted by their weight/wealth.

Light weight search engine/ index

Another type of decentralized search engine, this uses less space.

Or it's just a better ZeroSites/index. All nodes can add a description for a site, and sorted by non-blockchain reputation, the peers count a user has.

Decentralized verification

DAO and the stakeholders in delegated proof-of-stake mechanism give the requester a challenge (e.g. computer generated captcha), and the verifiers vote to determine accept him or not.

Compared to ZeroId, after all, it's still more decentralized.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!