/ipfs/anon ID: 269956 IPFS + Bittorrent Index May 25, 2018, 1:15 p.m. No.9   🗄️.is 🔗kun   >>23 >>24 >>27 >>28 >>29

As we all know, we need IPFS file index sites like what bittorrent has. Eventually I might make a python GUI/CLI program for creating torrents and IPFS hashes (don't expect anything soon). I need a desktop GUI wrapper for people to submit content to the decentralized index I designed.

 

It uses bitmessage as a submission method and IPFS for database syncing. The content is completely separated from any frontend. Anyone can spin up or shut down frontends locally or publicly hosted on a VPS. The only thing that has to stay up is the admin's bitmessage monitoring/publishing program. Since new submissions and the database doesn't depend on IP addresses, and bitmessage messages stay in the network for days, it can be hosted from any computer and can hop from computer to computer with very little impact on the system. One of the goals is to function in an IP address independent way.

 

For the database I initially thought of using a blockchain to sync the data. Then I realized a few spammers could irreversibly (besides hard forking) bloat the chain to inconvenient sizes. Plus I couldn't deal with deleting real illegal stuff like cp which would be a big problem. Then I turned to IPNS. Each new submission through bitmessage would be a new self describing JSON object file. This has the advantage that all previously synced users would still be sharing all previous content even when the database changes. The admin can also delete JSON records and sync that with peers. For this, peers will have to run a sync script that periodically (every 5 or 10 minutes) checks for an updated address from IPNS. The script would use ipfs ls to compare the previous sync's content to the current content using diff and would download/pin/remove/delete files accordingly. This creates a distributed one-way sync using IPFS. The coolest part though is that the program would also convert and insert or delete the synced JSON records into SQLlite (or any other SQL DB) so it can be easily used from any program, like a web frontend. Of course everything would be modular to support multiple databases and frontends.

 

So there are three components to the system. First, the desktop GUI/CLI that uses bitmessage and IPFS (they would be started as daemons and accessed through their APIs) that allows users to create torrents and IPFS hashes to submit to the system through bitmessage. I'm thinking of a simple python Qt GUI for users to start. Index sites would provide submit.html, report.html, modify.html, and remove.html files that the program will list and serve serve to the localhost (http://localhost/<my-program-name>/<index>/submit.html). When clicked on it will open the files in your web browser and you'll fill out the form (alpalkajs would be used for easy form generation and verification of user metadata like an enum of categories). The webpage would POST the form input as JSON to the python web server. It would then verify it against a standard JSON schema file provided by the index and do the necessary operations (create torrent and IPFS hashes from the provided file/directory). Finally it would send it over bitmessage to the index's address.

 

Second, the index admin program monitors a bitmessage address for new content, processes them (inserts and keeps track of UIDs, usernames, etc), writes/removes them to/from JSON files, and publishes the updated directory to IPNS. It would have a priority queue so certain bitmessage addresses get processes first (deleting is faster then adding) and could optionally have permissions based on the bitmessage address. One index could want to give users the ability to modify their submissions while another would want to reserve modifying to only admins/mods. It would broadcast a bitmessage message every time it accepts, rejects, or modifies/removes records to give users info on the status of their request.

 

(continued below)

/ipfs/anon ID: 269956 May 25, 2018, 1:15 p.m. No.10   🗄️.is 🔗kun   >>39

Third, have a synchronization program that peers use to download and share JSON files via IPFS and construct/keep in sync a local SQL database. Each index would have its own SQLite file. For frontends, indexes have plenty of options. Of course all frontends should be easily user installable so to have as many local users as possible. I'm split about whether to build a modular system where frontend links would be included in the python program (have users go to http://localhost/<my-program-name>/<index>) or leave the frontend implementation the responsibility of the index admin. All data would be queried from SQLite either through their own program or a program such as Datasette (turns SQLite into a JSON API). Web facing frontends can do whatever they want to increase performance.

 

What I'm envisioning is a framework for decentralized index sites to use. There would be a set of basic universal rules but how each index is run is completely up to the index admin. I want to give them as much freedom as possible. Want a closed whitelisted only index? Fine. Want an open index where anyone can submit or modify/delete? Fine. Want an index where anyone can submit but only admins/mods can modify or remove? Fine. How admins run their index is completely up to them.

 

I'm pretty confident this setup could reliably work. Like I said I plan on making it be able to track multiple indexes so anyone can start their own database if they want. I plan on making an anime focused index to start us off. Frontends could even combine multiple SQLite databases to combine content from multiple indexes.