That sounds like a recipe for a massively bloated blockchain. are the posts at least compressed in some way before being added to the blockchain? if steem is meant to be comparable to reddit then it should be prepared to handle similar amounts of data, according to a source I found one month of Reddit comments is about 30GB, so how is steem expected to handle that much data (steem will actually need to handle more data then that due to the overhead caused by including cryptographic proof with each comment, upvote and edit) without becoming semi-centralized (limited to a small group of big miners)?
also how are edits handled? a complete copy added to the blockchain each time an edit is made or just the changes?
source: https://www.reddit.com/r/datasets/comments/3bxlg7/i_have_every_publicly_available_reddit_comment/
I believe the whitepaper discusses scalability, but supposedly it's scalable to at least Reddit levels with current technology and specs.
I don't know regarding edits. Good Question.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
@hell2o, I was wondering the same thing about edits. Did you ever find the answer?
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit