Freedom requires BOTH radical transparency and radical privacy
This post is in response to Dan's post asking in my opinion the falsely positioned question with "or", when it really is about "and". Dan asks: "Does Freedom Require Radical Transparency or Radical Privacy?". His post has some baked in assumptions which I don't actually think are universal. For example the assumption being that I want to know every detail about my neighbor? I actually don't think I need to know every detail about my neighbor and why?
The reason why is that I recognize my own bias. If I know too much about a particular person then because I too am a person, I may become biased in my dealings with this person. This bias could result in discrimination on an irrational basis (humans are not rational). In other cases this bias could result in persecution.
What does a world look like when everyone can rate everyone else and anyone can search via Google the life history of anyone else?
Consider a world where nothing is forgotten, where every good or bad thing you ever did in your entire life is logged to be searched for under Google. Imagine even the definitions for good and bad behaviors are always changing and behaviors which were considered good or neutral when you did them could later on be re-classified as bad? Every decision you ever make, must be weighed against how current and future attitudes might judge that decision.
By everything I literally mean everything, from the food you eat, to how many times a week you exercise, to who you associate with as friends, to who you marry, to how you have sex, nothing is off limits or sacred from being judged by the masses. In other words, your entire life up for peer review, continuously, and by a standard which you cannot know in advance. How can you know how your grand kids will think of your decisions today?
Total transparency is beautiful to the wealthiest in society
The idea of total transparency sounds most beautiful to the most wealthy in society. When I use the word "wealth" I'm not speaking about the narrow sense where the richer you are, the more on top you are, the wealthier you are. By wealth I'm talking not just net worth but your traits. This is everything, from being born into the right family, as the right gender, race, being born straight, being born with a predisposition to high IQ, etc. When the genetic lottery favored you, then you have the least to lose in a totally transparent world because your demographic isn't the most hated or disadvantaged. Those who have the least to lose and most to gain from a transparency society includes whomever is rich and famous in the current society, and a lot of what determines this is mere luck.
Would the gay jew like to live in a transparent society? I mean just because society is transparent it doesn't mean people don't discriminate or aren't biased. What about transsexuals? Does transparency protect them or help them to get hired? Transparency will favor those who fit in the most, who are the most normal, who have the currently most beautiful traits and or behaviors according to the current beauty standard(s). Total transparency in essence makes life a beauty pageant where some people are just born to look good while others have to look bad, with little to nothing that can be done about it.
This opens a pandora's box of problems but at the same time there are some bright sides
On the bright side, people probably would adopt much more radically conservative behavior. People who have a lot to lose already take less risks. So if a person has for example a reputation to lose, a lot of money to lose, a family to lose, and so on, they cannot afford to take as many risks. At the same time there are people born with none of this, and so they have little to nothing to lose, so they take more or most of the risks, and yet rarely get the rewards. In a sense, it's a sort of lottery, and more transparency doesn't change the fact that luck is the ultimate determiner, but actually has a dramatic effect of making it so luck is possibly the only determining factor which can allow someone to be socially mobile.
Today a person can take big risks, get big rewards, and we call it "pulling yourself up by your bootstraps". These private actors do things which perhaps their parents would be ashamed of, or which is considered sinful by their church, but they manage to escape their life of poverty and over time become dignified members of their communities. In a world of total transparency how would anyone ever escape these circumstances again? If every decision from birth is judged by the most conservative members of the community then hard work and risk taking may actually be discouraged in favor of conformity.
My preliminary conclusion
- I don't actually want to know everything about my neighbor's life or all the details of every action. I want the algorithms or AI to know, but for them to maintain their privacy (human access should be restricted). This way my own bias will not get in the way of making the best possible decisions from the information that exists.
- I want the scores or results from these algorithms to be released in such a way that these scores can be used in my own algorithms to support my decisions, but I do not want to know the details of every behavior which make up a particular score. Do I need to know everything about a person to determine if they are trustworthy in certain situations? No I don't, because algorithms already can take scores and produce a level of trust without any human studying a person. Humans do not have a good track record for being in these positions of judging the lives of others.
- I want maximum privacy and transparency at the same time, as only having both is the path to true freedom. It's not a question of should we let everyone see everything we do, read our minds 24/7, etc, in order to free ourselves, vs let everyone mask everything they do, and have no clue what anyone is thinking 24/7 in order to free ourselves. Freedom requires we have the best possible information from which to make a decision but it does not require a human being to access that or a human judge.
The problem with Dan's world is it relies on the human element too much. The faith in the human element is the same faith behind democracy. There is nothing in Dan's radical transparency to prevent mob rule, and mob rule never has justice. Why does mob rule never have justice? Because humans cannot truly be moral, as humans are biased, flawed, and irrational. These flaws might not make as much of a difference when all aspects of your life aren't being judged, but when they are? Then you will have these flaws magnified.
The solution in my opinion comes from putting the AI or machines in the position to access our life history and not putting people in these positions. It is in my opinion damaging to the human psyche to either be in a position of constant judgment by other people, or to be put in the position to constantly judge other people. Neither of these in my opinion are something humans are accustomed to on this scale. Artificial intelligence can know everything about everyone without the risk of bias, discrimination, persecution, provided that we can debias the AI and also restrict access to the data via for instance homomorphic encryption.
So my conclusion is we should aim for maximum privacy and maximum transparency. A transparent world where the all seeing eye is AI, and not our neighbors. The less I know about you the better it is for me and you, as long as the machines know us both.
I really agree with you here and I'm glad you decided to gather your thoughts into a full post response. I hadn't thought of it exactly how you present it here but it's inspired me.
For example, I really like data being available but scrubbed in some way. For example it's really useful to have huge amounts of medical data, but it's not so great if that data is tied to individual identities. I also favor one way functions which reduce the dimensionality and / or granularity (detail) of information, perhaps also redacting information.
It's really important on a blockchain to know what all the accounts are doing but it is great that we don't necessarily know who is doing it. Or put the other way around, it's good that when I see someone on the street that I don't know what they blockchain account or public key is.
Regarding @dan 's imagined world of totalitarian transparency, I think this is one of the most original thoughts you have on it:
Exactly. As I said in one of my comments, in some way we are all abnormal. We may be the most average person but have a couple of weird quirks. That's what a norm is, not "this is what a normal person is like" but "here's things that between most people we find acceptable".
We still need norms, and anyway we cannot get rid of them, but forcing people to expose their abnormalities will result in a favoring of the "positive" average (the average of only positive things).
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
In my opinion true freedom comes from having both privacy (between humans) and transparency (to the AI/machines). That is if the AI are collaboratively developed, decentralized, and trending toward being unbiased. I do not assume bias will be removed overnight just as I don't assume bugs will not exist in any software, but I do assume if we design it so that over time (iterative improvement) the trend is toward less and less biased AI, then we will eventually reach a point where we will be satisfied with some level of minimum bias and for sure it will be better than humans. Humans don't typically become less biased as humans get older. Humans don't typically become less biased over time as they are exposed to more knowledge, as some humans even ignore the latest knowledge if it conflicts with their world views/feelings.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Good point, it could work in theory. I wouldn't throw it out outright but I would need to see it in action.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I agree with the first half of your argument. Humans have bias and so we are likely to judge others unfairly. It's also a lot to ask for the masses who have already had to follow the whims of those with more power.
I don't trust AI because AI will be programmed by computers or eventually will program itself based on who the hell knows what...it's like the AI is biased by it's programming.
I believe transparency and privacy should be two optional ideals that everyone has the right to pursue. I would like to see the “nation” become more of a body that represents certain ideals and has little desire to influence what is outside of it's territory but is focused instead on truly embracing their ideals and showing the rest of the world the results. We could have societies that embrace total transparacy and others that embrace total privacy with many different flavors in between. The same could go for use of technology, some that fully embrace certain technologies, others that embrace other technologies, and some that reject technology all together. The individual could become a part of any society they wished according to their own principles. Any nation that wanted to go the AI route would have to program the AI to stay within the reach of it's own borders.
The question is, how do we build such a world and how do we encourage peaceful coexistence between these nations. It's just an ideal, a direction I think we should move in. The point is, the culture has to change for the better in order for the policy or implimentation to change for the better.
Anyway, I also made a reply to Dan's post :-D
https://steemit.com/steemit/@whatamidoing/on-creating-a-more-accepting-society-a-response-to-dan-s-post-on-radical-transparency
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
The type of AI I'm speaking about is narrow and known as autonomous agents. Although I do not restrict my solution to just being narrow AI. If we really could develop an AGI in a safe way, which could be our best friend, our priest, our mentor, our adviser, our moral calculator, our lawyer, then in my opinion the benefits of having this decision support far outweigh the risks of having it. The issue with AGI is it's a challenge to build a safe version of it, particularly because of a risk from nation states putting nationalist agendas into any instance of it. Nation states have enemies, have agendas beyond defending human rights, and have a tendency to bias. The Terminator we have to remember is not a movie about a decentralized AI built on a blockchain which the whole world collaboratively programmed, trained, developed, in a transparent fashion, but instead was about an AI developed in secret, in a classified setting, with closed source, without any transparency, and with nationalism as the bias.
This bias in the story is what led to the robots concluding all humans were the enemy instead of just the opposing nation the United States was at war with. My idea doesn't work in the same way. The AI would be totally transparent, developed to work with blockchain technology, open source, collaboratively developed as there would be a very low barrier to entry for anyone to contribute to training it, or debiasing it.
Would it work? Maybe the first instance of this sort of AI will not work, but the idea is iterative improvement can take place where every future generation becomes slightly less biased, slightly smarter, slightly more beneficial, slightly improved overall according to globally agreed upon criteria.
My suggestion is we maintain privacy between humans, but have complete transparency to our AIs, our machines.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Im not against the technology. I just don't think we are mature enough to implement it in a responsible way. Maybe one day though.
Could this kind of AI be easily deactivated if it didn't serve it as purpose properly or if problems arose?
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
The technology is to evolve what we are so as to make us mature enough to be responsible. Without the technology I don't believe there is a chance. Transhumanism is about being better than human because specifically humans cause a lot of unnecessary misery.
An AI which exists decentralized on a blockchain is under the control of all humans who keep the blockchain running. It's controlled by the consensus mechanism. I problems arose then just like we saw with TheDAO, we will simply see a fork, and resources directed away from the problem AI.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
When we live outside of the power based, fear inducing structures and learn trust and empathy, we are capable of such goodness on a grand scale. The problem is not human nature but the cycle we are caught in. I don't expect I will convince you there is a better way, but I hope you stay open to the possibility that we can do even better without relegating the job of overseer to machines. If not, come meet my friends one day ;-)
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Great insightful post as always Dana. Loved reading your perspective.
Regards Nainaz
#thealliance
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I just don't understand how it's not bloody obvious that stripping people of their privacy (whether they like it or not) is a gross violation of human rights.
You are terrible wrong about the conclusion in your last paragraph. It won't be just the machines knowing about you and me, for who do the machines ultimately serve?
Right, other (more powerful) humans.
You say not to put people there and leave it to the machines. This is not possible.
But let's say it is, regardless, the self-censoring because you know you are being watched will manifest, whether it is a machine or a human doing the spying.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
They said peer to peer electronic cash is not possible. A central bank was always supposed to be necessary. We were supposed to have to always trust centralized institutions because decentralized institutions weren't considered possible. Perhaps those who think it is impossible are wrong?
Can an algorithm violate human rights? Perhaps, but an algorithm cannot "know" anything and merely computes. It's not able to violate your privacy. This is particularly true if the right cryptography is used (homomorphic encryption) which renders it impossible by mathematics/physics for your privacy to be violated as the information being computed by the AI is not decipherable.
I think you don't understand the nature of computation or of machines. Computation is already something which occurs in non-humans, and even in the non-living. For example rocks could be said to include computations, or cloud formations, or many natural things. My idea is to keep no secrets from computation itself (think pancomputationalism) but to merely restrict human computation, as privacy applies to human beings, not to nature itself or non-human computation.
When you are worried about trees, plants, and non-human animals spying on you, then you are missing out on the ability to benefit from their computation. The universe is said to be a computational system which we keep no secrets from.
See these helpful videos:
References
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Anyone who creates the AI has and will have the ability to dictate pretty much every aspect of society. Surely, the person who is in control of the AI will be tempted to exploit the AI for his or her self-interest. I don't believe we can ever have a situation where machine is in a totally neutral ground.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Which nation controls cryptocurrency? This is the whole point behind decentralization of AI. If you believe a machine can never be neutral what is the point of decentralization? Neutrality can only be had by decentralization, which is why I sort of tend to agree with you if the AI is developed in secret, or by just a few companies under the pressure and coercion of their national governments.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I totally agree with the idea of decentralization, which helps it be neutral. However, ordinary people like you and me, we do not have the time nor the money to investigate companies who develop the AI technology. While blockchain has its advantage compared to traditional banking system, it's still susceptible to 51 mining attack. It doesn't matter if you have a decentralization if the majority of the mining system is coming from the same place.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Maybe I'm not normal, if I'm posting on Steemit, on topics like this, and thinking as I do, and that is perhaps the issue with total transparency. Do we want to promote maximum normalcy in thought and behavior?
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
What do you mean by maximum normalcy in thought and behavior?
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Since the crowd is God, it decides whether you are an angel or demon, a sinner or what is a sin, and it defines this without necessarily telling you what it has decided. At any time you can be kicked out of the in-group because your thoughts and behaviors don't conform to it.
So by normal, I mean you have to be the most normal person in the group to guarantee your own safety from being shunned. Being shunned in total transparency is social death. No one will talk to you, your friends, family, all will have to turn their back on you or they could get shunned.
Remember, we aren't talking about the sort of shunning in history where eventually people forgive and forget. We are talking about blockchain enhanced permanent shunning, where there is a blockchain which never forgets, and which people never forgive, and so the shunning is permanent for life.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Interesting post. Can I ask who you think should or would write/run the AI ?
Just one person or a "committee" ? And for what purpose?
I smell 'dictatorship' of a sorts?
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Everyone should which is exactly the point. You cannot trust a single company to produce unbiased AI, or to debias all AI. You cannot expect a company operating in one country to be free from bias. You can't expect that a group of only American programmers, all from the same or a similar background, to produce something unbiased.
But this is exactly why I am in favor of decentralized development of AI.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
i would like to program the AI to stay out of my business ;-P I'm ok with my neighbor knowing my business IF I CHOOSE to share it, which ideally I will...but maybe not.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Decentralised AI sounds OK in theory, but I'm concerned what happens once AI is prevalant and someone figures out how to "buy up" all the bits (think MSM, buying up all the small fish over the last 50 yrs)
The only solution I can see is ANON...... I'm working on being a Crypto-Zombie.
The only 'people' who need to know me are my doctor (if I had one)....maybe the taxman (but I could debate that :-) ) but privacy is #1 IMHO
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
What is a crypto-zombie? I keep seeing governments like South Korea and others mention that phrase. Maybe you should fully define it.
I am professionally working on decentralizing AI because I think it is actually a human right to have it and one of the best things we can do. That said, I don't know how to debias AI, I don't know what is or isn't moral, but with a decentralized AI, I can always ask.
The other great challenge beyond decentralizing AI is to find a way to make homomorphic encryption feasible and practical. Currently it is still mostly in the realm of theoretical but due to issues with performance it has not been done in a way which is truly practical. If we can achieve practical homomorphic encryption then it as a technique will solve this debate. Suddenly by way of cryptography we could have both radical privacy and transparency at the same time and this in my opinion is what we should be aiming for.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I thought I started the use of C.Z. right here on steem haha maybe it's catching on? :-)
To me it means I have crypto, I buy coffee & pay rent with it etc. i can convert to any 'coin' / fiat as a merchant demands
But I am NOT registered anywhere, with anyone (esp govs / bankers)
I exist yet I do not....a zombie ... free of FIAT, free of GOV, free of systems of control
I have no probs re AI as a way to help/assist/aid humankind.....just leave out the personal identification.
I want to be free, not tattooed (digitally) with a number, controlled by a 'master'
Basically this all boils down to a choice - conceptually as follows:
a FEAR based system where we have a "controller" looking after us and our interests (prob to 'their' advantage per current world / NWO)
a TRUTH based system where we look after self & those around us. (per Our New World => ONW)
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I think governments are watching you because the phrase "Bitcoin Zombie" was used: http://fortune.com/2017/12/12/south-korea-cyptocurrency-trading/
I thought I heard the phrase before and now I realize it came from you.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Cool
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I agree with you. I don't want to know everything about my surrounding. There are things, normal things, personal habits that can push me away. I'd rather not know about these little secrets. As you know, everyone has some skeleton in the closet. And it is important past actions. Life is a river. It is human nature to change. Everyone makes some mistakes. And then change, and do not want to think about it. So do I I'm not the same man who was a few years ago. Some things I regret, I want to correct them. Or not to make such mistakes anymore. This is my experience. I don't want to become public knowledge.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Your thoughts are beautiful!
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Do you believe that companies like Facebook and Google are handling the privacy/transparency issue correctly? Arguably, Facebook and Google know everything about most of their users (based on searches and posts). But both companies have a tremendous amount of security between your neighbor and your information.
However, both companies will sell your information "anonymously" to vendors in order to cover their costs.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Let me paint the nightmare scenario with Dan Larimer's ideal "radical transparency". The nightmare is it will just create another pyramid far worse than the pyramid we currently have. Think of a sort of beauty contest, where traits which people cannot change are used to rank them in society. Think of it as a contest to see who can be the most normal in a society and the top of the new pyramid will be the people who appear the most normal.
So what happens to minorities? I guess if you weren't born into the right family (a good family) with the right traits (genetics), the right race (social position/caste), the right religion (whatever your parents or religious teachings taught you), then you're for lack of better words fucked. Everyone can simply judge your behavior through their own cultural traditions, and if you're not considered normal then you are now a minority.
So what happens to the abnormals, the minorities, those deemed not to meet the current beauty standard? Will they be allowed to have jobs? To have apartments? To get married? To do anything? Who will associate with the undesirables? As much as Dan Larimer promotes coordinated shunning as a punishment, and as effective as it is, it is a bit too effective when you have a technology which can never forget a sin or transgression, going back to when you were a child.
Does that look like something better than what we currently have or worse? Here is something interesting to note and it might just be a pattern I notice but no one else does, but why aren't any minorities ever getting behind radical transparency? It's almost always the most normal person in the class or the prom king and queen who want total transparency, but where is the backing from the misfits?
This could be because it merely creates a new pyramid which puts the normals on top and the misfits at the bottom. A new social hierarchy is always going to form because humans obsessively rank and judge each other on instinct. To try to pretend as if this will not happen is naive, and under my own post I assume that is what will happen which is why I think we need to let unbiased AI be the only entity in our society trusted with all our secrets. AI does not have the unconscious biases by default, or the irrationality, or the ignorance, all which humans tend to have, all which will be involved with the analysis and judgment of your most private data.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
That is a brilliant reply!!
Thank you.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
That's the crux here. Can we do it? I don't think so..
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
If the blockchain can be decentralized, then AI can be both decentralized and debiased in the same way currency can.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Thanx you so much for sharing. & #resteem & UPVOTE DONE.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
@dana-edwards Thanks for the article and your comment on my reply to Dan's article . This made me think about a way of framing the dynamics between privacy and transparency and I have written this up in response to your and Dan's article. Would be interested in your thoughts on this.
Here is the article: https://steemit.com/governance/@voxxov/freedom-transparency-and-privacy
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
An excellent response to Dan's post and lots to think about again. Thanks also for the support for my position in the comments.
I definitely agree with you about the benefit of transparency to the socially wealthy as opposed to the rest of us and in regard to the current ability some have to do terrible things and achieve success in our world as things stand. I accept that with humanity we will always have some immoral elements within our society and should aim to limit these as much as possible by collective vigilance but I fear the unknown with AI. Maybe in doing so and during such discussion we might decide it is so or may find a less potentially unknown outcome.
Again, lots to think about. :)
I'm not trying to be a smart-arse but it is clear that you have a bias towards AI in terms of it's ability to lead the way in this area and would like to see such a process implimented because you feel it is the best route forward but I would suggest that perhaps you should propose discussion towards discovering whether all of us feel this is the best route forward.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
The difference is with the AI option you can determine whether you want AI assistance or not, how much to trust AI, what values to configure it and so on. The solution from Dan's article is not opt-in, is not voluntary, because radical transparency doesn't give us a choice to join or not join, and so we have to develop AI just to live in that environment with no privacy due to the massive increase in complexity that the extreme transparency and unforgiving blockchain will bring.
If you would rather take your chances when anything you say or do can turn the crowd against you, this is of course the option you have. I just want people to have more options than to leave things up to chance.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
I agree with you in your disagreement with dan. That post seemed like it was written by the government, and was quite unsettling when you know how it may play into the agendas of tomorrow.
But giving all the power to an AI is not any better, and in fact, I believe it to be what they want.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
You assume AI is some centralized thing controlled by "them" which you'll be forced to give power to. I never said give any power to the AI. I said allow people to have the option of consulting with AI. In other words, I do not promote for example government built AI which everyone has to then give all decision making to. I promote decentralized personal AI where each of us has our own decision support network to recommend to us decisions. It cannot decide for you, it cannot control you or manipulate you, all it can do is help you decide. When it is decentralized and done right, each individual gets to decide how much or how little to trust AI, and only people who trust it completely to make decisions on their behalf will have the AI doing that.
Transparency pushed on everyone is a different deal. Not everyone is prepared to live under that because not everyone is familiar with life in a small town. Personally, if I have to be transparent I would trust my machines and my AI more than any person, provided that the AI has exactly my morals because I selected it. That AI which I can trust, can be transparent to, could see everything, be my character witness, and solve the same problem Dan wants to solve using the crowd. The difference is the AI can solve the problem without loss of privacy, without unnecessary suffering, and actually improve my ability to be moral (Dan's transparency does nothing to help people be better).
It is easy to punish people in the short term for being bad (or just stupid), but it is very hard to help people become increasingly less bad over time. The issue I have with Dan's solution is it's more of same of what we have, where you just make it easier to punish "wrongdoers" and find fault, but offer absolutely no help for people to improve or be moral, or avoid being punished. In other words, it's a new trap with the blockchain capturing every mistake for later punishment. My idea is to help people reduce the risk of being caught in these traps in the first place by using the technology to help produce wiser decisions.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit