Born in 2017 TokenData is the result of an in-depth reflection and long-term work. Here's our story.
When in 2015 we started to hear about the Brooklyn Project, it was like everything we knew collapsed - we then dig into the Blockchain subject and felt like a new world was opening its doors to us. We also discovered the Ethereum project and started to follow & support all the extraordinary programs that happened in 2016.
Since then, we're a committed team dedicated to collect, process & analyze all the data we can find on the Blockchain ecosystem to build a decision maker platform, thanks to the power of Machine Learning.
A Billion Dollar Problem
Over the past two years, funding through the Blockchain has evolved in an unprecedented way & this phenomenon has played its part in making the Blockchain ecosystem the deepest fund in the world - the blockchain market has managed to raise almost US$ 7.6 billion in 2017 while venture capital funds have funded early stage projects for only $3.6 billion over the same period.
While this sector offers the best opportunities, investors & participants are exposed to significant risk of losing their investment on this ecosystem - almost half the companies which wanted to conduct an ICO in 2017 have put their plans on hold or abandoned it ; and some of the tokens that were exchanged with significant volumes on crypto exchanges happened to be frauds.
This means that a significant number of investors have lost their money. The potentially high level of losses suffered by investors essentially stems from the lack of visibility on the fundamentals of the projects and the absence of counter-powers or even bargaining power for investors when entrepreneurs fail to deliver on their promises.
As long as the tools available to investors to distinguish promising projects from others do not let them conduct a thorough analysis of the fundamentals, the risk will remain high. For all of these reasons, it is essential to provide investors with instruments bringing more clarity between truly promising and unworthy projects.
Token Data's Solution : the power of Machine Learning
This is why we came up with TokenData ; we aim to give the decision maker all the tools he needs to get the right feeling about a project, a company or a market and the blockchain ecosystem seemed the best way to demonstrate our capabilities to improve the state of the art.
TokenData is the combination of publicly known data, financial and alternative data. The compilation of all these datas allows the construction an unprecedented DataLake to create new decision models that match the ever changing world. Learn more about all the data available in our solution p.26 in the white paper.
For you to understand the scope of our solution, we need to quickly explain the taking and endings of Big Data & Machine Learning.
We live in a world where data production & storage is permanent. While this data was originally limited to raw financial and economic data, it is now enriched by so-called alternative data, disseminated through the democratization of the Internet, connected objects & high-speed connections, producing ever more data to store in real time. However, we are still only at the very beginning to understand the strategic impact of this data. Larger organization are only beginning to understand the strategic impact of this data (PwC has estimated that revenues from the marketing of data could reach US $300 Billion by 2019).
However, data is only valuable if properly presented. In other words, it is important to be able to consolidate the data in subsets to ensure that it is usable by users.
This is why the main challenge for data providers such as ourselves is to ensure that the verification, cleaning and ordering of the data has been completed.
In this context, we are working on the automation of the preparation and cleaning of the collected data in our Data Lake. This is where Machine Learning first enters the game. Preparation is usually carried out in two steps using Machine Learning algorithms :
- A so-called processing and conversion phase; for example, the vectorization and extraction of key data from a pitch deck to turn it into a usable data set;
- A so-called cleaning and verification phase; for example, when data appears "abnormal" and disconnected from the same data collected from other sources and observations of the same variable, we can sort them out or correct them. We can then verify the overall consistency of a data set.
Then, we use other Machine Learning techniques in order to create so-called alternative indicators available for our users.
For example, we're working at the time of the article on two different indexes about blockchain companies :
A Scam Index that allows to predict, as early as the publication of the ICO's white paper, the probability that the ICO is a fraud. Learn more about this Scam Index in our WP at p.26.
A Risk Index that helps define the speculative risk associated with a token. Thanks to our rating system, we are able to communicate to our users how high the risk of default associated with a token is. This index is closely related to the underlying of the token: the stronger the underlying, the lower the risk associated with buying the token (and vice versa). Updated regularly, this index takes into account criteria such as the execution of the roadmap according to the pre-defined planning, the feelings of the communities with regard to the company on the social networks, the volumes exchanged on the different exchanges, arrivals and departures of project contributors (employees and contractors).
Use Case
The financial industry is, by definition, likely to pay very significant sums of money to acquire the largest amounts of relevant data. The more relevant the data available to the investor, the more likely he will be able to improve his investment strategy. A simple adjustment of his investment decision model through the input of new data can lead to very important monetary consequences.
And what's amazing about our solution is that we offer investors this new alpha source of datas so that they will be able** to improve their analysis processes & concentrate their efforts on improving the efficiency of their models** (instead of spending time to collect & structure datas).
We are aware that our main clients (i.e. hedge funds & private investors) can be reluctant to using new decision models whereas they have been working with the same methodology for years.
That's why we always suggest that they first use our fundamental datas in their own decision models before switching to using our alternative indicators for investment choices ; thereby they can evaluate how our decision models' accuracies overtake their own with the same datas.
Token Data's Dream Team
Our team is composed of a worldwide group of entrepreneurs, tech experts & legal advisors that all share the same vision of creating new decision models based on data & not on pre-biais human hypothesis. Each of our full time dedicated member brings its own expertise & personality as well as its unique set of skills to conduct Token Data's mission.
- Alexis Berthoud - Chief Executive Officer
- Agathe Jambu Merlin - Chief Operating Officer
- Steed Monteiro - Chief Technical Officer
- Ethan Sebban - Chief Data Officer
- Yann Pringault - Full-Stack developer
- Max Huang - Legendary Senior Data Scientist
- Michal Monselise - Academic Senior Data Scientist
- Partha Sen - Crypto Senior Data Scientist
- Yijing Li - Genius Junior Data Scientist
- Sophie Gervais - Marketing Specialist
- Jonathan Nabais - Marketing Specialist
Legal Advisory Team :
- Sophie Vermeille - ICO Structuring Specialist
- Émilie de Vaucresson - IT & Intellectual Property
- Nicolas Rouiller - FINMA Specialist
- Isabelle Chauvet - Legal Fiscal Specialist
If you want to join us in our adventure, we still have some positions opened on AngelList (https://angel.co/tokendata).
We're also very excited to announce here for the first time a unique partnership with a research team from a top-A US University that will help us on changing the world.
We are currently at the beginning of our adventure, please take the opportunity to discover our website & our White Paper which includes our roadmap and more details on our solution.
We are reachable on Telegram any time to answer all the questions you may have.
Follow us on
- Facebook : https://www.facebook.com/TokenData-1884067325220498/
- Medium : https://medium.com/@tokendataICO
- Telegram Announcements : https://t.me/joinchat/AAAAAEnpmczvxh-EC90FtQ
- Telegram Chat : https://t.me/joinchat/H6goKw7qGXzszoJMA2Owlg
- Linkedin : https://www.linkedin.com/company/tokendata-ai/
- YouTube : https://www.youtube.com/channel/UC-B5rIX1eLR3elOjyZ3nG6w
- Twitter : https://twitter.com/tokendataai
- Reddit : https://www.reddit.com/user/tokendataico
- Discord : https://discordapp.com/channels/484700860094349312
Congratulations @alex20137! You have received a personal award!
1 Year on Steemit
Click on the badge to view your Board of Honor.
Do not miss the last post from @steemitboard:
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Congratulations @alex20137! You received a personal award!
You can view your badges on your Steem Board and compare to others on the Steem Ranking
Vote for @Steemitboard as a witness to get one more award and increased upvotes!
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit