Before an analysis is being carried on, we need to define sources, collect and cleanse data. Data must be "trusted", or it would hardly represent a reality, we'd like to perceive.
Even if data is trusted, who could guarantee that the resulting data insights are the same. An ordinary person would probably reveal, if Artificial Intelligence (AI) goes wrong with its digital advice in banking, online shopping or cheap flights. But there is an increasing number of children with cognitive disabilities (Autism, Asperger), who are expected in future mostly rely on digital labor (DL) assistance. In an autistic person adult life, this digital advice through a wearable devices will replace parents, who currently remind those children their everyday use-cases and journeys persistently.
The requirements to AI and DL must be reviewed now, so as people with disabilities in cognitive functions are protected in the digital ecosystem. Big Data and robust models are solutions. If we have a dataset that is big enough, the errors would not influence a common trend and, as consequence, the final insight. So, what is the measure for considering a data in the distinct area to be "big" enough?
Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
И как поставить лайк на твой пост? ))
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
@ashumanakh и как тебе здесь
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Там есть поле со знаком доллара, а рядом Кружок со стрелкой - на нее нажать надо, т. е. сделать Upvote
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit