Not only is data coming in faster and at higher volumes, but it is also coming in messier. Such “non-Euclidean domains” can be imagined as complicated graphs comprised of data points with specified relationships or dependencies with other data points. Deep learning research is now working hard to figure out how to approach these data-as-spaghetti sources through the notion of GNNs, or graph neural networks. With so much happening in this emerging field recently, this survey paper took the top of the list as the most saved article in users’ collections on arXiv.org, so something must be afoot in this area. The survey also summarized open source codes, benchmark datasets, and model evaluations to help you start to untangle this exciting new approach in machine learning.
Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
If you enjoyed what you read here, create your account today and start earning FREE STEEM!