Hi @steemchiller,
I have been working on the deep learning curation bot, and did not like the first version I created (it was alright at predicting article values, but not great). So I've gone back to add more complex features, and this time I'm using historic data on the author and the top/median curators. One problem I've run into is that it's really slow due to rpc errors. I've realized a way I could avoid the rpc errors is by setting up my own private node, but I don't know how much storage that takes, and wanted to ask you since you're a witness and run your own node. Do you have any advice on setting up a node if it is possible for my computer (if not perhaps I could invest in cloud storage if it's not extremely expensive)?
Thanks for your help!
@cmp2020
Hi, the requirements for running an own node depend on the APIs you want to use. In general, you won't get far with a 500 GB disk anymore and for most common cases I would recommend at least a disk with 1 TB SSD space. The amount of available RAM is not so important (32 GB would be good, but even 16 GB would currently work without any issues).
If you also plan to additionally run something like a Hivemind node (in case you are making use of the Bridge API methods), which receives the blocks from a regular full node, you will need even more disk space, of course.
So, the first thing to do would be to find out which APIs your current (and maybe coming) projects will require to be realized. Basically, it's always more fun to work with a fast SSD (for best experience even NVMe). In case you should decide to run an own node, I recommend to use WebSocket connections instead of normal HTTP requests, because this will perform way better when running many requests in a row ;)
Here are some links that might help you to get started faster:
https://files.steem.fans
https://github.com/steemfans/steem-docker-ex
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Thanks for your response! Unfortunately, my computer has a maximum of 16 GB of RAM so I think I will have to figure out a different way to do this until I have a proof of concept to show that it would be worthwhile to invest in either cloud computing or a computer with more ram. I did invest in 2 TB of external storage today though. I would appreciate any recommendations on how to download data faster with what I have. I am considering just cutting the historic data for now and working with the other features I've added.
Thanks again!
Edit
We decided to try setting up a node on our extra computer (which we hardly use) with 16gb ram and the external storage I bought. So far we haven't run into any problems so we might succeed at setting up the node. My dad and I just didn't think it would work because we're running it on WSL, but so far it seems to be working.
Yet another edit
We got it working, and have started syncing the blockchain. Thanks for your help! I hope to post an update on the ai soon enough ;)
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit