RE: Using SKlearn I am getting memory errors is there anyway to use batching?

You are viewing a single comment's thread from:

Using SKlearn I am getting memory errors is there anyway to use batching?

in python •  8 years ago  (edited)

Keras == "very few lines of code"

Bookmarked this video a while back for future reference. When you mentioned the batch process issue, it reminded me of this video presentation.

Possible batch processing solution @17:32 with the model.fit() function.
I assume that model.fit() might be a wrapper for some heavy parallel routines that segment your training dataset rows into manageable chunks to be batch-processed within the gpu as kernel functions. Saves you writing the for loops as .fit() takes care of all the vectorization and/or compiled loops inside the gpu space. Python loops are tremendously slow, you can vectorize as much as possible with numpy arrays, but translating for loops into vectorized code is a little tricky.

Keras does all the grunt work.

Keras great for protoyping.

Keras == "very few lines of code"

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!