Recently I was thinking about it, and our most efficient "everyday use" sorting algorithms are in the neighborhood of O(n log n). Now what if it were possible to create a "smart" Bogosort that takes into account the ball park of the data it is sorting, and uses those estimates to "guess" the index of randomly selected data and then goes through the list to "sweep up."
Also, most of the time you are using a sorting algorithm, it is to repeatably sort data from a single output. What if you were to process the random output with machine learning to detect patterns and develop a bogosort algorithm that plays of the patterns?
Would it be possible to achieve a better efficiency? Possibly. Would it be easy? No. This is because you would need to create a new sorting algorithm for every "input type" of data you are using. I could only see this being plausible if done on a very large scale.
Please let me know what you think!