The sorting function is kind of complex but easy to calculate which will only care about two parameters: students scores and classes indexes.
I don't need to store the entire possible solutions in memory. I have done a Mapreduce test to sort huge nums separatly stored in 100Gb text files, and this test does not fit traditional memory calculation either. For my case, I think it's samilar but i have no idea how to handle with much bigger combinations. I know this num is beyond calculation. It's an extreme example. I can simplify my problem into a less num, e.g., , which might be total 19Gb data size. Then it comes back to the same question.
I can set up 8 nested for-loop to save all combination indexes into multiple files into disk and apply Mapreduce to solve the problem, although it feels stupid.
But it's hard to extend the workflow for bigger data that is too big to calculation, like the one I mentioned . I was thinking there might be some way to get rid of this big number. That's why i come here to find some better tools. I have no diea about the Optimal Stopping Strategy but i can check it anyway.
I am sorry to waste your guys so much energy. Let's stop here.