Why do the training passes have different learning curves for the same data?
Each training pass is started with a random set of weights in the neural network. These weights are the values that are adjusted during the training of the network. The learning curve is a measure of the error produced by the current set of weights in the network. Since each pass begins with a random set of weights, the learning curves will appear different. This includes not only the starting position, but the overall shape since the training is approaching reducing the error from different initial conditions.