How is the Markov assumption related to a finite state machine?
In a sense, the state of a finite state machine has some memory of the path you chose to get there– in the sense that it would not be in its current state without a particular series of transitions occurring. So its state also depends on previous states. But finite state machines can easily model statements like “if event A has ever happened, then never enter states X,Y,Z” while the Markov case seems more like “if event A happened, decrease the probability of entering state X.” So, are finite state machines limiting cases of Markov processes, where probabilities are always either zero or one? How are the two ideas related? • BrooksMoses |2 pointswritten 3 months ago So, are finite state machines limiting cases of Markov processes, where probabilities are always either zero or one? As I understand it, yes. • wuch |2 pointswritten 3 months ago For fixed input string, probabilities are either zero, one or both. Because if we get into the same state second time and the terminal is differe