Transitional Probabilities, Prediction and Backward Associations in Language
A typical cognitive mechanism involved in language comprehension relates to the processing of sequences of stimuli. To learn sequences, statistical learning enables the computation of transitional probabilities (TP). A TP corresponds to the probability to encounter successive events in a sequence. While forward TP (FTP: probability to encounter B after A in an AB sequence) have been studied extensively, backward TP (BTP, probability to encounter A before B) remain elusive, as few studies have investigated how they are learned. Moreover, these investigations were based only on offline measures, thus leaving aside the issue of how BTP are used in real-time sequential processing, such as in language. We therefore attempted to synthesize the evolution of various statistical concepts in language processing, from cooccurrences in semantic priming to TP, and to highlight the lack of studies on BTP in the context of sequential processing.