Stochastic context-free grammars (SCFGs) are often used to represent the syntax of natural languages. Most algorithms for learning them require storage and repeated processing of a sentence corpus. The memory and computational demands of such algorithms are illsuited for embedded agents such as a mobile robot. Two algorithms are presented that incrementally learn the parameters of stochastic context-free grammars as sentences are observed. Both algorithms require a fixed amount of space regardless of the number of sentence observations. Despite using less information than the inside-outside algorithm, the algorithms perform almost as well.