Memory storage in the language learning process shows how we make inferences about
language in our communication.
If our brains were like computers, we could process information like a machine would. It takes about 1.5 megabytes of memory to learn a language, enough for a computer to save an image, according to researchers Francisca Mollica from the University of Rochester and Steven Piantadosi from the University of California, Berkeley.
The researchers estimated how much information we use to learn semantics, grammar rules, word choice, and other language features. They calculated the number of bits used in possible ways to represent these features. The majority of information goes to word meaning, suggesting language learning theories should focus on meaning, as opposed to other areas like grammar structure. For grammar structure, the researchers found 10 210 possible representations, greater than the number of atoms in the universe. Humans must have powerful inferring methods to reason through so many possibilities, they noted.
The research holds potential for determining theories of processing meaning and learning. The study had limitations, such as estimating the size of the adult vocabulary, to simplify the learning process.