New research suggests that when one part of our language network breaks down, the system has the ability to reroute itself.
Tools from the graph theory branch of mathematics can be applied to human memory, Michael S. Vitevitch says. The aim is to understand how words are stored in memory, because it may help explain why many brain trauma patients are able to recover language skills after a stroke or other trauma.
Michael S. Vitevitch is associate professor of psychology at the University of Kansas. He is also affiliated with the university’s Life Span Institute.
He is one of 146 scientists from 20 academic departments affiliated with the Life Span Institute, one of the largest research and development programs in the USA for the prevention and treatment of developmental disabilities.
Vitevitch’s work, which employs tools used by physicists and computer scientists to map the complex system of words in the human brain, will be published in the April issue of the Journal of Speech-Language-Hearing Research.
“Think of the diagram of flights you see in an in-flight magazine,” Vitevitch said. “In bad weather, one or two airports may be shut down but the entire system doesn’t come to a halt. You can take out parts of the system but other parts pick up the slack.”
A cognitive psychologist, Vitevitch has long studied the mental lexicon – how words are stored and retrieved in the human brain. Though a dictionary approaches words alphabetically, research suggests that the brain organizes words differently – by sound, by word meaning or by a combination of sound and meaning.
Intrigued by a new area called “the science of networks,” Vitevitch recently turned to the work of scientists in other disciplines who are using graph theory to illustrate how other complex systems work.
“Graph theory or network science are mathematical ways of looking at things – whether ecosystems, flight patterns, the Internet or social interactions,” Vitevitch said. “Some systems are randomly assembled and some are very structured, like atoms in a crystal.”
Using the network analysis and illustration program Pajek, Vitevitch entered a database of about 20,000 English words, slightly more than the average vocabulary of an adult native English speaker. Nodes in the network represented individual words. A link connected two nodes if the words were “phonological neighbors” (if they sounded alike). For example, the nodes hat, cut, cap and scat were connected to the node cat.
What Vitevitch found is that some nodes had many connections but most had only a handful.
“This disparity is a good thing,” Vitevitch said. “Short cuts are available from one end of this huge system to another.”
Returning to the analogy of airline travel, Vitevitch said the ability to bypass major hubs enables the system as a whole to keep functioning “even when things slow down.”
The path the network takes and how nodes are clustered in Vitevitch’s illustration show that the mental lexicon has “small-world characteristics,” also known as “six degrees of separation.” This now well-known theory, popularized in the 1970s, maintains that it is a small world, and each person is only six people away from everyone else on the planet.
“The small-world phenomenon has been the subject of much attention from mathematicians, physicists and computer scientists who study networks among collaborators, friends or Web pages on the Internet,” Vitevitch said. “Small-world networks tend to resist damage and disease and are very good at spreading information.”
The graph perspective holds promise for additional work in the mental lexicon, Vitevitch said, which will help researchers understand language development and processing. The small-world quality of language also may explain why word processing is so robust. “You may lose an individual connection but the system doesn’t entirely collapse.”