The current issue of “The Economist” (March 6-12, 2010) contains an article entitled “The many voices of the web” that is quite an interesting read. It is about translating the content of the web. I still see many libraries struggling with understanding the basic idea that they even need to provide systems that support multi-lingual content. They all too often feel it isn’t a critical issue for their community of users. But this article points out that while “the web connects over a billion people but it is fragmented by language”.
In the United States for example, surveys show that roughly three-quarters of the population speak only English. This article points out; there is rapid growth in Web content that exists only in other languages ranging from Japanese, Chinese, Arabic as well as many others. The ability to access that information is going to become more and more important to research, to critical thinking and to forming a fully rounded understanding of the complex world in which we live.
The article goes on to describes efforts underway at both providing quick manual translations of web content as well as new software being developed to do automated translation. Libraries, as both the keepers of the human record and the portal that provides access to that record must keep a close eye on this technology and examine how it can utilized in fulfilling their mission. It is no longer enough to think solely of providing access in a user’s native language, it is also becoming an incumbent part of the library’s responsibility to break down the fragmentation that exists between silos of language specific content by providing dynamic access to translated versions of that digital content. To do that, we must once again think of the scalability of the task before us and realize that we must embrace technology as an essential, although not the sole, part of the answer to this need.
I recommend reading the article. It reminds you of some of the more interesting challenges we face in the days/years ahead.