Thursday, October 8, 2015

Another perspective on ProQuest buying the Ex Libris Group.

The dust has settled a bit and I’ve had the opportunity to talk to senior executives at both ProQuest and Ex Libris Group about the recent announcement that Ex Libris has been acquired by ProQuest.  Now it’s time for us to sit back and start analyzing what has just happened to a couple of the major suppliers of library automation, and by any measure, this was a BIG event.  

I wrote a series of posts about Library Service Platforms several years back (2012). They apparently met a real need in the profession, as those posts have been viewed over 40,000 times as of today and since the time they were posted. The first post in that series is still very valid, but much of what I’ve said about the companies in subsequent posts has since changed. Of the companies I wrote about then, VTLS was sold to Innovative, Kuali/OLE has gone through massive changes in structure and backing (it’s open source, but not totally, at least not by the classical definition), WorldShare by OCLC has matured a great deal, but the organization behind it is still convulsing with changes under the new OCLC leadership and finally, there is Sierra by Innovative which now seems to be in a very questionable spot.  

In fact, when it comes to Innovative I’m predicting that we’ll see ownership changes of that company as soon as they can be arranged.  You simply don’t force out the CEO on a day’s notice, install a new CEO from the equity owner company and likely do so with any plan other than finding out how fast you can sell the company.  The problem for Innovative (and I told this to previous CEO shortly after he arrived at the company) is that they’ve stayed with the old architecture way too long.  Now whoever buys the company is going to be facing the massive task of totally rewriting and/or developing a new platform that is a true multi-tenant, cloud based architecture, i.e. a truly competitive Library Service Platform, (see this post for a definition).  That's a sizeable task that is slow, costly and has a target market of shrinking size.  My guess is the previous CEO was probably pushing to do that investment and when the equity owners looked at what that was going to cost and the return-on-investment, they decided to pursue another path with their money.  Parts of that sound familiar?  Yes, that would serve as an excellent segue back to the ProQuest / Ex Libris announcement.

Now there have already been a couple of excellent posts published that analyze this acquisition announcement in some detail and do so quite well and are generally very fair.  If you haven’t read the post by Marshall Breeding and the post by Roger Schonfeld, I’d certainly recommend you do so.  

Trying not to repeat what Marshall and Roger have said, here’s where I see important differences from what they’ve said in their posts:

  1. ILS’s vs LSP’s.  Integrated Library Systems (ILS’s), even when hosted in the cloud, and Library Service Platforms (LSP’s) are radically different architectures with huge implications for the future of library technology and thus libraries.  I detailed all this in a post, I’ve already mentioned a couple of times, but it’s worth saying again, multi-tenant software is the future.  Simply hosting multiple virtual instances of an ILS is not an LSP and will not get you where you need to go in another 3-10 years.  It simply won’t. If you go down that path you’re going to eventually get left behind -- way behind.  If you choose that path, understand it’s only good for the short term. (See my post on the “coming divide” for a full explanation).  I would also take serious exception with Schonfeld’s belief that libraries may not need this kind of technology in the future because they’re resources are becoming increasingly digital.  While the latter is true, it doesn’t make the former true.  Most libraries still have massive print collections and as a recent article in the NY Times described, we’re seeing publishers printing more books each year as the e-book business has seemingly hit a plateau, at least for now.  Library management systems will be around for a long time to come.
  2. Content-neutrality.  Let’s not lose sight of the fact that we’ve lost another “content-neutral” discovery vendor as a result of this acquisition.  That’s not a good thing for libraries, although most librarians ignore this reality.  In the end, I believe they’ll regret doing so. We’ve had yet another check-and-balance removed from our supply chain. This post explains why content neutrality is so important and why that loss carries a potentially high price for libraries.  So, in this regard, this is not good news.  OCLC with their WorldCat offering remain our only content-neutral discovery solution at this point outside of open source solutions (which don't’ have an aggregated metadata database like Primo Central, which provides important functionality for libraries).
  3. Equity Ownership.  Ex Libris is no longer held by equity investors. It’s no secret that I’m not a fan of equity ownership of major suppliers to libraries. I understand how equity ownership works and I’ve detailed my related concerns previously in this post. Yes, Ex Libris did well under equity ownership for the very reasons I outlined in my post.  But the fact remains, they could have done even better and invested even more in their products and services had they not been sending so much of their profit to the equity owners.  I’m hoping with that aspect of the ownership now removed from the equation, we’ll see some accelerated product development is some much needed areas, like the discovery system, course management system integration tools, and the some other needed product areas.
  4. Intota’s Future.  Despite what company executives will tell you, Intota has been languishing and a full product has never been released into the marketplace.  That reality has come at a steep price for ProQuest, as other companies now own large portions of the targeted high-end LSP market.  Of course one of those products was Alma by Ex Libris, now part of the ProQuest holdings.  So there is plenty of speculation that Alma will become the premier offering and Intota will eventually fade away entirely or the functionality that exists will be merged with Alma.  Certainly that’s possible although company executives deny that and insist the choices will remain.  However, I think there might well be another outcome.   Alma has long been aimed primarily for the academic, corporate and national library markets.  Which leaves public libraries and smaller academics thirsty for some competition in LSP offerings tailored more to their specific needs.  They really only have OCLC’s WorldShare at this point and I can easily see ProQuest re-aiming Intota towards those markets.  However, if I was betting, over the long-term, I'd go with Intota slowly merging with Alma and there being only one platform left, although possibly with two names to accent the different markets being served.
  5. Primo vs. Summon Discovery Systems.  As Marshall pointed out in his post, these products both have large and very devoted installed bases.  Neither product will disappear anytime soon, although pure business logic will dictate that over time, they will slowly meld together from the core outward until they are one.  But this will take many, many years and I’d agree with Roger Schonfeld, the future of discovery systems in general is more questionable than the future of these two product offerings in particular.
  6. Will Ex Libris remain a separate company?  Yes, for now, I think that’s a safe bet.  But it’s important to look at ProQuest acquisition history here and to note that over time, other companies that have been purchased have been slowly absorbed (remember Serials Solutions?) with only the product names remaining as vestiges of those firms.  But for now, yes, it makes total sense for these organizations to largely remain separate.  At least until company cultures are merged, operations are merged and everything is stabilized.
  7. What’s EBSCO’s next move?  Good question.  Clearly both Ebsco and ProQuest are trying to assemble end-to-end technology solutions for libraries.  Ebsco needs an LSP in their offerings.  They might be working on one behind the scenes.  Many people are speculating that buying Innovative or Sirsi/Dynix could be a step in that direction.  It could be, but as I outlined above, it’s a very problematic one because neither firm’s products are multi-tenant architecture needed for a real Library Service Platform.  So, a total rewrite would be required for them to turn that offering into the needed solution. Ebsco has a real challenge in front of them.  

What’s the bottom line here? I personally have a lot of respect for both of these companies and their teams. From a business point of view, it is a very good move.  Library automation is a tough and challenging field.  These companies have very smart people at the helm. Right now, they have all the right people saying all the right things.  But that’s normal at this stage of an acquisition.  What will matter is what actually happens in the weeks and months ahead.  So stay tuned.  Walking the talk is much harder to do.

Wednesday, July 15, 2015

Why, oh why, do so many librarians continue to chain themselves to the past??

Ask yourself a question:  Do you believe that the only way we in libraries convey and create knowledge is through reading? Via books? By the written word??  Do you have any doubt that when most people think of books, the term “library” is somewhere nearby in their thoughts? 

I doubt you answered any of the above with a “yes”.  (If you did, please contact me separately because we really need to talk!)

So if you don’t think that way, why, OH WHY do we continue to allow our libraries, our services, our very cause for existence, to be repeatedly tied to the idea that reading is the sole purpose of libraries?!?!?!  Why do we so blatantly reinforce that image?

Now, let me state the obvious here.  There is no question that for a very long time books have been and they will continue to be in the future, a major vehicle for the transmission of knowledge, whether it’s fact or of your favorite author’s latest new work of fiction.  However, in all these cases can we agree that the goal is the creation of new knowledge?   

David Lankes reminds us in “The Atlas of New Librarianship” that: “The mission of librarians is to improve society through facilitating knowledge creation in their communities.” 

Our mission - knowledge creation.  I agree with that statement.  (If you’re wondering what I’m defining as knowledge, refer to this article, Section 6.)  

But let’s remember all the additional forms which knowledge exists in, is created, curated and transmitted in today’s world: Video, photography, sound, software, data sets, webcasts,  geo-location files, collaborative rooms, our communities of users/members and yes, librarians!

Why is this so hard for us to take in and act upon?  You say it’s not?  Well then, please consider the following:

  • Library Promotion.  It’s summertime, so look to your nearby public library promotions.  I’m just guessing that the featured event is summer reading.  Ok, I can even agree with this, but where are the programs for using a camera to tell a story, to learn about visual reality as a pathway to have a debate with Plato, to work in collaboration with other children to achieve a knowledge goal, the list goes on and on.  Yet our focus is where?  Books….
  • Face to the World - Physical.  Here’s a photograph of the outside of the parking garage of a Midwestern public library.  I’d say that’s a pretty clear statement of what they think they’re about and it does a great job of reinforcing that the library is all about….. books.

  • Face to the World - Virtual.  Take a look at most any library’s website, discovery system or OPAC and apply a really critical eye to it.  (Better yet, get one of your users to sit down beside you so you can see it through their eyes.)  Ask them:  What does it say to you about how your library provides access to existing knowledge?  In what forms or media types?  I suggest you take a look at Harvard Library’s interface as an example of how to do it well; it features a listing in the left column of “books, all databases, article/journals, news, audio/video, images, archives/manuscripts, dissertations and data. Nice.  
  • Knowledge Creation Tools.  What tools does that site provide you to assist you in creating new knowledge?  Do you feel that you can create new knowledge remotely or must you go to the library to do it?  Can you use any device you want in accessing/creating knowledge?  Can you do it from any location you want?  The answers to those questions will tell you a great deal. 
  • Signage/Services.  One of my pet peeves at most libraries is services with names like “reading lists”.  REALLY?  Have you seen a professor who only uses reading to instruct, to teach, to engage students?  Most use lectures (now frequently recorded and online), PowerPoints, webcasts, podcasts, digital content, labs where students must collaborate, writing assignments and oh yes, some articles and books.  But how many other sensory input/stimuli did they also use?  Did we not assist in providing access to all of those?  So, WHY do we only talk about the “read” portion?!?!?!?  At least, let’s agree to change “reading lists” to “resource lists” and anything else that is so named as well.
As librarians we need to move away from branding that ties libraries solely to printed materials.   We are not just about books or journals, we’re about knowledge and the containers that knowledge comes in are far great than just the printed forms.  

Please, unchain your library from the past.

Tuesday, May 19, 2015

The next step on the path of building a Knowledge Creation Platform

This is one of the photographs that hangs in my office and it’s a quote from Buckminster Fuller which says:  
"You never change things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete.”  
It’s a line of thinking I adhere to frequently. So it’ll be no surprise to those of you who have followed this blog that one of my pet projects is not to try and perfect existing discovery solutions, but rather to build a Knowledge Creation Platform. For a starting description of that concept, see this article

Now combine that thinking with the fact that, back in 2011, when I was at the Charleston Conference and attended a session titled “New Initiatives in Open Research” where I heard Cliff Lynch and the late Lee Dirks speak. Cliff said:  
“If you do the math, you will find horrifying numbers, something like a scientific paper is published every minute or two.  It means you’re buried.  It means everybody’s buried.”  
That fact really stuck with me.

Today, what I see and experience in the University environment is the pace of knowledge creation becoming so intense and so fast that our current tools for researching, building and expressing new knowledge are outdated. And thus the existing processes we’ve automated are equally outdated. This strikes me as a set of problems in need of a very large fix.

So I want to introduce what I think is a very exciting step forward in beginning to address those issues.  What we’re doing is starting a new initiative with the high tech firm, Exaptive.  Here is a video describing their product, with an introduction to the initiative. I encourage you to watch it before reading any further.

Ok, you’ve watched it and now you’re back? Great.

Let me fill you in on the thinking behind this announcement.  Consistent with what I’ve said in articles and related blog posts is the fact that I want to provide to our library users/members some substantially different capabilities than those they get when they use a generic search tool like Google Scholar.  As noted in my writings, most discovery/search tools are querying repositories or databases that contain existing knowledge metadata and content.

Certainly that’s very valuable. But those existing tools make no real provision for analyzing this existing knowledge or drawing correlations between data sources, nor do they suggest overlaps, visualize the results or allow the user to easily bring together the people behind the knowledge found.  The existing tools do not give the capability to start building new knowledge, only to find existing knowledge.  However the Exaptive product moves us down the path towards knowledge creation and more.  The implications of this are really far reaching.

For example, the first project we’re moving forward on is one in which a researcher is looking for a concept, but one that over the centuries has existed in many cultures and languages and under many different terms.   Our researcher knows English and several other languages, but not all the languages in which that concept might be expressed.  So, what he is looking for is a tool that will function as an “authority file,” if you will, that’ll essentially provide “see” and “see also” references across those many languages.  By using known taxonomies, linked data, library related and accessible authority files, we hope to be able to do the analysis of data sources, then visualize the results to show the overlap and correlations that exists between data sources.   We believe when data sources are analyzed this way the Exaptive product should provide tremendous new insight into the topic and field of study.

Another exciting part of the Exaptive product is the ability to create what are called “cognitive networks,” groups comprised of the researchers responsible for creating the research and research data found and utilized in the analysis phase.  Unlike social networks where you have to slowly and manually build your connections or friends, the cognitive network is built automatically as researchers explore, select, filter and analyze the data they need. The result is that these cognitive networks facilitate a researcher’s work, instead of adding to it. This cognitive network can become a set of collaborators or peers that, if willing, could be focused on analyzing, vetting and refining the new knowledge from inception to dissemination.  (Yes, obviously, trust plays a huge part in this and must be dealt with as part of the model).  It’s a model that would be more capable of scaling to incorporate the vast amounts of research being conducted today and would increase the speed of dissemination of the new knowledge that results because it isn’t just dependent upon the publication of physical artifacts, such as papers or books (although that could still be done). Rather it would accommodate knowledge being born digitally, and once vetted by the cognitive network, could quickly be disseminated to others for them to continue the cycle and build upon yet further.  Think about how powerful that could be in creating new knowledge!

The Exaptive product, when coupled with what we've already got in place at the OU Libraries (our Discovery system, open journal/open access publication system, repositories and other) will allow us to move further and faster in helping to evolve ideas into new knowledge.

One thing I need to say at this point is that doing this is both a technological challenge and a change management challenge.  If this project is successful, it has the capability to remarkably change the engagement and knowledge creation experience for many people. To smooth this process, we will need to educate our users/members on the new needs we’re addressing and how and why it’s a major step forward. If you’ve read articles/books about how to do change management, you’ll know one of the best ways to do this is to work with thought-leaders on and in our campus and community and provide them with the extra support needed to learn and use these new tools.  We need to do that in order to ensure they are very successful in doing so.  If we do, it’s a win-win for all involved.  It puts users on the front edge of research and dissemination in their field and it gives us a success case to point towards as we talk with others and try to inspire them.  

Of course, those that wish to work in isolation can continue to do so, even with this new model.  However, new value would be added to ideas by bringing multi-disciplinary and multi-faceted viewpoints to the table throughout the lifetime of an idea, which will help to make these ideas substantially more valuable and more applicable in the end.  We already see the health sciences field moving in this direction because they so clearly understand the inter-connected nature of the organs of the human body and the need to bring researchers together as ideas are developed.  The Open Science Framework is another model where collaboration and shared data sets occurs early in the research process.

As I said above, there are lots of implications for new models of knowledge creation based on this initiative.  Existing culture and change are two of the largest challenges early in the process.  But first, we’re going to focus on getting the technological foundations in place and then see what we can do.   Stay tuned!

NOTE:  Those at the University of Oklahoma interested in having a departmental demonstration of this technology and/or meeting with key project team members, should contact me at: carl(dot)grant(at)ou(dot)edu

Wednesday, May 13, 2015

How Can Libraries Find the Money To Make Big Changes? (Part 3)

Over the last two posts, we’ve looked at how libraries can find the money within existing resources in order to fund big changes.  In the first post we looked at strategic plans and in the second post we looked at the use of metrics to measure progress against that strategic plan.  Now, in this final post, we want to step back and look at the efficiency and effectiveness of our current operations as reflected by their internal workflows.

Let’s face it, a great deal of what is done in libraries has been done for a long time.  Even if we’ve automated the workflow along the way, it was likely put into place 5-15 years ago and has rarely been reevaluated for efficiency or effectiveness since that time (unless you’ve implemented the metrics/analysis discussed in post two of this series). Yet reevaluation needs to be done on a regular and recurring basis.  

The process of reevaluating workflows presents you with a golden opportunity because it’s a great time to think really big about what the workflow would look like if you could design it without restriction.  So the first step in this workflow is to think about and design the “perfect” workflow.  I call this “setting-the-destination”, because it serves as a description of where you ultimately want to end up.  The perfect way for your staff to order new resources, the perfect way for metadata to get created or for users/members of your library to find the resources they need?  What’s the perfect way for users/members to utilize them, cite them, etc?  What do those workflow look like?  This is blue sky thinking and it needs to involve those on your team that have the ability to think creatively, but also those that understand the intricacies of the current workflow.  You want to be certain to document what the team comes up with in this step because you’ll come back and visit it again and again, as new versions of products, with new functionality, become available for implementation.

In the second step, you return to reality and, either through use-cases or flowcharts, describe how the workflow is actually being done today.  Again, use cases or flowcharts should document the workflow. When your teams do this, they will quickly realize there are many things being done that no longer need to be done. Those are candidates for immediate removal.  It’s not unusual to find a 10% boost in productivity just from this step happening. 

The third step is to redesign the workflow and workflows associated with it.  A very good time to undertake this kind of workflow is, for instance, when implementing a new Library Service Platform, because the technology touches so many areas of library operations and those products give you an excellent opportunity to streamline a lot of workflows.  

As we all know, in the past the library was primarily a print based operation and over the years, as licensed content and electronic content became a part of the library services, entire new workflows and procedures were created to accommodate each.  Over time, many of the steps in those workflows replicated, or very closely emulated, steps in handling one of the other parallel areas.  Most libraries find that when they perform workflow analysis they can see areas were these steps can now be combined together in order to free team members to address new, more challenging and interesting work that needs to be done.  

In doing workflow analysis it’s important to know that most people do not inherently know how to do workflow analysis.  They know how to do the workflow.  So it takes time to train people to do analysis; the process of pulling apart a workflow and rethinking it.  I’ve found giving them a series of questions to ask themselves, and others, at each step helps them to do this.  Here are some things to be sure to do:

  • Make sure to include everyone that is involved in each step of a workflow, from the beginning to the end.  
  • When looking at a workflow, identify everything that comes into the workflow (the inputs) and that flows out of the workflow (the outputs)
  • Here are some of the specific questions to be asked about each step in the workflow: 

    • Who receives the outputs? 
    • What does the next person do with the output you give them? 
    • How does the quantity and quality of the outputs affect what they do (do steps get skipped when quantity is high?  If quality is low, what do they have to do over or fix before they can move the work to the next step? 
    • Who and how is the quality of the work verified? 
    • When something different than what is expected happens, how is it handled?  What’s the workflow then?  How does it change? Be sure to document these processes as well.
    • Look for places in the workflow where there is waiting, moving and/or repetition.  Try to find ways to eliminate these, for instance, by dong something in parallel if possible. 
    • Identify the places in the workflow where there are complexities, bottlenecks and frustration.  Eliminate those.
  • Once you have done that, then ask these questions:
    • How many people does it currently take to complete a workflow?  What number should it take?  List that number and work towards it.
    • Consider, and document, what technology is needed to support the workflow and what it must functionally achieve?
    • What skills and expertise are needed to both perform and manage workflow?
    • Then ask if those skills/expertise/positions exist in the organization currently.
    • Use the answers to the above to describe the positions needed and then compare to the ones you have.  Any discrepancy will need a plan to address and resolve those differences.
    • Based on the outputs of the total workflow mapping, what are the jobs that: a) remain the same, b) will be modified and/or c) will be new?
    • Prepare a plan to be shared with your team addressing what and how the team will be trained for those new jobs?  When?
    • Consult with team members so they understand where they’re being aimed and what will be done to ensure they are successful in the jobs they will hold.  Repeat as necessary (i.e. which is frequently)
    • Throughout this process, Library Administration must make clear the workflows are being examined for ways to be more efficient and effective in light of the changing environments and that they know they have talented people and simply want to optimize their work.  
    • Be sure to link the new workflow back to the overall goals of the library.  They must see and understand the linkage and what value is added as a result.

  • Next, decide what will be measured to evaluate the workflow and how?
    • Can those measures/metrics be directly linked to the goals of the library?
    • If so, document which specific measure they are linked to.
    • If you can’t link them, re-examine why you’re doing this function and find a way to eliminate it.
  • It’s important as part of this workflow to note the distinction between what are called “core” workflows and those that are “support” workflows.  Core workflows are the ones that deliver value directly to end-users.  Support workflows enable core workflows (such as training, approvals, purchasing, etc).  Obviously, core workflows might be improved, but you want to be certain to only increase the value delivered to the end-users, not to diminish or delete it.  Support workflows are open to considerable revision for obvious reasons.
  • When the new workflow is introduced:
    • Explain, educate, communicate.  Repeat as needed.
    • Do trial runs, analyze the results/problems and make adjustments to resolve those issues.
    • Only then do you implement the new, more effective and efficient workflow.

So there you have it.  I’ve used all the steps described in this post and the previous two on this subject.  They’ve worked for my organizations and I believe they’ll work for yours as well.  When done, you should find that you have more people and financial resources at your disposal in order to resource those new ideas that have been sitting and waiting on your “to-do” list.

Thursday, April 2, 2015

How Can Libraries Find the Money To Make Big Changes? (Part 2)

In the first post of this series looking for money with which to finance change in libraries, we looked at the importance of active strategic plans, both at the levels of the parent organization, the library and all levels within the library. As noted in that post, obtaining new revenue via that pathway is a longer-term approach, one that will reap big dividends in the end, but represents a gradual approach as results happen, data is generated and confidence is grown. So now let’s take a look at a first step you can take in the shorter term. 

One of the most likely places to find resources to fuel new ideas is from within your existing resources.  How?  Especially, when you’re already working full speed and still feel like you’re falling behind?  The answer is by using metrics.

Metrics are a complicated subject and certainly understanding them fully requires more space than I can cover in this blog post.  However, let me cover, at a high level, a few topics and get you started and I’ll also provide a link to a post you can read if you want to know more.  

It’s important to understand that what metrics do is provide measures that will tell you and your organization if you’re moving in the right direction.  Are you  achieving the goals set forth in your strategic plan?  Of course to do that, what’s being measured must be in alignment and support the goals set of the strategic plan.  In other words, metrics provide focus.  Second, they provide accountability.  Now I’ve noted that’s a term a lot of people in libraries treat like a skunk on a hiking trail, i.e. they turnaround and head the other way.  But the result is the same, you backup and make no forward progress.   Accountability isn’t to assign blame, it’s to determine if you’re doing the right things and if not, to determine what should be done and to get moving in that direction.  It needs to be accompanied by an attitude of “failure is part of learning”, so while you don’t want to repeat the same mistakes twice, making mistakes is how we figure out the right way to achieve a goal.  An oft quoted Thomas Edison once said about the light bulb: “I have not failed.  I’ve just found 10,000 ways that won’t work.”  We need to apply this more frequently in libraries and we need to more willingly share the ways that don’t work so we can all more efficiently focus on finding the ways that do work.

Once you’ve determined what is to be measured, the results must be shared and used by all of management.  While many libraries internally operate like a landscape of farm silos, each operated by their own farmer, the reality is that libraries are more tightly woven than fine silk linens and we need to realize that only by working together do we produce a tapestry worthy of art that all want to see.   Management teams need to schedule regular reviews of the metrics and have open conversation about here this positions them on the goals of the strategic plan and to determine what adjustments need to be made.  If you want to read a bit more about metrics, here is a good, quick overview.

Another tool to be used in making new resources available is to conduct a true competitive analysis of the Library’s services.  I’d strongly recommend you convene a sub-committee of your end-users (students, faculty, staff community members, etc) in order to have this done.  You need to remove bias and the colored glasses from the perspectives taken and understand from the end-user’s community member’s point of view, what they see as the advantages and disadvantages of your various services.  OCLC sometimes does this for us with the periodic “Perceptions” reports, but unfortunately, the last one of those was 2010.  A half decade later, one could rightly questions the continued validity of the assessments made there.  So plan only on looking at those to understand questions you might want to ask and how to ask them.  The scope should be all end-user/community facing services.  Discovery, reference, liaison, circulation, ebooks, inter-library loan, etc.  Find out where the users go to get those services right now and specifically ask about the result of those services.  In other words, don’t ask where else they go to borrow a book from another library, ask how they obtain materials to read.  Look into how often they use those services.  Do they find them easier to use or more difficult?  Less costly or more costly?  Faster/slower?  You need a comprehensive, but very unbiased look.  When they report back read the assessment with an open mind because you’ll have been handed a treasure chest of facts.  If they tell you the service your library provides is inferior and little used, you have a candidate for elimination.  If they tell you it’s competitive, they’ll likely have also told you why and what you can amplify to make it even more competitive.  Don’t hesitate to discontinue the service because these are resources you’re wasting.  They can be redirected to support new services which will hopefully allow you to start generating greater value for your end-users/community.  If you want to read an excellent book on how to do this, read Blue Ocean Strategy.  You’ll find my review of that work in this post.

In the next and final post in this series, we’ll look at efficiency and effectiveness and the methodologies to use in finding those within your current operations.

Monday, February 16, 2015

How Can Libraries Find the Money To Make Big Changes? (Part 1)

My last blog post “If information has become ubiquitous due to the Internet, can librarians do the same?” caused a number of people to ask: “Wonderful ideas, but how do we pay for all of this?” It’s a very fair question.   Unfortunately, it is one, which frequently causes librarians, in response, to throw their hands in the air as if it’s hopeless.

Many librarians indicate that what they hear from their administrators is: “Do more with less” and feeling totally maxed out just trying to do what they’re doing now, they simply can’t wrap their heads around the idea of doing even more, without more resources.  However, I’ve always said that when confronted with that directive, we need to hear: “Be more efficient, be more effective”. I still maintain that position.

So, let me share some ideas about how you an organization go about doing that, over the next several posts.  Are these ideas easy?   No, of course not.  Are they quick?  It varies.  Some are quicker than others, but combined into a packaged approach, you’ll be able to show results from early on and well into the future.   Results that will help your library clearly establish its value to the communities it serves.

1.  Strategic Plans. When encountering a librarian reading my ideas and asking: how do we pay for that, my first response is to ask if the University has a current strategic plan published on the University website? (Here is ours at OU) Then I’ll ask the librarian(s) if they know what it says and can they, in fact, tell me something about it? More often than not, at best, I’ll encounter a blank stare or a mumbled response that they think they’ve seen it, but really can’t remember anything it says. Then I’ll ask if the Library has a current strategic plan and is it on their website? (I’ll frequently already know the answer, as I’ll have checked. The results of that checking are, shall we say, grim.  Here's ours at OU Libraries) If they have one, I’ll ask how often it’s reviewed in a year to ensure progress is being made on the goals/objectives that were set? All of which is a strong indicator of why a library is not performing well and/or is not being recognized for it’s contributed value to the University. 

It is staggering to me to believe that a Dean/Director of Libraries, in today’s funding environment, can expect to have a compelling and positive discussion about the Library finances, if they can’t sit down with their administration and directly show how the Library’s Strategic Plan supports and contributes to the goals of the University’s Strategic Plan.  Not only show but also do so in documented and measurable terms!  

For example, if the Universities plan calls for higher student retention, higher matriculation rates or the creation of a new degree or a program – the Library’s strategic plan needs to have some goals and objectives that show what the library is going to do to support those being achieved.  When achieved (and hopefully exceeded) it gives powerful support for why the University needs to continue to, at least, support the library at its current funding levels.   If the University’s goals have been exceeded, it makes a powerful case for the benefits to be shared with the Library.  Of course, this is not a quick path to more revenue.  It will take at least a year and possibly longer, before it starts to pay off.  However, it is likely to strongly support a case for not cutting the Library’s budget, if these linkages are drawn using metrics that make the case.

2.  Organizational Support of the Strategic Plan and Accountability.  Also, when talking to Librarians about their Library’s strategic plans, I all too often hear that it’s an administrative exercise, once done, it’s dropped in the drawer and forgotten until the next round of the exercise.  That’s a terrible mistake to make.  A strategic plan should be a living document and can serve in multiple ways to help build discipline in the organization that will allow the organization to achieve large goals and put the organization on a sustained high-performance track.  Creating objectives from the department level all the way to the team member level can do that.  Most strategic plans state goals.  While this isn’t the preferred route, it is frequently the route that results because people dislike accountability.  Yet any Dean/Director and/or department manager worth their pay, should take goals the Library Strategic Plan and turn it into objectives for their departments and team members.  What’s the difference between a goal and an objective?    According to this reference:
“Goals are long-term aims that you want to accomplish. Objectives are concrete attainments that can be achieved by following a certain number of steps… Goals have the word ‘go’ in it. Your goals should go forward in a specific direction. Objectives have the word ‘object’ in it. Objects are concrete. They are something that you can hold in your hand. Because of this, your objectives can be clearly outlined with timelines, budgets, and personnel needs. Every area of each objective should be firm. Unfortunately, there is no set way in which to measure the accomplishment of your goals. You may feel that you are closer, but since goals are de facto nebulous, you can never say for sure that you have definitively achieved them. Objectives can be measured. For example, ‘I want to accomplish x in y amount of time’ becomes ‘Did I accomplish x in y amount of time?’ This can easily be answered in a yes or no form.”    
If objectives are defined and they are linked to the Library (and thus the University’s) strategic plans, then when a performance period is over, both the team member and the manager should be able to sit down and say: “Did we accomplish this or not?”  Sure there will likely be a conversation why it wasn’t achieved if that is the case, but at least everyone knows what the expected result should be.  

Furthermore, that meeting shouldn't be once a year.  A better practice is to ensure that each team member has a quarterly meeting with their manager to ensure progress towards the goals is being achieved.  If the goals are no longer valid, this is a perfect opportunity to revisit and revise them, rather than waiting till the next annual evaluation cycle.

It's also important that the entire organization receive regular reports about how it is doing in achieving the plan.  A quarterly meeting, led by the Dean/Director is a good vehicle for achieving this.  So is a written report that can be distributed across the community to show the value being created for the community.  (Here is ours at OU Libraries.)

Performing these steps above helps to position the head of the Library to be able to take into their next meeting with the funding authorities, real measurable results that will document the Libraries value in achieving, and aligning, with the goals of the parent organization.  

That will also put in place a much firmer foundation for finding the money to make big changes than I see many libraries currently using.  

(Next time, we’ll talk about increasing efficiency and effectiveness by realizing that what brought you here, won’t take you there.)

Tuesday, January 27, 2015

If information has become ubiquitous due to the Internet, can librarians do the same? image
After my last blog post on library branding, I had an engaging exchange with a good friend who often says things that cause me to pause and think. That conversation was about what constitutes “expertise” in today’s information environment.  Then, over the holiday break, I read a recent book called “Virtual Unreality: Just Because the Internet Told You, How Do You Know It’s True? by Charles Seife.”  Finally, during that same holiday break, while visiting with another friend, who had recently written and self-published a book, he told me that while doing the research for his book, his very knowledgeable librarian, using substantial libraries resources, couldn’t find anything for him that he hadn’t already found using Google or Google Scholar.  In my thinking all of these dialogues converged together.  Here’s why.

A point most everyone, including librarians, agrees upon today is that due to easy accessibility, today information is truly ubiquitous in our environment.  Tapping or talking into our mobile devices readily retrieves information.  Increasingly, we can use normal conversational language in forming the inquiries.  In response the answers are delivered to us in mere moments.   It’s fast, it’s easy. Who needs a library or even a librarian anymore?  As a librarian, I know I’m not alone in having encountered numerous college and university administrators that have said this, nor am I alone in being asked at parties or at airports or on airlines, when being introduced and explaining I’m a librarian, being given the sad, sorrowful look and asked: “With eBooks and Google, aren’t libraries and librarians a thing of the past?”  That’s when I know I have someone in front of me that needs a major update on librarianship.  (Not that it’s really their fault. As I’ve long said, librarians do not excel at articulating their value-add).

Yes, information is ubiquitous today but here is the problem; so is so-called “expertise”.   Senator Patrick Moynihan once said, in an oft-quoted statement: “everyone is entitled to their own opinion, but not everyone is entitled to their own facts”. Unfortunately today, that no longer seems to be true.   As Seife documents in the book mentioned above, the criteria for holding “expertise” has been substantially lowered.  Today, you can be an “expert” by being a celebrity, by being rich (and buying a think-tank to generate “facts” that support your position) or just by being very persistent and vocal in making your position well know.  You can say just about anything online, and if you get a big enough following of people to read and repeat what you’ve said, you’ve by default earned the title of “expert”.  Social media, blogs, pod-casts and today’s TV media all permit, promote and foster the creation of so-called experts, most of who would not pass previous generations criteria for that term. The use of research and/or facts to support positions, particularly research and facts that have passed through tests normally applied to scholarship, have become totally secondary if required at all.  

We also know that we’re facing a population where concentration is becoming rare.  Multi-tasking has become a way of life, as has our supply of mobile devices. Soon those devices won’t even require carrying because they’ll be strapped to, or embedded in our bodies further exacerbating the problem.  Thoughts have become messages limited to 140 characters (Twitter) or videos that need to be limited to 3  minutes on YouTube, 15 seconds on Instagram and 6 seconds on (a definite trend there!).  We know Facebook and Google are using profiling to place us in silos in order to increase their ad sales.  However, those silos also result in us no longer thoughtfully exploring ideas or positions, particularly those that might conflict with our points-of-view.  As a result, we end up with a society, community, campus where we only read what we agree with and where we count on trending Tweets or friends to tell us what we think we need to know from the overwhelming, and ubiquitous, information flow.  It’s a very difficult environment, where simplicity triumphs over sophistication.

Now, let’s get back to libraries and librarianship, because it is this environment we’ve just described that gives librarianship the opportunity to create new, real and sustaining value.  However, as with so many opportunities, it also requires change. 

My previous post pointed out that librarians have not been diligent in keeping their brand up-to-date.  We’ve let the word “books” be our brand for way too long.  That was OK when libraries were THE place to go get information and most of it was made accessible in a book format.  However, that day is long gone.  

This has been compounded by the fact that when we did adopt new information tools, we lagged in the adoption curve and thus when finally introduced, we all too often, in a rush, simply tried to fully emulate that tool (look at our new search tool, it’s just like Google!).  When we did that, it meant we did not take the time to make clear the differentiating values librarianship provided (deep Web searching, alternate points-of-view, appropriateness, authoritativeness and authenticity).  This resulted in the commoditization of the new tool and as a result, it was quickly discounted (why do I need to go to the library, I can search Google and I’ll find more?). These problems were further exacerbated by the rise of mobile devices.  Librarians tended to simply push out their Google like interfaces (although frequently dumbed down) onto those devices. Now lacking the face-to-face interaction with the user, librarians easily became one with the technology. The result?  Librarians became identified with their technology and the total package was commoditized.  Which is where too many libraries still are today and why so many of us have ended up in those painful discussions about their profession and its future viability.  

Leading librarians saw what was happening and decided they had to adapt and so they defined a new pathway.  One, which allowed the value of librarians to be affirmed and even, expanded.  While not in the majority, their examples are now solidifying and are offering solid answers to the questions asked in those discussions.  The results work to ensure that expertise is seen as something that must be earned and measured by established academic criteria and not simply by creative marketing.

If we look at the recently built Hunt Library at North Carolina State, the newly announced, planned library at Temple University in Philadelphia or those institutions that are beginning to transform their existing facilities like the University of Oklahoma Libraries you can see recurring themes emerging with the use of phrases like: collaborative workspaces, intellectual commons & crossroads, knowledge creation, innovation centers and entrepreneurial centers.  In other words they are places where ideas come together, intersect, are examined, analyzed and improved.  This is done under the guidance of people who have earned the title “expert” through the normal channels of academic rigor and peer review, sometimes via face-to-face, sometimes virtually using librarians new investments in technology to support this exchange.  As a result, librarians are increasingly now able to be where their users are located and to add new and demonstrable value to the knowledge creation and supply chain.  Our goal has to be to make the value of Librarianship as ubiquitous as information

(In an upcoming blog post, I’ll talk about ways to fund the retraining of librarians and the reshaping of facilities to support these new pathways.)