Monday, October 29, 2012

Impressions of the new library services platforms - Part 3 - Intota by Serials Solutions

(TM) Serials Solutions
Note:  If you haven't read the first post in this series, I'd recommend you're clicking here and doing so before reading the following post.  That first post sets some definitions in place that are used in the analysis and comparison(s) that follow.


Proquest and its business unit, Serials Solutions are another very good company with a long history of providing libraries with solid products backed by very good service.  Serials Solutions product in the library services platform arena is called Intota(TM).   The company has backed this product with very solid people who have a long track record in the field and a very strong understanding of the needs of libraries.  From an external point of view, it appears they are assigning all the resources needed to produce a new and serious contender for those libraries wishing to procure a new library services platform.


Intota is a totally new product, written from the ground up, and appears to meet the criteria (see my introductory post) to be called a true cloud-computing solution.   Serials Solutions staff are saying it will offer true multi-tenant software operations, shared data capabilities and the capability to fully support a powerful analytic engine and analysis.  Plans also exist for multiple data centers including international locations within the next year.  Of course, these are verbal assurances at this point so libraries considering this solution, before signing a contract, should verify all of this.  

In my analysis, Intota was developed by ProQuest as a response to competitive offerings (from OCLC, Ex Libris, Innovative) that were offering complete discovery-to-back room systems and thus tighter, cleaner integration of all the processes involved.  Had ProQuest/Serials Solutions continued solely with Summon and the 360 offerings, they would have found themselves in a disadvantaged spot in the marketplace. Thus they responded by announcing Intota.  The consequence of this response-driven decision is that it is the latest entrant in the library services platform offerings and therefore it has the least functionality to show at this point-in-time.  The advantage of being the latest, is that what is being shown looks very promising and features some creative thinking and well thought out integration of the workflows and processes that occur in the backrooms of all libraries.

Intota is based on the approach that libraries are managing today’s collection with yesterday's tools and the nature of the collection has changed and that users want to be self-sufficient. Intota focuses on workflows, system maintenance and assessment so librarians can showcase their value. Overall, Intota is a total reconceptualization of library management systems providing functionality focused on selection, acquisitions, description (cataloging), fulfillment, knowledgebase and discovery. 

Library Considerations

As mentioned above, Intota is being written from the ground up.   This is both good and bad. The company has hired many extremely talented people for this product and as such, they won't be tied to historical code while inventing this new product.  At the same time, it creates a much higher probability of some forgotten complexities that underly much of what happens in library back room processes and thus functionality may or may not meet initial expectations or it might prove to be exceptionally prone to bugs while all these details get worked in.

It would also appear, based on Serials Solutions reported selection of development partners, that Intota is intended to be a product designed to have  broad appeal across all types and size of libraries.  Of course, this also means there is some risk involved in using Intota, for both libraries and the company.

One of the risks of the decision to try and appeal to many different types and sizes of libraries is that the functionality can be a bit thin, depending on the type of library and when the product is installed. For any organization, developing a new product and trying to quickly address a very broad market there is the potential to create disappointment for those involved in the early installations and stages. The reality is that this approach simply stretches the developer resources very thin, across many demands and has the potential to produce thin functionality and/or low quality code.  This can result in some very unhappy early adopters.  For any organization building a totally new product, the way to avoid this is to focus on similar types of customers initially. As they become successful, the organization then branches out to address the other types of libraries. Unfortunately, neither librarians in libraries nor sales managers in companies are a very patient lot.   As a result those librarians buying into the vision of the product and wanting to get on board early (possibly for financial reasons or reaching the end-of-life for their existing system) can find the delay in receiving needed functionality a very frustrating experience.  Likewise sales managers in companies, particularly those with business backgrounds that don’t understand libraries, will press account managers for closed sales. Thus sites that should wait to sign up, either for their own reasons or company reasons, get rushed to “close the deal”.  When that happens, installations can be problematic and frustrations will mount to the point that librarians will bail out of the agreement, which is bad for all concerned.    This is a place where librarians, in order to protect themselves and their libraries, need to press the company to be very clear about when their outstanding needs will be addressed and they should put a delivery schedule and associated penalties for failure to deliver, in the body of their contract to ensure it is more than just a “handshake agreement”.

As for the openness of Intota, what is being promised is a suite of documented, open API’s.  As Serials Solutions staff note in their presentations, this is not something new for them, they’ve been doing this with their other products for quite some time.  (See my blog post on API’s to understand how you can position your library to have a successful experience with API’s in an open platform environment.)  However, is it’s too early to provide any firm analysis here, we’ll simply have to wait until this product is further along before we can judge this. 

One of the real advantages of Intota is that it represents a total approach, covering discovery to the back room. As a result, it offers tightly integrated processes, workflows and data.  It will allow librarians to smash through the silo's that exist in so many library's technical services, reader services and other departments.  

However, as a librarian I’m terribly uncomfortable with a library locking themselves into buying so much of their content, assets and tools from the same supplier.  This incurs a real risk that I think way too many librarians are simply guilty of ignoring -- and at considerable peril.   This risk arises simply because doing this removes competition and too many checks-and-balances from the supply chain. This can easily lead to abuses (albeit very subtle at times) by the supplier.  Now in counterbalancing this point, one must note that ProQuest has long been a very reputable company (Although it is not without blemish. See my recent post on data ownership for an example of what can go wrong for libraries in these kinds of scenarios).  Plus, as we all know, companies sometimes change ownership.  New ownership may, or may not, bring new approaches and views.  Customers must exercise caution to avoid this and again I would note that many items should be very carefully spelled out in contracts to avoid future problems.

As for availability, while Serials Solutions is currently signing up and working with test partners for Intota, it should be noted that the product is not expected to be completed until late 2013, where many of the competing offerings are already largely complete and being installed at sites.  Furthermore, I would note that as anyone involved with a major software product development plan knows, projected dates about completion of an entirely new product almost always slide, sometimes only by weeks, more often by months, especially for a product of this size and complexity.

Bottom Line?

As noted, Intota appears to be another totally new system designed from the ground up. As a result, it will offer highly efficient back room workflows.  At the same time, because it is new, as with many new products, it will suffer from more bugs.   In addition, because of its relatively late entry into the market, its functionality, will be thin for awhile, until it catches up with competitive offerings.  However, the functionality sounds very promising and is already showing some very creative, new and fresh thinking.   The product is also clearly trying to leverage the other assets in the Cambridge Information Group portfolio by integrating them directly into the workflows of Intota.  For those libraries that are customers of those companies and products, this could be a very positive step.  For those who are not, the pressure will be on to move from the products currently used, to these which will offer more tightly integrated workflows.  Librarians will have to decide if the loss of checks-and-balances in their supply chain is worth that integration and I want to stress again, this should be thought about very carefully.   I will also note that Serials Solutions focus on assessment and analytics will be very appealing to librarians and will offer major steps forward in offering new, proactive services to users.  

At this point, the data center locations and security level certification(s) are unknown and thus, the ability of this solution to scale and be secure must also be carefully analyzed, as those developments are unveiled.  The caveats I noted in my introductory post about security clearly apply when considering this product.   

As noted above, Serials Solutions is a business unit within ProQuest and ProQuest is owned by the Cambridge Information Group (CIG), a private family-held corporation.  So this company has the advantage of being able to make long-term investments/decisions that will benefit their customers without the same level of pressure as private equity owned firms to generate increasing short-term sales numbers.  CIG also brings impressive financial resources to the table that will enable them to continue to make substantial investments and do additional acquisitions and mergers that could offer advantages to libraries in the future.  Furthermore, unlike companies owned by private equity firms, there is a higher probability that more of the profits made on sales will be directed back into the organization (although not as fully as platforms represented by open source efforts or a collaborative organization). 

Librarians should know that buying products and services from this company represents a reasonably sound investment both for the profession and their organization.  Given the caveats expressed above, particularly for a brand new product, if libraries considering Intota appropriately protect themselves in the contracting stage, they’re likely to find this product well designed to accommodate the rapidly changing needs of users and libraries for many, many years to come.  

NOTE:  This is one post in a series.  All the posts are listed below:

1. Introduction 
2. Sierra by Innovative 
3. Intota by Serials Solutions (this post)
4. Worldshare by OCLC

5. OLE by Kuali
6. Alma by Ex Libris
6a. Ex Libris and Golden Gate Capital
7. Open Skies by VTLS

Wednesday, October 24, 2012

Impressions of the new library service platforms - Part 2 - Sierra by Innovative

Note:  If you haven’t read the first post in this series, I’d recommend your clicking here and doing so before reading the following post.  That first post sets some foundational definitions in place that are used in the comparison(s) that will follow.


Innovative Interfaces is one of the oldest, best known, most stable, successful and respected names in library automation.   They’ve long been known for producing good products backed by good service and there are many, many libraries that rely on them for those very reasons.  They’re also well known for typically carrying a higher priced product (they would say because they offer higher value) and for charging their customers additional fees for any other add-on products and/or services.  


Sierra is the Innovative entry into the library services platform arena and represents a different approach than that taken by many of the other library services platform providers.  Innovative’s approach is to largely repackage their previous product, Millenium, and move it to run on a new open source database (PostgresSQL), use a new open source indexing engine (Lucene), add some new open API’s, open up some of the existing API’s, update the interface and add some new, functional modules.  The totality of this package is called Sierra and it can be had in either as software-as-a-service (Saas) or a local install.   (However, this fact alone means it is not a true cloud-computing solution).  Whether all of this is enough new componentry to qualify as a “new” product, I’ll leave to you.  

Considerations for Libraries

I’ll point out there is some logic and some risks in the approach taken by Innovative, both for libraries as their customers and for Innovative as a company.  Let’s start by looking at the logic, starting with libraries.  

Many libraries understand they are currently in a situation where their primary focus needs to be on meeting end-user or library member needs.  They need to do this by moving quickly and showing real and substantial progress.   If this happens, they are more assured of seeing improved funding and support in their community of users.  So, given that they are facing limited financial and staff resources, many libraries have to make a choice concerning where they will focus their resources in the short term – i.e. on the back room efficiencies, or on user-facing service improvements, many of which today only partially depend on the library automation system.  While there is no disagreement that improving the back room efficiencies will also improve the user facing services, the net short-term gain may not equal the cost of conversion to a new system and/or the re-engineering of those back room processes right now.   So many libraries decide to defer those improvements until later.  

Is this the right choice?  That depends on your library and what it needs to accomplish.  Clearly, given the number of sales of Sierra, many librarians are deciding to go this route. Of course, for Innovative, the logic here is that they can point out that there is no loss of existing functionality (since they aren’t redeveloping it), minimal additional training needed (which is also less demanding on company staff resources), quicker conversion of customers to the updated product from the old and there is no rewriting of the product from the ground up, as other providers have chosen to do, so fewer bugs, less documentation to write, less testing to be done, etc.   All of which sounds good, right?  Well, before deciding, let’s look at the risks that this approach incurs, again both for the libraries and for Innovative.

The risks for libraries comes in several parts.  First, because Sierra has not been re-written utilizing true multi-tenant architecture (see my introductory post for the definition I’m using) it means it will likely take longer for your library to get access to new versions of the software (that is if you’re hosted, because Innovative will need to update each implementation separately.  If you’re not using the SaaS hosting option, then you will still bear the cost of paying your staff to do the version upgrades, a task the SaaS version relieves you of doing).  Furthermore, because this doesn’t optimize the efficiency of the hardware upon which the software is running, it will keep the routine costs of running the hardware/software higher than those providers utilizing the newer, true cloud-computing, multi-tenant architecture.  Thus it will keep the costs customers have to pay Innovative higher.  It also means Innovative will be in the situation of having to support multiple versions of the software, another cost that ultimately customers must bear (and one which those providers offering a true cloud-computing solution will avoid).  If you’re running the product locally, it also appears there might be some substantial hardware investments needed to run Sierra.  Here is a recent proposal on the Web that shows some of the costs that might be involved for a library.  

With regard to the software, the totally rewritten and re-engineered products (WorldShare, Intota, Alma, OLE) provide more integrated and streamlined workflows and thus are far more efficient for those libraries that are rapidly moving towards adding support for digital collections.  These more efficient workflows mean you can take existing people and financial resources, and reallocate them to new user-facing services.   However, if your provider doesn’t offer these new integrated workflows (which is not the same as configurable workflows), as is the case with Sierra, it’s an advantage your library will not realize.  Again, this may not matter to your library at this point in time.  It is up to you to make a determination if the work and cost of converting to the newer, more efficient systems is worth the efficiencies you’ll gain? Almost certainly, in the long run, it would be.  However, many libraries need to deal with the short term first, and there the picture is not always as clear.  Sometimes, this is an acceptable risk.  It depends on the situation at your library. 

Another place where the Sierra architecture appears to be not as cutting-edge as the competitive offerings is in the ability of the system to truly aggregate data between libraries and to offer analytic services driven from the aggregated data.   However, it should be noted that Sierra does offer some excellent reporting tools, a feature that has long been a plus for the Innovative product.   These new tools include a new “Reporter” module which has been designed for the power user and is a powerful reporting tool, allowing users to select fields and compose reports with relative ease (some training required).  The data used to drive this module is copied nightly and includes the “core” ILS data.  Another tool is the “Decision Center”, a tool for use by end-users, typically the manager of collections.   It appears to primarily use canned reports, but it can be run dynamically, with reports produced for instant use and analysis.

However, these reporting tools are offered primarily for use with data from the library or consortium that is running on the system.  For aggregation beyond this (such as would be required to compare your library to peer institutions across the country), it involves additional steps to upload the data to Innovative’s Data Center and to run the reports there. (Some call this the “hosted paradox” a situation in which all the machines hosting customers are sitting in the same room, but the data can’t be easily shared between those systems without a lot of additional work).  The analytic tools for running reports appears to be limited to those written by Innovative whereas other library services platform providers are offering access to far more powerful tools like Hadoop and/or Oracle Business Analytic tools.    

Now, let’s look at the risks for Innovative.  First, because this product is not a true multi-tenant architecture, Innovative is facing higher costs to operate this product as a SaaS solution when compared to those competitors offering a true cloud computing architecture (for those running Sierra as a local installation the cost of buying hardware is a cost they will must continue to directly bear).   Of course, hardware costs continue to drop, so the impact of this cost may be deferred long enough for Innovative to develop a true multi-tenant solution, but unless that is already underway, this would be many years down the road.  So, ultimately, there is a risk here that they will be out performed and out priced by their competitors over the long term.   

In addition, because they don’t offer the same level of data aggregation as other providers, libraries using this approach will be less likely to be able to offer analytic-driven services to users (or at least services based on as wide a range of aggregated data). 

Another risk for Innovative is that, as libraries put into place other new user-facing services and continue on that path, they’ll need the efficiencies of the back room processes improved in order to continue to deliver state-of-the-art services and the staffing to support them.  Sierra seems unlikely, as presently architected, be able to match competitor offerings in this regard and thus it is probable that Sierra will begin to appear as a less viable option over the long-term.  Now, counter balancing this for Innovative, in the short-term, is the fact that the company owners will be able to maximize the profit they will earn (since they haven’t had to invest in a total rewrite of the core product), but this approach will potentially lower profitability in the long-term (Here, I’ll note that I’ve done a previous post about equity ownership, the likelihood of them owning a company for the long-term, and the implications all of this carries for libraries.  If you haven’t read that post, you might want to do so).

Next, let’s return to the customer perspective and look at the level of “openness” offered by Sierra.   It is interesting to note that Innovative doesn’t describe Sierra as an “open platform” instead they talk about “open development”.  One should investigate this terminology carefully and be certain to understand the differences between the two.  Sierra is clearly providing customers with access to more of the system API’s and is promising to deliver new API’s that will give access to additional data and services.  These are encouraging steps.  Innovative literature talks about a developer community coming soon, to be called the “Sierra Developer Sandbox”.  Again, these are positive steps and should be recognized as such.  Is “open development” simply an attempt to describe a differentiation in marketing terms or is it really the same as what others are calling “open platforms”?  At present, this is a bit unclear, but given Innovative’s long-time reputation for “black box” solutions, one would be well served to do some thoughtful due-diligence in this area.    


At least for the short term, for many libraries, Sierra will prove to be an entirely viable option.  Libraries that want to move to a hosted environment will be able to do so (although Innovative has long offered hosting for many of their products, including Millennium).  The product is available right now while offering a total range of library functionality.  Even though, that level of functionality will force libraries to maintain current staffing levels and redundancies that currently exist in the back room processes.   Analytic driven services based on large aggregates of data that’ll be offered by competing solutions are still in their infancy, so again, this is a place where many libraries can wait in the short term. Multi-tenancy, while important for offering solutions with lower costs and higher efficiencies, is probably not a major concern for Innovative libraries or they wouldn’t have chosen Innovative in the first place.    

One must recognize that Innovative, in taking this approach, has probably studied their customer's needs closely and feel their offering meets those needs.  However, it must be understood that this approach represents “staying the course”  at a time when many libraries are undergoing very rapid change and major upgrades.  Depending upon where your library is in dealing with change, Sierra may, or may not, be a solution that will work well for your library.

NOTE:  This is one post in a series.  All the posts are listed below:

1. Introduction
2. Sierra by Innovative (this post)
3. Intota by Serials Solutions
4. Worldshare by OCLC

5. OLE by Kuali
6. Alma by Ex Libris
6a. Ex Libris and Golden Gate Capital
7. Open Skies by VTLS

Monday, October 22, 2012

Impressions of the new library service platforms - Part 1

Property of
Introduction and definitions/descriptions

At the last ALA Conference in Anaheim back in June 2012, I looked at all the major new library service platforms.  These included:  OCLC’s Worldshare (which was announced at this same conference would be what EBSCO would be using coupled with EDS to provide a complete library services platform), Serials SolutionsIntota, Innovative’s Sierra, Ex Libris’s Alma and Kuali’s OLE.  Since that time, VTLS has announced a new platform, tentatively called “Open Skies”.   With the exception of VTLS’s offering, which was not yet available at that conference, I kicked the tires, peeked under the hoods, and took a few test drives of each of these new platforms.   In other words, I did all the usual routines any potential buyer at an ALA conference would do.   I walked away with a lot of impressions and thoughts about these new product offerings.  Many people routinely ask me about these systems and thinking that others might also benefit, I thought I’d write up some of my impressions and perceptions.  So over the next week (or two, since life seems a bit hectic these days), I’ll be posting my personal analysis of each of these platforms.  As I’ve noted, these represent my thoughts and impressions and I’ll welcome any comments and/or factual corrections from anyone who might want to submit them.

However, before getting underway with that analysis, I want to share some definitions I’ll be using as a framework for my analysis.  This is necessary because there is really some important distinctions to be made between these library service platforms based on these definitions.  I’ll be analyzing these platforms using the following definitions/descriptions:

  1. SaaS.  This stands for Software as a Service and really should be viewed primarily as a different way of delivering software services.  The major difference is that when using SaaS, you’re using a remotely hosted machine instead of a locally installed machine.  Coupled with that, the company hosting the machine takes on the responsibility for maintaining the system, so library staff is freed from this set of task.

  2. Cloud Computing.    This term, as noted by the Gartner Group in October of 2010, moved into the “Peak of Inflated Expectations” where it has remained in the latest survey (2012).  The reason is because it has been over used in the marketing of the concept and has become an all-inclusive remedy for whatever ails your library.    There is actually an agreed up definition from the National Institute of Standards  which states that a cloud computing system supports the following:

    • On-demand self-service: A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service’s provider.
    • Broad network access: Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).
    • Resource pooling: The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. There is a sense of location-independence in that the customer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or data center). Examples of resources include storage, processing, memory, network bandwidth, and virtual machines.
    • Rapid elasticity: Capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.
    • Measured service: Cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be managed, controlled, and reported providing transparency for both the provider and consumer of the utilized service."

    Of course, in reading this definition, it seems aimed at describing consumer facing applications more than those aimed at those organizations in-between the cloud-service and the end-user.   Which, of course, is where libraries are more typically found.  Still even within the products being marketed directly to libraries, you can still find wide variations in how it the term “cloud-computing” is being used and which parts of the definition they’re applying to their offering.  One of the most frequent “stretches” of the definition is when a firm markets their SaaS service as a “cloud computing” solution, while others organizations are truly bringing major new functionality and new software architecture to the market and using the same “cloud computing” description (although you will frequently also find the term “Webscale” bundled into those descriptions, but NOT always).  
  1. Multi-tenant software.  This has to be one of the most frequently misunderstood concepts of cloud computing.  While briefly mentioned in the definition above, a “light” definition can be derived from WhatIs.Com, which states (emphasis is my own): 
  2. “Multi-tenancy is an architecture in which a single instance of a software application serves multiple customers. Each customer is called a tenant. Tenants may be given the ability to customize some parts of the application, such as color of the user interface or business rules, but they cannot customize the application's code.  Multi-tenancy can be economical because software development and maintenance costs are shared. It can be contrasted with single-tenancy, an architecture in which each customer has their own software instance and may be given access to code. With a multi-tenancy architecture, the provider only has to make updates once. With a single-tenancy architecture, the provider has to touch multiple instances of the software in order to make updates.”    
    This has incredibly important implications for you as a customer.  Restating what is said above, this translates into your supplier being able to run a far more efficient operation, i.e., it will likely take less computer resources than those systems running in a SaaS architecture, which should ultimately translate into lower costs to your library for this type of technology.  As mentioned above, another reason, also  that costs should be lower, is that if a supplier is supporting all their customers (and for a working number, let’s say 500) from this one software instance, when they upgrade that instance of the software to the latest version, all 500 customers are upgraded at the same time.   If a supplier is using one instance of the software per customer, even if hosted in an SaaS architecture, then they have to upgrade each instance individually.  You’re probably already familiar with what that means for you in terms of waiting for an upgrade and the overhead that creates.  It is costly overhead that is not eliminated unless the software architecture is that of a  true multi-tenant architecture.   As we’ll see in the days ahead, some of these new systems are, and some aren’t, multi-tenant.    

  3. Security certifications.  The security of these new systems is really a complex and important topic.   Without a secure cloud-computing or SaaS system you’re potentially increasing the exposure of your library to legal liability. As a result, when procuring a new cloud-computing or SaaS library management system, you, your legal and procurement people, should make sure the supplier meets some certified standard of security.  Note: Most certifications only apply to the data center.    So these security certifications may not provide any assurance that data leaving the data center and traversing the larger net are being transferred in an encrypted, secure manner.  Again, this is something you should check separately and as part of your procurement.  In the analysis I’ll do in the days ahead of these new systems, I’ll only be asking what the security certification the data center has met.  Those typically consist of one or more of the following (although you should also carefully note that some providers indicated they have NO data center security certifications):

    • ISO/IEC 27001.  This standard is focused on security aspects and thus is the most appropriate for addressing your security concerns.  (NOTE:  SAS 70 or SASE 16, mentioned below, are focused more on quality issues, which can include security, so they’re also good, but you should know the focus is different.    The Wikipedia entry on ISO 27001 says in part:

      “ISO/IEC 27001 requires that management:
      • Systematically examine the organization's information security risks, taking account of the threats, vulnerabilities, and impacts;
      • Design and implement a coherent and comprehensive suite of information security controls and/or other forms of risk treatment (such as risk avoidance or risk transfer) to address those risks that are deemed unacceptable; and
      • Adopt an overarching management process to ensure that the information security controls continue to meet the organization's information security needs on an ongoing basis.”

      Compliance to the above can be audited by companies that specialize in this type of work and you can request to see a copy of the certification (although do not expect to see a copy of the detailed assessment as this very request would compromise the security of the system).  Also note that the certification should be for the data center where your data will be hosted because they are location specific.  

    • SAS 70 (NOTE: SAS 70 has now been superseded by SSAE 16, however you might encounter either of these in asking for a security certification).  This standard, written in 1992 was originally designed for accounting service standards and it performs an examination of a service organization's controls and processes.   Per the website certification to this standard “represents that a service organization has been through an in-depth audit of their control objectives and control activities, which often include controls over information technology and related processes.”   The newer SSAE 16 dates from 2010 and while upon first examination may not be thought to be applicable, in fact, just like SAS 70, it too examines controls applicable to service organizations and even has a related guide (SOC 1) that is applicable to organizations providing computing services to a customer. See this blog post  for more details.   
OK, with these important definitions/descriptions out of the way, my subsequent posts will begin the analysis of each of the new platforms.  I hope you’ll find them useful.  

NOTE:  This is one post in a series.  All the posts are listed below:

1. Introduction (this post)
2. Sierra by Innovative
3. Intota by Serials Solutions
4. Worldshare by OCLC

5. OLE by Kuali
6. Alma by Ex Libris
6a. Ex Libris and Golden Gate Capital
7. Open Skies by VTLS