Posted by: mcgratha | January 5, 2011

Impact of Open Source

Continuing the theme of the Future of ECM … trend #10 …

In the current economic climate, where many organisations are slashing their budgets and de-scoping new projects to focus on essentials to get the job done (dropping many of the extras and “bells and whistles”), open source ECM vendors are stepping up to offer a very viable and cost effective alternative to the mainstream ECM vendors. From a cost perspective, I see open source having three main advantages over mainstream ECM vendors:

1)     Less expensive – Over a 5 year period, it is perfectly plausible for an open source solution to come in at half the (software) price of a mainstream ECM vendor;

2)     No capital costs – Open source vendors typically receive their revenue from an annual subscription for technical support (against an agreed Service Level Agreement), plus maintenance and software updates. This means that there is minimal upfront investment, as costs are driven out of predicable, annual operating expense as opposed to capital expense;

3)     Simpler cost models – Unlike most of the mainstream ECM vendors which typically charge for their software based on a variety of factors and permutations (such as named users, per server/CPU, per optional modules), open source vendors usually charge a single price that is applicable to its entire product suite for customers to mix and match, no hidden extras, no surprises, no “number of user“ price ceilings, etc.

The majority of the open source vendors have to date primarily focused on web content management which has limited their impact on the mainstream ECM vendors. However, I expect this to significantly change over the coming five years as open source vendors start to offer a wider ECM portfolio of products, bringing them into more direct competition with the mainstream ECM vendors. Indeed, the open source vendor, Alfresco, is already there and offers a very compelling and competitive ECM suite.

This continual rise of open source will certainly impact the mainstream ECM vendors. For example:

  • They will seek to re-position themselves as more than just “ECM”;
  • The licence costs for core ECM products/services will certainly need to drop with more innovative commercial licence models being introduced – see my blog ECM as a commodity.
Posted by: mcgratha | January 4, 2011

ECM as a commodity

Continuing the theme of the Future of ECM … trend #9 …

Dropping into the infrastructure layer

With the onset of a more service-oriented approach to content management, and with less differentiation between ECM vendors around core content functionality, we expect ECM platforms to drop into the infrastructure layer, becoming a core and essential part of every IT infrastructure. This is evident by the nature of recent acquisitions in the ECM marketplace (see my blog Market consolidation below) with infrastructure players making a much bigger move into the market. For example, HP acquiring Tower Software, IBM acquiring FileNet, Oracle acquiring Stellent, EMC acquiring Documentum, SAP partnering with Open Text, and Microsoft pushing relentlessly forward with SharePoint. This drive is forcing a greater commoditisation of ECM.

Switching from a technical to a business discipline

However, for ECM to truly become a commodity, it will need to be available on-demand and charged to organisations based on the functionality and services that they use. In this way, for the most part, organisations will only need to be concerned about the business implementation of ECM rather worrying about what features and functionality it provides or how it will be implemented behind the scenes. I believe that the knock-on ramifications of this last point will have a really important and positive impact on ECM going forwards, potentially doubling the success rate of ECM deployments.

Why? Because from the outset of an ECM programme it will enable organisations to focus on what they want to achieve in three-five years time, carefully defining the strategy that will be required to get them there, in addition to identifying the appropriate metrics that will define and measure success.

Many organisations spend a disproportionate amount of time on the short-term evaluation and selection of an ECM product versus the time spent planning and focusing on the details of the longer-term business problems that need to be solved. Interestingly, when it comes to Microsoft SharePoint, the converse is almost true, as organisations tend to spend very little time on the evaluation and selection process, but unfortunately also spend very little time on the business problems that need to be solved. Although Microsoft SharePoint is arguably a step towards “ECM for the masses”, it is also often mistakenly seen as a piece of technology to deploy and forget, rather than something that requires just as much planning and control as any other ECM product in order to be successful.

However, in general, as the core functionality of ECM becomes commoditised, it will enable ECM to effectively switch from a technical to a business discipline, giving organisations the freedom to really focus on:

  • The needs of the business;
  • The roles and responsibilities that will be required (many of which will be new to the organisation);
  • Information Governance;
  • Training and business change;
  • The user experience.

These are all the kind of factors that will make the ECM deployment and roll-out a success. As the core functionality of ECM becomes a commodity, it will give organisations more time to take the “M” in ECM more seriously.

ECM reforms to something bigger

There are significant overlaps in terms of core functionality between the different disciplines of ECM, as illustrated in the diagram below (acknowledging that different people will have a different view on what disciplines actually make up ECM).

ECM Disciplines

I believe that many of the distinctions between the different disciplines will become increasingly blurred, morphing into more of a collection of information management services, and over time, terms like “document management” and “web content management” as distinct disciplines will fade as relevant marketing terms. In fact, we also see the demise of “ECM” as an overall term, as it reforms into something much broader, progressively incorporating many of the technologies that sit adjacent to a typical ECM implementation.

I believe that over the next five years, ECM vendors are likely to start re-positioning themselves, aligning more to Information Management than specifically “ECM”.

Posted by: mcgratha | January 3, 2011

Market consolidation

Continuing the theme of the Future of ECM … trend #8 …

The ECM market has gone through a series of frenzied acquisitions over the past decade as each of the leading ECM vendors assembled its arsenal of core ECM components. I’ve attempted to illustrate the main acquisitions by the leading ECM vendors in the diagram below. Note: this is obviously not an exhaustive list, and several vendors (for example, HP) are missing … there wasn’t enough “real estate” in my diagram to fit them all in.

ECM Market Consolidation

The pace of acquisitions does seem to have finally slowed down and we are now in the closing stage of market consolidation for what we have traditionally viewed as ECM.

Over the next five years, we can expect a new wave of acquisitions as ECM expands its remit into the larger Information Management marketplace, augmenting its core product portfolio by further acquiring technologies that are adjacent to ECM. The smaller ECM vendors will probably be increasingly squeezed, especially as core ECM functionality becomes a commodity and drops into the infrastructure stack (see my blog ECM as a commodity) and licence cost reductions take hold.  Of those smaller ECM vendors that survive, they will probably need to be more focused on niche markets or around specific vertical solutions.

Posted by: mcgratha | January 2, 2011

Semantonomics

Continuing the theme of the Future of ECM … trend #7 …

In this future trend of ECM, I talk about how semantic technology is likely to play an increasing role in facilitating the discovery of information, connecting it with the people that need it, often before they even know that they need it, thereby giving it greater value. This introduces the concept of semantonomics (semantic economics), the art of deriving value from information.

Infoglut

As the volume of digitally available information continues to grow, the problems associated with “information overload” are becoming more prevalent. In everyday life, people are having more and more information pushed at them from every angle. Even if all the information was perfectly classified and ordered, the reality is that people just don’t have time to actively parse through it all. The volumes are simply prohibitive and logistically reading and sifting through vast quantities of information so as to seek the content of interest is impractical. In fact the method that most people use to survive this “infoglut” is to ignore most of it. This can especially be a problem with mobile devices which have much smaller screens, and where you really do need to have the information that is relevant and important to you displayed in a manner that catches your attention very quickly.

So we have a problem; a significant amount of information that is produced in most organisations is not read by all of the people that ought to be reading it, or would certainly benefit from reading it. This can fundamentally de-value the information. Therefore, in order to maximise the value of information, it needs to be targeted and connected to a qualified audience.

It is worth noting that social collaboration tools are also likely to play an essential and overlapping role here in order to dynamically discover people’s interests and skills from analysis of social activity, providing increased intelligence in order to be able to connect relevant information to them. This is discussed in the ‘connecting the dots’ section of my blog The Collaborative Office.

Discover

Performing a Google search for the term “Oasis” will return over 4.5 million results, including the Oasis clothing store, Oasis the rock band, the web standards organisation Organization for the Advancement of Structured Information Standards (OASIS), the Oasis Beauty and Day spa, in addition to an oasis as a fertile spot in the middle of a desert. If I were a producer in the record industry and specifically interested in Oasis the rock band, then the vast majority of the search results would be irrelevant to me. I could, of course, add additional keywords to refine the search query, but fundamentally it doesn’t understand the meaning of “Oasis”, nor what it means in my individual context, which will always limit the quality and relevance of the search results.

In the next evolution of the web, web 3.0 (described in my blog Evolutionary Road), Semantic technology will mature enabling the meaning and context of information (such as documents, web pages, blogs) to be truly understood by rendering an insight into the relationships between words, and the ambiguities that words and phrases can sometimes present. This will facilitate a web of connected data that has been semantically enriched with sufficient metadata to enable machines to interpret it, permitting them to find, share and integrate information more easily and automatically. For example, a semantic search might be “I need to find somewhere for lunch for myself and my 3 year old son, preferably not too noisy, I’ve got a budget of £30 and need to be back home by 3pm, summarise my options?

Semantic technology represents a complete step change in terms of how we currently search and discover information on the web. It is envisaged that every user will have a unique web profile, tailored based on their browsing experience, interaction on social collaboration and networking sites, with different weightings given to information that is more of interest to them, etc. This means that different people will get different search results, even though they might search for exactly the same thing.

Semantic technology will play an increasing role in facilitating the discovery of information, connecting it with the people that need it, often before they even know that they need it, thereby giving it greater value. There will be a much greater focus on the concept of semantonomics (semantic economics), the art of deriving value from information.

The underlying XML technologies of the semantic web, such as RDF (Resource Description Framework – the grammar), OWL (Web Ontology Language – relationships between terms), and SPARQL (SPARQL Protocol and RDF Query Language – the rules) make the semantic search example above possible by allowing information to be read across the web by machines. However, there are a number of obstacles to be overcome before the semantic web reaches a tipping point where it can go mainstream. For example, there is a lot of work to be done to enrich data for RDF and create detailed ontologies and rules around the data. Even within a single organisation, not alone the wider web (with over 30 billion web pages), it can still be a considerable task.

Nevertheless, inroads have been made through the use of text mining/analytics software which can automatically parse, say a document, and identify key concepts, context, meaning, entities (people, places, events) and categorise these according to a taxonomy, enriching the document with intelligent, semantic metadata. This represents a transformation step to discover the business value in “unstructured” information, connecting it with the people that need it.

Visualise and explore

An essential aspect of information discovery is to be able to visualise and explore the information in a manner that brings it to life and maximises its value. Significant advances are being made in this field all the time, three examples of which are:

  • Microsoft Pivot (www.microsoft.com/silverlight/pivotviewer) is an innovative, highly visual user interface (powered by Silverlight) to allow you to explore and arrange large collections of information, discovering patterns and relationships between information that would otherwise be difficult to spot through standard browsing techniques. An excellent demonstration of Pivot can be viewed at www.ted.com/talks/lang/eng/gary_flake_is_pivot_a_turning_point_for_web_exploration.html.
  • Google Squared (www.google.com/squared) is the first significant effort by Google to understand and extract information from across the web about a particular term/phrase, teasing out structure from unstructured data, looking at the semantics and relationships between information, and  presenting a summary of what it has discovered about the selected term/phrase (currently in table-like format). It still has quite a way to go in order to be genuinely useful, however, it is an interesting start;
  • Concept Searching (www.conceptsearching.com) provides functionality to automatically identify and extract concepts from content, intelligently classifying the content and dynamically building a taxonomy over it. Although primarily focused on SharePoint, it is also feasible to point it at documents in a file system (that perhaps won’t be migrated into an ECM), automatically classify the documents and dynamically build a taxonomy on top of them, greatly facilitating much more effective information discovery. Active Navigation (www.activenav.com) provides a similar tool.
Posted by: mcgratha | January 2, 2011

The Collaborative Office

Continuing the theme of the Future of ECM … trend #6 …

Rapid uptake of social media

Social Media is not a fad, it represents a fundamental shift in the way that we communicate. This is evident if you consider that as of December 2010, 7 out of the top 20 most visited web sites in the UK were social networking or web 2.0 sites (source www.alexa.com). This same pattern, the rise of social networks, is repeated internationally in countries where broadband is widely available.

The Rise of Social Media

Value as a collaboration tool

As can be seen from the diagram above, there has been a massive uptake of social networking and collaboration in the consumer space. This momentum is now fuelling an increasing trend to apply social networking practices and tools into the enterprise as it is proving to be a really great way to facilitate much smarter, agile and rapid collaboration and knowledge sharing with colleagues horizontally across the organisation, with no hierarchy or geographical barriers and little need to have prior relationships with people already in place.

Social collaboration tools provide a really quick and easy means to:

  • Solicit requests to find information that might otherwise be difficult and/or time consuming to ascertain;
  • Explore topics/concepts/opinions, getting the insight and experiences of many people across the organisation, often avoiding the same problems being solved repeatedly by multiple groups;
  • Find people with common skills and interests across the organisation with whom you might never have had the opportunity to ‘meet’ otherwise (or even know existed);
  • Keep up to date in a non-invasive manner with what is happening in your areas of interest across the organisation, as it is perfectly feasible to be a passive on-looker, contributing whenever you want (and have time).

With the proliferation of high-speed broadband, in recent years, there has been a marked increase in the number of organisations that offer their staff a more flexible working arrangement that enables them to spend some of their time working from home instead of commuting to the office. Whilst this is generally seen as a positive step, it does mean that it can now often be more difficult to catch up with colleagues from both a personal and business perspective, and those accidental conversations when someone sparks an idea when you meet them at the coffee station are more rare. However, social collaboration tools are helping to counter-act this barrier by facilitating a virtual “coffee station”, with the added advantage that there are hundreds of people in that coffee station listening to you and who can spark ideas.

Traditional collaboration tools focus on information worker efficiency. Social collaboration tools go further, focusing on innovation and change, and promoting a culture of knowledge sharing across the organisation. Lew Platt, former CEO of Hewlett-Packard once said “If HP knew what HP knows, we would be three times as profitable“. Social collaboration tools provide a means to tap into knowledge pools and leverage more of the vast amount of knowledge that most organisations have, too often locked away amongst its staff. Such tools are enabling us to move towards a truly collaborative office.

Content, content everywhere, what are we going to do with it?

However, the growing use of social media as an internal team communication and collaboration tool raises an interesting challenge – what do we do with all the new and different types of content that is being generated? We’re producing more content than ever before and, where there is identified business value (for example, to provide evidence of the chain of collaboration that led to a business decision), it will be necessary to manage, secure, distribute and store/archive much of the content that is generated from social tools.

However, much of the content is typically created from a multitude of social tools that are often outside the remit of the corporate ECM. For example, Generation Y (born from late 70s to early 2000) use email as a communication mechanism far less than older generations, preferring social media instead. This younger generation coming into the workplace will demand tools that reflect the way they are used to collaborating on the web – and this is less-and-less by email, and more with social collaboration and networking tools.

As such, many organisations have a problem in that a growing proportion of their content, that used to be managed, secured and made easily accessible within ECM is “disappearing”, held in multiple different applications and silos outside of ECM, using different interfaces and different access controls.

This problem is further exasperated where externally hosted social media tools are used by employees within the organisation (perhaps out of necessity because the organisation hasn’t provided its employees with the tools that they need) to share information and collaborate on ideas. It is not just traditional “documents” that are being held in external social networking sites, it is also the conversations and discussions between colleagues who are collaborating on different topics, which can represent very valuable information held outside of the organisation. There is an inherent danger of having lots of business information, much of it confidential and valuable, being held in external social networking sites; it can easily be hacked or simply accessed by ex-employees that have gone to a competitor. In addition, as an organisation, you also don’t own the content that is posted to these external sites, and so it can be taken away at any time.

The momentum behind social media is too big for ECM vendors to ignore. As such, a key trend that is emerging is that ECM vendors are significantly enhancing their products (potentially leading to further acquisitions) to incorporate web 2.0 and social media tools, embracing all of this new business related information that is being generated from social media and drawing it into ECM and securing it.

Therefore, for organisations that build a social media strategy that leverages their social-enabled ECM solutions, they can expect much of the information to flow back under ECM control again. However, care will need to be taken when implementing policies regarding what social content to manage and archive into the ECM, as if all social “conversations” are stored in perpetuity then this is likely to change user behaviour, perhaps inhibiting the more natural, free-flowing conversations and collaboration contributions between users. It will be necessary to get the balance right.

There will, of course, always be business information held outside of ECM, but we are likely to see this information becoming much more accessible from within ECM in the future. Gartner predicts that by 2016, social technologies will be integrated with most business applications.

Connecting the dots

As more social collaborative content is brought under the remit of ECM or is at least accessible to ECM, we can expect the use of Social Analytics to really start kicking in over the coming years. Social Analytical tools mine and analyse the social content that is being created within an organisation, making intelligent connections between people and content, uncovering the patterns of interactions within a social network and driving value from them. For example, based on dynamically analysing social content and interactions (i.e. not drawing on information that people explicitly say about themselves, in say, their profiles), it could tell you who has got skills or knowledge in a certain topic and who would be a good point of contact if you needed information on a specific topic.

With the ability to find people (with the right skills and knowledge) just as easily as information, it is likely that collaborative teams will be increasingly made up of people with “weak links” between them, but who have been drawn together into a virtual team to do a job. This is analogous to taking the Service Orientated Architecture (SOA) approach for building applications and applying it to people/resources, “Service Orientated Resources”.

Social Analytical tools can visualise a social network, showing the numbers of connections between participants, the strength of connections, and in some cases, the volume of interactions (such as e-mail, and phone). In this context, this can also give insight into how work is actually done across the organisation as compared to the traditional organisational hierarchy around the division of labour.

A good example of where social analytics has already been used is for BioMedExperts (www.biomedexperts.com), which is a free, online social networking community that brings biomedical scientists and researchers together and allows them to collaborate online. Since its launch in April 2008, it has built over 330,000 registered users from over 3,500 institutions in more than 190 countries, becoming the world’s fastest growing scientific social network. It visualises the network of professional relationships between 1.8 million researchers, automatically generated from co-author information from millions of publications published in over 20,000 journals, allowing scientists and researchers across organisations the ability to share data and collaborate in far more innovate ways than has previously been possible. Patterns and relationships amongst researchers are automatically inferred making it possible to not only identify the strengths of an author’s research, but also rapidly understand the social contexts in which that research was developed.

Predictive information delivery

The usage of social analytical tools is expected to become much more widespread. However, when combined with advances in semantic technology (see my blog Semantonomics), we think this could go one step further by incorporating predictive information delivery. For example, let’s say that you are using a social collaboration tool and you are seeking information on a particular topic. Various people respond to your request and a series of social interactions ensue. By dynamically analysing the interactions, the semantics of what you are looking for can be understood and the software tool can:

  1. Seek out the information you need, not only from content available within your organisation but also from external sources across the web, and push the information to you;
  2. Make connections to people and information sources that perhaps you hadn’t previously considered or were aware of, offering different angles and perspectives to your original information request;
  3. Predict what you are likely to need next – if you looking for X now, then you are likely will want Y later, so it seeks and collates information on Y now to have it ready for you.
Posted by: mcgratha | January 2, 2011

Customer engagement

Continuing the theme of the Future of ECM … trend #5 …

Social CRM

And so it starts, reminiscent to when we started putting an “e” in front of everything back in 1999 (such as eCommerce, eBusiness, eBay), it now looks like there is going to be a flux of “social” everything. Social CRM (Customer Relationship Management) kicks things off on the business front.

Social CRM is about utilising social media tools to listen and interact with customers, engaging with them at a more personal level across a multitude of online touch points. It introduces a new angle on top of traditional CRM, in that instead of just dealing with and managing customer data, transactions and money, Social CRM deals with conversations and relationships with the customer. Social CRM focuses on a strategy for customer engagement, not managing customer data.

Prior to buying a product or a service, customers are increasingly influenced by their online peers, with reviews, comments, observations made by others online about a company or its products heavily factoring into the purchasing decision. As such it is very important that companies monitor and react to their brand perception online. However, Social CRM is not about trying to respond to the vast amount of information that is posted about them across every customer contact point (such as blogs, forums, tweets, Facebook, etc). That would not be scalable, they would quickly become overwhelmed and switch into “fire-fighting” mode, and it just wouldn’t make sense.

The real value will come from being able to extract common patterns across all social touch points, identifying shared problems and requirements, collaborating with customers to help give them what they want and improving the overall customer experience. This will hopefully lead them to leave positive comments about their experience online which others will read and be influenced by.

So how is this relevant to the future of ECM? Many of the same social tools as discussed in my blog The Collaborative Office will also be required to deliver Social CRM. It is just a different focus in how they are used. This focus isn’t just from an external context, but also from an internal perspective as Social CRM business activities are likely to lead to the generation of lots of new content, such as sharing ideas for innovation, generating brand awareness and visibility, capturing direct and indirect feedback on social networks and communities, creation of new content to aid sales and help customer service (for example, “how to” videos), polls/voting, surveys, etc.

It will accelerate the need for a much greater fusion between CRM, social computing and ECM (specifically the web content management side, see my blog Evolutionary Road).

Sentiment Analysis

The uptake of Social CRM, and the associated social digital marketing activities that accompany it, has created a need in the market to be able to monitor and track the effectiveness of customer engagement initiatives and brand perception. This need has been met through what is largely referred to as Sentiment Analysis, which analyses content across many sources  over the Internet (sometimes referred to as listening posts – e.g. Twitter,  Facebook) to derive the public sentiment (typically positive, neutral or negative) about a certain brand or topic, which enables companies to understand their brand perception and take early action. Sentiment Analysis tools can provide significant value-add around areas such as campaign launching, new product launches, brand monitoring and measuring, competitive intelligence, and crisis management. For example, sentiment analysis might be run against a specific product that a company sells capturing information on how that product is perceived by consumers. Based on this information, the company might make changes to its web site to address common issues/concerns raised by consumers. Running the sentiment analysis again, say, a month later, will identify if the issues/concerns have been addressed.

Some companies (such as Gatorade and Dell) have set up Social Media Command Centres, akin to a “war room”, to monitor conversations about their brands in real-time across social media channels, and quickly act upon the feedback received.

I expect that Sentiment Analysis tools will be increasingly used hand in hand with web content management and social computing tools within an overall ECM suite.

This blog explores the concept of being able to record a value in the balance sheet to both the information within an ECM and the knowledge amongst staff (especially with the increasing up-take of social computing tools used internally within organisations).

People are our greatest asset. This is something that you will hear most organisations say, and the value of this human capital is maximised by fostering an environment where people can share and re-use their knowledge and skills. This point was elegantly put by Andrew Carnegie of U.S. Steel in 1919 when he said “The only irreplaceable capital an organisation possesses is the knowledge and ability of its people. The productivity of that capital depends on how effectively people share their competence with those who can use it”. Indeed the importance of human capital is often alluded to within the annual reports (in the ‘Risk Factors’ section) of many organisations with statements along the lines of Our performance is substantially dependent on the performance of our executive officers and key employees, the loss of whose services could significantly harm our business”.

Nevertheless, despite its acknowledged importance, the value of human capital is not captured as an asset in the balance sheet. Human capital is captured as a cost (salaries) and partly recognised in the valuation of goodwill. Traditionally buildings, equipment, infrastructure, etc, have been viewed as sufficient reflection of an organisation’s assets. However, given the increasing strategic investment that companies are making in information systems to foster and maximise the knowledge of its people, the omission of the value of human capital from the balance sheet is becoming more evident.

The problem is the unique qualities that reflect the true value of human capital are intangible, and will vary greatly per industry type and per employee role, notwithstanding the union issues that would ensue with some groups being perceived as less valuable to others. Ultimately though, there is no incentive to resolve as the concept of human capital accounting (its know-how, capabilities and knowledge) is not recognised by Tax authorities and therefore has only academic utility.

You could, of course, contend that the ECM systems themselves also have an intrinsic value that ought to be captured in the balance sheet. They encapsulate a significant proportion of the documented knowledge within organisations and their value as an asset could arguably be just as important as physical assets. However, the problem is again that its value is intangible.

It is hard to envisage how it would be feasible for companies to list human capital and information as assets in their balance sheet. However, if it were feasible then it would make the business case and ROI calculations for ECM very interesting. For example, if the introduction of an ECM system enables staff to become more efficient (such as less time spent trying to find information), then does that make them more valuable and does that mean that their value on the balance sheet goes up?

A similar concept was explored by George Parapadakis in his blog, considering what it might mean if it were possible to measure the relative value of documents, in real-time, within an ECM.

 

 

Posted by: mcgratha | December 30, 2010

Evolutionary Road

Continuing the theme of the Future of ECM … trend #4 …

Web sites, powered by Web Content Management systems, have already evolved from the initial “static / brochure ware” web sites into more dynamic sites that personalise the information that is delivered to the end-user. However, this initial “web 1.0” evolution has predominantly been a one-way conversation, with information being pushed from the web site to the end-user.

The next “web 2.0” evolution that is well underway involves injecting social computing and collaboration functionality into web content management, introducing “conversational” web sites that offer a far more engaging and interactive media-rich experience for end-users. Depending on the nature of the site, there can be a huge value in harnessing the collective intelligence/wisdom of end-users (crowd sourcing), allowing them to participate on the site rather than simply using the site as a passive visitor. This next evolution will take hold and peak over the next five years and will form the basis from which to measure and attribute metrics around the value of the inter-relationships between people, process and content. An overlapping discussion on how organisations will leverage social computing to better engage with their customers is covered in my blog Customer Engagement.

Web Evolution

The evolution into “web 3.0” is still in the early stages with a vision of opening up and semantic enabling a revamped, ubiquitous web where data, people and appliances are all inter-connected. An essential aspect of this vision is making data, distributed across the web or indeed within organisations, readable by machines (by adding semantics to the data that can be interpreted by machines). In this manner, the web could be conceptually thought of as one “very large database” where all data is connected and accessible. The evolution here is more towards an intelligent web and the implications of this are huge. For example, governments collate huge volumes of data every day across many different agencies, and by seamlessly connecting this data, far greater intelligence can be inferred, enabling national security threats to be flagged and acted upon far earlier than was previously possible. Another example would be in relation to a merger or acquisition between two financial institutions, where it would not be necessary to go through a lengthy IT systems integration process before a common view of customer data was available; the semantically enabled customer data would connect and find itself. The concepts around the discovery of information using semantic technology are discussed in my blog Semantonomics.

Web 4.0 will firmly build on its predecessor, introducing semi-cognitive applications that take full advantage of a semantic web, and which will probably eventually run on a web operating system (WebOS). They are likely to display some level of memory, judgment, and reasoning, and with the ability to act and communicate autonomously. As a simple example, an intelligent semantic web agent that could be called upon to, say, “find me all research on topic X across our internal data sources, cross-referencing against all available data on the web, especially by our competitors, summarise the results and keep me updated every day by 3pm with what you find”. Such an agent could be running continuously in the background, learning from your information preferences, and automatically pushing this highly relevant and personalised information that it finds to you. This could include an amalgamation of different types of information, such as TV programs, music, and consumables like clothes and holidays.

As observed by Nova Spivack (an evangelist on the next phase of the web’s development, www.novaspivack.com), there would seem to be a “back-end / front-end” pattern in how the web is evolving:

  • Web 1.0 primarily focused on getting the back-end infrastructure in place to support and drive the development of the Web, with web 2.0 primarily focusing on engaging user participation through front-end applications and user interfaces;
  • This “back-end / front-end” pattern is likely to repeat itself again, with web 3.0 focusing on the back-end activities to open up and build a semantic web that can be readable by machines, with web 4.0 capitalising on the new semantic web at the front-end through the next level of powerful, innovative and potentially cognitive applications and user interfaces.
Posted by: mcgratha | December 22, 2010

The spawning of a new construction boom

Continuing the theme of the Future of ECM … trend #3 …

The majority of the development and configuration work that is done when implementing ECM solutions is traditionally done with a specific ECM product in mind. With the introduction of the Content Management Interoperability Services (CMIS) standard , I believe that this status quo will completely turn on its head within five years.

Although there will always be a need to have people who are specialised in the configuration of a specific ECM product and who know that product really well, I believe that the biggest proportion of development work going forwards will involve work that is independent of the underlying ECM product (where the ECM is a “black-box”). There is a whole new construction boom on the horizon that involves the assembly of ECM solutions, largely broken down into four areas:

  • ECM “Apps” – Analogous to iPhone apps, these represent a library of generic, pre-built components that provide a specific piece of content management functionality that will work with any CMIS-enabled ECM product. They will enable large aspects of an overall ECM solution to be assembled from building blocks, cherry picked to either extend/improve on existing functionality provided by the ECM product or indeed provide new functionality. Some of the ECM and social computing vendors have already made advances in this area, notably Alfresco Forge (http://forge.alfresco.com) and Jive who intend to launch a Jive Apps Market in Q1 2011. Going forwards, I believe that a general “Apps” market for ECM will evolve, driven by developer communities and independent of individual products;
  • Accelerators – Perhaps as a specific form of ECM “App”,I anticipate the development of accelerators/adapters for enterprise line of business systems in order to enable them to take advantage of the organisation’s underlying CMIS-compliant ECM system – see my blog Making the “E” in ECM actually mean something. For example, the accelerators could provide the line of business application with records management capability by integrating the business application with the ECM to provide either in-place records management of data/documents (i.e. leaving them within the business application but relinquishing control of their future life-cycle to the ECM system) or moving/archiving them for storage within the ECM system. Another example might be to enable information stored within CMIS-compliant ECM repositories (such as legal contracts for a customer) to be accessible within the business application without leaving the business application user interface;
  • User Interface – In the majority of cases, the user interface application that comes out-of-the-box with the specific ECM product is the user interface (perhaps skinned to give some personalised corporate look & feel) that organisations use to access their content. Looking forwards, I anticipate the introduction of a range of more compelling, sophisticated and highly intuitive interfaces, developed by third parties to work with any CMIS-compliant ECM – see my blog The seduction of the species. This will enable organisations to capitalise on user interface applications that they feel are best suited to their organisation, rather than being constrained to the user interface application that comes with their ECM system. There are some third parties that have started to address this opportunity such as Generis (www.generiscorp.com);
  • Composite Content Applications – There will be a surge in the number of vertically or horizontally focused applications that deal with a specific business process requirement (Gartner calls these Composite Content Applications), developed to run completely independent of the underlying ECM. The development of such applications could be simplified by leveraging the OpenCMIS client libraries (http://incubator.apache.org/chemistry/opencmis.html) in combination with the ECM Apps, Accelerators and User Interfaces (as discussed in the points above), to push/pull information to/from a variety of CMIS-enabled repositories, reflecting different permuations of use cases.

Some of the key implications of these trends on how future ECM solutions will be implemented are:

  • The approach naturally lends itself for deployment into the cloud;
  • It will enable ECM solutions to be built and deployed far more quickly as utilising pre-built components that can be assembled to provide a tighter fit to requirements out-of-the-box;
  • It will promote ECM at a wider strategic level across the organisation, as the ECM system will be leveraged by a greater number of line of business systems.

I would love to hear other people’s views of the impact that CMIS will have in the ECM marketplace.

Posted by: mcgratha | December 21, 2010

The seduction of the species

Continuing the theme of the Future of ECM … trend #2 …

Experience is everything

User participation is absolutely essential to the successful deployment of an ECM solution. Without it, most fail or only achieve moderate success. This might seem like an obvious point, but is one that is often not fully appreciated.

There are many approaches to drive and improve user participation, with those that involve a ‘carrot’ rather than a ‘stick’ usually achieving the best long term, sustainable success. Seeking inspiration for where ECM is going in the future, I’ve looked at the consumer world where there has been a massive and viral up-take of consumer devices (such as iPhone, iPad) and social networking applications. These devices and applications have inspired users and created a desire to participate and communicate.

To replicate this level of participation into the workplace requires a shift from providing users with a purely functional interface for ECM to providing them with an “experience”. It is important to seduce users, immersing them into an interface that is cool, sleek and sophisticated, creating an environment where users really want to participate with minimum effort required, and invoking a positive change in user behaviour around how they create, use and share information.

To this end, I believe that there will be a step change in user interfaces for ECM over the coming five years, creating a much more intuitive, rich and compelling experience for users. For example:

  • Rich Internet Application interfaces – The native web client applications that come with ECM systems will gradually evolve into more sophisticated Rich Internet Applications (RIA) based on platforms such as Adobe Flex/Flash, Microsoft SilverLight, Curl, Asynchronous JavaScript and XML (AJAX) and HTML5 (when released). RIA allows end-users to interact with a web application as they would with a desktop application, avoiding the sluggish page response times and incessant screen refreshes associated with traditional web applications. Google Docs is an example of an RIA, enabling the creation and editing of documents, presentations and spreadsheets online. The web applications available from the Google Chrome web store present another interesting example of an RIA as their user interface resembles an iPad/iPhone application (see (https://chrome.google.com/webstore). This is reflective of a growing trend, as many companies are finding that their iPad applications are offering their customers better usability than their websites, and are therefore changing their websites to leverage design principles from iPad applications. Until standards are agreed for RIA (or until one RIA platform begins to dominate), ECM vendors are likely to align their user interfaces with one of the RIA platforms;
  • Mobility – Smartphones (such as RIM BlackBerry, Apple iPhone, Android phones) are used by an increasing percentage of people as their choice of interface device to conduct business. This is a trend that is echoed in the consumer space where, for example, 200 million people access Facebook via mobile devices and are generally shown to be twice as active as non-mobile users. From a business perspective, there will be an greater demand to access business related documents, stored within ECM systems, from smartphones whilst on the move, and away from the traditional desktop. This trend has already started with many of the leading ECM vendors in the marketplace releasing Blackberry and iPhone applications to provide a secure end-to-end communication between mobile devices and their ECM systems (for example, Open Text ‘Everywhere’);
  • Revolutionary interfaces – Sophisticated but highly intuitive interfaces will become increasingly available. For example:
    • Touch-screen – New user interface applications for ECM are likely to be heavily influenced by what has been learned from touch screen interfaces on mobile devices and tablets. For example, Microsoft Surface (www.microsoft.com/surface) is a revolutionary multi-touch computer with a horizontal user interface (looks like a coffee table) that responds to natural hand gestures and real-world objects;
    • 3D – User interfaces that enable information to be navigated and manipulated using intuitive gestures and movements akin to the Hollywood movie ‘Minority Report’ is now technically feasible and showcased at http://singularityhub.com/2010/07/02. There are also parallels to be drawn from the gaming industry (for example, Nintendo Wii, Microsoft Kinect for Xbox).

Many of the next generation user interface applications for ECM will be developed by third parties, rather than directly by ECM vendors. This will further accelerate the introduction of new sophisticated but highly intuitive interfaces, seducing users in the workplace to maximise their participation in information sharing and re-use.

« Newer Posts - Older Posts »

Categories