Tag: Planning

A Portfolio Approach to Developing Workflows and Processes

One of the common pitfalls I see with process optimization projects is that they tend to focus on a specific process at a time.  This may be ok when you are just starting out, or working with informal processes, but as the number of complex processes improves it is important to try and take a step back and look at things from an overall portfolio perspective.  In many cases processes overlap or are interrelated.  I most often see this in finance processes because they are so common in all organizations.  Something like a Check Request process should be a pretty standard, well defined process but it is often part of a number of other process flows.  It is easy to ignore the fact that the same steps and activities are followed elsewhere, but that leads to a lot of extra work for the process designers and administrators as well as non-standard activities for your process workers to follow.

To overcome this, it is important to consider the following points when analyzing and designing the process:

  • Is there a natural collection of steps or activities?
  • Are these steps also done to support another process?
  • Is a different group or department responsible for those steps?

While performing the process analysis and design, some activities may form a natural grouping.  It could be a set of steps that are referred to under a particular label and they are likely to be assigned to or processed by a specific group of users.  For large complex workflows, it may be a good idea to make that a sub-process that can be referred to as a set unit.  It is important to talk to the stakeholders that perform those tasks and understand if those same tasks or processes are are also performed to support another process.  If they are, then it would be better to design a standard sub-process that is called from the other processes than to build in the specific steps into each process.

The Check Request example I mentioned before is one that I have seen come up in more than one organization.  There is a set of common steps where requests have to be approved, logged, and then processed.  Standard compliance activities are another example of a common central process that may be leveraged by a number of other processes.  Some times these opportunities present themselves early, but other times you have to dig to identify these sub-processes.  In cases where there are multiple groups involved the main process owner or stakeholder may not fully understand the details of how every step is executed so it is important to interview the actual project participants to understand what they are doing and other processes that may use those steps.  Within the Check Request example, it is unlikely that a process owner in Operations understands all of the various corporate activities that may generate a check request, they only understand that it is part of their one process.  By talking to the process workers in Finance, the other perspective can be considered.

By taking a Portfolio Approach in this case, you can potentially make real improvements that extend the process design and automation benefits not just to the one process, but to multiple processes across the entire organization.  Those processes will also get easier to expand and manage as they can leverage common sub-processes and existing functionality.

Keys to Long Term SharePoint Stability and Success

Recently I have been called into a few environments where the customers were having some serious performance problems or had features that were no longer working.  It really nailed home the point that Capacity Planning should really be Capacity Management as Microsoft now refers to it in their Capacity Management and Sizing Overview guidance for SharePoint 2010.  These environments also tend to have some other issues with patching and large un-used content databases.

The Keys below will help establish long term success for your SharePoint environment.

Initial Design and Planning

The planning and design work that typically goes on prior to an implementation is based heavily on assumptions and the understanding of current requirements.  In any environment where an application like SharePoint takes off, those assumptions change quickly, the needs of the business evolves, and therefore all of those requirements change.  In many cases though, the SharePoint farm topology is not changed and can no longer meet the needs.  With the current state of IT many resources are stretched and do not have time to make major changes to the system, but in many cases a few proactive changes would remove some of the ongoing system support and troubleshooting efforts.

Continued Monitoring

Every system needs regular monitoring.  The frequency and depth of the reviews depends on how complicated the implementation is, but below I have listed out some generic topics that can be reviewed.

Quarterly Review

  • Memory and CPU Utilization
  • Patches – Review new patches and install if appropriate

Semi-Annual Review

  • Review Content Databases – Number of Site Collections per Content DB and the size of each Content Database
  • Search Index Health – Number of items in the index, length of the crawls
  • Average and Peak Usage Stats – Review the average and peak user stats and add hardware if needed.

In addition, in some cases new features are enabled or leveraged months after the initial implementation.  If for example you are going to use SharePoint to host your BI solutions additional capacity may be needed.  If the system was initially designed for a pretty basic Intranet and the BI capabilities are added then the system may not be able to keep up.


Patch management also contributes to keeping your SharePoint installation stable and high performing over time.  Installing Service Packs or the bi-monthly Cumulative Updates can be difficult in some environments where maintenance windows are tight, but these patches will also help keep services running smoothly and bugs at bay.  I worked in one environment where at least three major issues were all resolved with previously released patches.  Unfortunately a lot of time was spent troubleshooting needlessly.

Prune the Hedges

Most information stores get bloated over time, SharePoint is not immune to this.  IT groups have been fighting this for years with shared drives and mail servers.  It is important to have some good retention policies in place to make sure you are keeping the right content, but also getting rid of the stale content.  At the very least you can implement an archiving solution that can move the content to cheaper storage, while keeping it accessible.


Following these recommendations will greatly increase your chances for maintaining a highly capable, well performing environment.

White Paper – MS Office Compatibility with SharePoint

For those that have been involved with SharePoint for awhile the document title “Good, Better, Best” should ring some bells.  There was a document with that title published with the last few SharePoint versions that compared the various MS Office suites going back to Office 2000 and detailed how they interacted with SharePoint features.  An updated version has been published in the form of a 37 page white paper called Business Productivity at Its Best

This document is important because the hardware and software refresh cycles at most companies are much different than it is for servers.  I have worked with a few clients in the past few months that still have a significant number of users on Office 2003 for example.  When providing them SharePoint 2010 guidance the information needs to match what they can expect in their environment with the products they have.  This document will provide excellent information that may help justify accelerating the upgrade to Office 2010 or if not will help illustrate what features will not be available (ex. co-authoring).  The white paper also includes information on the mobile and offline capabilities which are also becoming much more important in deployment projects.

I highly recommend this document to anyone planning or engaged in a deployment or upgrade to SharePoint 2010. 

Keys to Planning For SharePoint Search

Of the core planning areas in SharePoint, Search seems to get very little attention.  I find this surprising since one of the most common complaints from SharePoint implementations (or any Enterprise Information System) is that users cannot find “anything” in the system.  Most of the environments I go into have Search configured in only a very basic sense; a search site was created and crawling is scheduled.  To really get value out of the search features, and to greatly enhance the end user experience, additional planning and configuration is required.  Like many of the high level planning topics, the goals and plan should be reviewed at least once a year to ensure that the current goals and expectations are aligned.

Evaluate the Content

Not all content is equal.  The format of the content (File Type, Lists, Web Sites), the freshness of the content, and the purpose of the content can vary dramatically and may need to be handled differently.  Evaluate the types of content in the system currently along with any new types of content expected to be added in the short term.

Three examples of the different types of content could include:  Enterprise Content Management, Help Desk, and Project content.

Enterprise Content Management – Large ECM data stores primarily include structured data used throughout the organization.  Examples could include things like Sales Orders, Purchase Orders, or anything that is used by multiple departments throughout the organization.  Search tends to be very important to large data stores like this since the content is not easy to browse.

Help Desk Content – In most medium to large organizations there is content used to support the Help Desk functions spread through multiple sites, and perhaps multiple systems.  Some content might be document based, some might be in SharePoint lists for things like FAQs, or other formats.  While a regular keyword search may bring back relevant content, this is a great example of a case where a custom Search Scope could be created to narrow down the content locations or types that are queried against.

Project Data – For organizations that manage a large number of projects through SharePoint, it is possible to identify a unique type of content that is stored throughout all of the sites, and create Search Scopes that can return that specific type of project content.

Determine Search Goals

The next step is to interview the stakeholders including a number of the end users to determine what their expectations are as well as how they think they would approach search with the system.

Identify the Content They Search For – Try to determine the content they most frequently work with as well as the types of content they most frequently have to search for.  Typically users know where their frequently accessed content is located.  Unless there are thousands are items they may not need search for that content.  For other content, in other areas they are likely less familiar with the structure and rely on search.

Identify Search Types – Try to identify the types of searches they do (standard, or advanced with keywords), and how likely they would be to match a search against a specific search scope.

Identify Level of Patience – While it would be great if you only ever received a single search item in the result set, and it was the perfect match, that is not realistic.  Try to have the users express what their expectations are and what their true level of patience is.  While they ever look at the second page of results?  Will they quickly just do a different search?  Will they use any of the search refinements?

Identify Location of the Search – In addition, try to have them identify WHERE they initiate a search from.  To they navigate to the general area of the sites, do they try and search directly from the front page, do they navigate to different search centers depending on what they are looking for?  Depending on the results, the end users may be even be able to take advantage of the advanced configuration without training.

Develop a Search Design Plan

The next step is to develop your Search Design Plan.  It is important to review all of the different types of content along with the stakeholder’s goals and expectations to ensure that they are addressed.

Define Content Sources – Start the design by identifying all of the content sources you want to include in the index.

Identify Search Scopes – Next, try and identify Search Scopes that represent the logical boundaries for your content.  This more than anything can lead to highest level of relevancy because you are limiting the areas of the index that are being searched against.

Keywords and Best Bets – For the common search terms or critical company specific terms, it is important to try and identify the Keywords, Best Bets and Synonyms to help surface the content.

Custom Search or Result Pages – Try to identify any areas where a custom search or results page would be beneficial.  Both the search and results pages can be fully customized.  There are a number situations where simple modifications can provide a big payback.

Enterprise Search

With proper planning these features can be extended to provide a powerful Enterprise Search experience.  To make that transition the planning needs to extend beyond just your SharePoint content, but to content also stored in other systems, file shares, web sites, and email systems.  As the scope of search increases, so does the number of items in the index and therefore the amount of planning and configuration that needs to happen to deliver relevant results.

This is also where typically the topic of Federated Search comes up, since it is often valuable to provide results from other search systems, but in a separate result set instead of adding it to the local index and return it as part of the main results.

Review Search Metrics and Reports

Both MOSS and Server 2010 come equipped with Search metrics and reports that make it possible to analyze the current usage and effectiveness of search.  If you have not reviewed these before, there are bound to be some surprises both with the number of searches executed as well as what people are searching for.  The information can help you understand what people frequently search for as well as which results they are clicking through versus executing another search.  This information can be used to tune the search results for those keywords, and Best Bets can be configured for key content.  I have always relied on this information when reviewing end user needs for an existing environment.


The keys to planning for SharePoint Search include Evaluating the Content, Determining the organization’s Search Goals, Developing a Plan, and Reviewing Search Reports.  Following through on these steps will greatly increase the likelihood that the system will deliver reliable and relevant content back to the end users.  Enhancing the search experience will have tremendous payback for any organization with mission critical content in the system and can be the critical point that makes the difference between a successful and an unsuccessful enterprise information system.

SharePoint 2010 Migration versus Upgrade

No doubt there are hundreds of companies currently reviewing or planning the move to SharePoint 2010 from previous versions.  The excitement and business interest around the technology is very encouraging.  Before picking an “upgrade” path though, I would encourage teams to compare the Migration paths in addition to the normal Upgrade paths. 

Migration Path

Taking a Migration path means you are going to build a new farm or environment and then move content to it.  There are many advantages to this approach, here are some to consider. 

Restructure Information Architecture and Design – The migration path gives you the ability to learn from past mistakes or to make changes to better suit the current set of requirements.  This opportunity does not come up often, so if changes need to be made, this is a good time to enact those changes.  Changes may include restructuring the web application and site collection topology as well as Taxonomies.

Incremental Move – Since you are moving from one farm to another it does not have to be completed at all once, you have the option of breaking the content down into smaller units that can be moved one at a time.  This is especially valuable in cases where custom applications might have been built that cannot be upgraded without rework. 

Take Advantage of New Features – There are some situations where after an upgrade it may be difficult or at least require more work in order to take advantage of all of the new features.  One great example of this is the new Claims Based Authentication model.  In the past though, I have seen other issues which were traced back to compatibility issues with site definitions.  Given the fundamental changes to the Publishing features in 2010, I expect there to be issues when people take an Upgrade path. 

The Downsides – There are costs to taking this approach.  It may take more time than an in-place upgrade and it definitely takes more planning.  There may also be software costs for migration tools which can help automate a migration.

Upgrade Path

The upgrade paths may mean you are using the same hardware, or at the very least you are moving the Content Databases which keep the Applications and Site Collection topology intact.  The main advantages include:

Quicker and Easier – It should not take as much planning, nor take as long to complete. 

The Downsides – All of your existing Site Topology, Security, and general organization issues are moved to the new platform.  The system may be difficult to administer and maintain.

Choosing the Correct Path

Both paths have their pros and cons and offer unique opportunities.  The right path is the path that helps your team meet its objectives within the constraints given.  Most IT leaders push for the regular Upgrade path because it is less complicated within the scope of the short term project, but they do not have the full picture of what the opportunity cost is or what the long term impact will be.  It is implementation team’s responsibility to educate the decision makers as much as possible so that the best decision can be reached.

Related Posts

Unlocking The Potential of SharePoint With Remote Access

I have been putting a lot of work lately into some general 2010 presentations and one of the messages I’m really concentrating on is Connect anywhere, from any device, with any browser.  I think this is a critical ingredient in enabling SharePoint to be a great Unified Business Collaboration Platform.  It comes down to removing all of the unnecessary hurdles of accessing and working with the content in a secure manner.  While there are some technical changes that were made such as better support for mobile devices and true standards support allowing cross-browser consistency, the real change for me is philosophical.  Until recently I have been somewhat uncomfortable opening up internal portals and content to the internet.  My biggest fear has always been network security.  Gateways like ISA, now Forefront Threat Management Gateway (TMG), are not new but many IT groups have had difficulty implementing them or managing them effectively so I have not pushed for them in many of my implementations unless it was needed to fulfill a requirement.  With the Connect Anywhere, from any device, with any browser mantra in mind that will no longer be the case.  I will now advocate for secure access to be provided wherever possible without the requirement to connect via VPN or be on the local network.  This enhanced access will easy collaboration for remote workers and has the potential to speed up the collaboration process in addition to making it richer. 

If your organization is not sure how to implement or configure one of the gateway applications, seek out experts in the form of consultants or community members that are able to help.  You users will thank you for it!

Content Aggregation in SharePoint 2010

One of the areas I continue to see companies struggle with is sharing and reusing content.  Many will cram everything into a single site collection, or worse yet a single web so that everything is in one spot and available.  A little over a year ago I wrote an article titled “The Importance of a Content Syndication and Aggregation Plan” where I covered a few of the method used in the 2007 version.  There were a lot of limitations with the out of the box toolset, many relating to the site topology boundaries.  The 2010 version brings a few important improvements that I think will benefit most companies.

Capabilities in SharePoint 2010

Things are vastly improved in 2010.  All of the features from 2007 are still available, and some of them have been improved.  They include:

  • Content Query Web Part (CQWP) – with SharePoint Server
  • DataView Web Part (DVWP)
  • RSS
  • Content Deployment – with SharePoint Server
  • Content Types (promote common definition and reuse)
  • Custom Web Parts

Some of the new features include:

Calendar Overlays – One of the most common requests I have received in the past is to aggregate multiple calendars.  Whether it is for doing a roll-up on the teams or groups in a department or for pulling together dates across multiple projects, people need to be able to merge calendars.  With 2010 you now have the ability to aggregate up to ten calendars in a single view.  These calendars can be in any site collection, and can even be from exchange.  This makes it super easy for example to pull together a main company calendar with the entries being managed by different departments like HR and Marketing or across divisions.

Check out Bjorn Furuknap’s blog for details.  He did a great walkthrough on how to View Multiple Calendars in SharePoint 2010 here.

Managed Metadata Service – As the name suggests, this is a centralized service that enables you to manage your terms and content types.  Previously Content Types were bound to a specific site collection and working with them in more than one site collection meant manually keeping them in sync.  Having content described consistently across an organization makes aggregating that content a whole lot easier.

Here is the documentation on TechNet for the Managed Metadata Service.

A Word of Caution

An important word of caution; just because a feature is supported does not mean it will meet your needs.  In previous versions some of the aggregation features that were available, like the ability to connect to another list in the site collection with a DVWP or CQWP, worked fine when working with a few data sources but did not scale well.  I have heard of cases where hundreds of data sources need to be aggregated into a central source.  Be sure to perform both functional and performance testing to ensure that it meets your needs and if not, look to a custom solution that fine tune the process and perhaps implement a special caching mechanism.

Think About Content During Topology Planning

As part of the site topology planning or review, it is important to think about the content that will be stored and where else it might need to be used.  Make sure that the team is familiar with the different aggregation or roll-up options and what the pros and cons are to each.   

Related Articles

%d bloggers like this: