Welcome!

Silverlight Authors: Automic Blog, Michael Kopp, AppDynamics Blog, Kaazing Blog, Steven Mandel

Related Topics: Microsoft Cloud, Industrial IoT, Cognitive Computing , Silverlight, Agile Computing

Microsoft Cloud: Article

SharePoint Gone Wild: When Governance Lacks Discoverability | Part 3

Part three of the SharePoint Gone Wild series

If you missed any previous part of this series, you can read them here.

In the series so far, we've talked about how accountability, quality, appropriateness and restrictions have all been key drivers for why we need to focus on governance for Microsoft SharePoint within our organizations.

To continue this series, I want to focus on another key area called Discoverability. What does this term mean? It is the quality of the content being discoverable (able to find information) within SharePoint. Unfortunately, this also happens to be one of the biggest pains within organizations today.

Now, you would think with an enterprise search engine, managed metadata service application, content types with site columns, and versioning that this wouldn't be so difficult. In reality, however, it can often make this harder than just searching on a file system because the search crawlers do such a good job of surfacing everything!

The biggest problem with search surfacing everything is that old data comes up in the search and, as discussed in my post on restrictions, unsecured data also bubbles up to the surface and creates noise to users who shouldn't see it.

Unfortunately, although managed metadata is great when applied to list items, it also produces an unfair rating if list items are not using the metadata or appearing when search filters are applied. Unless you enforce managed metadata as a required field on list items, this unbalance will always occur. Don't think that your job stops at putting a red star next to the managed metadata field, either: Even if you enforce required field entries, users can just add one term and others may add ten terms, thereby keeping the same type of imbalance.

Another common issue comes into play when the enterprise search within SharePoint is enabled and crawls all content without any parameters. It is essential that you "tune" the configuration with the use of scopes, keywords, and exclusions in order to make results more relevant.

As with any document repository, often there are duplicate documents in various locations where people accountable for content have placed it in the incorrect location in the information architecture. Although the search results page is smart enough to remove EXACT duplicates, if there a few versions littered around the farm, this will produce duplicate results as well.

When I go out and work with customers on this issue, they are always after the silver bullet. Unfortunately, this just isn't an easy problem to solve. Here are some of the things that I've seen customers doing using native SharePoint functionality in order to improve discoverability:

  • Information architecture planning - As I've already stressed in this post - as well as others - planning the appropriate structure for site collections, sub sites, lists, and libraries is very important.
  • Management metadata taxonomies - The best way to get up and running with managed metadata is to provide one taxonomy for users, and then educate them on the power of it for discoverability. The easiest way to demonstrate this is the search refiners during search results that allow you to narrow your search by terms.
  • Tuning search - As discussed, tuning the search can involve various configurations. The best approach to this is identifying key search use cases and customizing results pages specifically for them with requisite scopes and targeted advanced search fields. People search is a good out-of-the-box example of this.

Michael Pisarek also discusses some great points on search governance on his blog, and I recommend checking out his post. If you are not already subscribed to the SharePointPodShow.com I'd highly recommend doing so, as they did a great podcast on search with Josh Noble that is well worth a listen.

More Stories By Jeremy Thake

Jeremy Thake is AvePoint's Chief Architect. Jeremy’s 10-plus years of experience in the software development industry, along with his expertise in Microsoft technologies, earned him the label of “expert” in the global SharePoint community. He was named a Microsoft SharePoint MVP in 2009, and continues to work directly with enterprise customers and AvePoint’s research & development team to develop solutions that will set the standard for the next generation of collaboration platforms, including Microsoft SharePoint 2013.

Jeremy was one of only eight Microsoft MVPs from Australia, where he lived for seven years, who was recognized by the SharePoint Product Team in 2010 for his extensive contributions to the global SharePoint community. He also played an instrumental role in organizing the Perth SharePoint User Group during his time living there.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...