Shyam's Slide Share Presentations

VIRTUAL LIBRARY "KNOWLEDGE - KORRIDOR"

This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.

TO VIEW MORE CONTENT ON THIS SUBJECT AND OTHER TOPICS, Please visit KNOWLEDGE-KORRIDOR our Virtual Library

Monday, March 20, 2017

Case of Digital Reinvention 4



As executives assess the scope of their investments, they should ask themselves if they have taken only a few steps forward in a given dimension—by digitizing their existing customer touchpoints, say. Others might find that they have acted more significantly by digitizing nearly all of their business processes and introducing new ones, where needed, to connect suppliers and users.
To that end, it may be useful to take a closer look at Exhibit 6, which comprises six smaller charts. The last of them totals up actions companies take in each dimension of digitization. Here we can see that the most assertive players will be able to restore more than 11 percent of the 12 percent loss in projected revenue growth, as well as 7.3 percent of the 10.4 percent reduction in profit growth. Such results will require action across all dimensions, not just one or two—a tall order for any management team, even those at today’s digital leaders.

Looking at the digital winners

To understand what today’s leaders are doing, we identified the companies in our survey that achieved top-quartile rankings in each of three measures: revenue growth, EBIT growth, and return on digital investment.

We found that more than twice as many leading companies closely tie their digital and corporate strategies than don’t. What’s more, winners tend to respond to digitization by changing their corporate strategies significantly. This makes intuitive sense: many digital disruptions require fundamental changes to business models. Further, 49 percent of leading companies are investing in digital more than their counterparts do, compared with only 5 percent of the laggards, 90 percent of which invest less than their counterparts. It’s unclear which way the causation runs, of course, but it does appear that heavy digital investment is a differentiator.

Leading companies not only invested more but also did so across all of the dimensions we studied. In other words, winners exceed laggards in both the magnitude and the scope of their digital investments (Exhibit 7). This is a critical element of success, given the different rates at which these dimensions are digitizing and their varying effect on economic performance. 








































Strengths in organizational culture underpin these bolder actions. Winners were less likely to be hindered by siloed mind-sets and behavior or by a fragmented view of their customers. A strong organizational culture is important for several reasons: it enhances the ability to perceive digital threats and opportunities, bolsters the scope of actions companies can take in response to digitization, and supports the coordinated execution of those actions across functions, departments, and business units.

Bold strategies win

So we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more, and more broadly and boldly, than other companies do. Then we tested two paths to growth as industries reach full digitization.

The first path emphasizes strategies that change a business’s scope, including the kind of pure-play disruptions the hyperscale businesses discussed earlier generate. As Exhibit 8 shows, a great strategy can by itself retrieve all of the revenue growth lost, on average, to full digitization—at least in the aggregate industry view. Combining this kind of superior strategy with median performance in the nonstrategy dimensions of McKinsey’s digital-quotient framework—including agile operations, organization, culture, and talent—yields total projected growth of 4.3 percent in annual revenues. (For more about how we arrived at these conclusions, see sidebar “About the research.”).





























Most executives would fancy the kind of ecosystem play that Alibaba, Amazon, Google, and Tencent have made on their respective platforms. Yet many recognize that few companies can mount disruptive strategies, at least at the ecosystem level. With that in mind, we tested a second path to revenue growth (Exhibit 9).






In the quest for coherent responses to a digitizing world, companies must assess how far digitization has progressed along multiple dimensions in their industries and the impact that this evolution is having—and will have—on economic performance. And they must act on each of these dimensions with bold, tightly integrated strategies. Only then will their investments match the context in which they compete.

Contd 5.........

Page 12, 35


The case for digital reinvention 5



The case for digital reinvention 3






Instead, the survey indicates that distribution channels and marketing are the primary focus of digital strategies (and thus investments) at 49 percent of companies. That focus is sensible, given the extraordinary impact digitization has already had on customer interactions and the power of digital tools to target marketing investments precisely. By now, in fact, this critical dimension has become “table stakes” for staying in the game. Standing pat is not an option.

The question, it seems, looking at exhibits 4 and 5 in combination, is whether companies are overlooking emerging opportunities, such as those in supply chains, that are likely to have a major influence on future revenues and profits. That may call for resource reallocation. In general, companies that strategically shift resources create more value and deliver higher returns to shareholders. This general finding could be even more true as digitization progresses.

Our survey results also suggest companies are not sufficiently bold in the magnitude and scope of their investments (see sidebar “Structuring your digital reinvention”). Our research (Exhibit 6) suggests that the more aggressively they respond to the digitization of their industries—up to and including initiating digital disruption—the better the effect on their projected revenue and profit growth. The one exception is the ecosystem dimension: an overactive response to new hyperscale competitors actually lowers projected growth, perhaps because many incumbents lack the assets and capabilities necessary for platform strategies.




Contd 4.........

Page 1, 2, 4. 5





The case for digital reinvention 2



This finding confirms what many executives may already suspect: by reducing economic friction, digitization enables competition that pressures revenue and profit growth. Current levels of digitization have already taken out, on average, up to six points of annual revenue and 4.5 points of growth in earnings before interest and taxes (EBIT). And there’s more pressure ahead, our research suggests, as digital penetration deepens (Exhibit 2).





While the prospect of declining growth rates is hardly encouraging, executives should bear in mind that these are average declines across all industries. Beyond the averages, we find that performance is distributed unequally, as digital further separates the high performers from the also-rans. This finding is consistent with a separate McKinsey research stream, which also shows that economic performance is extremely unequal. Strongly performing industries, according to that research, are three times more likely than others to generate market-beating economic profit. Poorly performing companies probably won’t thrive no matter which industry they compete in.

At the current level of digitization, median companies, which secure three additional points of revenue and EBIT growth, do better than average ones, presumably because the long tail of companies hit hard by digitization pulls down the mean. But our survey results suggest that as digital increases economic pressure, all companies, no matter what their position on the performance curve may be, will be affected.

Uneven returns on investment

That economic pressure will make it increasingly critical for executives to pay careful heed to where—and not just how—they compete and to monitor closely the return on their digital investments. So far, the results are uneven. Exhibit 3 shows returns distributed unequally: some players in every industry are earning outsized returns, while many others in the same industries are experiencing returns below the cost of capital. 





These findings suggest that some companies are investing in the wrong places or investing too much (or too little) in the right ones—or simply that their returns on digital investments are being competed away or transferred to consumers. On the other hand, the fact that high performers exist in every industry (as we’ll discuss further in a moment) indicates that some companies are getting it right—benefiting, for example, from cross-industry transfers, as when technology companies capture value in the media sector.

Where to make your digital investments

Improving the ROI of digital investments requires precise targeting along the dimensions where digitization is proceeding. Digital has widely expanded the number of available investment options, and simply spreading the same amount of resources across them is a losing proposition. In our research, we measured five separate dimensions of digitization’s advance into industries: products and services, marketing and distribution channels, business processes, supply chains, and new entrants acting in ecosystems.

How fully each of these dimensions has advanced, and the actions companies are taking in response, differ according to the dimension in question. And there appear to be mismatches between opportunities and investments. Those mismatches reflect advancing digitization’s uneven effect on revenue and profit growth, because of differences among dimensions as well as among industries. Exhibit 4 describes the rate of change in revenue and EBIT growth that appears to be occurring as industries progress toward full digitization. This picture, combining the data for all of the industries we studied, reveals that today’s average level of digitization, shown by the dotted vertical line, differs for each dimension. Products and services are more digitized, supply chains less so. 




To model the potential effects of full digitization on economic performance, we linked the revenue and EBIT growth of companies to a given dimension’s digitization rate, leaving everything else equal. The results confirm that digitization’s effects depend on where you look. Some dimensions take a bigger bite out of revenue and profit growth, while others are digitizing faster. This makes intuitive sense. As platforms transform industry ecosystems, for example, revenues grow—even as platform-based competitors put pressure on profits. As companies digitize business processes, profits increase, even though little momentum in top-line growth accompanies them.

The biggest future impact on revenue and EBIT growth, as Exhibit 4 shows, is set to occur through the digitization of supply chains. In this dimension, full digitization contributes two-thirds (6.8 percentage points of 10.2 percent) of the total projected hit to annual revenue growth and more than 75 percent (9.4 out of 12 percent) to annual EBIT growth.

Despite the supply chain’s potential impact on the growth of revenues and profits, survey respondents say that their companies aren’t yet investing heavily in this dimension. Only 2 percent, in fact, report that supply chains are the focus of their forward-looking digital strategies (Exhibit 5), though headlining examples such as Airbnb and Uber demonstrate the power of tapping previously inaccessible sources of supply (sharing rides or rooms, respectively) and bringing them to market. Similarly, there is little investment in the ecosystems dimension, where hyperscale businesses such as Alibaba, Amazon, Google, and Tencent are pushing digitization most radically, often entering one industry and leveraging platforms to create collateral damage in others. 

Contd 3...............

Page 1 3, 4. 5



The case for digital reinvention 03-21


Digital technology, despite its seeming ubiquity, has only begun to penetrate industries. As it continues its advance, the implications for revenues, profits, and opportunities will be dramatic.





























Image credit : Shyam's Imagination Library



As new markets emerge, profit pools shift, and digital technologies pervade more of everyday life, it’s easy to assume that the economy’s digitization is already far advanced. According to our latest research, however, the forces of digital have yet to become fully mainstream. On average, industries are less than 40 percent digitized, despite the relatively deep penetration of these technologies in media, retail, and high tech.

As digitization penetrates more fully, it will dampen revenue and profit growth for some, particularly the bottom quartile of companies, according to our research, while the top quartile captures disproportionate gains. Bold, tightly integrated digital strategies will be the biggest differentiator between companies that win and companies that don’t, and the biggest payouts will go to those that initiate digital disruptions. Fast-followers with operational excellence and superior organizational health won’t be far behind.

The case for digital reinvention 


As digitization penetrates more fully, it will dampen revenue and profit growth for some, particularly the bottom quartile of companies, according to our research, while the top quartile captures disproportionate gains. Bold, tightly integrated digital strategies will be the biggest differentiator between companies that win and companies that don’t, and the biggest payouts will go to those that initiate digital disruptions. Fast-followers with operational excellence and superior organizational health won’t be far behind.

These findings emerged from a research effort to understand the nature, extent, and top-management implications of the progress of digitization. We tailored our efforts to examine its effects along multiple dimensions: products and services, marketing and distribution channels, business processes, supply chains, and new entrants at the ecosystem level (for details, see sidebar “About the research”). We sought to understand how economic performance will change as digitization continues its advance along these different dimensions. What are the best-performing companies doing in the face of rising pressure? Which approach is more important as digitization progresses: a great strategy with average execution or an average strategy with great execution?

The research-survey findings, taken together, amount to a clear mandate to act decisively, whether through the creation of new digital businesses or by reinventing the core of today’s strategic, operational, and organizational approaches.

More digitization—and performance pressure—ahead

According to our research, digitization has only begun to transform many industries (Exhibit 1). Its impact on the economic performance of companies, while already significant, is far from complete.

Contd 2.........

Page 2, 3, 4, 5

What’s Your Data Worth? 03-20


Many businesses don’t yet know the answer to that question. But going forward, companies will need to develop greater expertise at valuing their data assets.































Image credit : Shyam's Imagination Library


In 2016, Microsoft Corp. acquired the online professional network LinkedIn Corp. for $26.2 billion. Why did Microsoft consider LinkedIn to be so valuable? And how much of the price paid was for LinkedIn’s user data — as opposed to its other assets? Globally, LinkedIn had 433 million registered users and approximately 100 million active users per month prior to the acquisition. Simple arithmetic tells us that Microsoft paid about $260 per monthly active user.

Did Microsoft pay a reasonable price for the LinkedIn user data? Microsoft must have thought so — and LinkedIn agreed. But the deal generated scrutiny from the rating agency Moody’s Investors Service Inc., which conducted a review of Microsoft’s credit rating after the deal was announced. What can be learned from the Microsoft–LinkedIn transaction about the valuation of user data? How can we determine if Microsoft — or any acquirer — paid a reasonable price?

The answers to these questions are not clear. But the subject is growing increasingly relevant as companies collect and analyze ever more data. Indeed, the multibillion-dollar deal between Microsoft and LinkedIn is just one recent example of data valuation coming to the fore. Another example occurred during the Chapter 11 bankruptcy proceedings of Caesars Entertainment Operating Corp.

Inc., a subsidiary of the casino gaming company Caesars Entertainment Corp. One area of conflict was the data in Caesars’ Total Rewards customer loyalty program; some creditors argued that the Total Rewards program data was worth $1 billion, making it, according to a Wall Street Journal article, “the most valuable asset in the bitter bankruptcy feud at Caesars Entertainment Corp.” A 2016 report by a bankruptcy court examiner on the case noted instances where sold-off Caesars properties — having lost access to the customer analytics in the Total Rewards database — suffered a decline in earnings. But the report also observed that it might be difficult to sell the Total Rewards system to incorporate it into another company’s loyalty program. Although the Total Rewards system was Caesars’ most valuable asset, its value to an outside party was an open question.

As these examples illustrate, there is no formula for placing a precise price tag on data. But in both of these cases, there were parties who believed the data to be worth hundreds of millions of dollars.

Exploring Data Valuation

To research data valuation, we conducted interviews and collected secondary data on information activities in 36 companies and nonprofit organizations in North America and Europe. Most had annual revenues greater than $1 billion. They represented a wide range of industry sectors, including retail, health care, entertainment, manufacturing, transportation, and government.

Although our focus was on data value, we found that most of the organizations in our study were focused instead on the challenges of storing, protecting, accessing, and analyzing massive amounts of data — efforts for which the information technology (IT) function is primarily responsible.

While the IT functions were highly effective in storing and protecting data, they alone cannot make the key decisions that transform data into business value. Our study lens, therefore, quickly expanded to include chief financial and marketing officers and, in the case of regulatory compliance, legal officers. Because the majority of the companies in our study did not have formal data valuation practices, we adjusted our methodology to focus on significant business events triggering the need for data valuation, such as mergers and acquisitions, bankruptcy filings, or acquisitions and sales of data assets. Rather than studying data value in the abstract, we looked at events that triggered the need for such valuation and that could be compared across organizations.
We define data value as the composite of three sources of value: (1) the asset, or stock, value; (2) the activity value; and (3) the expected, or future, value.
All the companies we studied were awash in data, and the volume of their stored data was growing on average by 40% per year. We expected this explosion of data would place pressure on management to know which data was most valuable. However, the majority of companies reported they had no formal data valuation policies in place. A few identified classification efforts that included value assessments. These efforts were time-consuming and complex. For example, one large financial group had a team working on a significant data classification effort that included the categories “critical,” “important,” and “other.” Data was categorized as “other” when the value was judged to be context-specific. The team’s goal was to classify hundreds of terabytes of data; after nine months, they had worked through less than 20.

The difficulty that this particular financial group encountered is typical. Valuing data can be complex and highly context-dependent. Value may be based on multiple attributes, including usage type and frequency, content, age, author, history, reputation, creation cost, revenue potential, security requirements, and legal importance. Data value may change over time in response to new priorities, litigation, or regulations. These factors are all relevant and difficult to quantify.

A Framework for Valuing Data

How, then, should companies formalize data valuation practices? Based on our research, we define data value as the composite of three sources of value: (1) the asset, or stock, value; (2) the activity value; and (3) the expected, or future, value. Here’s a breakdown of each value source:

1. Data as Strategic Asset

For most companies, monetizing data assets means looking at the value of customer data. This is not a new concept; the idea of monetizing customer data is as old as grocery store loyalty cards. Customer data can generate monetary value directly (when the data is sold, traded, or acquired) or indirectly (when a new product or service leveraging customer data is created, but the data itself is not sold). Companies can also combine publicly available and proprietary data to create unique data sets for sale or use.

How big is the market opportunity for data monetization? In a word: big. The Strategy& unit of PwC has estimated that, in the financial sector alone, the revenue from commercializing data will grow to $300 billion per year by 2018.

2. The Value of Data in Use

Data use is typically defined by the application — such as a customer relationship management system or general ledger — and frequency of use. The frequency of use is typically defined by the application workload, the transaction rate, and the frequency of data access.

The frequency of data usage brings up an interesting aspect of data value. Conventional, tangible assets generally exhibit decreasing returns to use. That is, they decrease in value the more they are used. But data has the potential — not always, but often — to increase in value the more it is used. That is, data viewed as an asset can exhibit increasing returns to use. For example, Google Inc.’s Waze navigation and traffic application integrates real-time crowdsourced data from drivers, so the Waze mapping data becomes more valuable as more people use it.

The major costs of data are in its capture, storage, and maintenance. The marginal costs of using it can be almost negligible. An additional factor is time of use: The right data at the right time — for example, transaction data collected during the Christmas retail sales season — may be of very high value.

Of course, usage-based definitions of value are two-sided; the value attached to each side of the activity is unlikely to be the same. For example, for a traveler lost in an unfamiliar city, mapping data sent to the traveler’s cellphone may be of very high value for one use, but the traveler may never need that exact data again. On the other hand, the data provider may keep the data for other purposes — and use it over and over again — for a very long time.

3. The Expected Future Value of Data

Although the phrases “digital assets” or “data assets” are commonly used, there is no generally accepted definition of how these assets should be counted on balance sheets. In fact, if data assets are tracked and accounted for at all — a big “if” — they are typically commingled with other intangible assets, such as trademarks, patents, copyrights, and goodwill. There are a number of approaches to valuing intangible assets. For example, intangible assets can be valued on the basis of observable market-based transactions involving similar assets; on the income they produce or cash flow they generate through savings; or on the cost incurred to develop or replace them.
Making implicit data policies explicit, codified, and sharable across the company is a first step in prioritizing data value.

What Can Companies Do?

No matter which path a company chooses to embed data valuation into company-wide strategies, our research uncovered three practical steps that all companies can take.

1. Make valuation policies explicit and sharable across the company. It is critical to develop company-wide policies in this area. For example, is your company creating a data catalog so that all data assets are known? Are you tracking the usage of data assets, much like a company tracks the mileage on the cars or trucks it owns? Making implicit data policies explicit, codified, and sharable across the company is a first step in prioritizing data value.

A few companies in our sample were beginning to manually classify selected data sets by value. In one case, the triggering event was an internal security audit to assess data risk. In another, the triggering event was a desire to assess where in the organization the volume of data was growing rapidly and to examine closely the costs and value of that growth.

The strongest business case we found for data valuation was in the acquisition, sale, or divestiture of business units with significant data assets. We anticipate that in the future, some of the evolving responsibilities of chief data officers may include valuing company data for these purposes. But that role is too new for us to discern any aggregate trends at this time.

2. Build in-house data valuation expertise. Our study found that several companies were exploring ways to monetize data assets for sale or licensing to third parties. However, having data to sell is not the same thing as knowing how to sell it. Several of the companies relied on outside experts, rather than in-house expertise, to value their data. We anticipate this will change. Companies seeking to monetize their data assets will first need to address how to acquire and develop valuation expertise in their own organizations.

3. Decide whether top-down or bottom-up valuation processes are the most effective within the company. In the top-down approach to valuing data, companies identify their critical applications and assign a value to the data used in those applications, whether they are a mainframe transaction system, a customer relationship management system, or a product development system. Key steps include defining the main system linkages — that is, the systems that feed other systems — associating the data accessed by all linked systems, and measuring the data activity within the linked systems. This approach has the benefit of prioritizing where internal partnerships between IT and business units need to be built, if they are not already in place.

A second approach is to define data value heuristically — in effect, working up from a map of data usage across the core data sets in the company. Key steps in this approach include assessing data flows and linkages across data and applications, and producing a detailed analysis of data usage patterns. Companies may already have much of the required information in data storage devices and distributed systems.

Whichever approach is taken, the first step is to identify the business and technology events that trigger the business’s need for valuation. A needs-based approach will help senior management prioritize and drive valuation strategies, moving the company forward in monetizing the current and future value of its digital assets.

Reproduced from MITSLOAN Management Review

Saturday, March 18, 2017

When To NOT Use Isotype Controls 03-19




















Antibodies can bind to cells in a specific manner – where the FAB portion of the antibody binds to a high-affinity specific target or the FC portion of the antibody binds to the FcR on the surface of some cells.

They can also bind to cells in a nonspecific manner, where the FAB portion binds to a low affinity, non-specific target. Further, as cells die, and the membrane integrity is compromised, antibodies can non-specifically bind to intracellular targets.

So, the question is, how can you identify and control for this observed nonspecific antibody binding? 

To answer this question, many research groups started using a control known as the isotype control.
The concept of this control is that an antibody targeting a protein not on the surface of the target cells has the same isotype (both heavy and light chain) as the antibody of interest. When used to label cells, those that showed binding to the isotype would be excluded as they represented the non-specific binding of the cells.

Why Isotype Controls Often Fall Short 

Isotype controls were once the most popular negative control for flow cytometry experiments.


They are still very often included by some labs, almost abandoned by others, and a subject of confusion for many beginners. What are they, why and when do I need them? Are they of any use at all, or just a waste of money?

Most importantly, why do reviewers keep asking for them when they review papers containing flow data?

Isotype controls were classically meant to show what level of nonspecific binding you might have in your experiment. The idea is that there are several ways that an antibody might react in undesirable ways with the surface of the cell.

Not all of these can be directly addressed by this control (such as cross-reactivity to a similar epitope on a different antigen, or even to a different epitope on the same antigen). What it does do is give you an estimate of non-specific (non-epitope-driven) binding. This can be Fc mediated binding, or completely nonspecific “sticky” cell adhesion.

In order to be useful, the isotype control should ideally be the same isotype, both in terms of species, heavy chain (IgA, IgG, IgD, IgE, or IgM) and light chain (kappa or lambda) class, the same fluorochrome (PE, APC, etc.), and have the same F:P ratio. F:P is a measurement of how many fluorescent molecules are present on each antibody.

This, unfortunately, makes the manufacture of ideal isotype controls highly impractical. 

There is even a case to be made that differences in the amino acid sequence of the variable regions of both the light and heavy chains might result in variable levels of undesirable adherence in isotypes versus your antibody of interest. 
Moving Beyond Isotype Controls

Many flow cytometry researchers are no longer using isotype controls, with some suggesting they be left out of almost all experiments.


If you spend any time browsing the Purdue Cytometry list, you’ll see these same arguments presented in threads about isotype controls. 

A report in Cytometry A presents options for controls in several categories, the options available, and pros and cons of each option. The report's section on isotype controls summarizes the problems with the use of isotype controls very clearly.

A second report in Cytometry B presents options for controls in several categories, the options available, and pros and cons of each option.

The section of the above paper focusing on isotype controls summarizes the problems with their use very clearly.


The article also illustrates difference in undesirable binding at different levels using the same clone from different manufacturers.

For example, the figure below shows how even the same isotype control clone can result in highly variable levels of undesirable staining.

















If you do use isotype controls in your experiment, they must match as many of the following characteristics as possible for your specific antibody — species, isotype, fluorochrome, F:P ratio, and concentration.


Here are 5 cases against using isotype controls alone...

1. Isotype controls are not needed for bimodal experiments.

You don’t need isotype controls for experiments that are clearly bimodal. For example, if you are looking for T cells and B cells in peripheral blood, the negative cells also in the circulation will provide gating confidence.

As seen in the second figure below, it is extremely easy to pick out CD4 and CD8 positive cells in the sample of lysed mouse blood.



2. Isotype controls are not sufficient for post-cultured cells.

If you are using post-cultured cells, the isotype control might give you some information about the inherent “stickiness” of your cells.

However, this measurement is not a value you can subtract from your specific antibody sample to determine fluorescence intensity or percent positive.

Instead, the measurement is simply a qualitative measure of “stickiness” and the effectiveness of Fc-blocking in your protocol.

3. Isotype controls should not be used as gating controls.

If you are using multiple dyes in your search, and your concern is positivity by spectral overlap, you will be better served by using a fluorescence-minus-one control (FMO), in which all antibodies are included except the one you suspect is most prone to error from spectral overlap.

4. Isotype controls should not be used to determine positivity.

You should absolutely not be using isotype controls to determine positive versus negative cells — or, as mentioned in #3 above, as a gating control.

5. Isotype controls are not always sufficient for determining non-specific antibody adherence.

Isotype controls cannot always determine non-specific antibody adherence versus, for example, free fluorochrome adherence. For this, you need to use isoclonic controls. If you add massive amounts of non-fluorochrome conjugated monoclonal antibody to your staining reaction, your fluorescence should drop. If it does not, your issue is not due to nonspecific antibody binding, but to free fluorochrome binding.