Don't wanna be here? Send us removal request.
Text
youtube
Beyond enterprise data platforms: Drive success in data-to-decisions journey
0 notes
Text
0 notes
Text
0 notes
Text
What can a space mission teach us on digital transformations? See you at Google Cloud Next'23
0 notes
Text
Beyond Product Viability Index - Measuring Innovation Outcomes

Innovations has always been a key element for sustainable growth especially in the Consumer industry. With insurgent brands chipping away on the market share, this has become more important than ever. And established companies are spending serious sums in designing and launching new products in the markets. The question for them is: How can you measure the effectiveness of their innovation ecosystem for delivering sustainable organic growth?
Product vitality index (PVI) is often too abstract to provide actionable insights:
Many companies track the share of sales coming from innovation using a simple metric such as product vitality index (PVI). The calculation is derived from totaling the revenue of new products and services over the last five years against the overall revenue of the company. Unfortunately, many companies with a strong vitality index do not end up growing their top line.
Unfortunately, organizations’ simplification of innovation performance measures to just PVI does not produce sufficient learning and continuous improvement.
An innovation ecosystem has many components including multiple innovation types that require different innovation processes each with many phases and views necessary at the portfolio, pipeline and project. Therefore, a more comprehensive approach to innovation measurement is key to ascertain the product innovation vitality of a company.
Measuring the innovation mix is as important as measuring the overall innovation focus:
Different innovation types serve different purpose in a company’s portfolio.
“Sustaining” innovations, such as line extensions and repositioning: By launching close variants of an existing brand, the company can provide marginal incremental benefits to the consumer thereby keeping brand vitality and keep competitive superiority.
“Efficiency Innovations” help the company sell mature products at lower prices and improve productivity and profitability. These innovations would maintain the value proposition but deliver the same
“Disruptive Innovations” tap into a latent need that has not been met so far, or fundamentally create a new need in consumers’ mind, thereby creating a brand-new market
Each of these innovations have a place in the overall innovation strategy. Companies that are too focused on Line-Extensions may end up on a treadmill of portfolio proliferation and may get blindsided by sudden turns in the market. Similarly, as exciting as the big disruptive innovation programs may be, they have longer gestation and more unpredictable results. Therefore balancing the innovation portfolio is important. Companies need to constantly track the ratio of different types of innovations in their innovation portfolio and make sure they strike the right balance.
Measuring different innovation types differently:
Since these investment types serve different purpose and need different lead times to produce outcomes, it is critical to measure each type of innovation against different standards of performance and over varying investment horizons.
For example, while line extensions may generate significant revenue over the years, much of the revenue may come at the expense of the business that was cannibalized. It is therefore important to measure incrementality, or the portion of innovation volume that comes from new buyers and additional usage occasions. This is estimated by netting out cannibalization from a brand’s existing buyers. Similarly, efficiency innovations may not produce any net incremental revenue whatsoever and therefore their contribution will be underrepresented in PVI calculations. Measuring overall margins improvements may therefore be the right metric.
Time horizon for the measurement may be very important too. Disruptive innovations take much longer to nurture and may not produce outcomes for longer period. They may therefore need to be bucketed and compared depending on the gestation stage they are at.
Supplementing with Leading Indicators:
Since PVI is a lagging indicator, it is important to keep an eye on the performance indicators for the innovation process itself. While these leading indicators may in themselves not assure outcomes, they will allow an opportunity for course correction and not just constantly look at the rear view mirror. Metrics such as number of ideas generated, learning velocity, time spent at each stage and percentage of ideas progressed from one stage to next are some of the important leading indictors that can predict the outcomes of the innovation efforts.
In summary, while using a simple metric to measure the overall company’s product portfolio vitality may be tempting, it over-simplifies the challenge of measuring innovation. Having a more comprehensive innovation dashboard not only provides relevant insights that can be actioned, but can also be tuned to be in lines with overall innovation strategy of the company.
0 notes
Text
CPG Companies: Why improving innovation capabilities organically is better than buying startups

Over the last few decades, established large Consumer Goods companies have been losing out on growth to both private labels and smaller insurgents.
Before Covid, large companies were investing significant money in inorganic transformation of their portfolios. Much of the purchase was of more innovative brands. Nestle, under CEO Mark Schneider undertook a series of acquisitions and divestitures to focus on the changing consumer patterns. Mars bought Kind Snacks and Campbell’s acquired more faster growing Snyder's-Lance. Unilever made expensive acquisitions of Dollar Shave Club and Hourglass.
2020 lull in industry innovation was only temporary:
In 2020, as COVID hit, consumers moved to safety of known brands of large CPG companies. Many insurgent brands lost out also because they could not fill the shelves fast enough. Larger companies defocused from innovation and portfolio improvement to ensure their factories churned out as many of their flagship brands as possible. For some time, many of them enjoyed the false sense of security and a feeling of vindication.
While the supply chain volatility continues, in the last year, there has also been massive acceleration in more enduring changes in consumer behavior. For example, there was a surge in use of online channels, consumers embraced wellness, cooked more at home, and enjoyed more nature. Nimbler insurgent brands have once again started to out-innovate and out-perform their larger competitors and are chipping away at the market share.
Acquisitions for innovation-led growth is back on the agenda for large CPG companies:
And this is leading to, amongst large Consumer firms a re-emergence of discussion on pivoting portfolios to cater to the new consumer patterns – Once again the smaller insurgents have become active subjects of acquisition interest. After falling off the cliff in 2020, the M&A activity in Consumer Goods industry picked in 2021. As per KPMG’s estimate, in Q3 2021, the deal value went up by 10% compared to previous quarter. And the deal volume has also been much higher compared to both 2019 and 2020.
Buying startups for Innovation is always expensive, and often ineffective:
Unfortunately for the large incumbent CPG companies buying innovation is turning out to be very expensive. Bain’s recent research indicates that scope and capability deals have multiples that are about 30% higher and cost synergies that are roughly 45% lower than they are in scale deals. Unfortunately, despite paying significant sums, these acquisitions do not lead to the expected growth outcomes. Bain found that post-acquisition, growth rates of acquired insurgents drop by an average of 58%.
Improving capabilities in organic innovation:
Given such high cost of acquisitions, the need for investing more in organic capabilities for accelerating innovation is more than ever before. This would not just improve the returns on existing portfolio of products but would also improve their ability of enter and succeed in brand-new categories. Organic innovation requires a spectrum of capabilities, from better ability to spot opportunities, conducting scientific experimentation, to bringing agility in end-to-end product development lifecycle. Building these capabilities would not just improve the internal growth engines of the company but would make the any acquisitions more viable. They would have better appreciation of the companies they acquire and create an amplifying effect across their portfolio.
0 notes
Text
How supply chain operations need to evolve to support Innovation agenda in Food & Beverages industry
These are disruptive times for the Food & Beverage (F&B) industry. The new consumer is looking for organic, ethical, sustainable and diet foods. They expect more convenience, more personalization, and more innovation. While there is new competition, there is also new opportunities to rewire the value chain. On one hand, retailers are launching their private labels; on the other hand, demand for direct-to-home meal kits is skyrocketing. There are also innovative suppliers emerging, such as vertical farms popping up in places like New York City.
Supply Chain - The new frontier of innovation in the Food & Beverage industry: These emerging disruptions will create the next generation of successful companies. These companies will innovate not just in what they sell, but every aspect of their operating model. And most of all, they will repurpose their supply chains.
Supply Chains are at the heart of F&B operations and can be very complex – companies need to manage cold chains for perishables, provide end to end traceability, follow GMP and comply with a myriad of regulations from the FDA and USDA in the US, 178/2002 in Europe, and similar such in other countries. They need to build their Supply Chains to support various categories of materials, from packaging materials to raw materials for small-batch R&D and commercialization operations. All this makes Supply Chain operations very complex, but a vital function in the F&B industry.
How companies pivot their supply chain capabilities in the coming decades will become an important source of competitive advantage. To design supply chains that meet their aspirations in the emerging competitive landscape, companies will need to look at opportunities holistically – from kitting, warehousing, fulfillment & distribution, packaging, delivery to reverse logistics etc.
Fortifying traditional measures of supply chain performance:
The health measures of supply chain in food and beverage industry will become more important than ever before. To fortify those, companies will need to be more comprehensive in their approach and will need to build a much wider set of capabilities:
Efficiency: Improvements in cost profile of supply chain operations will fund new innovation, experimentation and expansion. Companies will need to find opportunities to more aggressively weed wastage across the entire cost bar e.g., through shrinkage elimination, packaging innovation, improving sourcing costs, reducing transportation costs. This would require a very granular view of the costs and ability to comprehensively model the impact of cost elimination initiatives
Velocity: Their ability to move material and information fast as companies receive their demand signals will need for them to look at their strategies such as for inventory placement and improvements in signaling speed across supply chain
Predictability: Their ability to know with reasonable certainty when and where the activities and material movements will happen. This would need improved forecasting, better end-to-end visibility and better ecosystem coordination
Agility: Their ability to quickly ramp up and down. This will be driven through creating the right capacity modeling across the supply chain and through improved planning for base and volatile loads. Companies will also need to better capture demand and supply signals and do better end-to-end integrated planning
Resilience: How the Supply Chains react to unforeseen breakdowns, for example when company can’t source ingredients from a country under outbreak of a disease. A breakdown of the key supply line. This would need improved “what-if” planning and the design a network of options.
Beyond: The three new frontiers of differentiation:
While the traditional measures of Supply Chain will continue to be important, they will not be sufficient in the new F&B landscape. There are three new frontiers on which companies will differentiate – Malleability of Supply Chain operations, its ability to support personalization at scale and its ability to support solutions for consumers
Malleability: As companies launch new product categories, through new channels and to new markets, the supply chains would need to adapt easily. Companies would need to source new materials, create a new trucking plan and new warehousing approach. The ability of the supply chains to allow the experimentation and eventual launch of new products will become increasingly important. Malleability would need to be designed into the supply chain strategy. It would require a set of capabilities that have traditionally not been looked at including ability to design, model and visualize the end-to-end chain for the new product extension, category or business model
Support ‘solutions’ for the consumers: The supply chains will need to pivot, from moving food to supporting solutions for consumers. We know that consumers consume food to address different unstated needs, for example feeling healthy, enjoying a friend’s company, stress busting or saving time in their busy lives. While products and branding messages are often designed to target these needs, it is rare for the supply chains to be designed with these needs in mind. Typical approaches apply ‘User Centric Deign’ to design systems that support supply chains. A more robust approach would also use Design Thinking to incorporate ‘Consumer Centric Design’. This would for example determine how food is delivered to the consumer, what is the acceptable level of stock out and how the food is sourced from the farms. There will be a clear linkage in the supply chain to the ‘intent’ i.e. the solution it is solving of a consumer
Support personalization at scale: As consumers demand more personalized food options, the ability of the supply chain to support a wide spectrum of personalized needs for the customers will become very important. This would need supply chains to be configured to support very granular and fast-moving demand signals. However, this will need to supported without losing scale economies. This would impact how various algorithms aggregate and disaggregate demand, inventory, production plans, supply, transportation routes etc. at various stages in the supply chain
Embracing the new digital tools to drive the Supply Chain transformation: The traditional set of capabilities and the three additional aspects will become extremely important as F&B companies pivot their business models. So how do the companies weave these capabilities in their supply chains? The current digital platforms may not suffice in supporting these new capabilities. Companies will need a much more evolved digital tools that intrinsically supports these capabilities.
First, this would require platforms that improve access and quality of supply chain data. The technology platforms will have an ability to discover, ingest and mine these datasets to continue to improve supply chain capabilities. This data could also come from external partners and data syndicators such as IRI and Neilson. The platforms will also allow consumption of ever-evolving machine learning capabilities improve the predictive power of the supply chain
Second, such platforms would granurly and fluidly join the dots to improve end-to-end visibility across the supply chain. Traditionally, the flow of information from one node to another can take time, is often incomplete or unreliable. With digital technologies, from IOT to Blockchain, it is now much easier to connect the dots in a reliable way, thus improving the end-to-end visibility, from farm to plate. Technologies such as Graph database allow a more fluid way of connecting various components, thereby allowing insights that were not earlier possible
Third, unlike traditional platforms, the digital platform for supply chains would not be hardwired but would be based on a modular service-oriented architecture. This would allow easy reconfiguration of the supply chains as the business models evolve. Furthermore, they would allow experimentation, e.g., by quickly deploying algorithms and scaling subsequently.
And finally, such a platform would connect the other operational capabilities to create an end-to-end picture of organization’s operations. This would need an ability to connect with on one hand systems for CRM, campaign management etc, and on other hand systems related to design & development and manufacturing.
Embracing the new digital tools to drive transformation: Supply Chain capabilities that Food & Beverage firms need for competing in tomorrow’s world will be much more evolved and, in many ways, very different than today’s. Transforming supply chains can be overwhelming especially since companies have to continue to keep their lights on with the current business as they evolve. The best way to start is by defining the blueprint of the future supply chain. Then one could create a roadmap that starts with quick and easy wins and then build the momentum for bigger next steps.
Food & Beverage companies have a reason to be excited. As the industry evolves, new opportunities will emerge. Those who are able to morph their supply chains to take advantage of these opportunities will drive the next wave of success in the upcoming decades
1 note
·
View note
Text
Supporting the renaissance of the 200mm fabs
Earlier this year, Intel revived its construction of the 450mm factory in Chandler, Arizona. While that grabbed a lot of media attention, equally of significance is a quiet renaissance in the demand for 200mm fabrication happening across the world. Interest in IoT and automotive sector has particularly contributed a surge in demand for MEMS, power, analog and discrete semiconductors, much of which can be manufactured in 200mm wafer fabs.
According to SEMI, by 2019, installed capacity of 200mm will reach close to 5.4 million wafer starts per month. This will be almost as high as capacity in 2006, adding a net capacity of more than 600k wpm in 5 years. Of the 19 new fabs that started construction in 2016-17, 4 are 200mm.
Despite the healthy growth in this installed capacity, the industry is experiencing an acute shortage of 200mm fab capacity. And, should the IoT and EV and various other changes in automotive markets take off as expected, the surge in demand could make the capacity shortage even more acute over the next few years.
Many of the existing 200mm semiconductor fabs operating in US and Europe were expected to close soon a few years back. However, now there is a much stronger rationale to extend their lives, and continue to operate them for longer, given the expected demand growth.
To continue to run these fabs profitably, there needs to be a renewed focus on improving fab cost efficiencies, and increasing yields. For this, there are a number of levers that could be deployed. To do so, it is important to understand the economics and operating models of these fabs. They are very different from the 300mm fabs, which are highly automated. 200mm fabs have much higher labor intensity per square-unit of silicon. They are typically used with smaller production runs. They use old equipment, much of which is difficult to maintain, and cost of replacement parts has surged in the last few years.
There are a number of emerging levers that can be used to improve the efficiency and yields, and especially have the potential to address these operating specifics of the 200mm fabs. Four of these are particularly important:
Improving downtime prediction and station utilization: Old fab equipment understandably tends to have more downtime. Unfortunately, 200mm equipment parts are in scarce supply, and are getting increasingly expensive. Latest machine learning techniques can reduce downtime and improve maintenance planning significantly by better predicting out of control parameters of the equipment and process. These go well beyond control charts that apply rules such as Nelson rules. Furthermore, they can predict using large number of parameters and the interplay between those parameters. Alarms for timely intervention can also be dynamically triggered based on cost and impact functions.
Improving factory traceability and real-time information visibility: Production costs can be reduced by improving feedback loops with upstream activities. Manufacturing production orders can be prioritized based on demand criticality, for example by allowing putting some orders on hold to let the important orders progress. The most important capability to build here is a granular lot traceability. It also provides more manufacturing flexibility by allowing lot splits and merges. Furthermore, a building a central data bus can allow real-time reaction to any events across the factory, allowing faster response to potential downtimes and wastages.
Improving low-hanging operations automation: 200mm fabs have a heavier load of direct and indirect labor costs. These fabs were built in 1990s, when semiconductor manufacturers hadn't yet developed the automated tools to handle hundreds of steps required in die manufacturing, and manual activities were needed, especially since the processes varied significantly. Although automation in these fabs improved with time, there are many fabs where surprisingly large amount of low-hanging automation opportunities still exist. Techniques, such as extensive use of RFID to track all movable parts, can improve automation significantly. Updating old IT systems, for example for maintenance management can further reduce operating costs significantly.
Left shift of quality predictions: Yield improvement is the most critical goal of all semiconductor operations. Yield has become even more important in 200mm factories given the capacity constraints. Unfortunately probe testing, post-assembly testing and customer defect tracking provide the feedback too late. Having better quality interventions earlier can make a significant difference in yield improvements. There are two parts to the shift-left of quality interventions. First, supervised or unsupervised modelling to help detect failures throughout the manufacturing process, as the wafer goes through hundreds of steps. Second, linking failures during probe testing to the exact cause of failure can help eliminate the root cause. Given the complexity of semiconductor manufacturing process and the volumes of data that gets generated, these techniques are not easy to apply. Data science and information technology have evolved significantly. The first part is much easier to apply, and small investments in the direction can make a big difference.
Semiconductor companies with 200mm fabs have a reason to be excited. They are certainly in demand. And opportunities have emerged to improve their efficiency and yield further, so that they can continue to meaningfully contribute to the companies’ bottom-line for possibly even a decade more than was earlier thought!
1 note
·
View note
Text
To win the ground game in emerging markets, CPG companies must organize data for unorganized channel
In theory, large global consumer goods companies, with their advantage of scale and resources, should be able to outperform the regional players. Yet, often regional companies tend to do much better in their respective geographies. In a research a few years back, McKinsey estimated that regional players are about 1.3 times more likely to outperform global players in Europe, 1.6 times in US and 2.7 times in Latin America. We believe that even today, many regional players continue to outsmart the global ones through a better organized ground game.
To compete effectively, the CPG companies need much better understanding of the last mile i.e., the outlets that sell their product to the retail customer. This would allow them to address important questions – Are these outlets aware of the benefits their product offers? Are they stocking their products enough? Are they actively offering their products? Are their products being priced correctly? etc.
However, the penetration of modern outlets is still low in many emerging countries. For example, modern format retail in Belarus is only 50% (compared to 90% in neighboring Latvia). Organized retail in India is still less than 20%. This, along with multiple layers in the distribution chain, fragmented technology, incentive structures of the players and other such supply chain dynamics, result in CPG companies to struggle to build robust insights of the channel behavior at the last mile.
As first step to better understanding the last mile, companies need to look at the completeness and correctness of their master data for outlets. For this, companies can leverage data from a variety of data syndicators. This raw data comes in different formats and needs some wrangling and application of ML to normalize, score confidence in correctness, identify discrepancies and impute missing information. Often findings need to be validated through on-ground sampling. Such sampling also allows to further enrich data and to test emerging hypotheses.
In a recent engagement, LTI fixed 70% store data that were missing or incorrect from the original outlet data the client was working with. It also helped identify a number of white spaces for the client to address. For example, while the client had good penetration in educational institutions, but was very poorly covered in local pharmacies and eateries in affluent neighborhood.
Such interventions can, at very low-cost help CPG companies identify how outlets, consumers and product interact in dynamic and complex emerging markets. These insights can be very valuable for winning the ground game.
1 note
·
View note
Text
Compute to Data or Data to Compute in Retail Stores? Balancing Edge, Fog and Cloud Computing
Retail Industry is in the midst of digital revolution. IoT and Cloud technologies are paving the way for digitization of retail stores. Retailers are racing to reinvent their stores to drive productivity, to improve customer experience or to light up new business models.
As they do so, one of the fundamental challenges that is emerging is where and how to handle vast volumes of data that the edge devices are spewing up. For example, if a retailer decides to install cameras at various places in the store to monitor and analyze customer footfall, these cameras across all retail stores would generate voluminous data. It would be impractical to send the data over to cloud, have it analyzed and sent back for relevant actions. But then again, there would also be merit in bringing some of this information across retail stores together on a common platform to draw inferences.
To enable different use cases, retailers may therefore need to use a combination of edge, fog and cloud computing. In edge computing, the intelligence and processing power would be placed in the edge devices (e.g. cameras). A fog environment would collect data through gateways and place the intelligence at the store-wide local area network. A cloud would take the data further into retailer’s cloud environment for analytics.
With a spectrum of options for IoT devices, gateways, controllers, cloud providers, analytics platforms, there can be mind-boggling options to build out a use case. How do retailers decide what configuration is best suited for their use case?
As a first step, we recommend the shortlisting options with a few different filters: Do we need to aggregate data across sources? How much data could the pipes practically transmit? What response latency is acceptable? What historical data would be needed for the analysis? How would security and privacy challenges be addressed? The shortlisted options could then be compared using their total cost of ownership. The TCO would need to include e.g. long-term cost of maintaining different hardware and the compute and storage costs across options.
The final solution would likely leverage all three, edge, fog and cloud. For example, retail stores could install smart-cameras with on-camera video analytics to calculate footfall in the isle. It would then stream footfall data to the gateway devices on a 15-second interval to help store associates predict and prevent stockouts or help smooth traffic flow by changing the isle arrangement. The data could then be aggregated and sent along with the actual end-of-day sales data to cloud. This data could help discover SKUs with high dissonance between footfall and sales and therefore improve pricing decisions for those SKUs
Digitization of retail stores can produce many benefits. However, digitization is also expensive. The right technological choices could generate a lot more bang for the buck.
1 note
·
View note
Text
Improving R&D Productivity in Semiconductor Industry
R&D constitutes 15-20% of the sales in semiconductor companies. Top 10 semiconductor companies spent in excess of $30b in R&D in 2015. Not only is that a significant cost, it is a critical source of competitive advantage.
Complexity of R&D in semiconductor industry continues to increase:
The technological changes are causing a huge escalation in the complexity of R&D projects. The complexity is being further amplified through need for more complex use cases being sought by customers, e.g for system-on-a- chip (SOC) devices that integrate processors, analog components, memory etc. on the same chip. R&D costs too are going up fast. For example, according to Gartner, on average, IC design costs are have risen from $30 million for 28nm, to $80 million for 14nm, and will go up to $120 million for 10nm. R&D complexity escalation is causing, and is being caused by transforming business models in semiconductor industry. It is one of the major drivers of recent consolidation in semiconductor industry and emergence of horizontal business models involving reuse of IP blocks. And as the traditional vertical integrated models are being broken by fabless and fab-lite companies, the importance of competing on R&D has never been felt more.
With the ever-escalating costs of R&D, to justify the investments, there is even more pressure on semiconductor companies to attain higher market share in the markets they focus on. Speed of product launch therefore has become even more important, which in turn is doubling the pressure on R&D organizations already reeling under technology driven complexity.
Need for a robust approach to estimating R&D productivity:
All this has put R&D productivity under spotlight. Semiconductor companies need to make sure that they have a robust mechanism for tracking their R&D productivity. The R&D productivity is however is not simple to estimate and track. Time and effort in R&D are not simple functions of transistor count. Extra time and effort can be justified by the type of circuit, density of circuit, the extent of reuse etc. These factors need to be normalized to find the actual causes of varying productivity. Most often such normalization would require certain degree of subjectivity. It is very useful nevertheless to be able to compare normalized productivity of the projects.
Identifying the right lever for productivity improvement:
Once the data has been normalized, it opens up opportunities to identifying the right lever(s) that can help improve productivity of individual projects, and R&D organizations overall. The are two broad categories of levers that can then be explored and applied to to improve the R&D productivity:
Comparing variance between projects: By comparing variance between their own projects, semiconductor companies can help identify best practices being adopted by one project that can be applied to others. There are a number of differences that could be studied, for example,
· Time and effort variance between various phases of the project, e.g. functional specifications, RTL design, physical design, testing, validation etc. Furthermore, there are sub-steps e.g. design review etc, that can also be compared to identify improvement opportunities across projects
· Number of sites used by design teams and configuration of the design teams can have a significant impact on the project productivity. For example, by increasing the number of sites, it becomes easier it is to scale teams and source talent, but it also adds to complexity in coordination. Comparing productivity of projects with different configurations can help identify the ones that best work for the company
· Differences in non-standard/mandated tools, e.g project management tools used between different projects. Project teams teams may use different tools to collaborate with colleagues, track progress, manage code repositories, and plan validations etc. These tools can have significant impact on productivity. Differences in usage of tools across projects could unearth significant improvement opportunities
· Tradeoffs applied by managers in their projects: There are a number of tradeoffs project teams make e.g., between reuse vs design from scratch, higher defect density vs extent of validation in their first release etc. These tradeoffs can vary widely by projects, and comparing projects can help identify a number of opportunities
Identifying systemic opportunities for productivity improvement: These are opportunities that may exist across the organization, irrespective of the variances across projects. Companies could start by focusing on activities that consume most time, or most effort on average and identify the improvement opportunities. Furthermore, they also need to look at the activities that may have most downstream impact to zoom into focus areas for systemic productivity improvements. There are a number of organizational and process decisions that can help companies improve productivity of the R&D organization. For example:
· Engineering skill level and competence of the organization
· Choice of EDA tools used in the organization
· Main design flows the company adopts in their R&D organization
· The extent of management support and attention given to the R&D teams
· The type and depth of relationship with the customers etc
Need for improving R&D productivity in semiconductor companies has never been higher. A robust data driven approach to R&D productivity improvements is not just about saving costs. It can make a huge difference in companies’ ability to compete effectively in a competitive landscape that is intensely competitive, where technological advancement is unrelenting and customers are ever more demanding.
1 note
·
View note
Text
A side effect of IoT we must not ignore
We are rightfully excited about what Internet of Things holds for our future. It will usher in huge economic benefits, often estimated in the range of 10 trillion dollars. It will change lives of people in emerging countries, give us access to improved healthcare, make us more secure, and make our cities much smarter.
There is however one thing that we may be ignoring, that needs to be addressed as embark on the journey of connecting the world. Estimates are that there may be as many as 2 trillion sensors getting installed every year by 2020 (22 billion in Automotive industry itself). These sensors will come in different shapes, sizes and weights, for example, depending on whether it is active or passive. Let us say each sensor will weigh around 10g on average. This means we could be generating an extra 20 million metric ton of eWaste every year, doubling the amount we generate today. eWaste constitutes only 2% of the American thrash in landfills, but constitutes 70% of the toxic waste. Only 12.5% of the eWaste is currently recycled in US every year. Due to the size and dispersal of sensors, our ability to recycle them may be even lower.
Ironically, one of the touted benefits of IoT is better waste management! Let us hope our collective ingenuity will also find a solution to this side effect of IoT!
2 notes
·
View notes
Text
Are we getting better at looking forward, but compromising our ability to look backwards?
Last year the GDELT project published a paper on how crunching through the big data of history can help us spot patterns and work out where the world is heading next. The GDELT database monitors news (broadcast, print, web) from across the world in over 100 languages and uses complex computer algorithms to codify what’s happening throughout the globe. Certainly our ability to project the future will improve, as we apply more sophisticated ArtificialIntelligence to draw such predictions about the future
Earlier this week however, Vint Cerf, one of pioneers of Internet, in his recent conversation at the American Association of Advancement of Science expressed a different concern. He said that the humanity could be headed towards a digital black hole as the digital objects are becoming unreadable as the technology evolves. He said that we be putting misplaced confidence in the longevity of digital information. The information GDELT and similar such efforts are trying to preserve is tip of the iceberg. The volume of data that is getting generated continues to grow exponentially, much of which (currently) of no obvious value. Much of this data is being stored with little thought on how it will be retrieved by historians of the future generations. We are at the risk of losing a lot of this information, compromising the ability of future historians to look backwards.
A few years back, I was chatting with a friend and client in the Aerospace industry responsible for archiving engineering data for the airplanes they designed. I was intrigued to learn that the company kept all their designs on printed paper, and not in digital format. My friend told me that that the ever-evolving digital world meant there was no safe way to store this information in digitized manner.
Till the time we are able to create what Dr Cerf calls a ”digital vellum”, it may be a smart idea for my friend to maintain the printed documents for their airplane design, and for you and I to print the digital photographs taken from our latest digital cameras
2 notes
·
View notes
Text
Making modular product design key as Technology and Luxury converge
The smart device and wearable technology products are increasingly turning into luxury goods. Apple, by getting Angela Ahrendts from Burberry made its perspective on the industry very clear. And luxury goods companies are also keenly working on making their products smarter. Tag Heuer, for example is rumored to be launching a range of smart watches. Ralph Lauren unveiled the Polo Tech shirt that includes sensors knitted into the fabric to track movement and gauge performance. Jonathan Newhouse, global boss of Vogue, has coined a new word for the sector –‘Techno-Luxury’
Although there are many learnings from luxury goods industry that could be applied to the new techno-luxury world, there are some unique characteristics too. In particular, there is a divergence between the economics of Technology and that of Luxury that companies need to worry about
The technology lifespan of hi-tech wearable is constrained by the Moore’s law. The products launched some time ago command much lower value than the ones launched recently. And the perceived value drops very fast with time
On the other hand, one of the most important perceived values for consumers of luxury products comes from longevity. Remember the famous slogans, “You never actually own a Patek Philippe, you merely look after it for the next generation.” Or the one from De Beers, “Diamonds are forever”
To manage this divergence of economics, unlike the traditional luxury product companies, Techno-Luxury makers will need to focus a lot more on modularity of their products. This would allow the products to be technically upgraded easily and selectively, whilst retaining the physical form and appearance.
Indeed much of the innovation is happening in this direction. Misfit has unveiled an activity tracker module that hides behind a separate Swarovski crystal. Google has come out with modular phone that has a strong aluminum and steel frame but can be fitted with new camera, speakers, batteries etc allowing the consumer to upgrade as the technology evolves. Likewise Montblanc in its recent launch, rather than creating a smartwatch, retained a mechanical watch with the smart device built into the leather “e-Strap.” This annexure to the watch can be bought as a separate module
Keeping modularity at the heart of the product design will be key for techno-luxury products - Designers would need to make this one of the key design objectives very early on in the design lifecycle, allowing them the ability to maximize the value they capture from the customers in the long term
1 note
·
View note
Text
Could it be smarter to make products a bit less smart?
Many product managers compete hard on cool features that have the potential to generate great buzz. These features are intended to create customer delight and make the product stand out. Of course, adding these features in the product is most often a good idea.
Sometimes however, decision to add 'customer delighters' is not as simple and linear. Noriaki Kano, an eminent professor at Tokyo University of Science created a model that classifies customer preferences. Besides the delighters, customers also need other features, presence of which does not necessarily generate customer delight, but their absence can lead to a lot of customer dissatisfaction.
Unfortunately many Product Managers do not fully comprehend these basic needs and end up competing in the wrong game. They keep adding delighter in an attempt to differentiate, at the expense of these basic needs. As product lifecycles have shrunk, and the pressure on profitability is immense, the risk of getting it wrong has never been higher.
Consider for instance, battery life of mobile devices such as fitness gadgets, smart watches and phones. Analyst studies indicate that the batteries cost only 2% of the overall cost of building most such gadgets. However their importance in keeping customers from being dissatisfied cannot be undermined. Yet, often we have seen companies launching products with energy sapping cool features, or compact shapes at the expense of time between charges.
As technologies evolve, the compromises between basic features and delighters can be better managed. Perhaps it may be a better idea to hold off launching some of the cool features in the meantime?
1 note
·
View note
Text
Are we ignoring the value from the third V of Big Data
Amongst the three Vs of Big Data, Volume, Velocity and Variety, many companies focus on deriving value by addressing the first two, and relatively less from the third. However one could argue that the most transformative solutions could come from tapping into the third V – the Variety of data.
Often companies solve Big Data problems within the confines of a specific organization function, with the variety of data available with that function. For example, the online sales functions tend to address how they can give real-time and granular recommendations to the customers by improving the characterization of the products and the customers through online browsing and purchase patterns. This taps the volume and velocity of data well, and leverages the few varieties of data that are accessible to the function. Solving such problem is certainly critical, but it is also worth noting that most competitors are likely working on the same very problem, and the customers have already started to expect personalized recommendations i.e. this is already becoming table stakes!
Today, not only has there been significant proliferation in the variety, but also the technological ability to use this variety is unprecedented. There are machine logs that come from manufacturing shop floors, data coming in variety of formats from suppliers, social data from customers, demand forecasts from resellers, weather projections, competitive information…. The list goes on.
Effectively tapping this variety is where the big opportunities could be. Could for example, the online sales team improve the recommendations further by looking at the data coming from shop floors that predicts a likely shortfall due to supplier quality issues. The company could steer the customer towards a different product or variant by giving a price discount. Similarly, could the manufacturing operations function improve the yield on the shop floor by more closely monitoring the data coming from the field service engineers and customer complaints showing up on social media?
These problems are certainly more complex to solve, mainly because they need for companies not to frame the right answers of the traditional questions, but to frame the questions themselves – They need to look at all data sources they could potentially tap into, and imagine how they could use them, and what value would that generate. Furthermore it requires them to resolve the very tricky issues that can hinder the access to data variety, especially data ownership rights, cross-department data access, and the technology infrastructure to allow the data access. Certainly more complex, but it is worthwhile thinking about all this before competitors start thinking about it!
0 notes