Tuesday

Internet and quality of work/life

HBS is featuring Patricia Wallace's new book 'The Internet In the Workplace: How New Technology is Transforming Work.'

While having Internet access at work arguably has created efficiencies for businesses and organizations, the author explores complications that the Web brings for employees and employers. For example, how have round-the-clock e-mail and wireless devices affected employees’ ability to separate work and family life? What are the productivity and psychological effects of workplace surveillance tools? A psychologist, Patricia Wallace is the senior director of information technology and distance programs at the Center for Talented Youth at Johns Hopkins University. Wallace’s thorough examination of the issues triggered by this pervasive technology, combined with her insightful analysis, make this an important contribution to the literature of the effects of technology on society. Readers in nearly every industry or profession will find this book useful as they reflect on their jobs and careers.

Monday

A look back at the Human Genome Project

Breaking the Code, PBS, Bruce Roe and Francis Collins talk to Ray Suarez, 1999

This interview provides a glimpse of the Human Genome Project, as it was happening.

It captures some of the excitement...

The determination of a DNA Sequence of a whole human chromosome is a tour de force. It provides the first view of a complete chromosome from a completely new vantage point. It's like seeing the surface or the landscape of a new planet for the first time.

... and shows how much things have changed since 1999.

Computers have been enormously helpful to this project. It's rather fortunate that the computer revolution and the genetics revolution are occurring in a nice dovetailed fashion, or we'd be having some trouble. But I will say for the real computer experts, they don't see our problems so far as all that demanding or challenging; even though it's a lot of information, it is fairly straightforward [so far].

Factors of cluster success

'Old Economy' Inputs for 'New Economy' Outcomes: Cluster Formation in the New Silicon Valleys, Timothy Bresnahan, Alfonso Gambardella, Annalee Saxenian, 2001

While looking for links between old and new hi-tech clusters, I came across this paper. It ties in with a series of studies conducted a while back in Ireland, India, Israel, Taiwan etc.

The authors argue that the factors that start a cluster are very different from those that keep it going/growing. (E.g. "success breeds success" isn't useful in founding a cluster). Starting a cluster involves much higher risks for firm founders, especially since they must bet on future technology trajectories.

They focus on the following factors of cluster success in their analysis:
- unemployed skilled technical labor (or skilled labor with low opportunity cost)
- managerial labor
- new firm foundation and firm growth (large firms attract more specialized supply, invest in larger projects, connect to world markets)
- connection to markets
- complementarity to leading/existing clusters rather than head-to-head competition. Strong links as people and ideas flow back and forth
- physical/supply-side restrictions on existing clusters support growth in new clusters
- policy of "benign neglect" and/or investment in education, encouraging multinationals, tolerating/encouraging brain drain, and - if possible - fostering sizable demand (e.g. national policies of adopting the GSM standard uniformly increased market size for telecoms suppliers)

Interestingly, (telecoms) infrastructure is not mentioned, even though at least some of the clusters in question (think outsourcing of services to Ireland and India) were greatly aided by the availability of excess bandwidth. Communications links also supported the 'strong links as people and ideas flow back and forth.'

Here's the paper abstract:

This paper discusses the results of a two-year research project on the sources of success in regional clusters of entrepreneurship and innovation like Silicon Valley. Our project has studied a number of locations, most of which have shown spectacular rates of growth of information and communcations technology-related activities during the 1990s. Our case studies comprise some emerging regions, notably in Ireland, India, Israel and Taiwan, along with more advanced areas like Northern Virginia in the US, Cambridge, UK, the Scandinavian countries and the Silicon Valley 40 years ago by way of the memory of one of its 'father founders', Gordon Moore. Through visits, interviews and other materials, we uncovered some regularities about the determinants of success of these entrepreneurial-led models of economic growth. We find that the economic factors that give rise to the start of a cluster can be very different from those that keep it going. Agglomeration economies, external effects and 'social increasing returns' of any sort arise almost naturally after a cluster has taken off. But the most difficult and risky part is to get the new clusters started. At that stage, 'old economy' factors like firm-building capabilities, managerial skills, a substantial supply of skilled labor and connection to markets were crucial for the take off of these 'new economy' clusters (including Silicon Valley 40 years ago).

Thursday

Get on the bandwagon

For a while now, stories have been popping up, not just about outsourcing jobs to India, but also about a fledgling migration from the United States to India. At first it seemed that this was a phenomenon of Indians (or people of Indian origin) heading back home as the business climate improved. Now others are jumping on the bandwagon as well - to work in a booming economy, or to study it.

According to Saritha Rai's article in today's NYTimes academics are lining up to study the process of service outsourcing in India. Researchers interview managers of Infosys and Wipro while students request internship placements in Bangalore or Hyderabad.

Tuesday

Diseconomies of scale

Another interesting piece by James Surowiecki, 'the pipeline problem' explains the new industry structure in life sciences. Large pharmaceutical firms may have branding and sales power, but encounter diseconomies of scale when it comes to R&D or innovation in general. As a consequence, their 'pipelines' have been drying up, and they have become to rely on smaller, more flexible and more entrepreneurial biotech firms to deliver efficient research.

The traditional pharmaceutical research model harks back to processes developed by German and Swiss chemical firms in the late nineteenth century, when chemists synthesized and screened thousands of compounds in search of a few potential new drug candidates. Although the methodology is more sophisticated now, success is still in many ways thought to be a matter of brute force: throw hundreds of scientists at a problem and hope for the best. It’s crapshoot economics; a few great successes can pay for myriad failures. So bigger has always been seen as better.

Today, though, the advantages of size are trumped by what are called “diseconomies” of scale: inertia, bureaucracy, risk aversion, clock-watching, office politics. Joseph Kim saw a lot of this firsthand, as a scientist at Merck for nine years, and now he likes to compare Merck to the Titanic. “Companies like Merck have fantastic scientists working for them, but they also have these middle and upper layers of managers who are just taking up space,” he said last week. “I like to call them ‘anti-bodies,’ because they just sit there being anti-everything. No one ever gets fired for saying no to a new idea.” Now, as the founder and the C.E.O. of a little biotech called VGX Pharmaceuticals, Kim has a novel type of aids drug in clinical trials and a promising drug for cancer in development.

It turns out that research and development doesn’t scale—that bigger may be worse. That’s why the engines of pharmaceutical innovation have for some time now been smaller biotech firms like VGX, which can concentrate on a few promising avenues of research and can offer enterprising scientists the freedom—and the potentially enormous rewards—of working as entrepreneurs. Just as, in the seventies, the locus of innovation in the tech business shifted from Goliaths like Digital and I.B.M. toward the smaller, more narrowly focussed start-ups of Silicon Valley, so it is shifting now from big to little pharma.

Monday

The cost of losing R&D

The Economist carried a comparison of US and European pharmaceuticals industries (based on a Bain study) a while back. In their analysis, they debunk the idea that Europe is benefiting from a free ride. Government pricing regimes may be keeping drug prices low, leaving American patients and insurers to pick up the tab of ever more expensive drug development. However, they are losing out overall - the policy is currently incurring a net loss in Germany for one.

As Europe becomes a less attractive market and red tape further hinders innovation, R&D (and the jobs that go with it) and cutting edge health care are moving to the United States.

IN THE drug industry, they call it “Europe's free ride”. Government pricing regimes mean that prescription drugs cost far less in Europe than in America, where a growing proportion of new drugs are developed—presumably because Americans are willing to bear the lion's share of development costs. On the face of it, Europe reaps big rewards. It spends 60% per head less on drugs than America. In 1992, the gap was 30%. Had spending kept pace with America, last year alone Europe would have shelled out an extra $160 billion. The cumulative “saving” since 1992 is approaching $1 trillion—quite some free ride.

But is the saving from cheap drugs more apparent than real? And are Europe's drug firms in fact struggling to keep up with more dynamic American competitors? A new study by Bain, a consultant, argues that the existing pricing regimes are bad for everyone, including patients. “The free ride is not free,” argues Paul Rosenburg, a co-author. “If governments and drug companies begin to accept this, then future policy on health-care innovation and spending can be far more rational.”

On the other hand, America gains from its growing dominance of drug research and development (R&D). A decade ago, Europe and America each spent roughly $10 billion a year on drug R&D. Now, America spends almost $30 billion annually, and Europe a little more than $20 billion. A growing number of firms now base their R&D efforts in America. Drugs R&D in Germany fell by 3% in 1992-2002.

One result is a striking decline in European drug innovation. Bain examined how many so-called new molecular entities (NMEs) have been produced in recent years. In 1993-97, Europe launched 81 NMEs and America 48. But in 1998-2002, the respective figures were 44 and 85, almost an exact reversal.

Exactly how drug-pricing regimes influence innovation is complex—and much debated. According to Bain, research shows that the main economic factor driving where firms locate their R&D is how big, and quick, are the potential profits. That gives America an advantage over Europe, where price controls slow down profit-taking. True, early-stage research can take place anywhere in the world—and big drug firms are increasingly looking to shift this to lower-cost places. (GSK recently linked up to do research with Ranbaxy, a leading Indian drug firm.) But the bulk of costs are incurred in the development phase between early-stage and market. And the process of drug approval remains very much a national, as opposed to global, activity. So it makes sense for firms to put promising drugs on trial in the market where there is most to gain—namely, for now, America.

According to Bain, a proper accounting for Germany's spending on drugs produces an alarming result. In 2002, Germany saved $19 billion because it spent much less per head than America on drugs. On the other hand, says Bain, in the same year, Germany lost out on $4 billion from R&D, patents and related benefits that went elsewhere. It lost $8 billion because high-value jobs went somewhere else—plus the benefits of those jobs from the “multiplier effect”. German drug firms would have made $3 billion more profit if they had kept pace with rivals elsewhere. A further $2 billion was lost as the country shed corporate headquarters and the benefits they bring. The cost of poorer-than-necessary health was $5 billion.

Of course, these calculations rely on some rough and ready assumptions. Even so, Bain arguably errs on the side of caution. It plays down, rather than up, the multiplier benefits of jobs in the drug industry, for instance. In sum, it reckons that Germany's $19 billion saving is in fact a $3 billion net loss. “When you add up all of the costs, the free rider model is actually quite expensive,” argues Mr Rosenburg.

Friday

How I learned to stop worrying and love the bubble

James Surowiecki has his own take on the United States' innovative power. In his column in the New Yorker he writes about the next investment bubble and why it is good for the US. Positive externalities again (of much heralded American risk-seeking behavior). Though the losers of the dotcom bust may not quite see it that way...

In the early sixties, investors stumbled on a neat trick: if a company had “tron” or “tronics” in its name, its stock was a hit. This was the dawn of the computer age, and a host of businesses straight out of “The Jetsons”—Astron, Transitron, Videotronics—became the darlings of Wall Street. The boom ended badly, as booms so often do. In 1962, the stock market plunged, and the trons and tronics were knocked flat, most of them for good. But investors have short memories. At the end of the decade, they fell for tech stocks again (the magic word this time around was “Silicon”). Later, it was P.C.s, then biotech, and, more recently, dot-coms. Infatuation and disillusionment: it’s the American way. Now investors have found a new crush: nanotechnology.

Even if nanotech does live up to its promise, though, almost all the nanotech companies that are now so hot on Wall Street, not to mention those still dreaming of blockbuster I.P.O.s, will be gone in a decade. That’s how new industries get built in America: a horde of companies rise up, the weak or misguided fall away, and a few good ones thrive. With a general-purpose technology like nanotech or the Internet, the process is even bloodier; because you can do so many things with the technology, you pursue a lot more fruitless notions and reckless schemes before you figure out what really works.

The price of innovation is that you spend money on bad ideas as well as on good ones. Some of the money comes from venture capitalists, who factor the inevitable mistakes into their investments. And some of it comes from government (Washington is spending $3.7 billion on nanotech research over the next four years). But a good chunk comes from all those money managers and retail investors who believe that they will be getting in on the next Xerox, the next Amgen, the next Microsoft. Most of them won’t, of course; in fact, most of them will end up losing money (which is why there’s already been a great deal of finger-wagging over investor interest in nanotech). The paradox is that their losses are often society’s gain. Thanks to investors’ willingness to take a flyer on things like nanotech, companies are able to do more research and development than is economically rational; they experiment with ideas and approaches that, under leaner conditions, would never be tried. It’s a messy process, but it’s the best one we’ve found for inspiring real innovation.

That doesn’t mean that all investor manias are healthy: no one benefitted when Wall Street thought bowling was going to take over America, for instance. But when it comes to transformative technologies, overoptimistic investors are actually working for the common good—even if they don’t know it. We can be glad that investors financed the construction of thousands of miles of track in the middle of the nineteenth century, despite the fact that most of them dropped a bundle doing it. The same goes for overoptimistic investors who poured money into semiconductors thirty years ago, financed undersea fibre-optic cables in the late nineties, and now are poised to lose their shirts in the coming nanobubble. In the dreams of avarice lie the hopes of progress.

Externalities and the problem with the outsourcing debate

As Hal Varian points out in his NYT column today, the problem with the outsourcing debate is that it doesn't treat costs and benefits in the same way.

The political problem with trade is simply this: when the dollars flow offshore, it is easy to identify those who are hurt. But when the dollars flow back, it is much more difficult to discern the beneficiaries.

The debates about trade are not about whether we should accept those good deals offered to us by cheap foreign labor - of course we should. The debate is all about who will capture the benefits from those deals and who will bear the costs.

Ideally, those who benefit the most from trade would compensate those who lose. In practice, virtually everyone benefits to some degree from cheaper goods and services, so compensation for those who lose from trade should come from general revenues.


As so often in the discussion of innovation (outsourcing being one, too), problems arise in dealing with externalities. The same goes for the problem of investments by 'knowledge seekers.' If it were possible to put a price on the benefits to a region of being an R&D hub and on the risk of having other regions catch up faster, there would be no need to suddenly fear FDI. A region could make a calculated decision on how much to pay (or charge) foreign firms. Unfortunately, there are often too many uncertainties involved to arrive at hard figures - especially in R&D intensive industries. But as it is, many of these regions probably do calculate and find that the positive externalities are worth quite a lot, i.e. multi-year tax-breaks, real-estate subsidies, etc.

Knowledge seekers

Knowledge Seeking and Location Choice of Foreign Direct Investment in the United States, Wilbur Chung and Juan Alcacer, 2002

In the latest Knowledge@Wharton newsletter, Wilbur Chung and Juan Alcacer present the hypothesis that foreign direct investment (FDI) is not purely cost or market driven. Companies acquire firms, engage in joint-ventures and set up green fields ventures for access to unique knowledge - not just to cut cost or gain access to markets.

(The academic paper can be found here.)

While such seekers have historically been characterized as technology laggards trying to catch up with market movers, more recently scholars have embraced the idea that leaders, too, invest abroad as they seek to broaden or deepen their knowledge.

The study covers FDI in the United States (by state and by economic region).

Not surprisingly, they found that knowledge seeking is most prevalent among foreign companies in R&D-heavy industries such as pharmaceuticals, semiconductors and electronics. In fact, they found that drug makers are twice as likely to seek knowledge abroad as companies in any other industry.

Where were knowledge seekers most likely to invest? R&D-intensive areas. 'Many investments, 32% of the sample, fall into four major metropolitan areas: New York City, San Francisco, Los Angeles and Chicago,' the researchers write. In contrast, a region of the United States known mostly for agriculture - the Dakotas and Idaho - had no investments during our investigation period.'


It seems obvious that a European biotech company would conduct R&D in the US, that a knowledge seeker in pharmaceuticals would set up shop near Boston, or that a company seeking state-of-the-art IT knowledge might invest in operations in Silicon Valley. But what about the opposite direction? The column mentions GE's new research and development lab for medical systems in China. The lab focuses on product development tailored to emerging economies. Is this a unique example? (GE does seem to be a pioneer as far as spreading R&D globally goes...)

K@W concludes that leading regions in the knowledge industry should be wary about inviting foreign firms and giving them tax-breaks or other incentives.

Traditionally, investments from foreign firms have been celebrated by holding press conferences and ribbon-cutting ceremonies, as South Carolina and Alabama did when they landed BMW and Mercedes. But if Chung is right, these investments may not always be unalloyed victories. 'If many foreign firms enter seeking new knowledge, [productivity] gains may not accrue, and a nation's technological uniqueness might be more quickly replicated,' he and Alcacer point out in their paper. Of course investments from foreign firms may still bring benefits such as more jobs and spin-off economic activity as, for example, suppliers spring up near the foreign firm's new plant.

This sounds much like an absurd reversal of the current outsourcing debate - we don't want our firms to invest abroad because that means we'll lose our jobs (even though our firms will be more competitive), but we don't want foreign firms to invest here because that means we'll lose our competitive knowledge edge (even though we'll get more jobs).

Besides, the argument doesn't hold. Knowledge doesn't diminish by being shared - and if foreign firms invest in US high-tech clusters, this strengthens the competitive advantage of those clusters by increasing their innovative churn. Many of a cluster's advantages (labour pool, social networks, proximity to leading research labs/universities etc.) don't travel well.

To be fair, Chung and Alcacer acknowledge the importance of place in other parts of the discussion, and the 'threat' to American competitiveness is only vaguely alluded to in a generalized statement in their academic paper.

An objection to Chung and Alcacer's research - and to the notion of knowledge seeking via foreign expansion, in general - might be that investing abroad is a costly way to learn. After all, patents and technical manuals are widely published, and newly graduated scientists and engineers are eager for jobs.

But Chung argues that this objection misconstrues the nature of knowledge. 'Knowledge can be broken into a codifiable piece - the stuff you can write down - and a tacit piece,' he explains. ... Consider eating at a restaurant, he says. 'You don't really experience it unless you go there yourself. You can have someone tell you about it. You can order takeout from the restaurant. You can buy the cookbook. But to get the full benefit of the experience, you have to go there.'


Think about it: Which advantage is eroded more easily - an emerging economy's lower labor cost, or the United States' R&D and innovation prowess? (Doubters may want to read Thomas Friedman's recent op-ed.)

By all means, negotiate IPR protections when entering alliances and joint-ventures, but don't get paranoid about foreigners transferring their money and their researchers here.

Thursday

What is innovation?

What is innovation? Approaches to distinguishing new products and processes from existing products and processes, Bruce S. Tether, 2003

In this paper, Bruce Tether from CRIC discusses various definitions of innovation and some tools for identifying innovations in research.

Tether argues that innovation is sometimes defined as an achievement, sometimes as the results or consequences of that achievement, and sometimes as a business approach - and that there is considerable confusion between the three.

Innovation as achievement

It can be difficult to know how easy it will be to develop a new technology, and the extent to which people will want to use it when it is available. For economists, these represent two types of uncertainties in the development of new technologies - technological uncertainties (will it work?) and market uncertainties (will it sell, how quickly, and will competitors quickly introduce their own versions of the product if it proves successful?). The presence of these uncertainties is often an indicator of innovation and will be important in various definitions of innovation.

Innovation as achievement (in the face of risk and uncertainties) can be two-fold:
1. Achieving a significant leap forward in the technological frontier
2. Re-conceptualising existing problems and thereby restructuring technological systems


Innovation as the consequence of achievement

'Great innovations' are primarily thought great because of the consequences of technologies, and not necessarily because of the novelty of the achievement itself, which in any case has usually transformed substantially from the original achievement through the accretion of little details.

What is important here is that innovation has unintended consequences that benefit everyone. Economists call these unintended consequences spillovers or positive externalities. (There can, of course, also be negative externalities.)

Innovation as dynamic capabilities

The conceptualisation of innovation - as a process - is becoming more widespread. Here, innovation is less associated with particular acts or achievements (and their consequences), and is more associated with an attitude of mind, and a whole ensemble of behaviours and practices associated with that attitude.

A truly innovative firm is not one that introduces a new product 'once in a blue moon', but is instead one that is continuously engaged in practices intended to enhance the probability that it will 'discover' new or better products or processes of making them. ... Central to this concept of innovation is being alive to change. Being flexible - being able to adapt what is done in different circumstances, such as to particular customers needs - is usually insufficient to constitute being truly innovative.

Firms that are innovative tend to have 'dynamic capabilities'. 'A dynamic capability is a learned and stable pattern of collective activity through which the organisation systematically generates and modifies its operating routines in pursuit of improved effectiveness.'
(M. Zollo and S. Winter, 2002). This requires a combination of strategic and organisational skills.

Innovation in this sense does not necessarily coincide with innovation in the form of introducing new products and services.

Product innovation

One conceptualization of product innovation (Saviotti and Metcalfe, 1984) distinguishes between technical characteristics and service characteristics of a product, and maps the two onto each other. (E.g. number of cylinders in a car's engine vs/ acceleration.) Innovation can then happen in 5 conceptually different ways. (In practice they are often interdependent.)
1. A change in the absolute values of one or more of the technical characteristics
2. A change in the mixture or balance of the technical characteristics
3. A change in the pattern of mapping between the technical characteristics an the service characteristics
4. A change in the mixture or balance of the service characteristics
5. A change in the absolute values of one or more of the service characteristics.
The problem with this approach is that it is practically impossible to measure service characteristics, such as brand value objectively.

Process innovation

The pattern of innovation in processes is likely to differ from that of products, and particularly from standardised 'mass produced' products. With standardised products, a new product is typically introduced (following processes of experimentation and prototype development) after which it will remain unchanged for some time. A few minor upgrades will be introduced over time, after which a bigger, generational change will be undertaken. ... The cycle is then repeated with the third and fourth generations of the product. Over time, the scale of improvement between the generations is likely to decline. ... Innovation is fairly easily measured, at least in principle, by the scale of the jumps, or steps between the products available.

By contrast, innovation in processes - and indeed in services as well as customised products - can follow a different path. Sometimes there is rapid learning immediately after the introduction of the process - this is the learning curve, initially the process is slow and inefficient, but gradually and on a continuous basis, minor improvements are introduced. The same pattern is repeated if a substantially new process is later introduced. Alternatively, after the initial innovation, there may be a slow process of continuous improvement, but after a while the scope for improvement diminishes.

The point here is that improvements, and innovations, may be much harder to identify in processes ... than with the archetypal mass-produced standardised products because improvements are less likely to occur in definite steps. In particular, it can be difficult to distinguish between variations and innovations.


Radical innovation and the hierarchical decomposability of technologies

The literature on innovation is replete with references to radical and incremental innovations, yet there is considerable confusion about what distinguishes an incremental from a radical innovation.

3 definitions:
Freeman (1982): Radical innovations are those that transcend the technical limitations (of the existing technologies)
Saviotti, Stubs, Coombs, and Gibbons (1982): Incremental innovation can be defined as a series of quantitative changes in known parameters or in the introduction into a given product of technical characteristics already used in some similar product. A radical innovation would be, instead, the appearance of a new technical characteristic.
Tushman and Anderson (1986) focus on the impact on the industry: Competence-destroying discontinuities are so fundamentally different from previous dominant technologies that the skills and knowledge base required to operate the core technology shift. ... Competence-enhancing discontinuities are order of magnitude improvements in price/performance that build on existing know-how within a product class.

Unfortunately, each case leaves significant room for disagreement in trying to categorize specific innovations.

Constant (1987) clarifies some of the issues by conceptualising technological systems. Ontologically, systems are composed of sub-systems which are composed of an immense variety of components. ... This hierarchical decomposability suggests the absolute relativity of all change: Whether a given change is perceived as radical or incremental depends solely on the hierarchical level chosen. A new valve, a new turbine material or fabrication technique may represent a revolutionary solution to a specific sub-problem at that level; yet the same change, viewed from the level or the total aircraft system may appear only as a typical incremental innovation.

Practical approaches to identifying innovation

Tether distinguishes between 'object based' approaches (the researcher identifies innovations) and 'subject based' approaches (the researcher asks firms about their innovative behaviour).

Several approaches are presented with a special emphasis on OECD's Oslo Manual, which is considered the international standard. Although it helps in specifying innovations, in Tether's opinion it doesn't distinguish enough between adoption of innovations and innovative activity. According to the Oslo Manual, a firm is innovative if it adopts a new technology - even if this adoption occurs without any learning, adaptation or risk-taking on the part of the firm.

Recommendations

Tether recommends a 3-dimensional approach to the evaluation of innovations.

For products: Conceptual novelty, technological uncertainty, and market uncertainty.
For processes: Conceptual novelty, technological uncertainty, learning & adaptation.

Comments

This could be a useful approach in justifying my choice of industry for case studies. A problem remains, however, in that the case studies will most likely be in the service industry. How do Tether's recommendations apply to services? The analytical dimensions of product innovations work for services. However, many will develop more along the line of processes with many incremental improvements, rather than large 'jumps'.

In the end, it might be more useful to work with the concept of innovation as a process/attitude/set of practices.

A footnote: Tether emphasizes that he is talking about innovation, not invention. The difference being that an innovation has been commercialised where an invention has not.

Stroke of luck or genius?

I have often heard the saying, 'when luck knocks on the door, you have to get up and answer.' Apparently, Louis Pasteur put it far more elegantly:

'Chance favours the prepared mind.'

(discovered in 'What is innovation?')

Wednesday

Human interface design

The Center for Advanced Media (CAM) at Pace University researches:

* Novel and economical collaborative immersive reality systems.
* Computer-supported community building through shared virtual environments.
* Applications of artistic methods to autostereographic display.
* Digital signal processing analysis tools for financial modeling and visualization.
* Evolutionary collaborative hypermedia approaches to global database systems.
* Handwriting recognition of authorship
* VoiceXML applied to sales force automation and customer management
* Military applications of wearable computers and augmented reality


The work on immersive reality systems and databases sounds particularly relevant to long-distance collaboration and innovation...

ICTs versus face-to-face interaction for problem solving

Sources of ideas for innovation in engineering design, Ammon Salter, David Gann, 2003

This paper, published in Research Policy, vol. 32, no. 8, discusses where engineers find new ideas to solve design problems.

Not surprisingly, the paper suggests that personal, face-to-face interactions remain essential for designers working in project/based environments. The findings reveal that although designers are keen users of information and communication technologies (ICT), the rely heavily on close, personal interaction to solve problems, to develop ideas and to assess the quality of their work.

The authors provide an overview of generally recognized sources of ideas for innovation: primarily internal sources, but also customers, suppliers and competitors. Industrial fairs and exhibitions play an important role as well as (in some industries) professional conferences.

The engineering design process itself is principally concerned with how things ought to be. It involves thinking ahead creatively in order to make a technical object fitting the requirements of users or clients. This process of creation often involves developing new combinations of existing technologies. Hacker argues that the engineering design problem-solving process evolves through a series of iterative and overlapping phases: from problem identification, through development of different conceptual solutions, to designing a favoured solution and working out details of the physical artefact.

The role of ICT tools in this process is not clear. Some argue that new packages of ICT tools have the potential to fundamentally alter the design process (see Steinmueller, 2004). Some suggest that these new tools are leading to the codification of the knowledge-set underlying design activities (David and Foray, 1995). The new ICT tools allow for virtual exchanges across space and time between engineering design teams. New visualisation software and simulation packages can also be seen to lessen the need for face-to-face contact and tacit experience.

By contrast, Nightingale suggests that there is little or no evidence that the importance of tacit knowledge for design is declining. In fact, new ICT tools may increase the need for personal, tacit skills and face-to-face communication.

In their survey, Salter and Gann found that designers rely on close, personal relationships for developing ideas in their work. The two highest rated sources of ideas for engineering design were related to face-to-face contact (84%) and working with others on projects (81%). Previous experience was cited by over 70% of the sample.

This was supported by results from personal interviews. In interviews, designers indicated the importance of close contact with others within their team and more widely within the firm. There was considerable interchange between young inexperienced designers and more experienced staff.

Clients and end-users were considered relatively less important sources of ideas. One the one hand, they were considered too far removed from the highly technical problems that the engineers faced. On the other hand, project managers acted as gatekeeper between engineers and clients.

Despite the fact that Arup is among the highest spenders on ICT tools in the UK design engineering sector, only 25% of its designers found on-line databases and working with new equipment and software to be an important source of ideas for design. ... Our survey shows that few designers used electronic scouting or CAD programmes for solving problems. ... Interviewees suggest that these media lack the immediacy or usefulness of other forms of communication.

The survey showed that few Arup designers thought access to information was a barrier to their design activities. ... As Court et al. have suggested, few designers lack information, instead what they lack is time.

In their conclusion, the authors discuss the importance of ICT tools vs/ face-to-face communications. The research shows that although designers may be keen users of new ICT tools, they still rely on personal exchanges and visual communication for the difficult parts of their work. This finding is supported by historical and ethnographic studies of engineering design that have shown that face-to-face communication among designers is necessary when there is a high level of uncertainty in the engineering design process. ... The immediacy of sketching and face-to-face exchanges is a key part of how engineering designers solve problems. New ICT tools have not yet altered the interactive nature of the design process.

Court et al. found that even when designers work on-line, a number of face-to-face meetings are necessary to build up trust to enable successful collaboration. ... Our study confirms the Court et al. (1997) view that designers suffer from information overload. New ICT tools have tended to increase the amount of documentation in the design process. Personal contact is essential to sift through this mountain of information.

The authors find that 'mixed use' is the best way to describe the application of ICT tools in the design process.

Unfortunately, they don't analyze the difference between face-to-face interaction and personal communication using ICTs (e.g. e-mail). It would be interesting to know how much of the personal interaction needs to happen face-to-face, and how much can be mediated by ICTs.

Some comments on the authors' methodology. Many studies in this field are carried out using large scale surveys where 1 representative (often a high-level R&D manager) responds on behalf of an entire firm. Salter and Gann used a different approach to better understand nuances in engineers' approaches to innovation. Through a series of interviews, they built a cast study of the firm in question. Based on this, they then carried out a survey of the firm's design engineers. This allowed them to better understand what was happening in day-to-day work than the traditional approach would.

They offer a complementary approach to large-scale innovation surveys by focusing on a detailed study of the ideas for innovation in engineering design in a single company. ... There have been few studies of sources of ideas for innovation in engineering design. This paper attempts to fill this gap in the literature by combining interview and survey data.

Monday

Biotech/bioinformatics India overview

www.biospectrumindia.com

I just finished scanning a year's worth of BioSpectrum issues. Here's a quick overview (with a severe bias towards bioinformatics). For facts and figures, the magazine's BioData is a good resource.

India isn't among the big players yet in terms of biotech, though it does seem to be positioning itself for a leading role in bioinformatics thanks to its strong IT sector. Nevertheless, there is much excitement about biotech taking off, founded not just on euphoria, but on the availability of natural resources (diverse gene pool and ecosystems) and scientific talent, a pharma sector that is moving from manufacturing generic drugs to drug discovery and providing services (e.g. contract research), a large domestic market for bioagri and traditional medicinal knowledge that provides a unique starting point for drug discovery. Of course there is also the cost advantage of conducting biotech in India rather than the United States.

So far vaccines seems to be the single largest biotech business, including exports (mainly WHO-driven) and development of new vaccines. Bioinformatics is a small business by comparison (one tenth the revenues of vaccines in 2002/2003), but the quickest growing one. However, given that the global market is smaller to begin with and that India is entering the market with proven IT competencies, chances are good for status as a global player. Analysts expect India to capture 5% of the global market by 2005. Another sector that relies on India's established reputation as a global player for ITES (IT enabled services) is the market for contract research opportunities (CRO).

In its early stages, bioinformatics was largely a software services business based on supplying custom-made code for foreign pharma and biotech firms. Increasingly, however, Indian bioinformatics firms are launching proprietary products. A growing niche could be internet applications: 'In contrast to the accelerated growth that the internet has experienced, companies in the biotech sector have just begun to utilize the variety of internet applications available. Although maintaining a web presence and accessibility to research exists for these companies, the industry has relatively overlooked the possibilities of B2B commerce, ASP applications or the ubiquitous wireless domain,' says Aditya M Reddy, CEO of Hyderabad-based DeUS Infotech Private Ltd.

So far the main customer base (80% of data-driven drug discovery) is in the US, and to a small extent in Europe, Australia, Singapore and Japan. The domestic pharma and biotech industries are considered by many to be too young to represent a significant market yet.

Despite this, and Ernst & Young survey showed only 3 Indian cross-border alliances in biotech for 2002.

Industry insiders warn that the service business in biotech is limited and that, for India the real money is in discovering new drugs for ourselves and not in supplying information and data to foreign companies, who would then use this information to discover new molecules.

Various clusters are aggressively marketing their locations, e.g. Hyderabad (Genome Valley) and Bangalore (the biotech city).

The main bioinformatics players are Strand Genomics, CDC Linex, Bigtec, Institute of Bioinformatics, Jubilant Biosys, Ocimum Biosolutions, Mascon Life Sciences, Bilcare, Scinova, SysArris, Molecular Connections, and SERC.

The major software outsourcing companies, such as Infosys, Wipro Health Sciences (or Wipro Healthcare), TCS, Kshema Technologies, and Satyam Computer Services are also exploring opportunities. However, their strength seems to lie in providing management software for biotech and pharma companies (bio-IT), rather than specialized bioinformatics.

International players (in bioinformatics and bio-IT) are IBM/IBM India, Intel, Sun Microsystems, Oracle, and Cognizant Technologies.

What is bioinformatics?

Workshop report: Impact of emerging technologies on the biological sciences, National Science Foundation, 1995

Essentially, bioinformatics involves the management of enormous databases of biological (especially microbiological) information. In addition, 3-D visualization is becoming increasingly important.

Narayan Kulkarni of BioSpectrum names 3 subdisciplines:

- the development of new algorithms and statistics with which to assess relationships among members of large data sets,

- the analysis and interpretation of various types of data including nucleotide and amino acid sequences, protein domains and protein structures, and

- the development and implementation of tools that enable efficient access and management of different types of information.



Recently, the field has expanded from the traditional areas of genomics (the study of total molecular sequencing of one set of all genes of an organism) and proteomics (the study of amino acid sequences and the three-dimensional structure related to the function of proteins). Newer developments are cheminformatics, glycomics (study of carbohydrates), metabolics and drug design through bioinformatics.

The NSF's 1995 report (dated, but still useful) states:

Bioinformatics is the facilitation of biological research by improving our ability to accumulate, manipulate and visualize data.

Bioinformatics involves all aspects of advanced computer science and engineering. It includes the high-speed acquisition of biological data, followed by the high throughput processing, analysis, archiving, data search and retrieval, networking, and display of complex biological data sets. This may be the single most pervasive emerging technology in terms of applications for biological research.

Large databases that can be accessed and analyzed with sophisticated tools will become central to biological research and education. The information content in the genomics of organisms, in the molecular dynamics of proteins, and in population dynamics, to name but a few areas, is enormous. Biologists are increasingly finding that the management of complex data sets is becoming a bottleneck for scientific advances. Therefore, bioinformatics will rapidly become a key technology in all fields of biology.

The present bottlenecks in bioinformatics include the education of biologists in the use of advanced computing tools, the recruitment of computer scientists into this evolving field, the limited availability of developed databases of biological information, and the need for more efficient and intelligent search engines for complex databases. Common data structures and user interfaces will be necessary to leverage investments in software development.


Aside from bioinformatics, narrowly defined, there are related fields where biology and IT meet:

- knowledge management tools and knowledge systems to aid drug discovery
- management support, e.g. software to track clinical trials or regular CRM, ERP tools for the biotech industry.

Important fields where biology and other emerging technologies meet are:

- computational biology applied to complex systems to yield progress in structural biology (e.g., molecular dynamics; chemical events in cells, tissues, organs, and organisms; and population and ecosystem dynamics);

- functional imaging tools using biosensors and biomarkers for defining the function of cells, tissues, organs, and organisms;

- transformation and transient expression technologies to allow animals, plants, and cell culture systems to be used as expression systems for production of compounds for research and commerce; and

- nanotechnologies to build small machines for microanalysis and micromanipulation.


Some of these may, of course, require IT support.

Sunday

Business ecosystems - perfect metaphor for the biotech industry

Strategy as Ecology, Marco Iansiti and Roy Levien, 2004

Another article from the March edition of HBR. Iansiti and Levien compare networks of suppliers, distributors, outsourcing firms, makers of related products or services, technology providers, etc. to ecosystems in nature. Certainly not a new concept for anyone who has read up on networks recently, but they carry the analogy further to derive business strategies and insights. An interesting read.

Like an individual species in a biological ecosystem, each member of a business ecosystem ultimately shares the fate of the network as a whole, regardless of that member’s apparent strength. From their earliest days, Wal-Mart and Microsoft—unlike companies that focus primarily on their internal capabilities—have realized this and pursued strategies that not only aggressively further their own interests but also promote their ecosystems’ overall health.

They have done this by creating “platforms”—services, tools, or technologies—that other members of the ecosystem can use to enhance their own performance. Wal-Mart’s procurement system offers its suppliers invaluable real-time information on customer demand and preferences, while providing the retailer with a significant cost advantage over its competitors. (For a breakdown of how Wal-Mart’s network strategy contributes to this advantage, see the exhibit “The Ecosystem Edge.”) Microsoft’s tools and technologies allow software companies to easily create programs for the widespread Windows operating system—programs that, in turn, provide Microsoft with a steady stream of new Windows applications. In both cases, these symbiotic relationships ultimately have benefited consumers—Wal-Mart’s got quality goods at lower prices, and Microsoft’s got a wide array of new computing features—and gave the firms’ ecosystems a collective advantage over competing networks.

Assessing Your Ecosystem’s Health

So what is a healthy business ecosystem? What are the indications that it will continue to create opportunities for each of its domains and for those who depend on it? There are three critical measures of health—for business as well as biological ecosystems.

Productivity. The most important measure of a biological ecosystem’s health is its ability to effectively convert nonbiological inputs, such as sunlight and mineral nutrients, into living outputs—populations of organisms, or biomass. The business equivalent is a network’s ability to consistently transform technology and other raw materials of innovation into lower costs and new products. There are a number of ways to measure this. A relatively simple one is return on invested capital.

Robustness. To provide durable benefits to the species that depend on it, a biological ecosystem must persist in the face of environmental changes. Similarly, a business ecosystem should be capable of surviving disruptions such as unforeseen technological change. The benefits are obvious: A company that is part of a robust ecosystem enjoys relative predictability, and the relationships among members of the ecosystem are buffered against external shocks. Perhaps the simplest, if crude, measure of robustness is the survival rates of ecosystem members, either over time or relative to comparable ecosystems.

Niche Creation. Robustness and productivity do not completely capture the character of a healthy biological ecosystem. The ecological literature indicates that it is also important these systems exhibit variety, the ability to support a diversity of species. There is something about the idea of diversity, in business as well as in biology, that suggests an ability to absorb external shocks and the potential for productive innovation. The best measure of this in a business context is the ecosystem’s capacity to increase meaningful diversity through the creation of valuable new functions, or niches. One way to assess niche creation is to look at the extent to which emerging technologies are actually being applied in the form of a variety of new businesses and products.


Considering the proposition that the biotech industry has entered a phase characterized by many small forms forming alliances/networks with each other, research institutes and pharma multinationals, the analogy seems particularly relevant.

Search

Courtesy of Micah Alpern, I now have a search engine on the blog! Unfortunately, I think it only links to the blog homepage, although the link to Google's cached snapshot should be a bit more useful. Also, it only works if Google has actually crawled the link in question. Since those occasions are few and far between, I'm not sure how useful it will turn out to be... Cool, nevertheless.

Saturday

When at first offshoring doesn't succeed

Tough Shift - Lesson in India: not every job translates overseas, Scott Thurm, 2004

In Wednesday's Wall Street Journal, writes about one ValiCert's travails as it tried to offshore programming work, first by hiring Infosys and later by opening its own subsidiary in Bangalore.

In a nutshell, offshoring didn't work because
- ValiCert kept changing the definition or goals of offshored projects. Many projects were cancelled or delayed after months of work. This led to a) Infosys changing team members designated to ValiCert or b) software engineers at ValiCert's subsidiary becoming utterly frustrated
- Offshored projects were usually small parts of larger efforts and required intensive coordination. This approach failed because of difficulties in managing teams spread across 14 time zones, and because a lot of information that was considered intuitive or taken for granted in Silicon Valley was not available in Bangalore.
- Coordination and communication problems led to severe delays, inefficiencies and a breakdown of trust between the Indian and US teams.

Eventually, ValiCert learned how to make profitable use of offshoring and now believes that the company would not have survived without it. If at first ValiCert believed that colleagues would swap work across the globe every 12 hours, helping ValiCert 'put more people on it and get it done sooner,' now it offshores entire projects, such as adapting an entire program to a different operating system. It has also improved its communications flows between operations in India and the US to ease or at least spread fairly the burden of communicating across time zones.

Nevertheless, Brent Haines, in charge of coordinating the US and Indian teams commented that such collaboration requires extensive planning, ... 'something very unnatural to people in software.'

ValiCert merged with Tumbleweed in Feb 2002. The combined Redwood City, Calif., company's 150 engineers today are almost evenly divided among California, the Tumbleweed operation in Bulgaria, and the India office started by ValiCert. In Bulgaria, engineers write and test software, and scan millions of e-mails daily for traces of spam. In India, engineers test software, fix bugs and create new versions of one product. Last September, Tumbleweed released its first product developed entirely in India, a program that lets two computers communicate automatically and securely. Mr. Marur's team had worked on it for over for 18 months. Core development for new products remains in California, where engineers are closer to marketing teams and Tumbleweed's customers.

On the European biotech sector

Why does the European biotech sector underperform? Mark Greener, 2004

In its December / January 2004 issue, Eurobusiness (now discontinued) carried a story on the worries of the European biotech sector.

While biotech in Europe shows impressive growth and success, it is definitely underperforming compared to the US industry. The main reason for this, according to Greener is a lack of venture capital.

The European biotech sector is younger and less mature than in the US. Companies are smaller - with market capitalizations that often fall short of investment funds' thresholds; their products are less developed and require more patience from investors; and there are few high-profile success stories yet to encourage vc's.

These drawbacks are exacerbated by risk-averse investors (much funding of European biotech ventures actually comes from US, not European, sources) and a lack of successful, co-ordinated stock exchanges. Most financing comes from partnerships with or acquisitions by large pharma firms.

Also, an aversion to GM food, a difficult regulatory environment, and a lower rate of entrepreneurship don't help.

The problem is, of course, that a lack of funds and success stories can stifle growth and reduce spending on new R&D, thereby endangering the future of the entire biotech - and by extension also pharma - industry.

An aside: There is a vc fund that has adapted it's venturing model to the European market. Instead of investing in 10 firms, hoping that one will be an overwhelming success, it's revenue model is based on, say, 5 out of those 10 companies delivering reasonably solid returns. Now if only I could remember the name of the vc fund and where I read about it...

Bringing innovations to market in networked industries

The New Rules for Bringing Innovations to Market, Bhaskar Chakravorti, 2004

Bhaskar Chakravorti, author of 'The Slow Pace of Fast Change,' discusses the pitfalls of innovation in networked industries in this HBR article. (See also this earlier post.)

Chakravorti bases his argument on game theory and network economics.

When a market is in equilibrium (i.e. Nash equilibrium) every player in a market believes that he or she is making the best possible choices and that every other player is doing the same. Equilibrium in a market lends stability to the players' expectations, validates their choices, and reinforces their behaviors. When an innovation enters the market, it upsets the players' expectations and choices and introduces uncertainty in decision making.

So, once a market reaches equilibrium, it resists new ideas and new products and significantly favors incumbents who maintain the status quo.

A market's hostility to innovations becomes stronger when plazers are interconnected. In a networked market, each participant will switch to a new product only when it believes others will do so, too. ... When America's first transcontinental railroads were built in the 1860's, for example, factories and businesses that were close to waterways did not immediately relocate near railways. They did so only when they felt their customers and suppliers were making the switch, too.

Communications technology provides virtual connections between market participants and can affect adoption of new products. Using these technologies, market participants sent signals about their behavior and allow others to form expectations.

For instance, E. Remington and Sons introduced the first typewriter in 1874, a time when penmanship was still a highly respected skill. Most writers (with the exception of Mark Twain) initially shunned the typewriter. The growth of railroads, telephones, and telegraph lines led to the dispersal of companies and the depersonalization of communications. the typewritten document became the standard for written communications in business. Use of the typewriter spread. Thus, the railroads, the telephone, and the telegraph implicitly increased the speed with which consumers accepted the typewriter.

That influence is a two-edged sword.

Networked markets allow for the rapid diffusion of news, ideas, and, in theory, innovations. But they also erect formidable barriers to the adoption of innovations - primarily because of the interdependencies between players.

This reminds me of Barabasi's work on power laws in networks.

Once enough plazers in a networked market decide to switch to a new product, other players' motivation to do so becomes stronger. Beyond that threshold, the network becomes innovation's ally rather than its foe.

This sounds much like a game theoretical explanation of Shumpeter's 'creative destruction.'

Chakravorti also emphasizes the importance of hubs for innovators. Aligning interests with the most connected industry players gives an innovator access to a large network with very little effort.

While he goes on to recommend a strategy for innovators who are trying to move from one equilibrium to another in order to promote their new products, there is little mention of markets that are not yet in equilibrium. I could imagine that the bioinformatics industry is still too young to have reached a Nash equilibrium and that network ties are not yet stable enough to deter innovation. Or, to put it another way, the biomedical industry experienced two 'waves:' Has the industry reached equilibrium in the second wave yet?

And what will happen to an industry so dependent on, even defined by, innovation once it does reach equilibrium?

Thursday

The Bioeconomy and what it means for regional economies

Prospects for a Bioeconomy: The Biomedical Industry and Economic Development, Cinda Herndon-King and Richard S. Seline, 2000

Without many too many facts to fall back on, I have proposed that hi-tech industry industries are moving away from a pure cluster model towards a 'network of competing and cooperating clusters.' This report backs me up as far as the biomedical industry is concerned. There's more on networks of innovation and regions collaborating to compete at the website of New Economy Strategies.

Cinda Herndon-King and Richard Seline analyzed 28 regions in the United States, with a special emphasis on the 4 most important clusters: Boston, San Diego, the Bay Area and Seattle. At the time the report was written, biotech was poised to pick up investments and momentum from the slacking internet bubble economy.

Herndon-King and Seline provide a comprehensive overview of the biomedical industry. They point out the enormous market potential of the health care industry in the U.S., mainly due to a population with a higher life expectancy that is aging overall. However, much of the potential also arises from the fact that genomic pharmaceuticals allow much more personalized healthcare and a much vaster scope of treatments - beginning with highly targeted preventive care.

They cite Mark Dibner and list 7 factors which distinguish the biomedical industry from other high tech sectors:
1. Financing: The start-up costs of business are high, and generally not financed by the entrepreneur
2. Reliance on research base: Most (55%) of biotechnology companies engage in activities which are in the research and development phase only.
3. Time to market: Typically, between five to twelve years is required. Return on investment for early investors is not based on product sales but from increasing valuation of the company, realized upon exit.
4. Regulatory environment: The cost of the the drug development and approval process is estimated at an average of $300 to $500 million per drug. The time required for approvals can be highly variable, and can often depend on factors outside the control of the submitting company.
5. Dependence on patent issues: Attracting investment requires a strong global intellectual property position.
6. Alliances and outsourcing: Due to the high costs of doing business, biotechnology firms extensively leverage outside skills, technology and capital through alliances. Reliance on academic innovation has emerged as the primary factor affecting biotechnology industry cluster devlopment.
7. Influence of public perception and environment.


Two major trends that form a recurring theme throughout the report are:
1. the interrelationship of tools and enabling technology with basic scientific discovery. The distinction between providing equipment or software and conducting basic research is blurred since so much discovery depends on the development of specialized or custom-made new tools.
2. the requirement for interdisciplinary approaches to biomedical research, bioinformatics being a case in point for both trends.

The authors go on to describe 2 phases of the industry:

The first wave business model centered on the 'full integrated pharmaceutical company' that licensed, financed, managed, and fought the federal regulatory labyrinth around (typically) a university patent or paper. This fully-integrated model housed the research, the testing, the manufaturing, and the distribution and sales for all aspects of bringing a drug or product to the market.

The second wave of the biotech industry is best defined by the reliance upon outsourcing and business networks rather than the integration model. Simply, the biotech and life science industry has found alliances, networks among researchers-vendors-suppliers, and a more concentrated and accelerated focus of both the science and the economics to be not just valuable but competitive propositions.

This has implications for regional economies that focus on biotech/biomed:

The Second Wave therefore is permeating regional strategies: proximity is no longer a value proposition in all elements of the lifecycle. Proximity to new ideas, to faculty, to research facilities promises greater innovation (defined as a social process among inputs of the science and outputs of entrepreneurial formation), but as firms mature the proximity demand within a region is challenged. Seattle for instance found in the late 1980s that no strategic marketing firms existed in their region and thus turned to Los Angeles and New York for assistance. Over a three year period, enough demand was created in Seattle that approximately 30 firms were established to serve the growing strategic marketing and sales requirements – many were outpost from Los Angeles and New York, others were home-grown. Currently San Diego has exceeded its manufacturing capacity – land is in short supply and costly; an initiative is underway to partner with border cities in Mexico and communities outside of California for non-essential manufacturing services.

There is a shift from self-contained regional clusters to specialized networked regions (see graph on page 44 of the report).

This is a reflection of changes in the industry itself as it moved from full vertical integration within one firm to a greater reliance on networks and alliances.

Proximity matters but not as it once did - like a fully-integrated company, regions believed that they must manage or control all aspects of the product cycle. With the determination that not every region has all the critical ingredients, more and more expectations arise for networking with other institutions, knowledge, talent and entrepreneurs beyond the local community. Proximity matters because innovation is a social process but not all aspects of the product testing and development must rely on the capacity to 'rub shoulders' with the testing, trials, and manufacturing aspects of the industry.

But the question remains: Which aspects require shoulder rubbing, and which don't?

Wednesday

Biotech resource

I found a newish industry magazine that covers biotech in India: Biospectrum. They provide a great industry overview, lots of stats and profiles - and best of all, all their back issues are online.

Thanks Reuben, for linking to my blog and motivating me to start posting again!