Sie sind auf Seite 1von 13

Article 1: How big data can revolutionize pharmaceutical R&D By Jamie Cattell, Sastry Chilukuri, and Michael Levy

April 2013 After transforming customer-facing functions such as sales and marketing, big data is extending its reach to other parts of the enterprise. In research and development, for example, big data and analytics are being adopted across industries, including pharmaceuticals. The McKinsey Global Institute estimates that applying big-data strategies to better inform decision making could generate up to $100 billion in value annually across the US health-care system, by optimizing innovation, improving the efficiency of research and clinical trials, and building new tools for physicians, consumers, insurers, and regulators to meet the promise of more individualized approaches. (In a related video, McKinsey director Sam Marwaha outlines examples of how analytics is already changing the health-care industry.) The big-data opportunity is especially compelling in complex business environments experiencing an explosion in the types and volumes of available data. In the health-care and pharmaceutical industries, data growth is generated from several sources, including the R&D process itself, retailers, patients, and caregivers. Effectively utilizing these data will help pharmaceutical companies better identify new potential drug candidates and develop them into effective, approved and reimbursed medicines more quickly. Imagine a future where the following is possible: Predictive modeling of biological processes and drugs becomes significantly more sophisticated and widespread. By leveraging the diversity of available molecular and clinical data, predictive modeling could help identify new potential-candidate molecules with a high probability of being successfully developed into drugs that act on biological targets safely and effectively. Patients are identified to enroll in clinical trials based on more sources for example, social mediathan doctors visits. Furthermore, the criteria for including patients in a trial could ta ke significantly more factors (for instance, genetic information) into account to target specific populations, thereby enabling trials that are smaller, shorter, less expensive, and more powerful. Trials are monitored in real time to rapidly identify safety or operational signals requiring action to avoid 2 significant and potentially costly issues such as adverse events and unnecessary delays. Instead of rigid data silos that are difficult to exploit, data are captured electronically and flow easily between functions, for example, discovery and clinical development, as well as to external partners, for instance, physicians and contract research organizations (CROs). This easy flow is essential for powering the real-time and predictive analytics that generate business value. Thats the vision. However, many pharmaceutical companies are wary about investing significantly in improving big-data analytical capabilities, partly because there are few examples of peers creating a lot of value from it. However, we believe investment and value creation will grow. The road ahead is indeed challenging, but the bigdata opportunity in pharmaceutical R&D is real, and the rewards will be great for companies that succeed. The big-data prescription for pharmaceutical R&D Our research suggests that by implementing eight technology-enabled measures, pharmaceutical companies can expand the data they collect and improve their approach to managing and analyzing these data. Integrate all data Having data that are consistent, reliable, and well linked is one of the biggest challenges facing pharmaceutical R&D. The ability to manage and integrate data generated at all stages of the value chain, from discovery to realworld use after regulatory approval, is a fundamental requirement to allow companies to derive maximum benefit from the technology trends. Data are the foundation upon which the value-adding analytics are built. Effective
1

end-to-end data integration establishes an authoritative source for all pieces of information and accurately links disparate data regardless of the sourcebe it internal or external, proprietary or publicly available. Data integration also enables comprehensive searches for subsets of data based on the linkages established rather than on the information itself. Smart algorithms linking laboratory and clinical data, for example, could create automatic reports that identify related applications or compounds and raise red flags concerning safety or efficacy. Implementing end-to-end data integration requires a number of capabilities, including trusted sources of data and documents, the ability to establish cross-linkages between elements, robust quality assurance, workflow management, and role-based access to ensure that specific data elements are visible only to those who are authorized to see it. Pharmaceutical companies generally avoid overhauling their entire data-integration system at once because of the logistical challenges and costs involved, although at least one global pharmaceutical enterprise has employed a big bang approach to remaking its clinical IT systems. Companies typically employ a two-step approach: first, they prioritize the specific data types to address (usually clinical data) and create additional data-warehousing capabilities as needed. The goal is to tackle the most important data first to obtain benefits as soon as possible. This step alone can take over a year and requires significant infrastructure and procedural changes. Second, the company develops an approach for the next levels of priority data, including scenario analysis, ownership, and expected costs and timelines. Collaborate internally and externally Pharmaceutical R&D has been a secretive activity conducted within the confines of the R&D department, with little internal and external collaboration. By breaking the silos that separate internal functions and enhancing collaboration with external partners, pharmaceutical companies can extend their knowledge and data networks. Whereas end-to-end integration aims to improve the linking of data elements, the goal of collaboration is to enhance the linkages among all stakeholders in drug research, development, commercialization, and delivery. Maximizing internal collaboration requires improved linkages among different functions, such as discovery, clinical development, and medical affairs. This can lead to insights across the portfolio, including clinical identification and research follow-up on potential opportunities in translational medicine or identification of personalized-medicine opportunities through the combination of biomarkers research and clinical outcomes; predictive sciences could also recommend options at the research stage based on clinical data or simulations. External collaborations are those between the company and stakeholders outside its four walls, including academic researchers, CROs, providers, and payors. Several examples show how effective external collaboration can broaden capabilities and insights: External partners, such as CROs, can quickly add or scale up internal capabilities and provide access to expertise in, for example, best-in-class management of clinical studies. Academic collaborators can share insights from the latest scientific breakthroughs and make a wealth of external innovation available. Examples include Eli Lillys Phenotypic Drug Discovery Initiative, which enables external researchers to submit their compounds for scr eening using Lillys proprietary tools and data to identify whether the compound is a potential drug candidate. Participation in the screening does not require the researcher to give up intellectual property, but it does offer Lilly a first look at new compounds, as well as an avenue to reach researchers who are not typical drug-discovery scientists. Collaborative open space initiatives can enable experts to address specific questions or share insights. Examples include the X PRIZE, which provides financial incentives for teams that successfully meet a big challenge (such as enabling low-cost manned space flight), and InnoCentive, which offers financial incentives for individuals or teams that address a specific problem (such as determining a compounds synthesis pathway). Customer insights can be used to shape strategy throughout the pipeline progression.

Some pharmaceutical companies have made inroads in improving internal and external collaboration, which involves addressing a number of challenges. These include putting in place communications systems and

governance to enable appropriate and effective information exchange. Another challenge is to promote a shift in mind-set, moving away from withholding all data and toward identifying which data can be shared and with whom. In addition, pharmaceutical enterprises must understand and mitigate the legal, regulatory, and intellectual-property risks associated with a more collaborative approach. Some pharmaceutical companies start to improve collaboration by identifying data elements to share with specific sets of trusted partners, such as CROs, and establishing privileged and near-real-time access to data produced by external partners. Such steps are only the beginning, however, as they are essentially just a way to expand the circle of trust to select partners. Employ IT-enabled portfolio-decision support To ensure the appropriate allocation of scarce R&D funds, it is critical to enable expedited decision making for portfolio and pipeline progression. Pharmaceutical companies often find it challenging to make appropriate decisions about which assets to pursue or, sometimes more important, which assets to kill. The personnel or financial investments they have already made may influence decisions at the expense of merit, and they often lack appropriate decision-support tools to facilitate making tough calls. IT-enabled portfolio management allows data-driven decisions to be made quickly and seamlessly. Smart visual dashboards should be used whenever possible to allow rapid and effective decision making, including for the analysis of current projects, business-development opportunities, forecasting, and competitive information. These visual systems should provide high-level dashboards that permit users to deeply examine the data, including information to bolster managerial decision making as well as detailed tactical information, and that make asset performance and opportunities more transparent. In addition to the technical requirements, portfolio decision making should follow a defined process with known timing, deliverables, service levels, and stakeholders. The people involved in the process should be given clear roles and authority (for example, their ability to make decisions should be defined). Resource allocation should be based on a systematic approach that accommodates top-down budgetary requirements and bottom-up requests. And innovation boards at the corporate level and at the business-unit or therapeutic-area level should review the portfolio regularly. The boards should assess, manage, and prioritize the portfolio based on the corporate strategy and changes in the business landscape or industry context. Leverage new discovery technologies Pharmaceutical R&D must continue to use cutting-edge tools. These include sophisticated modeling techniques such as systems biology and high-throughput data-production technologiesthat is, technologies that produce a lot of data quickly, for example, next-generation sequencing, which, within 18 to 24 months, will make it possible to sequence an entire human genome at a cost of roughly $100. The wealth of new data and improved analytical techniques will enhance future innovation and feed the drugdevelopment pipeline. Integrating vast amounts of new data will test a pharmaceutical companys analytical capabilities. For example, a company will need to connect patient genotypes to clinical-trial results to identify opportunities for improving the identification of responsive patients. Such developments would make personalized medicine and diagnostics an integral part of the drug-development process rather than an afterthought and would lead to new discovery technologies and analytical techniques. Deploy sensors and devices Advances in instrumentation through miniaturized biosensors and the evolution in smartphones and their apps are resulting in increasingly sophisticated health-measurement devices. Pharmaceutical companies can deploy smart devices to gather large quantities of real-world data not previously available to scientists. Remote monitoring of patients through sensors and devices represents an immense opportunity. This kind of data could be used to facilitate R&D, analyze drug efficacy, enhance future drug sales, and create new economic models that combine the provision of drugs and services.

Remote-monitoring devices can also add value by increasing patients adherence to their prescriptions. Examples of devices that are under development include smart pills that can release drugs and relay patient data, as well as smart bottles that help track usage. Technology and mobile providers are offering services such as data feeds, tracking, and analysis to complement medical devices. These devices and services, combined with in-home visits, have the potential to decrease health-care costs through shortened hospital stays and earlier identification of health issues. Raise clinical-trial efficiency A combination of new, smarter devices and fluid data exchange will enable improvements in clinical-trial design and outcomes as well as greater efficiency. Clinical trials will become increasingly adaptable to react to drugsafety signals seen only in small but identifiable subpopulations of patients. Examples of potential clinical-trial efficiency gains include the following: Dynamic sample-size estimation (or reestimation) and other protocol changes could enable rapid responses to emerging insights from the clinical data. Efficiency gains are achieved by enabling smaller trials for equivalent power or shortening the time necessary to expand a trial. Adapting to differences in site patient-recruitment rates would allow a pharmaceutical company to address lagging sites, bring new sites online if necessary, and increase recruiting from more successful sites. Increased use of electronic data capture could help in recording patient information in the providers electronic medical records. Using electronic medical records as the primary source for clinical-trial data rather than having a separate system could accelerate trials and reduce the likelihood of data errors caused by manual or duplicate entry. Next-generation remote monitoring of sites, enabled by fluid, real-time data access, could improve management and responses to issues that arise in trials. Improve safety and risk management Pharmaceutical companies can use safety as a competitive advantage in regulatory submissions and after regulatory approval, once the drug is on the market. Safety monitoring is moving beyond traditional approaches to sophisticated methods that identify possible safety signals arising from rare adverse events. Furthermore, signals could be detected from a range of sources, for example, patient inquiries on Web sites and search engines. Online physician communities, electronic medical records, and consumer-generated media are also potential sources of early signals regarding safety issues and can provide data on the reach and reputation of different medicines. Bayesian analytical methods, which can identify adverse events from incoming data, could highlight rare or ambiguous safety signals with greater accuracy and speed. An early response to physician and patient sentiments could prevent regulatory and public-relations backlashes. The FDA is investing in the evaluation of electronic health records through the Sentinel Initiative, a legally mandated electronic-surveillance system that links and analyzes health-care data from multiple sources. As part of this system, the FDA now has secured access to data concerning more than 120 million patients nationwide. Sharpen focus on real-world evidence Real-world outcomes are becoming more important to pharmaceutical companies as payors increasingly impose value-based pricing. These companies should respond to this cost-benefit pressure by pursuing drugs for which they can show differentiation through real-world outcomes, such as therapies targeted at specific patient populations. In addition, the FDA and other government organizations have created incentives for research on health economics and outcomes. To expand their data beyond clinical trials, some leading pharmaceutical companies are creating proprietary data networks to gather, analyze, share, and respond to real-world outcomes and claims data. Partnerships with payors, providers, and other institutions are critical to these efforts. The challenges of a big-data transformation

For a big-data transformation in pharmaceutical R&D to succeed, executives must overcome several challenges. Organization Organizational silos result in data silos. Functions typically have responsibility for their systems and the data they contain. Adopting a data-centric view, with a clear owner for each data type across functional silos and through the data life cycle, will greatly facilitate the ability to use and share data. The expertise gained by the data owner will be invaluable when developing ways to use existing information or to integrate internal and external data. Furthermore, having a single owner will enhance accountability for data quality. These organizational changes will be possible only if a companys leadership understands the potential long -term value that can be unlocked through better use of internal and external data. Technology and analytics Pharmaceutical companies are now saddled with legacy systems containing heterogeneous and disparate data. Increasing the ability to share data requires rationalizing and connecting these systems. Th eres also a shortage of people equipped to develop the technology and analytics needed to extract maximum value from the existing data. Mind-sets Many pharmaceutical companies believe that unless they identify an ideal future state, there is little value to investing in improving big-data analytical capabilities. Indeed, they seem to fear being the first mover, since there are few examples of pharmaceutical companies creating a lot of value from the improved use of big data. Compounding their hesitation is concern about increasing interactions with regulators if they pursue a big-data change program. Pharmaceutical companies should learn from smaller, more entrepreneurial enterprises that see value in the incremental improvements that might emerge from small-scale pilots. The experience so obtained might yield long-term benefits and accelerate the path to the future state. Pharmaceutical companies desperately need to bolster R&D innovation and efficiency. By implementing these eight technology-enabled ways to benefit from big data, they could gradually turn the tide of declining success rates and stagnant pipelines. Article 1: Healthcare: Big pharma, big data By Andrew Jack 4 April 2013 Financial Times Patient groups are hailing a new era of transparency, but drug companies fear its effects When Guido Rasi took charge of Europes medicines regulator in 2011 he inherited an explosive dossier that is now poised to transform drug development. The furore started in 2007 when two Danish academics pushed the European Medicines Agency for greater transparency on 15 clinical trials. Their pursuit of greater openness set in motion a chain of events that stands to revolutionise the pharmaceutical industry in the months ahead. From the beginning of next year, the EMA will release all information about clinical studies submitted to it by organisations seeking authorisation for new treatments. As this deadline approaches, heated parliamentary and legal debates are raging to determine how far this new culture of open access should go.

For advocates of change, including many patient groups and academic researchers, greater transparency in drug trials is essential to understand the full risks and benefits of medicines and reduce development costs. Critics argue that a cascade of publicly available data will threaten the confidentiality of patients taking part in trials and undermine the role of regulators. Many pharmaceutical companies are watching the disputes with trepidation, fearing that the publication of so much trial data could damage their business models by allowing competitors to access the expensive research they undertook to develop their drugs. Chris Viehbacher, chief executive of Sanofi, the French pharmaceutical company, says: Im against a data dump on the sidewalk. This is second-guessing the regulators, and its not right if we have spent millions of dollars on this data for it to be released to everyone. In Britain, parliaments science and technology committee has launched an inquiry into whether greater transparency is necessary to stop drug companies cherry-picking data that show their products in a favourable light. The EMAs decision to release so much data now puts Europe ahead of the l evel of openness expected in the US. Still, a court case initiated last month has illustrated how the action in Europe has implications for companies from the US and further afield. Two US-based companies AbbVie and InterMune are suing the EMA in the EUs general court in an attempt to prevent it releasing their drug trial data. This stand-off over data began six years ago, when Peter Gtzsche and Anders Jrgensen from the University of Copenhagen wrote to the EMA requesting details of clinical trials on which the agency based its decision to approve the weight-loss drugs Rimonabant and Orlistat. They worried that the drug companies could be concealing the full results of their tests. By potentially exaggerating benefits and playing down side effects, they feared the risk both of harming patients and imposing unnecessary costs on the healthcare system. They sought full details of the protocols describing the trials, the results and the supportive underlying raw data of tests on each patient trying out the medicines. As they wrote in the British Medical Journal: The effect on weight loss in the published trials is small, and the harms are substantial. One Sanofis Rimonabant was never approved by the US Food and Drug Administration. It has since been withdrawn in Europe over concerns that it triggered suicidal feelings. After being rebuffed by the EMA on the grounds that such information was commercially sensitive and therefore exempted from the usual EU rights of freedom of information, the Danes appealed to the European Ombudsman, who ultimately ruled in their favour in 2010. Rather than contesting that opinion, the EMA released the documents in early 2011. Its decision has since triggered a flurry of requests, and officials have responded by making public a number of previously confidential drug authorisation submissions. Mr Rasi set EMAs new tone in November, saying: We are not here to decide if we will publish clinic al trial data, only how. He plans to release all data as soon as the agency decides to approve or reject a drug. For medical researchers such as Mr Gtzsche, this move is essential. He has won support from others including the doctor and writer Ben Goldacre, author of Bad Pharma, and the alltrials campaign supported by a number of patient organisations and others including the BMJ. They point to the publication in medical journals of only a biased selection of industry-funded clinical trials showing favourable results, and to cases where the scientific questions being studied are altered retrospectively to fit the data better. Then there are accusations that Merck played down cardiac side effects of its painkiller Vioxx, which was withdrawn from sale globally in 2004, and that GlaxoSmithKline did the same for its diabetes drug Avandia, also subsequently pulled from the market in Europe and North America. The companies denied the allegations.

... More recent ammunition has come from a group of academics led by Peter Doshi, who has been seeking fuller disclosure of the trials on Roches antiviral drug Tamiflu. After excluding studies they suspect are not independent, they question the drugs benefit and whether the billions of dollars spent by government s stockpiling it for the 2007 flu pandemic were justified. Roche retorts that the regulators have had access to full clinical trial results even for studies that were not published in medical journals and that the efficacy of Tamiflu has been supported by reviews conducted by independent academics to whom it did release more details. On the subject of full clinical data transparency, the pharmaceutical industry itself is divided. AbbVie and InterMune, the two companies that have sued the EMA, claim greater release will expose commercial secrets and reduce incentives for investment in developing drugs. That includes competitive details of clinical trial design and plans for their medicines. Industry experts fear generic drug companies could use the data to accelerate the approval of cheap copies in jurisdictions where patent rules are weaker or even steal the data and claim it as their own in regulatory submissions. Paul Stoffels, worldwide chairman of Janssen, part of Johnson & Johnson, cites a different problem. His companys Zytiga, a new treatment for prostate cancer, is based on an old compound on which the patent was close to expiry. He says he was only able to justify investing in costly clinical trials to support the treatment because of US and EU rules granting an additional period of exclusivity based on the efficacy data. If that information were made public, he argues, such rights would be lost and no company would invest. Others argue rival companies could analyse the results to modify, accelerate or abandon their trials on similar drugs, saving costs and eroding the competitive advantage of the original developer. Supporting this view is the fact that four-fifths of all the freedom of information requests made to the EMA have come from other companies, consultants or lawyers, with just a handful from independent medical researchers or non-profit groups. AbbVies request came from UCB and InterMunes from Boehringer Ingelheim. Both of those companies are working on competitor drugs. Not everyone agrees on the dangers. GSK has come out in favour of transparency, after being sharply criticised in 2008 by the UKs drug regulator for failing to flag an unpublished study that identified suicidal feelings in children taking its antidepressant Seroxat. Late last year, Sir Andrew Witty, the chief executive, voiced support for a full release of data. There was a history in the industry of fear around collaboration especially, openness of information and a mistaken judgment that actually by being more open and more transparent around data somehow that would destroy the fundamental business model, he said. But he and others are concerned about two further factors. The first is the danger of breaching patient confidentiality. All patients must give consent to take part in drug trials, while trusting companies to safeguard their personal data. The fear is that especially for rare diseases it could prove extremely difficult to ensure results are anonymous. That could make patients vulnerable to discrimination by insurance companies, employers or direct marketers. Prof Sir Michael Rawlins, the outgoing head of the National Institute for Health and Clinical Excellence, the UK governments medicines watchdog, told politicians last month: Getting patients agreement can be very complicated, but it is not fair on patients to opt out by saying that we have patient consent. He argues that access to patient-level data are also unnecessary and that the debate should focus on ensuring broader access to comprehensive clinical trial results. While all such studies are increasingly submitted to regulators to examine safety and efficacy, they are not all passed on to bodies such as his that assess cost effectiveness. The second concern raised is the danger of misuse and misinterpretation of patient data, without the safety valve of a regulator. Full public release allows access to anyone, from medical academics with flawed analyses to vaccine denialists. Both could scare patients away from valuable medicines. Patients could get very confused

with the information coming out, says Elmar Schnee, chief executive of Cardiorentis, a biotech company. I hope this transparency gets stopped. People could do a lot of stupid things with it. The UK Academy of Medical Science, the US Institute of Medicines and the Wellcome Trust are in discussions with industry and academia about an agreement to release patient-level data only selectively to legitimate researchers to minimise abuse. ... While the risks of full disclosure of patient data are unclear, the benefits also remain untested. The US FDA which does not release patient-level data employs its own scientists to trawl through and raise any concerns with companies. The EMA typically does not. So far, Mr Doshis studies of Tamiflu have identified potential concerns but no smoking guns. Mr Gtzsche says he has recruited a new student to analyse the data on the two drugs that he received from the EMA in 2011 but has not yet been able to study it in detail. In future, it seems likely that greater access to legitimate researchers equipped with the right skills will provide greater insight into drugs and help lower the costs of developing medicines for everyone. But, particularly in the shadow of the litigation now under way, the shorter-term emphasis may focus on consistent and comprehensive release of clinical trial results, not the underlying data. In the long term, Mr Gtzsche argues, the more we share the data, the more efficiently the drug companies will be able to develop new drugs, which will benefit everybody. We must leave the dark ages of secrecy that we have suffered for too long. ------------------------------------------Research: Drug companies look to crowdsourcing for discoveries The vast amount of scientific information available online, and the growing sophistication of computer tools to mine it, have the potential to make significant medical advances far beyond the limits of any researcher or company. With patients showing an increasing willingness to share personal medical data and discuss their conditions, specialist social media networks and groups such as Patientslikeme are being used to find recruits for clinical trials. Others, such as 23andme, which sells consumer genetic tests, have been able to use the data collected from their clients to identify patterns that may have been overlooked elsewhere. Last year it identified and filed a patent on a genetic variant linked to Parkinsons disease. Philippe Wolgen, head of Clinuvel, an Australian biotech company, says Facebook allowed his company to better understand patients feedback as it tested a new treatment for the rare condition of erythropoietic protoporphyria, or light intolerance. Innocentive, originally developed by Eli Lilly and now a standalone company with clients in and beyond the pharmaceutical industry, uses crowdsourcing to solve problems. Like the X Prize, it posts a challenge on the internet with a cash reward for anyone able to resolve it. InfoCodex has developed a computer program that automatically analyses medical literature for overlooked information. In a recent test, it sought new biomarkers biological measures such as indicators in the blood for diabetes and obesity by scanning 120,000 published academic papers and internal documents at Merck. The result, according to Carlo Trugenberg, the companys founder, was a number of potentially interesting new approaches that could accelerate drug discovery. They were so promising that Merck, the client, decided it would not make them public. In the fiercely competitive world of drug development, transparency is still sometimes one-way. Article 1: Is Brain Mapping Ready for Big Science?

By Patricia Fitzpatrick Dimond 29 March 2013 Genetic Engineering & Biotechnology News President Barack Obamas public-private initiative to create an activity map of the human brain will cost more than $3 billion, projections say, or $300 million annually for 10 years. The project has multiple private and public institutions lined up to participate, including the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation. All parties hope that the initiative will move brain science forward with the same kind of money and focused effort that drove the Genome Project. Every dollar we invested to map the human genome returned $14 0 to our economyevery dollar, the president commented. Today our scientists are mapping the human brain to unlock the answers to Alzheimers. Theyre developing drugs to regenerate damaged organs, devising new materials to make batteries 10 times more powerful. Now is not the time to gut these job-creating investments in science and innovation. George M. Church, Ph.D., professor of genetics at Harvard Medical School and director of PersonalGenomes.org, said he was helping to plan the Brain Activity Map project. If you look at the total spending in neuroscience and nanoscience that might be relative to this today, we are already spending more than that. We probably wont spend less money, but we will probably get a lot more bang for the buck, he commented in the New York Times. BAM The proposal for the project came from six scientists, among them Dr. Church, who said in the journal Neuron, We propose launching a large-scale, international public effort, the Brain Activity Map project (BAM), aimed at reconstructing the full record of neural activity across complete neural circuits. This technological challenge could prove to be an invaluable step toward understanding fundamental and pathological brain processes. The collective idea for the initiative was generated at a meeting of neuroscientists and nanoscientists convened in September 2011 at the Kavli Royal Society International, U.K., organized by Tom Kalil, deputy director for policy at the White Houses Office of Science and Technology Policy (OSTP), and Miyoung Chun, Ph.D., vice president of science programs at the Kavli Foundation in Oxnard, California. The Kavli institute has founded institutes for brain science at UC San Diego, Yale, and the Norwegian University of Science and Technology. Meeting attendees articulated the issues the BAM will address in its report, mentioning our persistent ignorance of the brains micro-circuitrythe minute and multitudinous connections contained within, and citing the great brain scientist Ramon y Cajals 1923 quote that refers to the interconnected, intermixed, and dynamical network of different cell types as impenetrable jungles where many investigators have lost themselves. Another equally fundamental shortcoming, they noted, is our inability to monitor n etwork interactions and coordinated brain activities densely, and to do so simultaneously across extended regions of the brain, and with sufficient temporal and spatial resolution. And most scientists, whether proponents or opponents of the big science approach to brain mapping, agree that its biggest challenge is the need to develop novel tools to study the brain. Revolutionary New Tools Needed Partha Mitra, Ph.D., a theoretical physicist and currently Crick-Clay professor of biomathematics at Cold Spring Harbor Laboratory, says that current methods to visualize living or dead brains provide only glimpses of small portions of the full spatial extent of neurons in the human brain, or pictures of thin sections of brain, with pieces of the neurons in them. No one has yet seen, under the microscope or in digital reconstruction, a complete human brain neuron that sends projections to distant parts of the brain. To do that at the whole-brain scale would be like seeing a new continent or planet. Dr. Mitras research currently combines experimental, theoretical, and informatics approaches to gain an understanding of how brains work.

Dr. Chun has been developing the project since the beginning and has described herself as the glue holding the diverse stakeholders together. She told Nature that theres clearly an issue with tool developmentand not just amending current, existing tools, although that will be important in the initial stages. In the long run, one of the very important points would be to come up with revolutionary new tools that will measure brain activity in a completely different way than what we know now. And project proponents say the only way to tackle some thus far tricky intractable human diseases, like Alzheimers and Parkinsons disease, is with a huge program. We are right on the edge of finding out really vital information about the brain, says Brown University neuroscientist John Donoghue, Ph.D., who was part of the project team. There are questions we can now answer that can only be tackled as a collaborative project, not by individual labs. In Dr. Donaghue's view, the problem is that the people developing novel technologies and the neuroscience community dont communicate effectively. Biologists don't know enough about the tools alrea dy out there, and the materials scientists aren't getting feedback from them on ways to make their tools more useful. Economic Incentives And theres no denying the economic incentives the project provides. What motivates people to pursue these big projects is not the belief that they will solve problems, says Michael Eisen, Ph.D., a biologist at the University of California, Berkeley. Its the belief that this is the way to get money. John Mazziotta, M.D., Ph.D., UCLAs department of neurology chair an d director of its Brain Mapping Center, says, This initiative is more comprehensive than anything Ive ever seen medicine and neuroscience. This effort will be both the stimulus and the challenge to work and collaborate in ways we havent done before, but always have wanted to. UCLA will likely benefit handsomely from the initiative as it says it is well -positioned to play a significant role in the effort and to capture funding that will support such an initiative, owing to the existence Ahmanson-Lovelace Brain Mapping Center and its excellence in nanoscience and nanotechnology. Dr. Church is also in favor of spreading the funding for the project around. In an interview with Harvard Medical School News last month, he said, The Genome Project didnt adequately embrace small science. I think enabling small labs to do amazing things might be more powerful than having a juggernaut of a large lab, or worse yet, a race among a few large labs. A report from the Battelle Technology Partnership says that, between 1988 and 2010, federal investment in genomic research generated an economic impact of $796 billion, impressive considering that Human Genome Project (HGP) spending between 19902003 amounted to $3.8 billion and an ROI of 141:1. Apart from job creation and ROI, if this massive initiative provides new treatment targets for intractable human neurological and psychiatric disorders, it will have been worth the investment. Article 1: Talk of Medicare Changes Could Open Way to Budget Pact By Jackie Calmes 28 March 2013 The New York Times WASHINGTON As they explore possible fiscal deals, President Obama and Congressional Republicans have quietly raised the idea of broad systemic changes to Medicare that could produce significant savings and end the polarizing debate over Republican plans to privatize the insurance program for older Americans. While the two remain far apart on the central issue of new tax revenue, recent statements from both sides show possible common ground on curbing the costs of Medicare, suggesting some lingering chance, however small, for a budget bargain.

Mr. Obama assured House and Senate Republicans during recent separate visits that he could support specific cost-saving changes to Medicare and deliver Democratic votes, though only as part of a balanced package that had additional revenues. Several changes are likely to once again be in his annual budget, which will be released on April 10, after Congress returns from its break. Mr. Obama also plans a dinner with Senate Republicans that night. In particular, participants say, the president told House Republicans that he was open to combining Medicares coverage for hospitals and doctor services. That would create a single deductible that could increase out-ofpocket costs for many future beneficiaries, but also could pay for a cap on their total expenses and reduce the need to buy Medigap supplementary insurance. Representative Eric Cantor of Virginia, the No. 2 House Republican, proposed much the same in a speech in February. We should begin by ending the arbitrary division between Part A, the hospital program, and Part B, the doctor services, he said. We can create reasonable and predictable levels of out -of-pocket expenses without forcing seniors to rely on Medigap plans. While Mr. Cantors proposal got little attention at the time, its echo by Mr. Obama hints at a new route toward compromise in contrast with the budget that House Republicans passed this month that has no chance of Senate approval. At a time when retiring baby boomers and mounting medical prices have made federal health care spending the biggest single driver of the nations rising debt, the House budget from Representative Paul D. Ryan, Republican of Wisconsin, would transform Medicare into a voucherlike system known as premium support, which Mr. Obama and Democrats adamantly oppose. But Mr. Cantor, like Mr. Obama, is suggesting cost-saving changes within the existing Medicare program. To Senator Mark R. Warner of Virginia, a Democrat who has long led a bipartisan group of senators seeking a fiscal deal, such a proposal is the sort of newer idea needed for the parties to stop the stale arguments that after three years have turned their budget battling into World War I trench warfare. Youve got a whole lot of folks on the Republican side saying, Well, we dont really like what Ryan has done premium support but we want systemic reform, Mr. Warner said at a round table hosted by Bloomberg News. Mr. Obamas openness to Medicare changes seemed to be news to many Republicans, even though he first proposed detailed ideas in 2011. Republicans often accuse the president of opposing changes in entitlement spending while focusing on raising taxes, an attack that ignores his proposals but also reflects how little Mr. Obama has talked about them. Still, the same hurdle to compromise stands: The president and his party will not support even his Medicare proposals unless Republicans agree to raise taxes on the wealthy and some corporations. Without that trade-off, common ground on Medicare will remain unplowed. The president has said this to the Republicans: You want to do entitlement reform? I do, too. I can produce entitlement reform and bring Democrats to the table, because I am a Democratic president. And so I m ready to sit down with you and work out an approach, Senator Richard J. Durbin of Illinois, a Democratic leader, said at a recent forum hosted by The Wall Street Journal. Many Republicans remain distrustful of Mr. Obama. Yet when they speak of altering Medicare, not replacing it, it is clear that they share some concerns about the existing program. Representative Kevin Brady, Republican of Texas and chairman of a health subcommittee, said the structure of the traditional fee-for-service Medicare is outdated and confusing.

Can you imagine a world in which someone has to buy hospital and nursing home coverage from one insurance company, physician office coverage from another insurance company, prescription drug coverage from yet another company, and likely supplemental coverage from a fourth insurance company? Mr. Brady asked. This is exactly how the current Medicare benefit is designed.

Both the administrations and Mr. Cantors interest in restructuring Medicares Parts A and B dates to 201 1, when various proposals were considered by a deficit-reduction group headed by Vice President Joseph R. Biden Jr. that included Mr. Cantor. The goal is to discourage people from seeking unneeded treatments, shrink health spending and offset the costs of a cap on beneficiaries total out-of-pocket costs. Such a cap would reduce beneficiaries need for extra insurance. About 90 percent of beneficiaries in the traditional Medicare program have supplemental coverage through Medigap policies, employers retiree plans or Medicaid for low-income people. Many health-policy economists have called for creating a single, unified deductible. The current two deductibles reflect separate legislative tracks that came together in the creation of Medicare in 1965. The deductible for Part A hospital care is relatively high ($1,184 this year), while that for Part B doctor care is relatively low ($147). Patients also have co-payments for many services. Despite the bipartisan interest, the politics of merging Part A and Part B are complicated. Glenn M. Hackbarth, chairman of the Medicare Payment Advisory Commission, a group of nonpartisan experts that advises Congress, said a combined deductible could increase costs for those who use only doctor and outpatient services a majority of beneficiaries in any year. It could reduce costs, he said, for the roughly 20 percent who require hospitalization. Proponents, including some in the administration, acknowledge the political risks of increasing most beneficiaries costs, even in exchange for capping their total costs, as in cases of catastrophic illness. A 1988 law protecting against catastrophic costs caused such an outcry among older Americans, who faced an extra tax, that Congress quickly repealed it. But administration officials say the 1988 law affected current beneficiaries, while Mr. Obama would apply any changes only to people becoming eligible for Medicare after 2016. So far, the changes the president has proposed do not go as far as a single deductible and a cap on catastrophic costs. Instead, Mr. Obama has called for increasing the Part B deductible, which has risen much less than medical costs. He also proposed that beneficiaries pay something for home health care, which is among Medicares fastest-growing and most fraud-prone expenses; people just released from the hospital would be exempted. Third, Mr. Obama proposed a 15 percent surcharge on Medigap plans that cover all or nearly all of a beneficiarys initial annual expenses. Economists say that such coverage leaves beneficiaries insensitive to costs, increasing Medicares spending and the premiums beneficiaries pay. Article 1: Pharma needs to redefine emerging markets: Booz study By Kevin Grogan 28 March 2013 PharmaTimes Online Pharmaceutical companies that thought of the emerging markets as a solution to their woes in Europe and the USA need to tailor their operations to each individual country.

That is one of the conclusions from a new analysis from management consultants Booz & Co which interviewed 12 of the top 15 global drugmakers, accounting for some 50% of total global pharmaceutical revenues . It states that 52% of the top managers who took part in the survey claim that, by 2018, more than 30% of their respective companies revenue will come from emerging markets. Today, the threshold is only 23%, Booz notes, which equates to $191 billion of revenue in emerging markets. Nevertheless, the report warns that "companies should be aware that whilst there is huge potential here, variations within the emerging markets themselves require different and tailored approaches". Some 27% of those surveyed see current strategies insufficiently tailored to local market conditions to be the biggest failing. One of the executives interviewed said, one of our biggest mistakes was to treat emerging markets like mature markets. We were wrong. Pharmaceutical strategies have to fit a countrys individual needs and development. Taking a local approach 77% of survey participants deemed the deployment of local operational teams in BRICMT (Brazil, Russia, India, China, Mexico and Turkey) markets to be a sensible move, while 67% voiced their support for a local production unit; 65% support the creation of a local R&D division. The Booz report notes that 78% are planning further expansion of their offices in BRICMT, "a sign of the degree of trust companies have in the stability of these countries". The analysis goes on to say that "whilst emerging markets offer huge untapped potential, they also display a wide diversity in their stages of development". The BRICMT countries are much more developed than some in SouthEast Asia and Africa, and there is a huge difference in size, yet "they are all regarded as 'emerging markets'." Booz claims that "the vast majority of decision-makers remain cautious about these second-tier markets, despite the projected growth rates in Africa of 28%". Collaborations with governments or local sales representatives are currently preferred in the latter. Stephan Danner, partner with Boozs health practice, concludes that "in the non-established markets, pharma companies have a unique opportunity to enter early and shape the market. He added that the emerging markets "have long been regarded as the 'promised land' of the pharmaceutical industry, yet, so far, ambitious targets remain unachieved". In terms of the drugmakers best-placed to succeed, participants in the Booz plump for Novartis and Sanofi, followed by GlaxoSmithKline, Roche, Pfizer and Merck & Co.

Das könnte Ihnen auch gefallen