852 resultados para Post-genomic science
Resumo:
Genomics and genetic findings have been hailed with promises of unlocked codes and new frontiers of personalized medicine. Despite cautions about gene hype, the strong cultural pull of genes and genomics has allowed consideration of genomic personhood. Populated by the complicated records of mass spectrometer, proteomics, which studies the human protein, has not achieved either the funding or the popular cultural appeal proteomics scientists had hoped it would. While proteomics, being focused on the proteins that actually indicate and create disease states, has a more direct potential for clinical applications than genomic risk predictions, culturally, it has not provided the material for identity creation. In our ethnographic research, we explore how proteomic scientists attempting to shape an appeal to personhood through which legitimacy may be defined.
Resumo:
Richard Lewontin proposed that the ability of a scientific field to create a narrative for public understanding garners it social relevance. This article applies Lewontin's conceptual framework of the functions of science (manipulatory and explanatory) to compare and explain the current differences in perceived societal relevance of genetics/genomics and proteomics. We provide three examples to illustrate the social relevance and strong cultural narrative of genetics/genomics for which no counterpart exists for proteomics. We argue that the major difference between genetics/genomics and proteomics is that genomics has a strong explanatory function, due to the strong cultural narrative of heredity. Based on qualitative interviews and observations of proteomics conferences, we suggest that the nature of proteins, lack of public understanding, and theoretical complexity exacerbates this difference for proteomics. Lewontin's framework suggests that social scientists may find that omics sciences affect social relations in different ways than past analyses of genetics.
Resumo:
Background and aims: GP-TCM is the 1st EU-funded Coordination Action consortium dedicated to traditional Chinese medicine (TCM) research. This paper aims to summarise the objectives, structure and activities of the consortium and introduces the position of the consortium regarding good practice, priorities, challenges and opportunities in TCM research. Serving as the introductory paper for the GPTCM Journal of Ethnopharmacology special issue, this paper describes the roadmap of this special issue and reports how the main outputs of the ten GP-TCM work packages are integrated, and have led to consortium-wide conclusions. Materials and methods: Literature studies, opinion polls and discussions among consortium members and stakeholders. Results: By January 2012, through 3 years of team building, the GP-TCM consortium had grown into a large collaborative network involving ∼200 scientists from 24 countries and 107 institutions. Consortium members had worked closely to address good practice issues related to various aspects of Chinese herbal medicine (CHM) and acupuncture research, the focus of this Journal of Ethnopharmacology special issue, leading to state-of-the-art reports, guidelines and consensus on the application of omics technologies in TCM research. In addition, through an online survey open to GP-TCM members and non-members, we polled opinions on grand priorities, challenges and opportunities in TCM research. Based on the poll, although consortium members and non-members had diverse opinions on the major challenges in the field, both groups agreed that high-quality efficacy/effectiveness and mechanistic studies are grand priorities and that the TCM legacy in general and its management of chronic diseases in particular represent grand opportunities. Consortium members cast their votes of confidence in omics and systems biology approaches to TCM research and believed that quality and pharmacovigilance of TCM products are not only grand priorities, but also grand challenges. Non-members, however, gave priority to integrative medicine, concerned on the impact of regulation of TCM practitioners and emphasised intersectoral collaborations in funding TCM research, especially clinical trials. Conclusions: The GP-TCM consortium made great efforts to address some fundamental issues in TCM research, including developing guidelines, as well as identifying priorities, challenges and opportunities. These consortium guidelines and consensus will need dissemination, validation and further development through continued interregional, interdisciplinary and intersectoral collaborations. To promote this, a new consortium, known as the GP-TCM Research Association, is being established to succeed the 3-year fixed term FP7 GP-TCM consortium and will be officially launched at the Final GP-TCM Congress in Leiden, the Netherlands, in April 2012.
Resumo:
Nature conservation may be considered a post-normal science in that the loss of biodiversity and increasing environmental degradation require urgent action but are characterised by uncertainty at every level. An ‘extended peer community’ with varying skills, perceptions and values are involved in decision-making and implementation of conservation, and the uncertainty involved limits the effectiveness of practice. In this paper we briefly review the key ecological, philosophical and methodological uncertainties associated with conservation, and then highlight the uncertainties and gaps present within the structure and interactions of the conservation community, and which exist mainly between researchers and practitioners, in the context of nature conservation in the UK. We end by concluding that an openly post-normal science framework for conservation, which acknowledges this uncertainty but strives to minimise it, would be a useful progression for nature conservation, and recommend ways in which knowledge transfer between researchers and practitioners can be improved to support robust decision making and conservation enactment.
Resumo:
Background: New challenges are rising in the animal protein market, and one of the main world challenges is to produce more in shorter time, with better quality and in a sustainable way. Brazil is the largest beef exporter in volume hence the factors affecting the beef meat chain are of major concern in countrýs economy. An emerging class of biotechnological approaches, the molecular markers, is bringing new perspectives to face these challenges, particularly after the publication of the first complete livestock genome (bovine), which has triggered a massive initiative to put in practice the benefits of the so called the Post-Genomic Era. Review: This article aimed at showing the directions and insights in the application of molecular markers on livestock genetic improvement and reproduction as well at organizing the progress so far, pointing some perspectives of these emerging technologies in Brazilian ruminant production context. An overview on the nature of the main molecular markers explored in ruminant production is provided, which describes the molecular bases and detection approaches available for microsatellites (STR) and single nucleotide polymorphisms (SNP). A topic is dedicated to review the history of association studies between markers and important trait variation in livestock, showing the timeline starting on quantitative trait loci (QTL) identification using STR markers and ending in high resolution SNP panels to proceed whole genome scans for phenotype/genotype association. Also the article organizes this information to reveal how QTL prospection using STR could open ground to the feasibility of marker-assisted selection and why this approach is quickly being replaced by studies involving the application of genome-wide association using SNP research in a new concept called genomic selection. Conclusion: The world's scientific community is dedicating effort and resources to apply SNP information in livestock selection through the development of high density panels for genomic association studies, connecting molecular genetic data with phenotypes of economic interest. Once generated, this information can be used to take decisions in genetic improvement programs by selecting animals with the assistance of molecular markers.
Resumo:
Four years after the completion of the Human Genome Project, the US National Institutes for Health launched the Human Microbiome Project on 19 December 2007. Using metaphor analysis, this article investigates reporting in English-language newspapers on advances in microbiomics from 2003 onwards, when the word “microbiome” was first used. This research was said to open up a “new frontier” and was conceived as a “second human genome project”, this time focusing on the genomes of microbes that inhabit and populate humans rather than focusing on the human genome itself. The language used by scientists and by the journalists who reported on their research employed a type of metaphorical framing that was very different from the hyperbole surrounding the decipherment of the “book of life”. Whereas during the HGP genomic successes had been mainly framed as being based on a unidirectional process of reading off information from a passive genetic or genomic entity, the language employed to discuss advances in microbiomics frames genes, genomes and life in much more active and dynamic ways.
Resumo:
The question “artificial nutrition and hydration (ANH) is therapy or not?” is one of the key point of end-of-life issues in Italy, since it was (and it is also nowadays) a strategic and crucial point of the Italian Bioethics discussion about the last phases of human life: determining if ANH is therapy implies the possibility of being included in the list of treatments that could be mentioned for refusal within the living will document. But who is entitled to decide and judge if ANH is a therapy or not? Scientists? The Legislator? Judges? Patients? This issue at first sight seems just a matter of science, but at stake there is more than a scientific definition. According to several scholars, we are in the era of post-academic Science, in which Science broaden discussion, production, negotation and decision to other social groups that are not just the scientific communities. In this process, called co-production, on one hand scientific knowledge derives from the interaction between scientists and society at large. On the other hand, science is functional to co-production of social order. The continuous negotation on which science has to be used in social decisions is just the evidence of the mirroring negotation for different way to structure and interpret society. Thus, in the interaction between Science and Law, deciding what kind of Science could be suitable for a specific kind of Law, envisages a well defined idea of society behind this choice. I have analysed both the legislative path (still in progress) in the living will act production in Italy and Eluana Englaro’s judicial case (that somehow collapsed in the living will act negotiation), using official documents (hearings, texts of the official conference, committees comments and ruling texts) and interviewing key actors in the two processes from the science communication point of view (who talks in the name of science? Who defines what is a therapy? And how do they do?), finding support on the theoretical framework of the Science&Technologies Studies (S&TS).
Resumo:
Ankylosing spondylitis is a highly heritable, common rheumatic condition, primarily affecting the axial skeleton. The association with HLA-B27 has been demonstrated worldwide, and evidence for a role of HLA-B27 in disease comes from linkage and association studies in humans, and transgenic animal models. However, twin studies indicate that HLA-B27 contributes only 16% of the total genetic risk for disease. Furthermore, there is compelling evidence that non-B27 genes, both within and outwith the major histocompatability complex, are involved in disease aetiology. In this post-genomic era we have the tools to help elicit the genetic basis of disease. This review describes methods for genetic investigation of ankylosing spondylitis, and summarises the status of current research in this exciting area.
Resumo:
The use of human tissue sample collections has become an important tool in biomedical research. The collection, use and distribution of human tissue samples, which include blood and diagnostic tissue samples, from which DNA can be extracted and analyzed has also become a major bio-political preoccupation, not only in national contexts, but also at the transnational level. The foundation of medical research rests on the relationship between the doctor and the research subject. This relationship is a social one, in that it is based on informed consent, privacy and autonomy, where research subjects are made aware of what they are getting involved in and are then able to make an informed decision as to whether or not to participate. Within the post-genomic era, however, our understanding of what constitutes informed consent, privacy and autonomy is changing in relation to the needs of researchers, but also as a reflection of policy aspirations. This reflects a change in the power relations between the rights of the individual in relation to the interests of science and society. Using the notions of tissue economies and biovalue (Waldby, 2002) this research explores the changing relationship between sources and users of samples in biomedical research by examining the contexts under which human tissue samples and the information that is extracted from them are acquired, circulated and exchanged in Finland. The research examines how individual rights, particularly informed consent, are being configured in relation to the production of scientific knowledge in tissue economies in Finland from the 1990s to the present. The research examines the production of biovalue through the organization of scientific knowledge production by examining the policy context of knowledge production as well as three case studies (Tampere Research Tissue Bank, Hereditary Non-polyposis Colorectal Cancer and the Finnish Genome Information Center) in which tissues are acquired, circulated and exchanged in Finland. The research shows how interpretations of informed consent have become divergent and the elements and processes that have contributed to these differences. This inquiry shows how the relationship between the interests of individuals is re-configured in relation to the interests of science and society. It indicates how the boundary between interpretations of informed consent, on the one hand, and social and scientific interests, on the other, are being re-drawn and that this process is underscored, in part, by the economic, commercial and preventive potential that research using tissue samples are believed to produce. This can be said to fundamentally challenge the western notion that the rights of the individual are absolute and inalienable within biomedical legislation.
Resumo:
Janet Taylor, Ross D King, Thomas Altmann and Oliver Fiehn (2002). Application of metabolomics to plant genotype discrimination using statistics and machine learning. 1st European Conference on Computational Biology (ECCB). (published as a journal supplement in Bioinformatics 18: S241-S248).
Resumo:
The slogan ‘capitalism is crisis’ is one that has recently circulated swiftly around the global Occupy movement. From Schumpeter to Marx himself, the notion that the economic cycles instituted by capitalism require periodic crises as a condition of renewed capital accumulation is a commonplace. However, in a number of recent texts, this conception of crisis as constituting the very form of urban capitalist development itself has taken on a more explicitly apocalyptic tone, exemplified by the Invisible Committee's influential 2007 book The Coming Insurrection, and its account of what it calls simply ‘the metropolis’. ‘It is useless to wait’, write the text's anonymous authors, ‘for a breakthrough, for the revolution, the nuclear apocalypse or a social movement.… The catastrophe is not coming, it is here.’ In considering such an apocalyptic tone, this paper thus situates and interrogates the text in terms both of its vision of the metropolis as a terrain of total urbanization and its effective spatialization of the present as itself a kind of ‘unnoticed’ apocalypse: the catastrophe which is already here. It does so by approaching this not only apropos its place within contemporary debates surrounding leftist politics and crisis theory but also via its imaginative intersection with certain post-1960s science fiction apocalyptic motifs. What, the paper asks, does it mean to think apocalypse as the ongoing condition of the urban present itself, as well as the opening up of political and cultural opportunity for some speculative exit from its supposedly endless terrain?
Resumo:
Trypanosoma brucei rhodesiense and T. b. gambiense are the causative agents of sleeping sickness, a fatal disease that affects 36 countries in sub-Saharan Africa. Nevertheless, only a handful of clinically useful drugs are available. These drugs suffer from severe side-effects. The situation is further aggravated by the alarming incidence of treatment failures in several sleeping sickness foci, apparently indicating the occurrence of drug-resistant trypanosomes. Because of these reasons, and since vaccination does not appear to be feasible due to the trypanosomes' ever changing coat of variable surface glycoproteins (VSGs), new drugs are needed urgently. The entry of Trypanosoma brucei into the post-genomic age raises hopes for the identification of novel kinds of drug targets and in turn new treatments for sleeping sickness. The pragmatic definition of a drug target is, a protein that is essential for the parasite and does not have homologues in the host. Such proteins are identified by comparing the predicted proteomes of T. brucei and Homo sapiens, then validated by large-scale gene disruption or gene silencing experiments in trypanosomes. Once all proteins that are essential and unique to the parasite are identified, inhibitors may be found by high-throughput screening. However powerful, this functional genomics approach is going to miss a number of attractive targets. Several current, successful parasiticides attack proteins that have close homologues in the human proteome. Drugs like DFMO or pyrimethamine inhibit parasite and host enzymes alike--a therapeutic window is opened only by subtle differences in the regulation of the targets, which cannot be recognized in silico. Working against the post-genomic approach is also the fact that essential proteins tend to be more highly conserved between species than non-essential ones. Here we advocate drug targeting, i.e. uptake or activation of a drug via parasite-specific pathways, as a chemotherapeutic strategy to selectively inhibit enzymes that have equally sensitive counterparts in the host. The T. brucei purine salvage machinery offers opportunities for both metabolic and transport-based targeting: unusual nucleoside and nucleobase permeases may be exploited for selective import, salvage enzymes for selective activation of purine antimetabolites.
Resumo:
The challenges regarding seamless integration of distributed, heterogeneous and multilevel data arising in the context of contemporary, post-genomic clinical trials cannot be effectively addressed with current methodologies. An urgent need exists to access data in a uniform manner, to share information among different clinical and research centers, and to store data in secure repositories assuring the privacy of patients. Advancing Clinico-Genomic Trials (ACGT) was a European Commission funded Integrated Project that aimed at providing tools and methods to enhance the efficiency of clinical trials in the -omics era. The project, now completed after four years of work, involved the development of both a set of methodological approaches as well as tools and services and its testing in the context of real-world clinico-genomic scenarios. This paper describes the main experiences using the ACGT platform and its tools within one such scenario and highlights the very promising results obtained.