32 resultados para Advent.
Resumo:
Metaphor has featured frequently in attempts to define the proverb (see Taylor 1931, Whiting 1932, Mieder 1985, 1996), and since the advent of modern paremiological scholarship, it has been identified as one of the most salient markers of ‘proverbiality’ (Arora 1984) across a broad spectrum of world languages. Significant language-specific analyses, such as Klimenko (1946), Silverman-Weinreich (1981), and Arora (1984) have provided valuable qualitative information on the form and function of metaphor in Russian, Yiddish, and Spanish proverbs respectively. Unfortunately, no academic scholarship has engaged with the subject of metaphor in Irish proverbs. This study builds on international paremiological research on metaphor and provides for the first time a comprehensive quantitative and qualitative analysis of the form, frequency, and nature of linguistic metaphors in Irish proverbs (1856-1952). Moreover, from the perspective of paremiology, it presents a methodological template and result-set that can be applied cross-linguistically to compare metaphor in the proverbs of other languages.
Resumo:
The efficient development of multi-threaded software has, for many years, been an unsolved problem in computer science. Finding a solution to this problem has become urgent with the advent of multi-core processors. Furthermore, the problem has become more complicated because multi-cores are everywhere (desktop, laptop, embedded system). As such, they execute generic programs which exhibit very different characteristics than the scientific applications that have been the focus of parallel computing in the past.
Implicitly parallel programming is an approach to parallel pro- gramming that promises high productivity and efficiency and rules out synchronization errors and race conditions by design. There are two main ingredients to implicitly parallel programming: (i) a con- ventional sequential programming language that is extended with annotations that describe the semantics of the program and (ii) an automatic parallelizing compiler that uses the annotations to in- crease the degree of parallelization.
It is extremely important that the annotations and the automatic parallelizing compiler are designed with the target application do- main in mind. In this paper, we discuss the Paralax approach to im- plicitly parallel programming and we review how the annotations and the compiler design help to successfully parallelize generic programs. We evaluate Paralax on SPECint benchmarks, which are a model for such programs, and demonstrate scalable speedups, up to a factor of 6 on 8 cores.
Resumo:
Crohn's disease (CD) and ulcerative colitis (UC) are the two major forms of inflammatory bowel disease (IBD) and both diseases lead to high morbidity and health care costs. Complex interactions between the immune system, enteric commensal bacteria and host genotype are thought to underlie the development of IBD although the precise aetiology of this group of diseases is still unknown. The understanding of the composition and complexity of the normal gut microbiota has been greatly aided by the use of molecular methods and is likely to be further increased with the advent of metagenomics and metatranscriptomics approaches, which will allow an increasingly more holistic assessment of the microbiome with respect to both diversity and function of the commensal gut microbiota. Studies thus far have shown that the intestinal microbiota drives the development of the gut immune system and can induce immune homeostasis as well as contribute to the development of IBD. Probiotics which deliver some of the beneficial immunomodulatory effects of the commensal gut microbiota and induce immune homeostasis have been proposed as a suitable treatment for mild to moderate IBD. This review provides an overview over the current understanding of the commensal gut microbiota, its interactions with the mucosal immune system and its capacity to induce both gut homeostasis as well as dysregulation of the immune system. Bacterial-host events, including interactions with pattern recognition receptors (PRRs) expressed on epithelial cells and dendritic cells (DCs) and the resultant impact on immune responses at mucosal surfaces will be discussed. (C) 2009 Elsevier GmbH. All rights reserved.
Resumo:
The standard model for the origin of galactic magnetic fields is through the amplification of seed fields via dynamo or turbulent processes to the level consistent with present observations. Although other mechanisms may also operate, currents from misaligned pressure and temperature gradients (the Biermann battery process) inevitably accompany the formation of galaxies in the absence of a primordial field. Driven by geometrical asymmetries in shocks associated with the collapse of protogalactic structures, the Biermann battery is believed to generate tiny seed fields to a level of about 10 gauss (refs 7, 8). With the advent of high-power laser systems in the past two decades, a new area of research has opened in which, using simple scaling relations, astrophysical environments can effectively be reproduced in the laboratory. Here we report the results of an experiment that produced seed magnetic fields by the Biermann battery effect. We show that these results can be scaled to the intergalactic medium, where turbulence, acting on timescales of around 700 million years, can amplify the seed fields sufficiently to affect galaxy evolution.
Resumo:
The advent of next generation sequencing technologies (NGS) has expanded the area of genomic research, offering high coverage and increased sensitivity over older microarray platforms. Although the current cost of next generation sequencing is still exceeding that of microarray approaches, the rapid advances in NGS will likely make it the platform of choice for future research in differential gene expression. Connectivity mapping is a procedure for examining the connections among diseases, genes and drugs by differential gene expression initially based on microarray technology, with which a large collection of compound-induced reference gene expression profiles have been accumulated. In this work, we aim to test the feasibility of incorporating NGS RNA-Seq data into the current connectivity mapping framework by utilizing the microarray based reference profiles and the construction of a differentially expressed gene signature from a NGS dataset. This would allow for the establishment of connections between the NGS gene signature and those microarray reference profiles, alleviating the associated incurring cost of re-creating drug profiles with NGS technology. We examined the connectivity mapping approach on a publicly available NGS dataset with androgen stimulation of LNCaP cells in order to extract candidate compounds that could inhibit the proliferative phenotype of LNCaP cells and to elucidate their potential in a laboratory setting. In addition, we also analyzed an independent microarray dataset of similar experimental settings. We found a high level of concordance between the top compounds identified using the gene signatures from the two datasets. The nicotine derivative cotinine was returned as the top candidate among the overlapping compounds with potential to suppress this proliferative phenotype. Subsequent lab experiments validated this connectivity mapping hit, showing that cotinine inhibits cell proliferation in an androgen dependent manner. Thus the results in this study suggest a promising prospect of integrating NGS data with connectivity mapping. © 2013 McArt et al.
Resumo:
Resistance to chemotherapy and molecularly targeted therapies is a major problem facing current cancer research. The mechanisms of resistance to 'classical' cytotoxic chemotherapeutics and to therapies that are designed to be selective for specific molecular targets share many features, such as alterations in the drug target, activation of prosurvival pathways and ineffective induction of cell death. With the increasing arsenal of anticancer agents, improving preclinical models and the advent of powerful high-throughput screening techniques, there are now unprecedented opportunities to understand and overcome drug resistance through the clinical assessment of rational therapeutic drug combinations and the use of predictive biomarkers to enable patient stratification.
Resumo:
The question of whether there is or was life on Mars has been one of the most pivotal since Schiaparellis' telescopic observations of the red planet. With the advent of the space age, this question can be addressed directly by exploring the surface of Mars and by bringing samples to Earth for analysis. The latter, however, is not free of problems. Life can be found virtually everywhere on Earth. Hence the potential for contaminating the Mars samples and compromising their scientific integrity is not negligible. Conversely, if life is present in samples from Mars, this may represent a potential source of extraterrestrial biological contamination for Earth. A range of measures and policies, collectively termed ‘planetary protection’, are employed to minimise risks and thereby prevent undesirable consequences for the terrestrial biosphere. This report documents discussions and conclusions from a workshop held in 2012, which followed a public conference focused on current capabilities for performing life-detection studies on Mars samples. The workshop focused on the evaluation of Mars samples that would maximise scientific productivity and inform decision making in the context of planetary protection. Workshop participants developed a strong consensus that the same measurements could be employed to effectively inform both science and planetary protection, when applied in the context of two competing hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. Participants then outlined a sequence for sample processing and defined analytical methods that would test these hypotheses. They also identified critical developments to enable the analysis of samples from Mars.
Resumo:
Many parts of the UK’s rail network were constructed in the mid-19th century long before the advent of modern construction standards. Historic levels of low investment, poor maintenance strategies and the deleterious effects of climate change have resulted in critical elements of the rail network being at significant risk of failure. The majority of failures which have occurred over recent years have been triggered by extreme weather events. Advance assessment and remediation of earthworks is, however, significantly less costly than dealing with failures reactively. It is therefore crucial that appropriate approaches for assessment of the stability of earthworks are developed, so that repair work can be better targeted and failures avoided wherever possible. This extended abstract briefly discusses some preliminary results from an ongoing geophysical research project being carried out in order to study the impact of climate or seasonal weather variations on the stability of a century old railway embankment on the Gloucestershire Warwickshire steam railway line in Southern England.
Resumo:
The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.
The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.
In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.
Resumo:
Energy consumption has become an important area of research of late. With the advent of new manycore processors, situations have arisen where not all the processors need to be active to reach an optimal relation between performance and energy usage. In this paper a study of the power and energy usage of a series of benchmarks, the PARSEC and the SPLASH- 2X Benchmark Suites, on the Intel Xeon Phi for different threads configurations, is presented. To carry out this study, a tool was designed to monitor and record the power usage in real time during execution time and afterwards to compare the r
Resumo:
The advent of microneedle (MN) technology has provided a revolutionary platform for the delivery of therapeutic agents, particularly in the field of gene therapy. For over 20 years, the area of gene therapy has undergone intense innovation and progression which has seen advancement of the technology from an experimental concept to a widely acknowledged strategy for the treatment and prevention of numerous disease states. However, the true potential of gene therapy has yet to be achieved due to limitations in formulation and delivery technologies beyond parenteral injection of the DNA. Microneedle-mediated delivery provides a unique platform for the delivery of DNA therapeutics clinically. It provides a means to overcome the skin barriers to gene delivery and deposit the DNA directly into the dermal layers, a key site for delivery of therapeutics to treat a wide range of skin and cutaneous diseases. Additionally, the skin is a tissue rich in immune sentinels, an ideal target for the delivery of a DNA vaccine directly to the desired target cell populations. This review details the advancement of MN-mediated DNA delivery from proof-of-concept to the delivery of DNA encoding clinically relevant proteins and antigens and examines the key considerations for the improvement of the technology and progress into a clinically applicable delivery system.
Resumo:
Time domain astronomy has come of age with astronomers now able to monitor the sky at high cadence both across the electromagnetic spectrum and using neutrinos and gravitational waves. The advent of new observing facilities permits new science, but the ever increasing throughput of facilities demands efficient communication of coincident detections and better subsequent coordination among the scientific community so as to turn detections into scientific discoveries. To discuss the revolution occurring in our ability to monitor the Universe and the challenges it brings, on 2012 April 25-26 a group of scientists from observational and theoretical teams studying transients met with representatives of the major international transient observing facilities at the Kavli Royal Society International Centre, UK. This immediately followed the Royal Society Discussion meeting "New windows on transients across the Universe" held in London. Here we present a summary of the Kavli meeting at which the participants discussed the science goals common to the transient astronomy community and analysed how to better meet the challenges ahead as ever more powerful observational facilities come on stream.
Resumo:
Molecular techniques have a key role to play in laboratory and clinical haematology. Restriction enzymes allow nucleic acids to be reduced in size for subsequent analysis. In addition they allow selection of specific DNA or RNA sequences for cloning into bacterial plasmids. These plasmids are naturally occuring DNA molecules which reside in bacterial cells. They can be manipulated to act as vehicles or carriers for biologically and medically important genes, allowing the production of large amounts of cloned material for research purposes or to aid in the production of medically important recombinant molecules such as insulin. As acquired or inherited genetic changes are implicated in a wide range of haematological diseases, it is necessary to have highly specific and sensitive assays to detect these mutations. Most of these techniques rely on nucleic acid hybridisation, benefitting from the ability of DNA or RNA to bind tighly to complimentary bases in the nucleic acid structure. Production of artificial DNA molecules called probes permits nucleic acid hybridiation assays to be performed, using the techniques of southern blotting or dot blot analysis. In addition the base composition of any gene or region of DNA can be determined using DNA sequencing technology. The advent of the polymerase chain reaction (PCR) has revolutionised all aspects of medicine, but has particular relevance in haematology where easy access to biopsy material provides a wealth of material for analysis. PCR permits quick and reliable manipulation of sample material and its ability to be automated makes it an ideal tool for use in the haematology laboratory.
Resumo:
Context. Thanks to the advent of Herschel and ALMA, new high-quality observations of molecules present in the circumstellar envelopes of asymptotic giant branch (AGB) stars are being reported that reveal large differences from the existing chemical models. New molecular data and more comprehensive models of the chemistry in circumstellar envelopes are now available.
Aims: The aims are to determine and study the important formation and destruction pathways in the envelopes of O-rich AGB stars and to provide more reliable predictions of abundances, column densities, and radial distributions for potentially detectable species with physical conditions applicable to the envelope surrounding IK Tau.
Methods: We use a large gas-phase chemical model of an AGB envelope including the effects of CO and N2 self-shielding in a spherical geometry and a newly compiled list of inner-circumstellar envelope parent species derived from detailed modeling and observations. We trace the dominant chemistry in the expanding envelope and investigate the chemistry as a probe for the physics of the AGB phase by studying variations of abundances with mass-loss rates and expansion velocities.
Results: We find a pattern of daughter molecules forming from the photodissociation products of parent species with contributions from ion-neutral abstraction and dissociative recombination. The chemistry in the outer zones differs from that in traditional PDRs in that photoionization of daughter species plays a significant role. With the proper treatment of self-shielding, the N → N2 and C+→ CO transitions are shifted outward by factors of 7 and 2, respectively, compared with earlier models. An upper limit on the abundance of CH4 as a parent species of (≲2.5 × 10-6 with respect to H2) is found for IK Tau, and several potentially observable molecules with relatively simple chemical links to other parent species are determined. The assumed stellar mass-loss rate, in particular, has an impact on the calculated abundances of cations and the peak-abundance radius of both cations and neutrals: as the mass-loss rate increases, the peak abundance of cations generally decreases and the peak-abundance radius of all species moves outwards. The effects of varying the envelope expansion velocity and cosmic-ray ionization rate are not as significant.
Resumo:
The advent of novel genomic technologies that enable the evaluation of genomic alterations on a genome-wide scale has significantly altered the field of genomic marker research in solid tumors. Researchers have moved away from the traditional model of identifying a particular genomic alteration and evaluating the association between this finding and a clinical outcome measure to a new approach involving the identification and measurement of multiple genomic markers simultaneously within clinical studies. This in turn has presented additional challenges in considering the use of genomic markers in oncology, such as clinical study design, reproducibility and interpretation and reporting of results. This Review will explore these challenges, focusing on microarray-based gene-expression profiling, and highlights some common failings in study design that have impacted on the use of putative genomic markers in the clinic. Despite these rapid technological advances there is still a paucity of genomic markers in routine clinical use at present. A rational and focused approach to the evaluation and validation of genomic markers is needed, whereby analytically validated markers are investigated in clinical studies that are adequately powered and have pre-defined patient populations and study endpoints. Furthermore, novel adaptive clinical trial designs, incorporating putative genomic markers into prospective clinical trials, will enable the evaluation of these markers in a rigorous and timely fashion. Such approaches have the potential to facilitate the implementation of such markers into routine clinical practice and consequently enable the rational and tailored use of cancer therapies for individual patients. © 2010 Macmillan Publishers Limited. All rights reserved.