857 resultados para complexity of agents


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Snakebites are a major neglected tropical disease responsible for as many as 95000 deaths every year worldwide. Viper venom serine proteases disrupt haemostasis of prey and victims by affecting various stages of the blood coagulation system. A better understanding of their sequence, structure, function and phylogenetic relationships will improve the knowledge on the pathological conditions and aid in the development of novel therapeutics for treating snakebites. A large dataset for all available viper venom serine proteases was developed and analysed to study various features of these enzymes. Despite the large number of venom serine protease sequences available, only a small proportion of these have been functionally characterised. Although, they share some of the common features such as a C-terminal extension, GWG motif and disulphide linkages, they vary widely between each other in features such as isoelectric points, potential N-glycosylation sites and functional characteristics. Some of the serine proteases contain substitutions for one or more of the critical residues in catalytic triad or primary specificity pockets. Phylogenetic analysis clustered all the sequences in three major groups. The sequences with substitutions in catalytic triad or specificity pocket clustered together in separate groups. Our study provides the most complete information on viper venom serine proteases to date and improves the current knowledge on the sequence, structure, function and phylogenetic relationships of these enzymes. This collective analysis of venom serine proteases will help in understanding the complexity of envenomation and potential therapeutic avenues.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article provides time series data on the medieval market in freehold land, including the changing social composition of freeholders, level of market activity, size and complexity of landholdings, and shifts in the market value of land. These are subjects hitherto largely ignored due, in part, to the disparate nature of the evidence. It argues that feet of fines, despite archival limitations, if employed with care and an understanding of the underlying changes in the common law of real property, are capable of providing quantifiable evidence spanning hundreds of years and comparable across large areas of England.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article argues that a native-speaker baseline is a neglected dimension of studies into second language (L2) performance. If we investigate how learners perform language tasks, we should distinguish what performance features are due to their processing an L2 and which are due to their performing a particular task. Having defined what we mean by “native speaker,” we present the background to a research study into task features on nonnative task performance, designed to include native-speaker data as a baseline for interpreting nonnative-speaker performance. The nonnative results, published in this journal (Tavakoli & Foster, 2008) are recapitulated and then the native-speaker results are presented and discussed in the light of them. The study is guided by the assumption that limited attentional resources impact on L2 performance and explores how narrative design features—namely complexity of storyline and tightness of narrative structure— affect complexity, fluency, accuracy, and lexical diversity in language. The results show that both native and nonnative speakers are prompted by storyline complexity to use more subordinated language, but narrative structure had different effects on native and nonnative fluency. The learners, who were based in either London or Tehran, did not differ in their performance when compared to each other, except in lexical diversity, where the learners in London were close to native-speaker levels. The implications of the results for the applicability of Levelt’s model of speaking to an L2 are discussed, as is the potential for further L2 research using native speakers as a baseline.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The overarching aim of the research reported here was to investigate the effects of task structure and storyline complexity of oral narrative tasks on second language task performance. Participants were 60 Iranian language learners of English who performed six narrative tasks of varying degree of structure and storyline complexity in an assessment setting. A number of analytic detailed measures were employed to examine whether there were any differences in the participants’ performances elicited by the different tasks in terms of their accuracy, fluency, syntactic complexity and lexical diversity. Results of the data analysis showed that performance in the more structured tasks was more accurate and to a great extent more fluent than that in the less structured tasks. The results further revealed that syntactic complexity of L2 performance was related to the storyline complexity, i.e. more syntactic complexity was associated with narratives that had both foreground and background storylines. These findings strongly suggest that there is some unsystematic variance in the participants’ performance triggered by the different aspects of task design.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Control and optimization of flavor is the ultimate challenge for the food and flavor industry. The major route to flavor formation during thermal processing is the Maillard reaction, which is a complex cascade of interdependent reactions initiated by the reaction between a reducing sugar and an amino compd. The complexity of the reaction means that researchers turn to kinetic modeling in order to understand the control points of the reaction and to manipulate the flavor profile. Studies of the kinetics of flavor formation have developed over the past 30 years from single- response empirical models of binary aq. systems to sophisticated multi-response models in food matrixes, based on the underlying chem., with the power to predict the formation of some key aroma compds. This paper discusses in detail the development of kinetic models of thermal generation of flavor and looks at the challenges involved in predicting flavor.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The last decade has seen successful clinical application of polymer–protein conjugates (e.g. Oncaspar, Neulasta) and promising results in clinical trials with polymer–anticancer drug conjugates. This, together with the realisation that nanomedicines may play an important future role in cancer diagnosis and treatment, has increased interest in this emerging field. More than 10 anticancer conjugates have now entered clinical development. Phase I/II clinical trials involving N-(2-hydroxypropyl)methacrylamide (HPMA) copolymer-doxorubicin (PK1; FCE28068) showed a four- to fivefold reduction in anthracycline-related toxicity, and, despite cumulative doses up to 1680 mg/m2 (doxorubicin equivalent), no cardiotoxicity was observed. Antitumour activity in chemotherapy-resistant/refractory patients (including breast cancer) was also seen at doxorubicin doses of 80–320 mg/m2, consistent with tumour targeting by the enhanced permeability (EPR) effect. Hints, preclinical and clinical, that polymer anthracycline conjugation can bypass multidrug resistance (MDR) reinforce our hope that polymer drugs will prove useful in improving treatment of endocrine-related cancers. These promising early clinical results open the possibility of using the water-soluble polymers as platforms for delivery of a cocktail of pendant drugs. In particular, we have recently described the first conjugates to combine endocrine therapy and chemotherapy. Their markedly enhanced in vitro activity encourages further development of such novel, polymer-based combination therapies. This review briefly describes the current status of polymer therapeutics as anticancer agents, and discusses the opportunities for design of second-generation, polymer-based combination therapy, including the cocktail of agents that will be needed to treat resistant metastatic cancer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mean field models (MFMs) of cortical tissue incorporate salient, average features of neural masses in order to model activity at the population level, thereby linking microscopic physiology to macroscopic observations, e.g., with the electroencephalogram (EEG). One of the common aspects of MFM descriptions is the presence of a high-dimensional parameter space capturing neurobiological attributes deemed relevant to the brain dynamics of interest. We study the physiological parameter space of a MFM of electrocortical activity and discover robust correlations between physiological attributes of the model cortex and its dynamical features. These correlations are revealed by the study of bifurcation plots, which show that the model responses to changes in inhibition belong to two archetypal categories or “families”. After investigating and characterizing them in depth, we discuss their essential differences in terms of four important aspects: power responses with respect to the modeled action of anesthetics, reaction to exogenous stimuli such as thalamic input, and distributions of model parameters and oscillatory repertoires when inhibition is enhanced. Furthermore, while the complexity of sustained periodic orbits differs significantly between families, we are able to show how metamorphoses between the families can be brought about by exogenous stimuli. We here unveil links between measurable physiological attributes of the brain and dynamical patterns that are not accessible by linear methods. They instead emerge when the nonlinear structure of parameter space is partitioned according to bifurcation responses. We call this general method “metabifurcation analysis”. The partitioning cannot be achieved by the investigation of only a small number of parameter sets and is instead the result of an automated bifurcation analysis of a representative sample of 73,454 physiologically admissible parameter sets. Our approach generalizes straightforwardly and is well suited to probing the dynamics of other models with large and complex parameter spaces.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The bewildering complexity of cortical microcircuits at the single cell level gives rise to surprisingly robust emergent activity patterns at the level of laminar and columnar local field potentials (LFPs) in response to targeted local stimuli. Here we report the results of our multivariate data-analytic approach based on simultaneous multi-site recordings using micro-electrode-array chips for investigation of the microcircuitary of rat somatosensory (barrel) cortex. We find high repeatability of stimulus-induced responses, and typical spatial distributions of LFP responses to stimuli in supragranular, granular, and infragranular layers, where the last form a particularly distinct class. Population spikes appear to travel with about 33 cm/s from granular to infragranular layers. Responses within barrel related columns have different profiles than those in neighbouring columns to the left or interchangeably to the right. Variations between slices occur, but can be minimized by strictly obeying controlled experimental protocols. Cluster analysis on normalized recordings indicates specific spatial distributions of time series reflecting the location of sources and sinks independent of the stimulus layer. Although the precise correspondences between single cell activity and LFPs are still far from clear, a sophisticated neuroinformatics approach in combination with multi-site LFP recordings in the standardized slice preparation is suitable for comparing normal conditions to genetically or pharmacologically altered situations based on real cortical microcircuitry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The great majority of plant species in the tropics require animals to achieve pollination, but the exact role of floral signals in attraction of animal pollinators is often debated. Many plants provide a floral reward to attract a guild of pollinators, and it has been proposed that floral signals of non-rewarding species may converge on those of rewarding species to exploit the relationship of the latter with their pollinators. In the orchid family (Orchidaceae), pollination is almost universally animal-mediated, but a third of species provide no floral reward, which suggests that deceptive pollination mechanisms are prevalent. Here, we examine floral colour and shape convergence in Neotropical plant communities, focusing on certain food-deceptive Oncidiinae orchids (e.g. Trichocentrum ascendens and Oncidium nebulosum) and rewarding species of Malpighiaceae. We show that the species from these two distantly related families are often more similar in floral colour and shape than expected by chance and propose that a system of multifarious floral mimicry—a form of Batesian mimicry that involves multiple models and is more complex than a simple one model–one mimic system—operates in these orchids. The same mimetic pollination system has evolved at least 14 times within the species-rich Oncidiinae throughout the Neotropics. These results help explain the extraordinary diversification of Neotropical orchids and highlight the complexity of plant–animal interactions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Approaches to natural resource management emphasise the importance of involving local people and institutions in order to build capacity, limit costs, and achieve environmental sustainability. Governments worldwide, often encouraged by international donors, have formulated devolution policies and legal instruments that provide an enabling environment for devolved natural resource management. However, implementation of these policies reveals serious challenges. This article explores the effects of limited involvement of local people and institutions in policy development and implementation. An in-depth study of the Forest Policy of Malawi and Village Forest Areas in the Lilongwe district provides an example of externally driven policy development which seeks to promote local management of natural resources. The article argues that policy which has weak ownership by national government and does not adequately consider the complexity of local institutions, together with the effects of previous initiatives on them, can create a cumulative legacy through which destructive resource use practices and social conflict may be reinforced. In short, poorly developed and implemented community based natural resource management policies can do considerably more harm than good. Approaches are needed that enable the policy development process to embed an in-depth understanding of local institutions whilst incorporating flexibility to account for their location-specific nature. This demands further research on policy design to enable rigorous identification of positive and negative institutions and ex-ante exploration of the likely effects of different policy interventions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim: To examine the causes of prescribing and monitoring errors in English general practices and provide recommendations for how they may be overcome. Design: Qualitative interview and focus group study with purposive sampling and thematic analysis informed by Reason’s accident causation model. Participants: General practice staff participated in a combination of semi-structured interviews (n=34) and six focus groups (n=46). Setting: Fifteen general practices across three primary care trusts in England. Results: We identified seven categories of high-level error-producing conditions: the prescriber, the patient, the team, the task, the working environment, the computer system, and the primary-secondary care interface. Each of these was further broken down to reveal various error-producing conditions. The prescriber’s therapeutic training, drug knowledge and experience, knowledge of the patient, perception of risk, and their physical and emotional health, were all identified as possible causes. The patient’s characteristics and the complexity of the individual clinical case were also found to have contributed to prescribing errors. The importance of feeling comfortable within the practice team was highlighted, as well as the safety of general practitioners (GPs) in signing prescriptions generated by nurses when they had not seen the patient for themselves. The working environment with its high workload, time pressures, and interruptions, and computer related issues associated with mis-selecting drugs from electronic pick-lists and overriding alerts, were all highlighted as possible causes of prescribing errors and often interconnected. Conclusion: This study has highlighted the complex underlying causes of prescribing and monitoring errors in general practices, several of which are amenable to intervention.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With rising public awareness of climate change, celebrities have become an increasingly important community of non nation-state ‘actors’ influencing discourse and action, thereby comprising an emergent climate science–policy–celebrity complex. Some feel that these amplified and prominent voices contribute to greater public understanding of climate change science, as well as potentially catalyze climate policy cooperation. However, critics posit that increased involvement from the entertainment industry has not served to influence substantive long-term advancements in these arenas; rather, it has instead reduced the politics of climate change to the domain of fashion and fad, devoid of political and public saliency. Through tracking media coverage in Australia, Canada, the United States, and United Kingdom, we map out the terrain of a ‘Politicized Celebrity System’ in attempts to cut through dualistic characterizations of celebrity involvement in politics. We develop a classification system of the various types of climate change celebrity activities, and situate movements in contemporary consumer- and spectacle-driven carbon-based society. Through these analyses, we place dynamic and contested interactions in a spatially and temporally-sensitive ‘Cultural Circuits of Climate Change Celebrities’ model. In so doing, first we explore how these newly ‘authorized’ speakers and ‘experts’ might open up spaces in the public sphere and the science/policy nexus through ‘celebritization’ effects. Second, we examine how the celebrity as the ‘heroic individual’ seeking ‘conspicuous redemption’ may focus climate change actions through individualist frames. Overall, this paper explores potential promises, pitfalls and contradictions of this increasingly entrenched set ofagents’ in the cultural politics of climate change. Thus, as a form of climate change action, we consider whether it is more effective to ‘plant’ celebrities instead of trees.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.