997 resultados para collective models
Resumo:
Recent advances in remote sensing technologies have facilitated the generation of very high resolution (VHR) environmental data. Exploratory studies suggested that, if used in species distribution models (SDMs), these data should enable modelling species' micro-habitats and allow improving predictions for fine-scale biodiversity management. In the present study, we tested the influence, in SDMs, of predictors derived from a VHR digital elevation model (DEM) by comparing the predictive power of models for 239 plant species and their assemblages fitted at six different resolutions in the Swiss Alps. We also tested whether changes of the model quality for a species is related to its functional and ecological characteristics. Refining the resolution only contributed to slight improvement of the models for more than half of the examined species, with the best results obtained at 5 m, but no significant improvement was observed, on average, across all species. Contrary to our expectations, we could not consistently correlate the changes in model performance with species characteristics such as vegetation height. Temperature, the most important variable in the SDMs across the different resolutions, did not contribute any substantial improvement. Our results suggest that improving resolution of topographic data only is not sufficient to improve SDM predictions - and therefore local management - compared to previously used resolutions (here 25 and 100 m). More effort should be dedicated now to conduct finer-scale in-situ environmental measurements (e.g. for temperature, moisture, snow) to obtain improved environmental measurements for fine-scale species mapping and management.
Resumo:
Several methods and algorithms have recently been proposed that allow for the systematic evaluation of simple neuron models from intracellular or extracellular recordings. Models built in this way generate good quantitative predictions of the future activity of neurons under temporally structured current injection. It is, however, difficult to compare the advantages of various models and algorithms since each model is designed for a different set of data. Here, we report about one of the first attempts to establish a benchmark test that permits a systematic comparison of methods and performances in predicting the activity of rat cortical pyramidal neurons. We present early submissions to the benchmark test and discuss implications for the design of future tests and simple neurons models
Resumo:
An important evaporitic sedimentation occurred during the Paleogene (Eocene to lower Oligocene) in the Barberà sector of the southeastern margin of the Tertiary Ebro Basin. This sedimentation took place in shallow lacustrine environments and was controlled by a number of factors: 1) the tectonic structuration of the margin; 2) the high calcium sulphate content in the meteoric waters coming from the marginal reliefs; 3) the semiarid climate; and 4) the development of large alluvial fans along the basin margin, which also conditioned the location of the saline lakes. The evaporites are currently composed of secondary gypsum in surface and anhydrite at depth. There are, however, vestiges of the local presence of sodium sulphates. The evaporite units, with individual thicknesses ranging between 50 and 100 m, are intercalated within various lithostratigraphic formations and exhibit a paleogeographical pattern. The units located closer to the basin margin are characterized by a massive gypsum lithofacies (originally, bioturbated gypsum) bearing chert, and also by meganodular gypsum locally (originally, meganodules of anhydrite) in association with red lutites and clastic intercalations (gypsarenites, sandstones and conglomerates). Chert, which is only linked to the thickest gypsum layers, seems to be an early diagenetic, lacustrine product. Cyclicity in these proximal units indicates the progressive development of lowsalinity, lacustrine bodies on red mud flats. At the top of some cycles, exposure episodes commonly resulted in dissolution, erosion, and the formation of edaphic features. In contrast, the units located in a more distal position with regard to the basin margin are formed by an alternation of banded-nodular gypsum and laminated gypsum layers in association with grey lutites and few clastic intercalations. These distal units formed in saline lakes with a higher ionic concentration. Exposure episodes in these lakes resulted in the formation of synsedimentary anhydrite and sabkha cycles. In some of these units, however, outer rims characterized by a lithofacies association similar to that of the proximal units occur (nodular gypsum, massive gypsum and chert nodules).
Resumo:
Hsp70s are conserved molecular chaperones that can prevent protein aggregation, actively unfold, solubilize aggregates, pull translocating proteins across membranes and remodel native proteins complexes. Disparate mechanisms have been proposed for the various modes of Hsp70 action: passive prevention of aggregation by kinetic partitioning, peptide-bond isomerase, Brownian ratcheting or active power-stroke pulling. Recently, we put forward a unifying mechanism named 'entropic pulling', which proposed that Hsp70 uses the energy of ATP hydrolysis to recruit a force of entropic origin to locally unfold aggregates or pull proteins across membranes. The entropic pulling mechanism reproduces the expected phenomenology that inspired the other disparate mechanisms and is, moreover, simple.
Resumo:
Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination.
Resumo:
PURPOSE OF REVIEW: HIV targets primary CD4(+) T cells. The virus depends on the physiological state of its target cells for efficient replication, and, in turn, viral infection perturbs the cellular state significantly. Identifying the virus-host interactions that drive these dynamic changes is important for a better understanding of viral pathogenesis and persistence. The present review focuses on experimental and computational approaches to study the dynamics of viral replication and latency. RECENT FINDINGS: It was recently shown that only a fraction of the inducible latently infected reservoirs are successfully induced upon stimulation in ex-vivo models while additional rounds of stimulation make allowance for reactivation of more latently infected cells. This highlights the potential role of treatment duration and timing as important factors for successful reactivation of latently infected cells. The dynamics of HIV productive infection and latency have been investigated using transcriptome and proteome data. The cellular activation state has shown to be a major determinant of viral reactivation success. Mathematical models of latency have been used to explore the dynamics of the latent viral reservoir decay. SUMMARY: Timing is an important component of biological interactions. Temporal analyses covering aspects of viral life cycle are essential for gathering a comprehensive picture of HIV interaction with the host cell and untangling the complexity of latency. Understanding the dynamic changes tipping the balance between success and failure of HIV particle production might be key to eradicate the viral reservoir.
Resumo:
In the past three decades, feminists and critical theorists have discussed and argued the importance of deconstructing and problematizing social science research methodology in order to question normalized hierarchies concerning the production of knowledge and the status of truth claims. Nevertheless, often, these ideas have basically remained theoretical propositions not embodied in research practices. In fact there is very little published discussion about the difficulties and limits of their practical application. In this paper we introduce some interconnected reflections starting from two different but related experiences of embodying 'feminist activist research'. Our aim is to emphasise the importance of attending to process, making mistakes and learning during fieldwork, as well as experimenting with personalized forms of analysis, such as the construction of narratives and the story-telling process.
Resumo:
In the past three decades, feminists and critical theorists have discussed and argued the importance of deconstructing and problematizing social science research methodology in order to question normalized hierarchies concerning the production of knowledge and the status of truth claims. Nevertheless, often, these ideas have basically remained theoretical propositions not embodied in research practices. In fact there is very little published discussion about the difficulties and limits of their practical application. In this paper we introduce some interconnected reflections starting from two different but related experiences of embodying 'feminist activist research'. Our aim is to emphasise the importance of attending to process, making mistakes and learning during fieldwork, as well as experimenting with personalized forms of analysis, such as the construction of narratives and the story-telling process.
Resumo:
Regulatory gene networks contain generic modules, like those involving feedback loops, which are essential for the regulation of many biological functions (Guido et al. in Nature 439:856-860, 2006). We consider a class of self-regulated genes which are the building blocks of many regulatory gene networks, and study the steady-state distribution of the associated Gillespie algorithm by providing efficient numerical algorithms. We also study a regulatory gene network of interest in gene therapy, using mean-field models with time delays. Convergence of the related time-nonhomogeneous Markov chain is established for a class of linear catalytic networks with feedback loops.
Resumo:
L'article recull les últimes revisions sobre els models de desenvolupament professional des d'una perspectiva actual i globalitzadora del tema. Exposa cinc models de desenvolupament professional: a) guiat individualment, b) basat en l'observació/evaluació, c)basat en la preparació (formació), d) basat en la implicació en processos de desenvolupament/millora, i e) basat en la indagació. Acaba recollint la proposta del MEC sobre la formació del professorat, dins del nostre context.
Resumo:
This article studies alterations in the values, attitudes, and behaviors that emerged among U.S. citizens as a consequence of, and as a response to, the attacks of September 11, 2001. The study briefly examines the immediate reaction to the attack, before focusing on the collective reactions that characterized the behavior of the majority of the population between the events of 9/11 and the response to it in the form of intervention in Afghanistan. In studying this period an eight-phase sequential model (Botcharova, 2001) is used, where the initial phases center on the nation as the ingroup and the latter focus on the enemy who carried out the attack as the outgroup. The study is conducted from a psychosocial perspective and uses "social identity theory" (Tajfel & Turner, 1979, 1986) as the basic framework for interpreting and accounting for the collective reactions recorded. The main purpose of this paper is to show that the interpretation of these collective reactions is consistent with the postulates of social identity theory. The application of this theory provides a different and specific analysis of events. The study is based on data obtained from a variety of rigorous academic studies and opinion polls conducted in relation to the events of 9/11. In line with social identity theory, 9/11 had a marked impact on the importance attached by the majority of U.S. citizens to their identity as members of a nation. This in turn accentuated group differentiation and activated ingroup favoritism and outgroup discrimination (Tajfel & Turner, 1979, 1986). Ingroup favoritism strengthened group cohesion, feelings of solidarity, and identification with the most emblematic values of the U.S. nation, while outgroup discrimination induced U.S. citizens to conceive the enemy (al-Qaeda and its protectors) as the incarnation of evil, depersonalizing the group and venting their anger on it, and to give their backing to a military response, the eventual intervention in Afghanistan. Finally, and also in line with the postulates of social identity theory, as an alternative to the virtual bipolarization of the conflict (U.S. vs al-Qaeda), the activation of a higher level of identity in the ingroup is proposed, a group that includes the United States and the largest possible number of countries¿ including Islamic states¿in the search for a common, more legitimate and effective solution.
Resumo:
It has been repeatedly debated which strategies people rely on in inference. These debates have been difficult to resolve, partially because hypotheses about the decision processes assumed by these strategies have typically been formulated qualitatively, making it hard to test precise quantitative predictions about response times and other behavioral data. One way to increase the precision of strategies is to implement them in cognitive architectures such as ACT-R. Often, however, a given strategy can be implemented in several ways, with each implementation yielding different behavioral predictions. We present and report a study with an experimental paradigm that can help to identify the correct implementations of classic compensatory and non-compensatory strategies such as the take-the-best and tallying heuristics, and the weighted-linear model.
Resumo:
Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.