1000 resultados para Integrable Supersymmetric Fermion Models
Resumo:
Here I develop a model of a radiative-convective atmosphere with both radiative and convective schemes highly simplified. The atmospheric absorption of radiation at selective wavelengths makes use of constant mass absorption coefficients in finite width spectral bands. The convective regime is introduced by using a prescribed lapse rate in the troposphere. The main novelty of the radiative-convective model developed here is that it is solved without using any angular approximation for the radiation field. The solution obtained in the purely radiation mode (i. e. with convection ignored) leads to multiple equilibria of stable states, being very similar to some results recently found in simple models of planetary atmospheres. However, the introduction of convective processes removes the multiple equilibria of stable states. This shows the importance of taking convective processes into account even for qualitative analyses of planetary atmosphere
Resumo:
The second differential of the entropy is used for analysing the stability of a thermodynamic climatic model. A delay time for the heat flux is introduced whereby it becomes an independent variable. Two different expressions for the second differential of the entropy are used: one follows classical irreversible thermodynamics theory; the second is related to the introduction of response time and is due to the extended irreversible thermodynamics theory. the second differential of the classical entropy leads to unstable solutions for high values of delay times. the extended expression always implies stable states for an ice-free earth. When the ice-albedo feedback is included, a discontinuous distribution of stable states is found for high response times. Following the thermodynamic analysis of the model, the maximum rates of entropy production at the steady state are obtained. A latitudinally isothermal earth produces the extremum in global entropy production. the material contribution to entropy production (by which we mean the production of entropy by material transport of heat) is a maximum when the latitudinal distribution of temperatures becomes less homogeneous than present values
Resumo:
Aim The imperfect detection of species may lead to erroneous conclusions about species-environment relationships. Accuracy in species detection usually requires temporal replication at sampling sites, a time-consuming and costly monitoring scheme. Here, we applied a lower-cost alternative based on a double-sampling approach to incorporate the reliability of species detection into regression-based species distribution modelling.Location Doñana National Park (south-western Spain).Methods Using species-specific monthly detection probabilities, we estimated the detection reliability as the probability of having detected the species given the species-specific survey time. Such reliability estimates were used to account explicitly for data uncertainty by weighting each absence. We illustrated how this novel framework can be used to evaluate four competing hypotheses as to what constitutes primary environmental control of amphibian distribution: breeding habitat, aestivating habitat, spatial distribution of surrounding habitats and/or major ecosystems zonation. The study was conducted on six pond-breeding amphibian species during a 4-year period.Results Non-detections should not be considered equivalent to real absences, as their reliability varied considerably. The occurrence of Hyla meridionalis and Triturus pygmaeus was related to a particular major ecosystem of the study area, where suitable habitat for these species seemed to be widely available. Characteristics of the breeding habitat (area and hydroperiod) were of high importance for the occurrence of Pelobates cultripes and Pleurodeles waltl. Terrestrial characteristics were the most important predictors of the occurrence of Discoglossus galganoi and Lissotriton boscai, along with spatial distribution of breeding habitats for the last species.Main conclusions We did not find a single best supported hypothesis valid for all species, which stresses the importance of multiscale and multifactor approaches. More importantly, this study shows that estimating the reliability of non-detection records, an exercise that had been previously seen as a naïve goal in species distribution modelling, is feasible and could be promoted in future studies, at least in comparable systems.
Resumo:
The integrity of central and peripheral nervous system myelin is affected in numerous lipid metabolism disorders. This vulnerability was so far mostly attributed to the extraordinarily high level of lipid synthesis that is required for the formation of myelin, and to the relative autonomy in lipid synthesis of myelinating glial cells because of blood barriers shielding the nervous system from circulating lipids. Recent insights from analysis of inherited lipid disorders, especially those with prevailing lipid depletion and from mouse models with glia-specific disruption of lipid metabolism, shed new light on this issue. The particular lipid composition of myelin, the transport of lipid-associated myelin proteins, and the necessity for timely assembly of the myelin sheath all contribute to the observed vulnerability of myelin to perturbed lipid metabolism. Furthermore, the uptake of external lipids may also play a role in the formation of myelin membranes. In addition to an improved understanding of basic myelin biology, these data provide a foundation for future therapeutic interventions aiming at preserving glial cell integrity in metabolic disorders.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.
Resumo:
Functionally relevant large scale brain dynamics operates within the framework imposed by anatomical connectivity and time delays due to finite transmission speeds. To gain insight on the reliability and comparability of large scale brain network simulations, we investigate the effects of variations in the anatomical connectivity. Two different sets of detailed global connectivity structures are explored, the first extracted from the CoCoMac database and rescaled to the spatial extent of the human brain, the second derived from white-matter tractography applied to diffusion spectrum imaging (DSI) for a human subject. We use the combination of graph theoretical measures of the connection matrices and numerical simulations to explicate the importance of both connectivity strength and delays in shaping dynamic behaviour. Our results demonstrate that the brain dynamics derived from the CoCoMac database are more complex and biologically more realistic than the one based on the DSI database. We propose that the reason for this difference is the absence of directed weights in the DSI connectivity matrix.
Resumo:
In recent years, both homing endonucleases (HEases) and zinc-finger nucleases (ZFNs) have been engineered and selected for the targeting of desired human loci for gene therapy. However, enzyme engineering is lengthy and expensive and the off-target effect of the manufactured endonucleases is difficult to predict. Moreover, enzymes selected to cleave a human DNA locus may not cleave the homologous locus in the genome of animal models because of sequence divergence, thus hampering attempts to assess the in vivo efficacy and safety of any engineered enzyme prior to its application in human trials. Here, we show that naturally occurring HEases can be found, that cleave desirable human targets. Some of these enzymes are also shown to cleave the homologous sequence in the genome of animal models. In addition, the distribution of off-target effects may be more predictable for native HEases. Based on our experimental observations, we present the HomeBase algorithm, database and web server that allow a high-throughput computational search and assignment of HEases for the targeting of specific loci in the human and other genomes. We validate experimentally the predicted target specificity of candidate fungal, bacterial and archaeal HEases using cell free, yeast and archaeal assays.
Resumo:
To study different temporal components on cancer mortality (age, period and cohort) methods of graphic representation were applied to Swiss mortality data from 1950 to 1984. Maps using continuous slopes ("contour maps") and based on eight tones of grey according to the absolute distribution of rates were used to represent the surfaces defined by the matrix of various age-specific rates. Further, progressively more complex regression surface equations were defined, on the basis of two independent variables (age/cohort) and a dependent one (each age-specific mortality rate). General patterns of trends in cancer mortality were thus identified, permitting definition of important cohort (e.g., upwards for lung and other tobacco-related neoplasms, or downwards for stomach) or period (e.g., downwards for intestines or thyroid cancers) effects, besides the major underlying age component. For most cancer sites, even the lower order (1st to 3rd) models utilised provided excellent fitting, allowing immediate identification of the residuals (e.g., high or low mortality points) as well as estimates of first-order interactions between the three factors, although the parameters of the main effects remained still undetermined. Thus, the method should be essentially used as summary guide to illustrate and understand the general patterns of age, period and cohort effects in (cancer) mortality, although they cannot conceptually solve the inherent problem of identifiability of the three components.
Resumo:
Study of the publication models and the means of accessing scientific literature in the current environment of digital communication and the web. The text introduces the concept of journal article as a well-defined and stable unit within the publishing world, and as a nucleus on which professional and scholarly communication has been based since its beginnings in the 17th century. The transformation of scientific communication that the digital world has enabled is analysed. Descriptions are provided of some of the practices undertaken by authors, research organisations, publishers and library-related institutions as a response to the new possibilities being unveiled for articles, both as products as well as for their creation and distribution processes. These transformations affect the very nature of articles as a minimal unit -both unique and stable- of scientific communication. The article concludes by noting that under varying documentary forms of publisher aggregation and bibliographic control -sometimes simultaneously and, even, apparently contradictory- there flourishes a more pluralistic type of scientific communication. This pluralism offers: more possibilities for communication among authors; fewer levels of intermediaries such as agents that intervene and provide added value to the products; greater availability for users both economically speaking and from the point of view of access; and greater interaction and wealth of contents, thanks to the new hypertext and multimedia possibilities.
Resumo:
BACKGROUND: The criteria for choosing relevant cell lines among a vast panel of available intestinal-derived lines exhibiting a wide range of functional properties are still ill-defined. The objective of this study was, therefore, to establish objective criteria for choosing relevant cell lines to assess their appropriateness as tumor models as well as for drug absorption studies. RESULTS: We made use of publicly available expression signatures and cell based functional assays to delineate differences between various intestinal colon carcinoma cell lines and normal intestinal epithelium. We have compared a panel of intestinal cell lines with patient-derived normal and tumor epithelium and classified them according to traits relating to oncogenic pathway activity, epithelial-mesenchymal transition (EMT) and stemness, migratory properties, proliferative activity, transporter expression profiles and chemosensitivity. For example, SW480 represent an EMT-high, migratory phenotype and scored highest in terms of signatures associated to worse overall survival and higher risk of recurrence based on patient derived databases. On the other hand, differentiated HT29 and T84 cells showed gene expression patterns closest to tumor bulk derived cells. Regarding drug absorption, we confirmed that differentiated Caco-2 cells are the model of choice for active uptake studies in the small intestine. Regarding chemosensitivity we were unable to confirm a recently proposed association of chemo-resistance with EMT traits. However, a novel signature was identified through mining of NCI60 GI50 values that allowed to rank the panel of intestinal cell lines according to their drug responsiveness to commonly used chemotherapeutics. CONCLUSIONS: This study presents a straightforward strategy to exploit publicly available gene expression data to guide the choice of cell-based models. While this approach does not overcome the major limitations of such models, introducing a rank order of selected features may allow selecting model cell lines that are more adapted and pertinent to the addressed biological question.
Resumo:
First, we examine the context of creation of special collection units in libraries, and the reasons why libraries compile archive materials and collections. Second, we focus on the techniques used in library environments to describe archive materials and collections and to guarantee their accessibility. We examine the models used in the United States and the United Kingdom to describe and access these materials, and the cooperative projects launched in these two countries in the past few years. Finally, we offer a preliminary analysis of how these types of materials are currently dealt with in Catalan libraries, and issue some recommendations to improve their archiving and access.
Resumo:
Selostus: Lannoituksen pitkäaikaiset kenttäkokeet: kolmen matemaattisen mallin vertailu
Resumo:
The method of stochastic dynamic programming is widely used in ecology of behavior, but has some imperfections because of use of temporal limits. The authors presented an alternative approach based on the methods of the theory of restoration. Suggested method uses cumulative energy reserves per time unit as a criterium, that leads to stationary cycles in the area of states. This approach allows to study the optimal feeding by analytic methods.