807 resultados para Information Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 1-D 1/2-spin XXZ model with staggered external magnetic field, when restricting to low field, can be mapped into the quantum sine-Gordon model through bosonization: this assures the presence of soliton, antisoliton and breather excitations in it. In particular, the action of the staggered field opens a gap so that these physical objects are stable against energetic fluctuations. In the present work, this model is studied both analytically and numerically. On the one hand, analytical calculations are made to solve exactly the model through Bethe ansatz: the solution for the XX + h staggered model is found first by means of Jordan-Wigner transformation and then through Bethe ansatz; after this stage, efforts are made to extend the latter approach to the XXZ + h staggered model (without finding its exact solution). On the other hand, the energies of the elementary soliton excitations are pinpointed through static DMRG (Density Matrix Renormalization Group) for different values of the parameters in the hamiltonian. Breathers are found to be in the antiferromagnetic region only, while solitons and antisolitons are present both in the ferromagnetic and antiferromagnetic region. Their single-site z-magnetization expectation values are also computed to see how they appear in real space, and time-dependent DMRG is employed to realize quenches on the hamiltonian parameters to monitor their time-evolution. The results obtained reveal the quantum nature of these objects and provide some information about their features. Further studies and a better understanding of their properties could bring to the realization of a two-level state through a soliton-antisoliton pair, in order to implement a qubit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A laser scanning microscope collects information from a thin, focal plane and ignores out of focus information. During the past few years it has become the standard imaging method to characterise cellular morphology and structures in static as well as in living samples. Laser scanning microscopy combined with digital image restoration is an excellent tool for analysing the cellular cytoarchitecture, expression of specific proteins and interactions of various cell types, thus defining valid criteria for the optimisation of cell culture models. We have used this tool to establish and evaluate a three dimensional model of the human epithelial airway wall.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims at the development and evaluation of a personalized insulin infusion advisory system (IIAS), able to provide real-time estimations of the appropriate insulin infusion rate for type 1 diabetes mellitus (T1DM) patients using continuous glucose monitors and insulin pumps. The system is based on a nonlinear model-predictive controller (NMPC) that uses a personalized glucose-insulin metabolism model, consisting of two compartmental models and a recurrent neural network. The model takes as input patient's information regarding meal intake, glucose measurements, and insulin infusion rates, and provides glucose predictions. The predictions are fed to the NMPC, in order for the latter to estimate the optimum insulin infusion rates. An algorithm based on fuzzy logic has been developed for the on-line adaptation of the NMPC control parameters. The IIAS has been in silico evaluated using an appropriate simulation environment (UVa T1DM simulator). The IIAS was able to handle various meal profiles, fasting conditions, interpatient variability, intraday variation in physiological parameters, and errors in meal amount estimations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20\% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing world population, changing climate and limiting fossil fuels will provide new pressures on human production of food, medicine, fuels and feed stock in the twenty-first century. Enhanced crop production promises to ameliorate these pressures. Crops can be bred for increased yields of calories, starch, nutrients, natural medicinal compounds, and other important products. Enhanced resistance to biotic and abiotic stresses can be introduced, toxins removed, and industrial qualities such as fibre strength and biofuel per mass can be increased. Induced and natural mutations provide a powerful method for the generation of heritable enhanced traits. While mainly exploited in forward, phenotype driven, approaches, the rapid accumulation of plant genomic sequence information and hypotheses regarding gene function allows the use of mutations in reverse genetic approaches to identify lesions in specific target genes. Such gene-driven approaches promise to speed up the process of creating novel phenotypes, and can enable the generation of phenotypes unobtainable by traditional forward methods. TILLING (Targeting Induced Local Lesions IN Genome) is a high-throughput and low cost reverse genetic method for the discovery of induced mutations. The method has been modified for the identification of natural nucleotide polymorphisms, a process called Ecotilling. The methods are general and have been applied to many species, including a variety of different crops. In this chapter the current status of the TILLING and Ecotilling methods and provide an overview of progress in applying these methods to different plant species, with a focus on work related to food production for developing nations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discusses the cooperative effort between librarians and science faculty at Bucknell University in developing an effective library use education course for incoming undergraduate science and engineering students. Describes course structure and activities, and includes a library instruction bibliography. (five references) (EA)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the past 20 years or so, more has become known about the properties of khat, its pharmacology, physiological and psychological effects on humans. However, at the same time its reputation of social and recreational use in traditional contexts has hindered the dissemination of knowledge about its detrimental effects in terms of mortality. This paper focuses on this particular deficit and adds to the knowledge-base by reviewing the scant literature that does exist on mortality associated with the trade and use of khat. We sought all peer-reviewed papers relating to deaths associated with khat. From an initial list of 111, we identified 15 items meeting our selection criteria. Examination of these revealed 61 further relevant items. These were supplemented with published reports, newspaper and other media reports. A conceptual framework was then developed for classifying mortality associated with each stage of the plant's journey from its cultivation, transportation, consumption, to its effects on the human body. The model is demonstrated with concrete examples drawn from the above sources. These highlight a number of issues for which more substantive statistical data are needed, including population-based studies of the physiological and psychological determinants of khat-related fatalities. Khat-consuming communities, and health professionals charged with their care should be more aware of the physiological and psychological effects of khat, together with the risks for morbidity and mortality associated with its use. There is also a need for information to be collected at international and national levels on other causes of death associated with khat cultivation, transportation, and trade. Both these dimensions need to be understood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th-90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40-111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69-215 Bq/m³) in the medium category, and 219 Bq/m³ (108-427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be robust through validation with an independent dataset. The model is appropriate for predicting radon level exposure of the Swiss population in epidemiological research. Nevertheless, some exposure misclassification and regression to the mean is unavoidable and should be taken into account in future applications of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new physics-based technique for correcting inhomogeneities present in sub-daily temperature records is proposed. The approach accounts for changes in the sensor-shield characteristics that affect the energy balance dependent on ambient weather conditions (radiation, wind). An empirical model is formulated that reflects the main atmospheric processes and can be used in the correction step of a homogenization procedure. The model accounts for short- and long-wave radiation fluxes (including a snow cover component for albedo calculation) of a measurement system, such as a radiation shield. One part of the flux is further modulated by ventilation. The model requires only cloud cover and wind speed for each day, but detailed site-specific information is necessary. The final model has three free parameters, one of which is a constant offset. The three parameters can be determined, e.g., using the mean offsets for three observation times. The model is developed using the example of the change from the Wild screen to the Stevenson screen in the temperature record of Basel, Switzerland, in 1966. It is evaluated based on parallel measurements of both systems during a sub-period at this location, which were discovered during the writing of this paper. The model can be used in the correction step of homogenization to distribute a known mean step-size to every single measurement, thus providing a reasonable alternative correction procedure for high-resolution historical climate series. It also constitutes an error model, which may be applied, e.g., in data assimilation approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The construction of a reliable, practically useful prediction rule for future response is heavily dependent on the "adequacy" of the fitted regression model. In this article, we consider the absolute prediction error, the expected value of the absolute difference between the future and predicted responses, as the model evaluation criterion. This prediction error is easier to interpret than the average squared error and is equivalent to the mis-classification error for the binary outcome. We show that the distributions of the apparent error and its cross-validation counterparts are approximately normal even under a misspecified fitted model. When the prediction rule is "unsmooth", the variance of the above normal distribution can be estimated well via a perturbation-resampling method. We also show how to approximate the distribution of the difference of the estimated prediction errors from two competing models. With two real examples, we demonstrate that the resulting interval estimates for prediction errors provide much more information about model adequacy than the point estimates alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous time series studies have provided strong evidence of an association between increased levels of ambient air pollution and increased levels of hospital admissions, typically at 0, 1, or 2 days after an air pollution episode. An important research aim is to extend existing statistical models so that a more detailed understanding of the time course of hospitalization after exposure to air pollution can be obtained. Information about this time course, combined with prior knowledge about biological mechanisms, could provide the basis for hypotheses concerning the mechanism by which air pollution causes disease. Previous studies have identified two important methodological questions: (1) How can we estimate the shape of the distributed lag between increased air pollution exposure and increased mortality or morbidity? and (2) How should we estimate the cumulative population health risk from short-term exposure to air pollution? Distributed lag models are appropriate tools for estimating air pollution health effects that may be spread over several days. However, estimation for distributed lag models in air pollution and health applications is hampered by the substantial noise in the data and the inherently weak signal that is the target of investigation. We introduce an hierarchical Bayesian distributed lag model that incorporates prior information about the time course of pollution effects and combines information across multiple locations. The model has a connection to penalized spline smoothing using a special type of penalty matrix. We apply the model to estimating the distributed lag between exposure to particulate matter air pollution and hospitalization for cardiovascular and respiratory disease using data from a large United States air pollution and hospitalization database of Medicare enrollees in 94 counties covering the years 1999-2002.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amplifications and deletions of chromosomal DNA, as well as copy-neutral loss of heterozygosity have been associated with diseases processes. High-throughput single nucleotide polymorphism (SNP) arrays are useful for making genome-wide estimates of copy number and genotype calls. Because neighboring SNPs in high throughput SNP arrays are likely to have dependent copy number and genotype due to the underlying haplotype structure and linkage disequilibrium, hidden Markov models (HMM) may be useful for improving genotype calls and copy number estimates that do not incorporate information from nearby SNPs. We improve previous approaches that utilize a HMM framework for inference in high throughput SNP arrays by integrating copy number, genotype calls, and the corresponding confidence scores when available. Using simulated data, we demonstrate how confidence scores control smoothing in a probabilistic framework. Software for fitting HMMs to SNP array data is available in the R package ICE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop statistical methodology to facilitate indirect estimation of the concentration of antiretroviral drugs and viral loads in the prostate gland and the seminal vesicle. The differences in antiretroviral drug concentrations in these organs may lead to suboptimal concentrations in one gland compared to the other. Suboptimal levels of the antiretroviral drugs will not be able to fully suppress the virus in that gland, lead to a source of sexually transmissible virus and increase the chance of selecting for drug resistant virus. This information may be useful selecting antiretroviral drug regimen that will achieve optimal concentrations in most of male genital tract glands. Using fractionally collected semen ejaculates, Lundquist (1949) measured levels of surrogate markers in each fraction that are uniquely produced by specific male accessory glands. To determine the original glandular concentrations of the surrogate markers, Lundquist solved a simultaneous series of linear equations. This method has several limitations. In particular, it does not yield a unique solution, it does not address measurement error, and it disregards inter-subject variability in the parameters. To cope with these limitations, we developed a mechanistic latent variable model based on the physiology of the male genital tract and surrogate markers. We employ a Bayesian approach and perform a sensitivity analysis with regard to the distributional assumptions on the random effects and priors. The model and Bayesian approach is validated on experimental data where the concentration of a drug should be (biologically) differentially distributed between the two glands. In this example, the Bayesian model-based conclusions are found to be robust to model specification and this hierarchical approach leads to more scientifically valid conclusions than the original methodology. In particular, unlike existing methods, the proposed model based approach was not affected by a common form of outliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Many HIV-infected patients on highly active antiretroviral therapy (HAART) experience metabolic complications including dyslipidaemia and insulin resistance, which may increase their coronary heart disease (CHD) risk. We developed a prognostic model for CHD tailored to the changes in risk factors observed in patients starting HAART. METHODS: Data from five cohort studies (British Regional Heart Study, Caerphilly and Speedwell Studies, Framingham Offspring Study, Whitehall II) on 13,100 men aged 40-70 and 114,443 years of follow up were used. CHD was defined as myocardial infarction or death from CHD. Model fit was assessed using the Akaike Information Criterion; generalizability across cohorts was examined using internal-external cross-validation. RESULTS: A parametric model based on the Gompertz distribution generalized best. Variables included in the model were systolic blood pressure, total cholesterol, high-density lipoprotein cholesterol, triglyceride, glucose, diabetes mellitus, body mass index and smoking status. Compared with patients not on HAART, the estimated CHD hazard ratio (HR) for patients on HAART was 1.46 (95% CI 1.15-1.86) for moderate and 2.48 (95% CI 1.76-3.51) for severe metabolic complications. CONCLUSIONS: The change in the risk of CHD in HIV-infected men starting HAART can be estimated based on typical changes in risk factors, assuming that HRs estimated using data from non-infected men are applicable to HIV-infected men. Based on this model the risk of CHD is likely to increase, but increases may often be modest, and could be offset by lifestyle changes.