910 resultados para Process models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Crohn's disease is a chronic inflammatory process that has recently been associated with a higher risk of early implant failure. Herein we provide information on the impact of colitis on peri-implant bone formation using preclinical models of chemically induced colitis. METHODS Colitis was induced by intrarectal instillation of 2,4,6-trinitro-benzene-sulfonic-acid (TNBS). Colitis was also induced by feeding rats dextran-sodium-sulfate (DSS) in drinking water. One week after disease induction, titanium miniscrews were inserted into the tibia. Four weeks after implantation, peri-implant bone volume per tissue volume (BV/TV) and bone-to-implant contacts (BIC) were determined by histomorphometric analysis. RESULTS Cortical histomorphometric parameters were similar in the control (n = 10), DSS (n = 10) and TNBS (n = 8) groups. Cortical BV/TV was 92.2 ± 3.7%, 92.0 ± 3.0% and 92.6 ± 2.7%. Cortical BIC was 81.3 ± 8.8%, 83.2 ± 8.4% and 84.0 ± 7.0%, respectively. No significant differences were observed when comparing the medullary BV/TV and BIC (19.5 ± 6.4%, 16.2 ± 5.6% and 15.4 ± 9.0%) and (48.8 ± 12.9%, 49.2 ± 6.2 and 41.9 ± 11.7%), respectively. Successful induction of colitis was confirmed by loss of body weight and colon morphology. CONCLUSIONS The results suggest bone regeneration around implants is not impaired in chemically induced colitis models. Considering that Crohn's disease can affect any part of the gastrointestinal tract including the mouth, our model only partially reflects the clinical situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. According to the sequential accretion model (or core-nucleated accretion model), giant planet formation is based first on the formation of a solid core which, when massive enough, can gravitationally bind gas from the nebula to form the envelope. The most critical part of the model is the formation time of the core: to trigger the accretion of gas, the core has to grow up to several Earth masses before the gas component of the protoplanetary disc dissipates. Aims: We calculate planetary formation models including a detailed description of the dynamics of the planetesimal disc, taking into account both gas drag and excitation of forming planets. Methods: We computed the formation of planets, considering the oligarchic regime for the growth of the solid core. Embryos growing in the disc stir their neighbour planetesimals, exciting their relative velocities, which makes accretion more difficult. Here we introduce a more realistic treatment for the evolution of planetesimals' relative velocities, which directly impact on the formation timescale. For this, we computed the excitation state of planetesimals, as a result of stirring by forming planets, and gas-solid interactions. Results: We find that the formation of giant planets is favoured by the accretion of small planetesimals, as their random velocities are more easily damped by the gas drag of the nebula. Moreover, the capture radius of a protoplanet with a (tiny) envelope is also larger for small planetesimals. However, planets migrate as a result of disc-planet angular momentum exchange, with important consequences for their survival: due to the slow growth of a protoplanet in the oligarchic regime, rapid inward type I migration has important implications on intermediate-mass planets that have not yet started their runaway accretion phase of gas. Most of these planets are lost in the central star. Surviving planets have masses either below 10 M⊕ or above several Jupiter masses. Conclusions: To form giant planets before the dissipation of the disc, small planetesimals (~0.1 km) have to be the major contributors of the solid accretion process. However, the combination of oligarchic growth and fast inward migration leads to the absence of intermediate-mass planets. Other processes must therefore be at work to explain the population of extrasolar planets that are presently known.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central event in protein misfolding disorders (PMDs) is the accumulation of a misfolded form of a naturally expressed protein. Despite the diversity of clinical symptoms associated with different PMDs, many similarities in their mechanism suggest that distinct pathologies may cross talk at the molecular level. The main goal of this study was to analyze the interaction of the protein misfolding processes implicated in Alzheimer's and prion diseases. For this purpose, we inoculated prions in an Alzheimer's transgenic mouse model that develop typical amyloid plaques and followed the progression of pathological changes over time. Our findings show a dramatic acceleration and exacerbation of both pathologies. The onset of prion disease symptoms in transgenic mice appeared significantly faster with a concomitant increase on the level of misfolded prion protein in the brain. A striking increase in amyloid plaque deposition was observed in prion-infected mice compared with their noninoculated counterparts. Histological and biochemical studies showed the association of the two misfolded proteins in the brain and in vitro experiments showed that protein misfolding can be enhanced by a cross-seeding mechanism. These results suggest a profound interaction between Alzheimer's and prion pathologies, indicating that one protein misfolding process may be an important risk factor for the development of a second one. Our findings may have important implications to understand the origin and progression of PMDs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with batch scheduling problems in process industries where final products arise from several successive chemical or physical transformations of raw materials using multi–purpose equipment. In batch production mode, the total requirements of intermediate and final products are partitioned into batches. The production start of a batch at a given level requires the availability of all input products. We consider the problem of scheduling the production of given batches such that the makespan is minimized. Constraints like minimum and maximum time lags between successive production levels, sequence–dependent facility setup times, finite intermediate storages, production breaks, and time–varying manpower contribute to the complexity of this problem. We propose a new solution approach using models and methods of resource–constrained project scheduling, which (approximately) solves problems of industrial size within a reasonable amount of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current models of embryological development focus on intracellular processes such as gene expression and protein networks, rather than on the complex relationship between subcellular processes and the collective cellular organization these processes support. We have explored this collective behavior in the context of neocortical development, by modeling the expansion of a small number of progenitor cells into a laminated cortex with layer and cell type specific projections. The developmental process is steered by a formal language analogous to genomic instructions, and takes place in a physically realistic three-dimensional environment. A common genome inserted into individual cells control their individual behaviors, and thereby gives rise to collective developmental sequences in a biologically plausible manner. The simulation begins with a single progenitor cell containing the artificial genome. This progenitor then gives rise through a lineage of offspring to distinct populations of neuronal precursors that migrate to form the cortical laminae. The precursors differentiate by extending dendrites and axons, which reproduce the experimentally determined branching patterns of a number of different neuronal cell types observed in the cat visual cortex. This result is the first comprehensive demonstration of the principles of self-construction whereby the cortical architecture develops. In addition, our model makes several testable predictions concerning cell migration and branching mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to bridge interdisciplinary differences in Presence research and to establish connections between Presence and “older” concepts of psychology and communication, a theoretical model of the formation of Spatial Presence is proposed. It is applicable to the exposure to different media and intended to unify the existing efforts to develop a theory of Presence. The model includes assumptions about attention allocation, mental models, and involvement, and considers the role of media factors and user characteristics as well, thus incorporating much previous work. It is argued that a commonly accepted model of Spatial Presence is the only solution to secure further progress within the international, interdisciplinary and multiple-paradigm community of Presence research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. Method: TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. Results: TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. Conclusions: TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The success of an intervention to prevent the complications of an infection is influenced by the natural history of the infection. Assumptions about the temporal relationship between infection and the development of sequelae can affect the predicted effect size of an intervention and the sample size calculation. This study investigates how a mathematical model can be used to inform sample size calculations for a randomised controlled trial (RCT) using the example of Chlamydia trachomatis infection and pelvic inflammatory disease (PID). METHODS We used a compartmental model to imitate the structure of a published RCT. We considered three different processes for the timing of PID development, in relation to the initial C. trachomatis infection: immediate, constant throughout, or at the end of the infectious period. For each process we assumed that, of all women infected, the same fraction would develop PID in the absence of an intervention. We examined two sets of assumptions used to calculate the sample size in a published RCT that investigated the effect of chlamydia screening on PID incidence. We also investigated the influence of the natural history parameters of chlamydia on the required sample size. RESULTS The assumed event rates and effect sizes used for the sample size calculation implicitly determined the temporal relationship between chlamydia infection and PID in the model. Even small changes in the assumed PID incidence and relative risk (RR) led to considerable differences in the hypothesised mechanism of PID development. The RR and the sample size needed per group also depend on the natural history parameters of chlamydia. CONCLUSIONS Mathematical modelling helps to understand the temporal relationship between an infection and its sequelae and can show how uncertainties about natural history parameters affect sample size calculations when planning a RCT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Libraries of learning objects may serve as basis for deriving course offerings that are customized to the needs of different learning communities or even individuals. Several ways of organizing this course composition process are discussed. Course composition needs a clear understanding of the dependencies between the learning objects. Therefore we discuss the metadata for object relationships proposed in different standardization projects and especially those suggested in the Dublin Core Metadata Initiative. Based on these metadata we construct adjacency matrices and graphs. We show how Gozinto-type computations can be used to determine direct and indirect prerequisites for certain learning objects. The metadata may also be used to define integer programming models which can be applied to support the instructor in formulating his specifications for selecting objects or which allow a computer agent to automatically select learning objects. Such decision models could also be helpful for a learner navigating through a library of learning objects. We also sketch a graph-based procedure for manual or automatic sequencing of the learning objects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The policy development process leading to the Labour government's white paper of December 1997—The new NHS: Modern, Dependable—is the focus of this project and the public policy development literature is used to aid in the understanding of this process. Policy makers who had been involved in the development of the white paper were interviewed in order to acquire a thorough understanding of who was involved in this process and how they produced the white paper. A theoretical framework is used that sorts policy development models into those that focus on knowledge and experience, and those which focus on politics and influence. This framework is central to understanding the evidence gathered from the individuals and associations that participated in this policy development process. The main research question to be asked in this project is to what extent do either of these sets of policy development models aid in understanding and explicating the process by which the Labour government's policies were developed. The interview evidence, along with published evidence, show that a clear pattern of policy change emerged from this policy development process, and the Knowledge-Experience and Politics-Influence policy making models both assist in understanding this process. The early stages of the policy development process were characterized as hierarchical and iterative, yet also very collaborative among those participating, with knowledge and experience being quite prevalent. At every point in the process, however, informal networks of political influence were used and noted to be quite prevalent by all of the individuals interviewed. The later stages of the process then became increasingly noninclusive, with decisions made by a select group of internal and external policy makers. These policy making models became an important tool with which to understand the policy development process. This Knowledge-Experience and Politics-Influence dichotomy of policy development models could therefore be useful in analyzing other types of policy development. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using properties of moment stationarity we develop exact expressions for the mean and covariance of allele frequencies at a single locus for a set of populations subject to drift, mutation, and migration. Some general results can be obtained even for arbitrary mutation and migration matrices, for example: (1) Under quite general conditions, the mean vector depends only on mutation rates, not on migration rates or the number of populations. (2) Allele frequencies covary among all pairs of populations connected by migration. As a result, the drift, mutation, migration process is not ergodic when any finite number of populations is exchanging genes. in addition, we provide closed form expressions for the mean and covariance of allele frequencies in Wright's finite-island model of migration under several simple models of mutation, and we show that the correlation in allele frequencies among populations can be very large for realistic rates of mutation unless an enormous number of populations are exchanging genes. As a result, the traditional diffusion approximation provides a poor approximation of the stationary distribution of allele frequencies among populations. Finally, we discuss some implications of our results for measures of population structure based on Wright's F-statistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dua and Miller (1996) created leading and coincident employment indexes for the state of Connecticut, following Moore's (1981) work at the national level. The performance of the Dua-Miller indexes following the recession of the early 1990s fell short of expectations. This paper performs two tasks. First, it describes the process of revising the Connecticut Coincident and Leading Employment Indexes. Second, it analyzes the statistical properties and performance of the new indexes by comparing the lead profiles of the new and old indexes as well as their out-of-sample forecasting performance, using the Bayesian Vector Autoregressive (BVAR) method. The new indexes show improved performance in dating employment cycle chronologies. The lead profile test demonstrates that superiority in a rigorous, non-parametric statistic fashion. The mixed evidence on the BVAR forecasting experiments illustrates the truth in the Granger and Newbold (1986) caution that leading indexes properly predict cycle turning points and do not necessarily provide accurate forecasts except at turning points, a view that our results support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the central bank should adopt a loss function that differs from the social loss function. Carefully designing the central bank s loss function with consistent targets can harmonize optimal and consistent policy. This desirable result emerges from two observations. First, the social loss function reflects a normative process that does not necessarily prove consistent with the structure of the microeconomy. Thus, the social loss function cannot serve as a direct loss function for the central bank. Second, an optimal loss function for the central bank must depend on the structure of that microeconomy. In addition, this paper shows that control theory provides a benchmark for institution design in a game-theoretical framework.