25 resultados para Process models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Crohn's disease is a chronic inflammatory process that has recently been associated with a higher risk of early implant failure. Herein we provide information on the impact of colitis on peri-implant bone formation using preclinical models of chemically induced colitis. METHODS Colitis was induced by intrarectal instillation of 2,4,6-trinitro-benzene-sulfonic-acid (TNBS). Colitis was also induced by feeding rats dextran-sodium-sulfate (DSS) in drinking water. One week after disease induction, titanium miniscrews were inserted into the tibia. Four weeks after implantation, peri-implant bone volume per tissue volume (BV/TV) and bone-to-implant contacts (BIC) were determined by histomorphometric analysis. RESULTS Cortical histomorphometric parameters were similar in the control (n = 10), DSS (n = 10) and TNBS (n = 8) groups. Cortical BV/TV was 92.2 ± 3.7%, 92.0 ± 3.0% and 92.6 ± 2.7%. Cortical BIC was 81.3 ± 8.8%, 83.2 ± 8.4% and 84.0 ± 7.0%, respectively. No significant differences were observed when comparing the medullary BV/TV and BIC (19.5 ± 6.4%, 16.2 ± 5.6% and 15.4 ± 9.0%) and (48.8 ± 12.9%, 49.2 ± 6.2 and 41.9 ± 11.7%), respectively. Successful induction of colitis was confirmed by loss of body weight and colon morphology. CONCLUSIONS The results suggest bone regeneration around implants is not impaired in chemically induced colitis models. Considering that Crohn's disease can affect any part of the gastrointestinal tract including the mouth, our model only partially reflects the clinical situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. According to the sequential accretion model (or core-nucleated accretion model), giant planet formation is based first on the formation of a solid core which, when massive enough, can gravitationally bind gas from the nebula to form the envelope. The most critical part of the model is the formation time of the core: to trigger the accretion of gas, the core has to grow up to several Earth masses before the gas component of the protoplanetary disc dissipates. Aims: We calculate planetary formation models including a detailed description of the dynamics of the planetesimal disc, taking into account both gas drag and excitation of forming planets. Methods: We computed the formation of planets, considering the oligarchic regime for the growth of the solid core. Embryos growing in the disc stir their neighbour planetesimals, exciting their relative velocities, which makes accretion more difficult. Here we introduce a more realistic treatment for the evolution of planetesimals' relative velocities, which directly impact on the formation timescale. For this, we computed the excitation state of planetesimals, as a result of stirring by forming planets, and gas-solid interactions. Results: We find that the formation of giant planets is favoured by the accretion of small planetesimals, as their random velocities are more easily damped by the gas drag of the nebula. Moreover, the capture radius of a protoplanet with a (tiny) envelope is also larger for small planetesimals. However, planets migrate as a result of disc-planet angular momentum exchange, with important consequences for their survival: due to the slow growth of a protoplanet in the oligarchic regime, rapid inward type I migration has important implications on intermediate-mass planets that have not yet started their runaway accretion phase of gas. Most of these planets are lost in the central star. Surviving planets have masses either below 10 M⊕ or above several Jupiter masses. Conclusions: To form giant planets before the dissipation of the disc, small planetesimals (~0.1 km) have to be the major contributors of the solid accretion process. However, the combination of oligarchic growth and fast inward migration leads to the absence of intermediate-mass planets. Other processes must therefore be at work to explain the population of extrasolar planets that are presently known.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with batch scheduling problems in process industries where final products arise from several successive chemical or physical transformations of raw materials using multi–purpose equipment. In batch production mode, the total requirements of intermediate and final products are partitioned into batches. The production start of a batch at a given level requires the availability of all input products. We consider the problem of scheduling the production of given batches such that the makespan is minimized. Constraints like minimum and maximum time lags between successive production levels, sequence–dependent facility setup times, finite intermediate storages, production breaks, and time–varying manpower contribute to the complexity of this problem. We propose a new solution approach using models and methods of resource–constrained project scheduling, which (approximately) solves problems of industrial size within a reasonable amount of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current models of embryological development focus on intracellular processes such as gene expression and protein networks, rather than on the complex relationship between subcellular processes and the collective cellular organization these processes support. We have explored this collective behavior in the context of neocortical development, by modeling the expansion of a small number of progenitor cells into a laminated cortex with layer and cell type specific projections. The developmental process is steered by a formal language analogous to genomic instructions, and takes place in a physically realistic three-dimensional environment. A common genome inserted into individual cells control their individual behaviors, and thereby gives rise to collective developmental sequences in a biologically plausible manner. The simulation begins with a single progenitor cell containing the artificial genome. This progenitor then gives rise through a lineage of offspring to distinct populations of neuronal precursors that migrate to form the cortical laminae. The precursors differentiate by extending dendrites and axons, which reproduce the experimentally determined branching patterns of a number of different neuronal cell types observed in the cat visual cortex. This result is the first comprehensive demonstration of the principles of self-construction whereby the cortical architecture develops. In addition, our model makes several testable predictions concerning cell migration and branching mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to bridge interdisciplinary differences in Presence research and to establish connections between Presence and “older” concepts of psychology and communication, a theoretical model of the formation of Spatial Presence is proposed. It is applicable to the exposure to different media and intended to unify the existing efforts to develop a theory of Presence. The model includes assumptions about attention allocation, mental models, and involvement, and considers the role of media factors and user characteristics as well, thus incorporating much previous work. It is argued that a commonly accepted model of Spatial Presence is the only solution to secure further progress within the international, interdisciplinary and multiple-paradigm community of Presence research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. Method: TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. Results: TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. Conclusions: TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The success of an intervention to prevent the complications of an infection is influenced by the natural history of the infection. Assumptions about the temporal relationship between infection and the development of sequelae can affect the predicted effect size of an intervention and the sample size calculation. This study investigates how a mathematical model can be used to inform sample size calculations for a randomised controlled trial (RCT) using the example of Chlamydia trachomatis infection and pelvic inflammatory disease (PID). METHODS We used a compartmental model to imitate the structure of a published RCT. We considered three different processes for the timing of PID development, in relation to the initial C. trachomatis infection: immediate, constant throughout, or at the end of the infectious period. For each process we assumed that, of all women infected, the same fraction would develop PID in the absence of an intervention. We examined two sets of assumptions used to calculate the sample size in a published RCT that investigated the effect of chlamydia screening on PID incidence. We also investigated the influence of the natural history parameters of chlamydia on the required sample size. RESULTS The assumed event rates and effect sizes used for the sample size calculation implicitly determined the temporal relationship between chlamydia infection and PID in the model. Even small changes in the assumed PID incidence and relative risk (RR) led to considerable differences in the hypothesised mechanism of PID development. The RR and the sample size needed per group also depend on the natural history parameters of chlamydia. CONCLUSIONS Mathematical modelling helps to understand the temporal relationship between an infection and its sequelae and can show how uncertainties about natural history parameters affect sample size calculations when planning a RCT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Libraries of learning objects may serve as basis for deriving course offerings that are customized to the needs of different learning communities or even individuals. Several ways of organizing this course composition process are discussed. Course composition needs a clear understanding of the dependencies between the learning objects. Therefore we discuss the metadata for object relationships proposed in different standardization projects and especially those suggested in the Dublin Core Metadata Initiative. Based on these metadata we construct adjacency matrices and graphs. We show how Gozinto-type computations can be used to determine direct and indirect prerequisites for certain learning objects. The metadata may also be used to define integer programming models which can be applied to support the instructor in formulating his specifications for selecting objects or which allow a computer agent to automatically select learning objects. Such decision models could also be helpful for a learner navigating through a library of learning objects. We also sketch a graph-based procedure for manual or automatic sequencing of the learning objects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I will start by discussing some aspects of Kagitcibasi’s Theory of Family Change: its current empirical status and, more importantly, its focus on universal human needs and the consequences of this focus. Family Change Theory’s focus on the universality of the basic human needs of autonomy and relatedness and its culture-level emphasis on cultural norms and family values as reflecting a culture’s capacity for fulfilling its members’ respective needs shows that the theory advocates balanced cultural norms of independence and interdependence. As a normative theory it therefore postulates the necessity of a synthetic family model of emotional interdependence as an alternative to extreme models of total independence and total interdependence. Generalizing from this I will sketch a theoretical model where a dynamic and dialectical process of the fit between individual and culture and between culture and universal human needs and related social practices is central. I will discuss this model using a recent cross-cultural project on implicit theories of self/world and primary/secondary control orientations as an example. Implications for migrating families and acculturating individuals are also discussed.