825 resultados para multi-mediational path model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND AIM There is a lack of suitable in vitro models to evaluate various treatment modalities intending to remove subgingival bacterial biofilm. Consequently, the aims of this in vitro-study were: a) to establish a pocket model enabling mechanical removal of biofilm and b) to evaluate repeated non-surgical periodontal treatment with respect to biofilm removal and reformation, surface alterations, tooth hard-substance-loss, and attachment of periodontal ligament (PDL) fibroblasts. MATERIAL AND METHODS Standardized human dentin specimens were colonized by multi-species biofilms for 3.5 days and subsequently placed into artificially created pockets. Non-surgical periodontal treatment was performed as follows: a) hand-instrumentation with curettes (CUR), b) ultrasonication (US), c) subgingival air-polishing using erythritol (EAP) and d) subgingival air-polishing using erythritol combined with chlorhexidine digluconate (EAP-CHX). The reduction and recolonization of bacterial counts, surface roughness (Ra and Rz), the caused tooth substance-loss (thickness) as well as the attachment of PDL fibroblasts were evaluated and statistically analyzed by means of ANOVA with Post-Hoc LSD. RESULTS After 5 treatments, bacterial reduction in biofilms was highest when applying EAP-CHX (4 log10). The lowest reduction was found after CUR (2 log10). Additionally, substance-loss was the highest when using CUR (128±40 µm) in comparison with US (14±12 µm), EAP (6±7 µm) and EAP-CHX (11±10) µm). Surface was roughened when using CUR and US. Surfaces exposed to US and to EAP attracted the highest numbers of PDL fibroblasts. CONCLUSION The established biofilm model simulating a periodontal pocket combined with interchangeable placements of test specimens with multi-species biofilms enables the evaluation of different non-surgical treatment modalities on biofilm removal and surface alterations. Compared to hand instrumentation the application of ultrasonication and of air-polishing with erythritol prevents from substance-loss and results in a smooth surface with nearly no residual biofilm that promotes the reattachment of PDL fibroblasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An  ∼ 1000-member ensemble of the Bern3D-LPJ carbon–climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information-centric networking (ICN) addresses drawbacks of the Internet protocol, namely scalability and security. ICN is a promising approach for wireless communication because it enables seamless mobile communication, where intermediate or source nodes may change, as well as quick recovery from collisions. In this work, we study wireless multi-hop communication in Content-Centric Networking (CCN), which is a popular ICN architecture. We propose to use two broadcast faces that can be used in alternating order along the path to support multi-hop communication between any nodes in the network. By slightly modifying CCN, we can reduce the number of duplicate Interests by 93.4 % and the number of collisions by 61.4 %. Furthermore, we describe and evaluate different strategies for prefix registration based on overhearing. Strategies that configure prefixes only on one of the two faces can result in at least 27.3 % faster data transmissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptation potential of forests to rapid climatic changes can be assessed from vegetation dynamics during past climatic changes as preserved in fossil pollen data. However, pollen data reflect the integrated effects of climate and biotic processes, such as establishment, survival, competition, and migration. To disentangle these processes, we compared an annually laminated late Würm and Holocene pollen record from the Central Swiss Plateau with simulations of a dynamic forest patch model. All input data used in the simulations were largely independent from pollen data; i.e. the presented analysis is non-circular. Temperature and precipitation scenarios were based on reconstructions from pollen-independent sources. The earliest arrival times of the species at the study site after the last glacial were inferred from pollen maps. We ran a series of simulations under different combinations of climate and immigration scenarios. In addition, the sensitivity of the simulated presence/absence of four major species to changes in the climate scenario was examined. The pattern of the pollen record could partly be explained by the used climate scenario, mostly by temperature. However, some features, in particular the absence of most species during the late Würm could only be simulated if the winter temperature anomalies of the used scenario were decreased considerably. Consequently, we had to assume in the simulations, that most species immigrated during or after the Younger Dryas (12 000 years BP), Abies and Fagus even later. Given the timing of tree species immigration, the vegetation was in equilibrium with climate during long periods, but responded with lags at the time-scale of centuries to millennia caused by a secondary succession after rapid climatic changes such as at the end of Younger Dryas, or immigration of dominant taxa. Climate influenced the tree taxa both directly and indirectly by changing inter-specific competition. We concluded, that also during the present fast climatic change, species migration might be an important process, particularly if geographic barriers, such as the Alps are in the migrational path.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the transition from unitary to dissipative dynamics in the relativistic O(N) vector model with the λ(φ2)2 interaction using the nonperturbative functional renormalization group in the real-time formalism. In thermal equilibrium, the theory is characterized by two scales, the interaction range for coherent scattering of particles and the mean free path determined by the rate of incoherent collisions with excitations in the thermal medium. Their competition determines the renormalization group flow and the effective dynamics of the model. Here we quantify the dynamic properties of the model in terms of the scale-dependent dynamic critical exponent z in the limit of large temperatures and in 2≤d≤4 spatial dimensions. We contrast our results to the behavior expected at vanishing temperature and address the question of the appropriate dynamic universality class for the given microscopic theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE Leakage is the most common complication of percutaneous cement augmentation of the spine. The viscosity of the polymethylmethacrylate (PMMA) cement is strongly correlated with the likelihood of cement leakage. We hypothesized that cement leakage can be reduced by sequential cement injection in a vertebroplasty model. METHODS A standardized vertebral body substitute model, consisting of aluminum oxide foams coated by acrylic cement with a preformed leakage path, simulating a ventral vein, was developed. Three injection techniques of 6 ml PMMA were assessed: injection in one single step (all-in-one), injection of 1 ml at the first and 5 ml at the second step with 1 min latency in-between (two-step), and sequential injection of 0.5 ml with 1-min latency between the sequences (sequential). Standard PMMA vertebroplasty cement was used; each injection type was tested on ten vertebral body substitute models with two possible leakage paths per model. Leakage was assessed by radiographs using a zonal graduation: intraspongious = no leakage and extracortical = leakage. RESULTS The leakage rate was significantly lower in the "sequential" technique (2/20 leakages) followed by "two-step" (15/20) and "all-in-one" (20/20) techniques (p < 0.001). The RR for a cement leakage was 10.0 times higher in the "all-in-one" compared to the "sequential" group (95 % confidence intervals 2.7-37.2; p < 0.001). CONCLUSIONS The sequential cement injection is a simple approach to minimize the risk for leakage. Taking advantage of the temperature gradient between body and room temperature, it is possible to increase the cement viscosity inside the vertebra while keeping it low in the syringe. Using sequential injection of small cement volumes, further leakage paths are blocked before further injection of the low-viscosity cement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Information-centric networking (ICN) offers new perspectives on mobile ad-hoc communication because routing is based on names but not on endpoint identifiers. Since every content object has a unique name and is signed, authentic content can be stored and cached by any node. If connectivity to a content source breaks, it is not necessarily required to build a new path to the same source but content can also be retrieved from a closer node that provides the same content copy. For example, in case of collisions, retransmissions do not need to be performed over the entire path but due to caching only over the link where the collision occurred. Furthermore, multiple requests can be aggregated to improve scalability of wireless multi-hop communication. In this work, we base our investigations on Content-Centric Networking (CCN), which is a popular {ICN} architecture. While related works in wireless {CCN} communication are based on broadcast communication exclusively, we show that this is not needed for efficient mobile ad-hoc communication. With Dynamic Unicast requesters can build unicast paths to content sources after they have been identified via broadcast. We have implemented Dynamic Unicast in CCNx, which provides a reference implementation of the {CCN} concepts, and performed extensive evaluations in diverse mobile scenarios using NS3-DCE, the direct code execution framework for the {NS3} network simulator. Our evaluations show that Dynamic Unicast can result in more efficient communication than broadcast communication, but still supports all {CCN} advantages such as caching, scalability and implicit content discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Feedback is considered to be one of the most important drivers of learning. One form of structured feedback used in medical settings is multisource feedback (MSF). This feedback technique provides the opportunity to gain a differentiated view on a doctor’s performance from several perspectives using a questionnaire and a facilitating conversation, in which learning goals are formulated. While many studies have been conducted on the validity, reliability and feasibility of the instrument, little is known about the impact of factors that might influence the effects of MSF on clinical performance. Summary of Work: To study under which circumstances MSF is most effective, we performed a literature review on Google Scholar with focus on MSF and feedback in general. Main key-words were: MSF, multi-source-feedback, multi source feedback, and feedback each combined with influencing/ hindering/ facilitating factors, effective, effectiveness, doctors-intraining, and surgery. Summary of Results: Based on the literature, we developed a preliminary model of facilitating factors. This model includes five main factors influencing MSF: questionnaire, doctor-in-training, group of raters, facilitating supervisor, and facilitating conversation. Discussion and Conclusions: Especially the following points that might influence MSF have not yet been sufficiently studied: facilitating conversation with the supervisor, individual aspects of doctors-in-training, and the causal relations between influencing factors. Overall there are only very few studies focusing on the impact of MSF on actual and long-term performance. We developed a preliminary model of hindering and facilitating factors on MSF. Further studies are needed to better understand under which circumstances MSF is most effective. Take-home messages: The preliminary model might help to guide further studies on how to implement MSF to use it at its full potential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: Species diversity and genetic diversity may be affected in parallel by similar environmental drivers. However, genetic diversity may also be affected independently by habitat characteristics. We aim at disentangling relationships between genetic diversity, species diversity and habitat characteristics of woody species in subtropical forest. Methods: We studied 11 dominant tree and shrub species in 27 plots in Gutianshan, China, and assessed their genetic diversity (Ar) and population differentiation (F’ST) with microsatellite markers. We tested if Ar and population specific F’ST were correlated to local species diversity and plot characteristics. Multi-model inference and model averaging were used to determine the relative importance of each predictor. Additionally we tested for isolation-by-distance and isolation-by-elevation by regressing pairwise F’ST against pairwise spatial and elevational distances. Important findings: Genetic diversity was not related to species diversity for any of the study species. Thus, our results do not support joint effects of habitat characteristics on these two levels of biodiversity. Instead, genetic diversity in two understory shrubs, Rhododendron simsii and Vaccinium carlesii, was affected by plot age with decreasing genetic diversity in successionally older plots. Population differentiation increased with plot age in Rhododendron simsii and Lithocarpus glaber. This shows that succession can reduce genetic diversity within, and increase genetic diversity between populations. Furthermore, we found four cases of isolation-by-distance and two cases of isolation-by-elevation. The former indicates inefficient pollen and seed dispersal by animals whereas the latter might be due to phenological asynchronies. These patterns indicate that succession can affect genetic diversity without parallel effects on species diversity and that gene flow in a continuous subtropical forest can be restricted even at a local scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of health promotion programs is related to both program effectiveness and the extent to which the program is implemented among the target population. The purpose of this dissertation was to describe the development and evaluation of a school-based program diffusion intervention designed to increase the rate of dissemination and adoption of the Child and Adolescent Trial for Cardiovascular Health, or CATCH program (recently renamed the Coordinated Approach to Child Health). ^ The first study described the process by which schools across the state of Texas spontaneously began to adopt the CATCH program after it was tested and proven effective in a multi-site randomized efficacy trial. A survey of teachers and administrator representatives of all schools on record that purchased the CATCH program, but were not involved in the efficacy trial, was used to find out who brought CATCH into the schools, how they garnered support for its adoption, why they decided to adopt the program, and what was involved in deciding to adopt. ^ The second study described how the Intervention Mapping framework guided the planning, development and implementation of a program for the diffusion of CATCH. An iterative process was used to integrate theory, literature, the experience of project staff and data from the target population into a meaningful set of program determinants and performance objectives. Proximal program objectives were specified and translated into both media and interpersonal communication strategies for program diffusion. ^ The third study assessed the effectiveness of the diffusion program in a case-comparison design. Three of the twenty Education Service Center regions in Texas were chosen, selected based on similar demographic criteria, and were followed for adoption of the CATCH curriculum. One of these regions received the full media and interpersonal channel intervention; a second received a reduced media-only intervention, and a third received no intervention. Results suggested the use of the interpersonal channels with media follow-up is an effective means to facilitate program dissemination and adoption. The media-alone condition was not effective in facilitating program adoption. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With substance abuse treatment expanding in prisons and jails, understanding how behavior change interacts with a restricted setting becomes more essential. The Transtheoretical Model (TTM) has been used to understand intentional behavior change in unrestricted settings, however, evidence indicates restrictive settings can affect the measurement and structure of the TTM constructs. The present study examined data from problem drinkers at baseline and end-of-treatment from three studies: (1) Project CARE (n = 187) recruited inmates from a large county jail; (2) Project Check-In (n = 116) recruited inmates from a state prison; (3) Project MATCH, a large multi-site alcohol study had two recruitment arms, aftercare (n = 724 pre-treatment and 650 post-treatment) and outpatient (n = 912 pre-treatment and 844 post-treatment). The analyses were conducted using cross-sectional data to test for non-invariance of measures of the TTM constructs: readiness, confidence, temptation, and processes of change (Structural Equation Modeling, SEM) across restricted and unrestricted settings. Two restricted (jail and aftercare) and one unrestricted group (outpatient) entering treatment and one restricted (prison) and two unrestricted groups (aftercare and outpatient) at end-of-treatment were contrasted. In addition TTM end-of-treatment profiles were tested as predictors of 12 month drinking outcomes (Profile Analysis). Although SEM did not indicate structural differences in the overall TTM construct model across setting types, there were factor structure differences on the confidence and temptation constructs at pre-treatment and in the factor structure of the behavioral processes at the end-of-treatment. For pre-treatment temptation and confidence, differences were found in the social situations factor loadings and in the variance for the confidence and temptation latent factors. For the end-of-treatment behavioral processes, differences across the restricted and unrestricted settings were identified in the counter-conditioning and stimulus control factor loadings. The TTM end-of-treatment profiles were not predictive of drinking outcomes in the prison sample. Both pre and post-treatment differences in structure across setting types involved constructs operationalized with behaviors that are limited for those in restricted settings. These studies suggest the TTM is a viable model for explicating addictive behavior change in restricted settings but calls for modification of subscale items that refer to specific behaviors and caution in interpreting the mean differences across setting types for problem drinkers. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A census of 925 U.S. colleges and universities offering masters and doctorate degrees was conducted in order to study the number of elements of an environmental management system as defined by ISO 14001 possessed by small, medium and large institutions. A 30% response rate was received with 273 responses included in the final data analysis. Overall, the number of ISO 14001 elements implemented among the 273 institutions ranged from 0 to 16, with a median of 12. There was no significant association between the number of elements implemented among institutions and the size of the institution (p = 0.18; Kruskal-Wallis test) or among USEPA regions (p = 0.12; Kruskal-Wallis test). The proportion of U.S. colleges and universities that reported having implemented a structured, comprehensive environmental management system, defined by answering yes to all 16 elements, was 10% (95% C.I. 6.6%–14.1%); however 38% (95% C.I. 32.0%–43.8%) reported that they had implemented a structured, comprehensive environmental management system, while 30.0% (95% C.I. 24.7%–35.9%) are planning to implement a comprehensive environmental management system within the next five years. Stratified analyses were performed by institution size, Carnegie Classification and job title. ^ The Osnabruck model, and another under development by the South Carolina Sustainable Universities Initiative, are the only two environmental management system models that have been proposed specifically for colleges and universities, although several guides are now available. The Environmental Management System Implementation Model for U.S. Colleges and Universities developed is an adaptation of the ISO 14001 standard and USEPA recommendations and has been tailored to U.S. colleges and universities for use in streamlining the implementation process. In using this implementation model created for the U.S. research and academic setting, it is hoped that these highly specialized institutions will be provided with a clearer and more cost-effective path towards the implementation of an EMS and greater compliance with local, state and federal environmental legislation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anticancer drugs typically are administered in the clinic in the form of mixtures, sometimes called combinations. Only in rare cases, however, are mixtures approved as drugs. Rather, research on mixtures tends to occur after single drugs have been approved. The goal of this research project was to develop modeling approaches that would encourage rational preclinical mixture design. To this end, a series of models were developed. First, several QSAR classification models were constructed to predict the cytotoxicity, oral clearance, and acute systemic toxicity of drugs. The QSAR models were applied to a set of over 115,000 natural compounds in order to identify promising ones for testing in mixtures. Second, an improved method was developed to assess synergistic, antagonistic, and additive effects between drugs in a mixture. This method, dubbed the MixLow method, is similar to the Median-Effect method, the de facto standard for assessing drug interactions. The primary difference between the two is that the MixLow method uses a nonlinear mixed-effects model to estimate parameters of concentration-effect curves, rather than an ordinary least squares procedure. Parameter estimators produced by the MixLow method were more precise than those produced by the Median-Effect Method, and coverage of Loewe index confidence intervals was superior. Third, a model was developed to predict drug interactions based on scores obtained from virtual docking experiments. This represents a novel approach for modeling drug mixtures and was more useful for the data modeled here than competing approaches. The model was applied to cytotoxicity data for 45 mixtures, each composed of up to 10 selected drugs. One drug, doxorubicin, was a standard chemotherapy agent and the others were well-known natural compounds including curcumin, EGCG, quercetin, and rhein. Predictions of synergism/antagonism were made for all possible fixed-ratio mixtures, cytotoxicities of the 10 best-scoring mixtures were tested, and drug interactions were assessed. Predicted and observed responses were highly correlated (r2 = 0.83). Results suggested that some mixtures allowed up to an 11-fold reduction of doxorubicin concentrations without sacrificing efficacy. Taken together, the models developed in this project present a general approach to rational design of mixtures during preclinical drug development. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Institutional Review Boards (IRBs) are the primary gatekeepers for the protection of ethical standards of federally regulated research on human subjects in this country. This paper focuses on what general, broad measures that may be instituted or enhanced to exemplify a "model IRB". This is done by examining the current regulatory standards of federally regulated IRBs, not private or commercial boards, and how many of those standards have been found either inadequate or not generally understood or followed. The analysis includes suggestions on how to bring about changes in order to make the IRB process more efficient, less subject to litigation, and create standardized educational protocols for members. The paper also considers how to include better oversight for multi-center research, increased centralization of IRBs, utilization of Data Safety Monitoring Boards when necessary, payment for research protocol review, voluntary accreditation, and the institution of evaluation/quality assurance programs. ^ This is a policy study utilizing secondary analysis of publicly available data. Therefore, the research for this paper focuses on scholarly medical/legal journals, web information from the Department of Health and Human Services, Federal Drug Administration, and the Office of the Inspector General, Accreditation Programs, law review articles, and current regulations applicable to the relevant portions of the paper. ^ Two issues are found to be consistently cited by the literature as major concerns. One is a need for basic, standardized educational requirements across all IRBs and its members, and secondly, much stricter and more informed management of continuing research. There is no federally regulated formal education system currently in place for IRB members, except for certain NIH-based trials. Also, IRBs are not keeping up with research once a study has begun, and although regulated to do so, it does not appear to be a great priority. This is the area most in danger of increased litigation. Other issues such as voluntary accreditation and outcomes evaluation are slowing gaining steam as the processes are becoming more available and more sought after, such as JCAHO accrediting of hospitals. ^ Adopting the principles discussed in this paper should promote better use of a local IRBs time, money, and expertise for protecting the vulnerable population in their care. Without further improvements to the system, there is concern that private and commercial IRBs will attempt to create a monopoly on much of the clinical research in the future as they are not as heavily regulated and can therefore offer companies quicker and more convenient reviews. IRBs need to consider the advantages of charging for their unique and important services as a cost of doing business. More importantly, there must be a minimum standard of education for all IRB members in the area of the ethical standards of human research and a greater emphasis placed on the follow-up of ongoing research as this is the most critical time for study participants and may soon lead to the largest area for litigation. Additionally, there should be a centralized IRB for multi-site trials or a study website with important information affecting the trial in real time. There needs to be development of standards and metrics to assess the performance of the IRBs for quality assurance and outcome evaluations. The boards should not be content to run the business of human subjects' research without determining how well that function is actually being carried out. It is important that federally regulated IRBs provide excellence in human research and promote those values most important to the public at large.^