958 resultados para Implementation models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Despite long-standing calls to disseminate evidence-based treatments for generalized anxiety (GAD), modest progress has been made in the study of how such treatments should be implemented. The primary objective of this study was to test three competing strategies on how to implement a cognitive behavioral treatment (CBT) for out-patients with GAD (i.e., comparison of one compensation vs. two capitalization models). METHODS: For our three-arm, single-blinded, randomized controlled trial (implementation of CBT for GAD [IMPLEMENT]), we recruited adults with GAD using advertisements in high-circulation newspapers to participate in a 14-session cognitive behavioral treatment (Mastery of your Anxiety and Worry, MAW-packet). We randomly assigned eligible patients using a full randomization procedure (1:1:1) to three different conditions of implementation: adherence priming (compensation model), which had a systematized focus on patients' individual GAD symptoms and how to compensate for these symptoms within the MAW-packet, and resource priming and supportive resource priming (capitalization model), which had systematized focuses on patients' strengths and abilities and how these strengths can be capitalized within the same packet. In the intention-to-treat population an outcome composite of primary and secondary symptoms-related self-report questionnaires was analyzed based on a hierarchical linear growth model from intake to 6-month follow-up assessment. This trial is registered at ClinicalTrials.gov (identifier: NCT02039193) and is closed to new participants. FINDINGS: From June 2012 to Nov. 2014, from 411 participants that were screened, 57 eligible participants were recruited and randomly assigned to three conditions. Forty-nine patients (86%) provided outcome data at post-assessment (14% dropout rate). All three conditions showed a highly significant reduction of symptoms over time. However, compared with the adherence priming condition, both resource priming conditions indicated faster symptom reduction. The observer ratings of a sub-sample of recorded videos (n = 100) showed that the therapists in the resource priming conditions conducted more strength-oriented interventions in comparison with the adherence priming condition. No patients died or attempted suicide. INTERPRETATION: To our knowledge, this is the first trial that focuses on capitalization and compensation models during the implementation of one prescriptive treatment packet for GAD. We have shown that GAD related symptoms were significantly faster reduced by the resource priming conditions, although the limitations of our study included a well-educated population. If replicated, our results suggest that therapists who implement a mental health treatment for GAD might profit from a systematized focus on capitalization models. FUNDING: Swiss Science National Foundation (SNSF-Nr. PZ00P1_136937/1) awarded to CF. KEYWORDS: Cognitive behavioral therapy; Evidence-based treatment; Implementation strategies; Randomized controlled trial

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A census of 925 U.S. colleges and universities offering masters and doctorate degrees was conducted in order to study the number of elements of an environmental management system as defined by ISO 14001 possessed by small, medium and large institutions. A 30% response rate was received with 273 responses included in the final data analysis. Overall, the number of ISO 14001 elements implemented among the 273 institutions ranged from 0 to 16, with a median of 12. There was no significant association between the number of elements implemented among institutions and the size of the institution (p = 0.18; Kruskal-Wallis test) or among USEPA regions (p = 0.12; Kruskal-Wallis test). The proportion of U.S. colleges and universities that reported having implemented a structured, comprehensive environmental management system, defined by answering yes to all 16 elements, was 10% (95% C.I. 6.6%–14.1%); however 38% (95% C.I. 32.0%–43.8%) reported that they had implemented a structured, comprehensive environmental management system, while 30.0% (95% C.I. 24.7%–35.9%) are planning to implement a comprehensive environmental management system within the next five years. Stratified analyses were performed by institution size, Carnegie Classification and job title. ^ The Osnabruck model, and another under development by the South Carolina Sustainable Universities Initiative, are the only two environmental management system models that have been proposed specifically for colleges and universities, although several guides are now available. The Environmental Management System Implementation Model for U.S. Colleges and Universities developed is an adaptation of the ISO 14001 standard and USEPA recommendations and has been tailored to U.S. colleges and universities for use in streamlining the implementation process. In using this implementation model created for the U.S. research and academic setting, it is hoped that these highly specialized institutions will be provided with a clearer and more cost-effective path towards the implementation of an EMS and greater compliance with local, state and federal environmental legislation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the recognition of the importance of evidence-based medicine, there is an emerging need for methods to systematically synthesize available data. Specifically, methods to provide accurate estimates of test characteristics for diagnostic tests are needed to help physicians make better clinical decisions. To provide more flexible approaches for meta-analysis of diagnostic tests, we developed three Bayesian generalized linear models. Two of these models, a bivariate normal and a binomial model, analyzed pairs of sensitivity and specificity values while incorporating the correlation between these two outcome variables. Noninformative independent uniform priors were used for the variance of sensitivity, specificity and correlation. We also applied an inverse Wishart prior to check the sensitivity of the results. The third model was a multinomial model where the test results were modeled as multinomial random variables. All three models can include specific imaging techniques as covariates in order to compare performance. Vague normal priors were assigned to the coefficients of the covariates. The computations were carried out using the 'Bayesian inference using Gibbs sampling' implementation of Markov chain Monte Carlo techniques. We investigated the properties of the three proposed models through extensive simulation studies. We also applied these models to a previously published meta-analysis dataset on cervical cancer as well as to an unpublished melanoma dataset. In general, our findings show that the point estimates of sensitivity and specificity were consistent among Bayesian and frequentist bivariate normal and binomial models. However, in the simulation studies, the estimates of the correlation coefficient from Bayesian bivariate models are not as good as those obtained from frequentist estimation regardless of which prior distribution was used for the covariance matrix. The Bayesian multinomial model consistently underestimated the sensitivity and specificity regardless of the sample size and correlation coefficient. In conclusion, the Bayesian bivariate binomial model provides the most flexible framework for future applications because of its following strengths: (1) it facilitates direct comparison between different tests; (2) it captures the variability in both sensitivity and specificity simultaneously as well as the intercorrelation between the two; and (3) it can be directly applied to sparse data without ad hoc correction. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aimed to create a mailable and OSLD-based phantom with accuracy suitable for RPC audits of HDR brachytherapy sources at institutions participating in NCI-funded cooperative clinical trials. An 8 × 8 × 10 cm3 prototype with two slots capable of holding nanoDot Al2O3:C OSL dosimeters (Landauer, Glenwood, IL) was designed and built. The phantom has a single channel capable of accepting all 192Ir HDR brachytherapy sources in current clinical use in the United States. Irradiations were performed with an 192Ir HDR source to determine correction factors for linearity with dose, dose rate, and the combined effect of irradiation energy and phantom construction. The uncertainties introduced by source positioning in the phantom and timer resolution limitations were also investigated. It was found that the linearity correction factor was where dose is in cGy, which differed from that determined by the RPC for the same batch of dosimeters under 60Co irradiation. There was no significant dose rate effect. Separate energy+block correction factors were determined for both models of 192Ir sources currently in clinical use and these vendor-specific correction factors differed by almost 2.6%. For Nucletron sources, this correction factor was 1.026±0.004 (99% Confidence Interval) and for Varian sources it was 1.000±0.007 (99% CI). Reasonable deviations in source positioning within the phantom and the limited resolution of the source timer had insignificant effects on the ability to measure dose. Overall measurement uncertainty of the system was estimated to be ±2.5% for both Nucletron and Varian source audits (95% CI). This uncertainty was sufficient to establish a ±5% acceptance criterion for source strength audits under a formal RPC audit program. Trial audits of eight participating institutions resulted in an average RPC-to-institution dose ratio of 1.000 with a standard deviation of 0.011.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Membrane systems are computational equivalent to Turing machines. However, their distributed and massively parallel nature obtains polynomial solutions opposite to traditional non-polynomial ones. At this point, it is very important to develop dedicated hardware and software implementations exploiting those two membrane systems features. Dealing with distributed implementations of P systems, the bottleneck communication problem has arisen. When the number of membranes grows up, the network gets congested. The purpose of distributed architectures is to reach a compromise between the massively parallel character of the system and the needed evolution step time to transit from one configuration of the system to the next one, solving the bottleneck communication problem. The goal of this paper is twofold. Firstly, to survey in a systematic and uniform way the main results regarding the way membranes can be placed on processors in order to get a software/hardware simulation of P-Systems in a distributed environment. Secondly, we improve some results about the membrane dissolution problem, prove that it is connected, and discuss the possibility of simulating this property in the distributed model. All this yields an improvement in the system parallelism implementation since it gets an increment of the parallelism of the external communication among processors. Proposed ideas improve previous architectures to tackle the communication bottleneck problem, such as reduction of the total time of an evolution step, increase of the number of membranes that could run on a processor and reduction of the number of processors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Public Private Partnerships (PPPs) are mostly implemented to circumvent budgetary constraints, and to encourage efficiency and quality in the provision of public infrastructure in order to reach social welfare. One of the ways of reaching the latter objective is by the introduction of performance based standards tied to bonuses and penalties to reward or punish the performance of the contractor. This paper focuses on the implementation of safety based incentives in PPPs in such a way that the better the safety outcome the greater larger will be the economic reward to the contractor. The main aim of this paper is to identify whether the incentives to improve road safety in PPPs are ultimately effective in improving safety ratios in Spain. To that end, Poisson and negative binomial regression models have been applied using information of motorways of the Spanish network of 2006. The findings indicate that even though road safety is highly influenced by variables that are not much controllable by the contractor such as the Average Annual Daily Traffic and the percentage of heavy vehicles, the implementation of safety incentives in PPPs has a positive influence in the reduction of fatalities, injuries and accidents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling phase is fundamental both in the analysis process of a dynamic system and the design of a control system. If this phase is in-line is even more critical and the only information of the system comes from input/output data. Some adaptation algorithms for fuzzy system based on extended Kalman filter are presented in this paper, which allows obtaining accurate models without renounce the computational efficiency that characterizes the Kalman filter, and allows its implementation in-line with the process

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The province of Salta is located the Northwest of Argentina in the border with Bolivia, Chile and Paraguay. Its Capital is the city of Salta that concentrates half of the inhabitants of the province and has grown to 600000 hab., from a small active Spanish town well founded in 1583. The city is crossed by the Arenales River descending from close mountains at North, source of water and end of sewers. But with actual growing it has become a focus of infection and of remarkable unhealthiness. It is necessary to undertake a plan for the recovery of the river, directed to the attainment of the well-being and to improve the life?s quality of the Community. The fundamental idea of the plan is to obtain an ordering of the river basin and an integral management of the channel and its surroundings, including the cleaning out. The improvement of the water?s quality, the healthiness of the surroundings and the improvement of the environment, must go hand by hand with the development of sport activities, of relaxation, tourism, establishment of breeding grounds, kitchen gardens, micro enterprises with clean production and other actions that contribute to their benefit by the society, that being a basic factor for their care and sustainable use. The present pollution is organic, chemical, industrial, domestic, due to the disposition of sweepings and sewer effluents that affects not only the flora and small fauna, destroying the biodiversity, but also to the health of people living in their margins. Within the plan it will be necessary to consider, besides hydric and environmental cleaning and the prevention of floods, the planning of the extraction of aggregates, the infrastructure and consolidation of margins works and the arrangement of all the river basin. It will be necessary to consider the public intervention at state, provincial and local level, and the private intervention. In the model it has been necessary to include the sub-model corresponding to the election of the entity to be the optimal instrument to reach the proposed objectives, giving an answer to the social, environmental and economic requirements. For that the authors have used multi-criteria decision methods to qualify and select alternatives, and for the programming of their implementation. In the model the authors have contemplated the short, average and long term actions. They conform a Paretooptimal alternative which secures the ordering, integral and suitable management of the basin of the Arenales River, focusing on its passage by the city of Salta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Incorporating the possibility of attaching attributes to variables in a logic programming system has been shown to allow the addition of general constraint solving capabilities to it. This approach is very attractive in that by adding a few primitives any logic programming system can be turned into a generic constraint logic programming system in which constraint solving can be user deñned, and at source level - an extreme example of the "glass box" approach. In this paper we propose a different and novel use for the concept of attributed variables: developing a generic parallel/concurrent (constraint) logic programming system, using the same "glass box" flavor. We argüe that a system which implements attributed variables and a few additional primitives can be easily customized at source level to implement many of the languages and execution models of parallelism and concurrency currently proposed, in both shared memory and distributed systems. We illustrate this through examples and report on an implementation of our ideas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a method for the static resource usage analysis of MiniZinc models. The analysis can infer upper bounds on the usage that a MiniZinc model will make of some resources such as the number of constraints of a given type (equality, disequality, global constraints, etc.), the number of variables (search variables or temporary variables), or the size of the expressions before calling the solver. These bounds are obtained from the models independently of the concrete input data (the instance data) and are in general functions of sizes of such data. In our approach, MiniZinc models are translated into Ciao programs which are then analysed by the CiaoPP system. CiaoPP includes a parametric analysis framework for resource usage in which the user can define resources and express the resource usage of library procedures (and certain program construets) by means of a language of assertions. We present the approach and report on a preliminary implementation, which shows the feasibility of the approach, and provides encouraging results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term "Logic Programming" refers to a variety of computer languages and execution models which are based on the traditional concept of Symbolic Logic. The expressive power of these languages offers promise to be of great assistance in facing the programming challenges of present and future symbolic processing applications in Artificial Intelligence, Knowledge-based systems, and many other areas of computing. The sequential execution speed of logic programs has been greatly improved since the advent of the first interpreters. However, higher inference speeds are still required in order to meet the demands of applications such as those contemplated for next generation computer systems. The execution of logic programs in parallel is currently considered a promising strategy for attaining such inference speeds. Logic Programming in turn appears as a suitable programming paradigm for parallel architectures because of the many opportunities for parallel execution present in the implementation of logic programs. This dissertation presents an efficient parallel execution model for logic programs. The model is described from the source language level down to an "Abstract Machine" level suitable for direct implementation on existing parallel systems or for the design of special purpose parallel architectures. Few assumptions are made at the source language level and therefore the techniques developed and the general Abstract Machine design are applicable to a variety of logic (and also functional) languages. These techniques offer efficient solutions to several areas of parallel Logic Programming implementation previously considered problematic or a source of considerable overhead, such as the detection and handling of variable binding conflicts in AND-Parallelism, the specification of control and management of the execution tree, the treatment of distributed backtracking, and goal scheduling and memory management issues, etc. A parallel Abstract Machine design is offered, specifying data areas, operation, and a suitable instruction set. This design is based on extending to a parallel environment the techniques introduced by the Warren Abstract Machine, which have already made very fast and space efficient sequential systems a reality. Therefore, the model herein presented is capable of retaining sequential execution speed similar to that of high performance sequential systems, while extracting additional gains in speed by efficiently implementing parallel execution. These claims are supported by simulations of the Abstract Machine on sample programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is evidence that the climate changes and that now, the change is influenced and accelerated by the CO2 augmentation in atmosphere due to combustion by humans. Such ?Climate change? is on the policy agenda at the global level, with the aim of understanding and reducing its causes and to mitigate its consequences. In most countries and international organisms UNO (e.g. Rio de Janeiro 1992), OECD, EC, etc . . . the efforts and debates have been directed to know the possible causes, to predict the future evolution of some variable conditioners, and trying to make studies to fight against the effects or to delay the negative evolution of such. The Protocol of Kyoto 1997 set international efforts about CO2 emissions, but it was partial and not followed e.g. by USA and China . . . , and in Durban 2011 the ineffectiveness of humanity on such global real challenges was set as evident. Among all that, the elaboration of a global model was not boarded that can help to choose the best alternative between the feasible ones, to elaborate the strategies and to evaluate the costs, and the authors propose to enter in that frame for study. As in all natural, technological and social changes, the best-prepared countries will have the best bear and the more rapid recover. In all the geographic areas the alternative will not be the same one, but the model must help us to make the appropriated decision. It is essential to know those areas that are more sensitive to the negative effects of climate change, the parameters to take into account for its evaluation, and comprehensive plans to deal with it. The objective of this paper is to elaborate a mathematical model support of decisions, which will allow to develop and to evaluate alternatives of adaptation to the climatic change of different communities in Europe and Latin-America, mainly in especially vulnerable areas to the climatic change, considering in them all the intervening factors. The models will consider criteria of physical type (meteorological, edaphic, water resources), of use of the ground (agriculturist, forest, mining, industrial, urban, tourist, cattle dealer), economic (income, costs, benefits, infrastructures), social (population), politician (implementation, legislation), educative (Educational programs, diffusion) and environmental, at the present moment and the future. The intention is to obtain tools for aiding to get a realistic position for these challenges, which are an important part of the future problems of humanity in next decades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is on the policy agenda at the global level, with the aim of understanding and reducing its causes and to mitigate its consequences. In most of the countries and international organisms UNO, OECD, EC, etc … the efforts and debates have been directed to know the possible causes, to predict the future evolution of some variable conditioners, and trying to make studies to fight against the effects or to delay the negative evolution of such. Nevertheless, the elaboration of a global model was not boarded that can help to choose the best alternative between the feasible ones, to elaborate the strategies and to evaluate the costs. As in all natural, technological and social changes, the best-prepared countries will have the best bear and the more rapid recover. In all the geographic areas the alternative will not be the same one, but the model should help us to make the appropriated decision. It is essential to know those areas that are more sensitive to the negative effects of climate change, the parameters to take into account for its evaluation, and comprehensive plans to deal with it. The objective of this paper is to elaborate a mathematical model support of decisions, that will allow to develop and to evaluate alternatives of adaptation to the climatic change of different communities in Europe and Latin-America, mainly, in vulnerable areas to the climatic change, considering in them all the intervening factors. The models will take into consideration criteria of physical type (meteorological, edaphic, water resources), of use of the ground (agriculturist, forest, mining, industrial, urban, tourist, cattle dealer), economic (income, costs, benefits, infrastructures), social (population), politician (implementation, legislation), educative (Educational programs, diffusion), sanitary and environmental, at the present moment and the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work sets out an innovative methodology that aims to facilitate the implementation and continuous improvement of Social Responsibility. It is a methodology that takes account of strategic-economic, social and environmental questions and allows measuring the impact of each of these aspects on the stakeholders and on each of the value areas. It can be extrapolated to all kinds of organisations regardless of their size and sector and admits scaleable models. A marked feature that sets it aside from other methodologies is that it eliminates subjectivity from the qualitative aspects and introduces an algorithm to quantify them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.