956 resultados para Computer models
Resumo:
PHWAT is a new model that couples a geochemical reaction model (PHREEQC-2) with a density-dependent groundwater flow and solute transport model (SEAWAT) using the split-operator approach. PHWAT was developed to simulate multi-component reactive transport in variable density groundwater flow. Fluid density in PHWAT depends not on only the concentration of a single species as in SEAWAT, but also the concentrations of other dissolved chemicals that can be subject to reactive processes. Simulation results of PHWAT and PHREEQC-2 were compared in their predictions of effluent concentration from a column experiment. Both models produced identical results, showing that PHWAT has correctly coupled the sub-packages. PHWAT was then applied to the simulation of a tank experiment in which seawater intrusion was accompanied by cation exchange. The density dependence of the intrusion and the snow-plough effect in the breakthrough curves were reflected in the model simulations, which were in good agreement with the measured breakthrough data. Comparison simulations that, in turn, excluded density effects and reactions allowed us to quantify the marked effect of ignoring these processes. Next, we explored numerical issues involved in the practical application of PHWAT using the example of a dense plume flowing into a tank containing fresh water. It was shown that PHWAT could model physically unstable flow and that numerical instabilities were suppressed. Physical instability developed in the model in accordance with the increase of the modified Rayleigh number for density-dependent flow, in agreement with previous research. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Little consensus exists in the literature regarding methods for determination of the onset of electromyographic (EMG) activity. The aim of this study was to compare the relative accuracy of a range of computer-based techniques with respect to EMG onset determined visually by an experienced examiner. Twenty-seven methods were compared which varied in terms of EMG processing (low pass filtering at 10, 50 and 500 Hz), threshold value (1, 2 and 3 SD beyond mean of baseline activity) and the number of samples for which the mean must exceed the defined threshold (20, 50 and 100 ms). Three hundred randomly selected trials of a postural task were evaluated using each technique. The visual determination of EMG onset was found to be highly repeatable between days. Linear regression equations were calculated for the values selected by each computer method which indicated that the onset values selected by the majority of the parameter combinations deviated significantly from the visually derived onset values. Several methods accurately selected the time of onset of EMG activity and are recommended for future use. Copyright (C) 1996 Elsevier Science Ireland Ltd.
Resumo:
In the present study, the authors sought to determine whether the efficiency and cost-effectiveness of cognitive-behavioral treatment (CBT) for panic disorder could be improved by adjunctive computer-assisted therapy. Eighteen participants who met Diagnostic and Statistical Manual of Mental Disorders (3rd ed., revised; American Psychiatric Association, 1987) criteria for panic disorder were randomly assigned to a 12-session CBT (CBT12) condition (D. H. Barlow & M. G. Craske, 1989) or to a 4-session computer-assisted CBT (CBT4-CA) condition. Palmtop computers, with a program developed to incorporate basic principles of CBT, were used by CBT4-CA clients whenever they felt anxious or wanted to practice the therapy techniques and were used by all participants as a momentary assessment tool. CBT4-CA clients carried the computer at all times and continued to use it for 8 weeks after termination of therapy. Analyses of clinically significant change showed superiority of CBT12 at posttest on some measures; however, there were no differences at follow-up.
Resumo:
The dynamic response of dry masonry columns can be approximated with finite-difference equations. Continuum models follow by replacing the difference quotients of the discrete model by corresponding differential expressions. The mathematically simplest of these models is a one-dimensional Cosserat theory. Within the presented homogenization context, the Cosserat theory is obtained by making ad hoc assumptions regarding the relative importance of certain terms in the differential expansions. The quality of approximation of the various theories is tested by comparison of the dispersion relations for bending waves with the dispersion relation of the discrete theory. All theories coincide with differences of less than 1% for wave-length-block-height (L/h) ratios bigger than 2 pi. The theory based on systematic differential approximation remains accurate up to L/h = 3 and then diverges rapidly. The Cosserat model becomes increasingly inaccurate for L/h < 2 pi. However, in contrast to the systematic approximation, the wave speed remains finite. In conclusion, considering its relative simplicity, the Cosserat model appears to be the natural starting point for the development of continuum models for blocky structures.
Resumo:
When linear equality constraints are invariant through time they can be incorporated into estimation by restricted least squares. If, however, the constraints are time-varying, this standard methodology cannot be applied. In this paper we show how to incorporate linear time-varying constraints into the estimation of econometric models. The method involves the augmentation of the observation equation of a state-space model prior to estimation by the Kalman filter. Numerical optimisation routines are used for the estimation. A simple example drawn from demand analysis is used to illustrate the method and its application.
Resumo:
The absence of considerations of technology in policy studies reinforces the popular notion that technology is a neutral tool, Through an analysis of the role played by computers in the policy processes of Australia's Department of Social Security, this paper argues that computers are political players in policy processes, Findings indicate that computers make aspects of the social domain knowable and therefore governable, The use of computers makes previously infeasible policies possible, Computers also operate as bureaucrats and as agents of client surveillance. Increased policy change, reduced discretion and increasingly targeted and complex policies can be attributed to the use of computer technology, If policy processes are to be adequately understood and analysed, then the role of technology in those processes must be considered.
Resumo:
Objective: The aim of this study was to test the effectiveness of various attitude-behavior theories in explaining alcohol use among young adults. The theory of reasoned action (TRA), the theory of planned behavior and an extension of the TRA that incorporates past behavior were compared by the method of maximum-likelihood estimation, as implemented in LISREL for Windows 8.12. Method: Respondents consisted of 122 university students (82 female) who were questioned about their attitudes, subjective norms, perceived behavioral control, past behavior and intentions relating to drinking behavior. Students received course credit for their participation in the research. Results: Overall, the results suggest that the extension of the theory of reasoned action which incorporates past behavior provides the best fit to the data. For these young adults, their intentions to drink alcohol were predicted by their past behavior as well as their perceptions of what important others think they should do (subjective norm). Conclusions: The main conclusions drawn from the research concern the importance of focusing on normative influences and past behavior in explaining young adult alcohol use. Issues regarding the relative merit of various alternative models and the need for greater clarity in the measure of attitudes are also discussed.
Resumo:
Sepsis remains a major cause of morbidity and mortality mainly because of sepsis-induced multiple organ dysfunction. In contrast to preclinical studies, most clinical trials of promising new treatment strategies for sepsis have failed to demonstrate efficacy. Although many reasons could account for this discrepancy, the misinterpretation of preclinical data obtained from experimental studies and especially the use of animal models that do not adequately mimic human sepsis may have been contributing factors. In this review, the potentials and limitations of various animal models of sepsis are discussed to clarify to which extent these findings are relevant to human sepsis. Such models include intravascular infusion of endotoxin or live bacteria, bacterial peritonitis, cecal ligation and perforation, soft tissue infection, pneumonia or meningitis models using different animal species including rats, mice, rabbits, dogs, pigs, sheep, and nonhuman primates. Despite several limitations, animal models remain essential in the development of all new therapies for sepsis and septic shock because they provide fundamental information about the pharmacokinetics, toxicity, and mechanism of drug action that cannot be replaced by other methods. New therapeutic agents should be studied in infection models, even after the initiation of the septic process. Furthermore, debility conditions need to be reproduced to avoid the exclusive use of healthy animals, which often do not represent the human septic patient.
Resumo:
This paper offers a defense of backwards in time causation models in quantum mechanics. Particular attention is given to Cramer's transactional account, which is shown to have the threefold virtue of solving the Bell problem, explaining the complex conjugate aspect of the quantum mechanical formalism, and explaining various quantum mysteries such as Schrodinger's cat. The question is therefore asked, why has this model not received more attention from physicists and philosophers? One objection given by physicists in assessing Cramer's theory was that it is not testable. This paper seeks to answer this concern by utilizing an argument that backwards causation models entail a fork theory of causal direction. From the backwards causation model together with the fork theory one can deduce empirical predictions. Finally, the objection that this strategy is questionable because of its appeal to philosophy is deflected.
Resumo:
Dengue has emerged as a frequent problem in international travelers. The risk depends on destination, duration, and season of travel. However, data to quantify the true risk for travelers to acquire dengue are lacking. We used mathematical models to estimate the risk of nonimmune persons to acquire dengue when traveling to Singapore. From the force of infection, we calculated the risk of dengue dependent on duration of stay and season of arrival. Our data highlight that the risk for nonimmune travelers to acquire dengue in Singapore is substantial but varies greatly with seasons and epidemic cycles. For instance, for a traveler who stays in Singapore for 1 week during the high dengue season in 2005, the risk of acquiring dengue was 0.17%, but it was only 0.00423% during the low season in a nonepidemic year such as 2002. Risk estimates based on mathematical modeling will help the travel medicine provider give better evidence-based advice for travelers to dengue endemic countries.
Resumo:
Introduction: Current advances in frame modeling and computer software allow stereotactic procedures to be performed with great accuracy and minimal risk of neural tissue or vascular injury. Case Report: In this report we associate a previously described minimally invasive stereotactic technique with state-of-the-art 3D computer guidance technology to successfully treat a 55-year-old patient with an arachnoidal cyst obstructing the aqueduct of Sylvius. We provide 1 detailed technical information and discuss how this technique deals with previous limitations for stereotactic manipulation of the aqueductal region. We further discuss current advances in neuroendoscopy for treating obstructive hydrocephalus and make comparisons with our proposed technique. Conclusion: We advocate that this technique is not only capable of treating this pathology but it also has the advantages to enable reestablishment of physiological CSF flow thus preventing future brainstem compression by cyst enlargement.
Resumo:
Purpose: The objective of this study is to evaluate blood glucose (BG) control efficacy and safety of 3 insulin protocols in medical intensive care unit (MICU) patients. Methods: This was a multicenter randomized controlled trial involving 167 MICU patients with at least one BG measurement +/- 150 mg/dL and one or more of the following: mechanical ventilation, systemic inflammatory response syndrome, trauma, or burns. The interventions were computer-assisted insulin protocol (CAIP), with insulin infusion maintaining BG between 100 and 130 mg/dL; Leuven protocol, with insulin maintaining BG between 80 and 110 mg/dL; or conventional treatment-subcutaneous insulin if glucose > 150 mg/dL. The main efficacy outcome was the mean of patients` median BG, and the safety outcome was the incidence of hypoglycemia (<= 40 mg/dL). Results: The mean of patients` median BG was 125.0, 127.1, and 158.5 mg/dL for CAIP, Leuven, and conventional treatment, respectively (P = .34, CAIP vs Leuven; P < .001, CAIP vs conventional). In CAIP, 12 patients (21.4%) had at least one episode of hypoglycemia vs 24 (41.4%) in Leuven and 2 (3.8%) in conventional treatment (P = .02, CAIP vs Leuven; P = .006, CAIP vs conventional). Conclusions: The CAIP is safer than and as effective as the standard strict protocol for controlling glucose in MICU patients. Hypoglycemia was rare under conventional treatment. However, BG levels were higher than with IV insulin protocols. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Objective: The aim of this article is to propose an integrated framework for extracting and describing patterns of disorders from medical images using a combination of linear discriminant analysis and active contour models. Methods: A multivariate statistical methodology was first used to identify the most discriminating hyperplane separating two groups of images (from healthy controls and patients with schizophrenia) contained in the input data. After this, the present work makes explicit the differences found by the multivariate statistical method by subtracting the discriminant models of controls and patients, weighted by the pooled variance between the two groups. A variational level-set technique was used to segment clusters of these differences. We obtain a label of each anatomical change using the Talairach atlas. Results: In this work all the data was analysed simultaneously rather than assuming a priori regions of interest. As a consequence of this, by using active contour models, we were able to obtain regions of interest that were emergent from the data. The results were evaluated using, as gold standard, well-known facts about the neuroanatomical changes related to schizophrenia. Most of the items in the gold standard was covered in our result set. Conclusions: We argue that such investigation provides a suitable framework for characterising the high complexity of magnetic resonance images in schizophrenia as the results obtained indicate a high sensitivity rate with respect to the gold standard. (C) 2010 Elsevier B.V. All rights reserved.