883 resultados para Simulation Based Method
Resumo:
In this work, we present a 3D web-based interactive tool for numerical modeling and simulation approach to breast reduction surgery simulation, to assist surgeons in planning all aspects related to breast reduction surgery before the actual procedure takes place, thereby avoiding unnecessary risks. In particular, it allows the modeling of the initial breast geometry, the definition of all aspects related to the surgery and the visualization of the post-surgery breast shape in a realistic environment.
Resumo:
Tese de Doutoramento em Biologia Molecular e Ambiental (área de especialização em Biologia Celular e Saúde).
Resumo:
Modular modelling, dynamics simulation, multibodies, O(N) method, closed loops, post-stabilization
Resumo:
Recently there has been a renewed research interest in the properties of non survey updates of input-output tables and social accounting matrices (SAM). Along with the venerable and well known scaling RAS method, several alternative new procedures related to entropy minimization and other metrics have been suggested, tested and used in the literature. Whether these procedures will eventually substitute or merely complement the RAS approach is still an open question without a definite answer. The performance of many of the updating procedures has been tested using some kind of proximity or closeness measure to a reference input-output table or SAM. The first goal of this paper, in contrast, is the proposal of checking the operational performance of updating mechanisms by way of comparing the simulation results that ensue from adopting alternative databases for calibration of a reference applied general equilibrium model. The second goal is to introduce a new updatin! g procedure based on information retrieval principles. This new procedure is then compared as far as performance is concerned to two well-known updating approaches: RAS and cross-entropy. The rationale for the suggested cross validation is that the driving force for having more up to date databases is to be able to conduct more current, and hopefully more credible, policy analyses.
Resumo:
Computational modeling has become a widely used tool for unraveling the mechanisms of higher level cooperative cell behavior during vascular morphogenesis. However, experimenting with published simulation models or adding new assumptions to those models can be daunting for novice and even for experienced computational scientists. Here, we present a step-by-step, practical tutorial for building cell-based simulations of vascular morphogenesis using the Tissue Simulation Toolkit (TST). The TST is a freely available, open-source C++ library for developing simulations with the two-dimensional cellular Potts model, a stochastic, agent-based framework to simulate collective cell behavior. We will show the basic use of the TST to simulate and experiment with published simulations of vascular network formation. Then, we will present step-by-step instructions and explanations for building a recent simulation model of tumor angiogenesis. Demonstrated mechanisms include cell-cell adhesion, chemotaxis, cell elongation, haptotaxis, and haptokinesis.
Resumo:
BACKGROUND AND STUDY AIMS: Various screening methods for colorectal cancer (CRC) are promoted by professional societies; however, few data are available about the factors that determine patient participation in screening, which is crucial to the success of population-based programs. This study aimed (i) to identify factors that determine acceptance of screening and preference of screening method, and (ii) to evaluate procedure success, detection of colorectal neoplasia, and patient satisfaction with screening colonoscopy. PATIENTS AND METHODS: Following a public awareness campaign, the population aged 50 - 80 years was offered CRC screening in the form of annual fecal occult blood tests, flexible sigmoidoscopy, a combination of both, or colonoscopy. RESULTS: 2731 asymptomatic persons (12.0 % of the target population) registered with and were eligible to take part in the screening program. Access to information and a positive attitude to screening were major determinants of participation. Colonoscopy was the method preferred by 74.8 % of participants. Advanced colorectal neoplasia was present in 8.5 %; its prevalence was higher in males and increased with age. Significant complications occurred in 0.5 % of those undergoing colonoscopy and were associated with polypectomy or sedation. Most patients were satisfied with colonoscopy and over 90 % would choose it again for CRC screening. CONCLUSIONS: In this population-based study, only a small proportion of the target population underwent CRC screening despite an extensive information campaign. Colonoscopy was the preferred method and was safe. The determinants of participation in screening and preference of screening method, together with the distribution of colorectal neoplasia in different demographic categories, provide a rationale for improving screening procedures.
Resumo:
Nowadays, many of the health care systems are large and complex environments and quite dynamic, specifically Emergency Departments, EDs. It is opened and working 24 hours per day throughout the year with limited resources, whereas it is overcrowded. Thus, is mandatory to simulate EDs to improve qualitatively and quantitatively their performance. This improvement can be achieved modelling and simulating EDs using Agent-Based Model, ABM and optimising many different staff scenarios. This work optimises the staff configuration of an ED. In order to do optimisation, objective functions to minimise or maximise have to be set. One of those objective functions is to find the best or optimum staff configuration that minimise patient waiting time. The staff configuration comprises: doctors, triage nurses, and admissions, the amount and sort of them. Staff configuration is a combinatorial problem, that can take a lot of time to be solved. HPC is used to run the experiments, and encouraging results were obtained. However, even with the basic ED used in this work the search space is very large, thus, when the problem size increases, it is going to need more resources of processing in order to obtain results in an acceptable time.
Resumo:
n this paper the iterative MSFV method is extended to include the sequential implicit simulation of time dependent problems involving the solution of a system of pressure-saturation equations. To control numerical errors in simulation results, an error estimate, based on the residual of the MSFV approximate pressure field, is introduced. In the initial time steps in simulation iterations are employed until a specified accuracy in pressure is achieved. This initial solution is then used to improve the localization assumption at later time steps. Additional iterations in pressure solution are employed only when the pressure residual becomes larger than a specified threshold value. Efficiency of the strategy and the error control criteria are numerically investigated. This paper also shows that it is possible to derive an a-priori estimate and control based on the allowed pressure-equation residual to guarantee the desired accuracy in saturation calculation.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment. © 2014 American Society for Bone and Mineral Research.
Resumo:
In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation
Resumo:
Background:¦Hirschsprung's disease (HSCR) is a congenital malformation of the enteric nervous system due to the¦arrest of migration of neural crest cells to form the myenteric and submucosal plexuses. It leads to an anganglionic intestinal segment, which is permanently contracted causing intestinal obstruction. Its incidence is approximately 1/5000 birth, and males are more frequently affected with a male/female ratio of 4/1. The diagnosis is in most cases made within the first year of life. The rectal biopsy of the mucosa and sub-mucosa is the diagnostic gold standard.¦Purpose:¦The aim of this study was to compare two surgical approaches for HSCR, the Duhamel technique and the transanal endorectal pull-through (TEPT) in term of indications, duration of surgery, duration of hospital stay, postoperative treatment, complications, frequency of enterocolitis and functional outcomes.¦Methods:¦Fifty-nine patients were treated for HSCR by one of the two methods in our department of pediatric¦surgery between 1994 and 2010. These patients were separated into two groups (I: Duhamel, II: TEPT), which were compared on the basis of medical records. Statistics were made to compare the two groups (ANOVA test). The first group includes 43 patients and the second 16 patients. It is noteworthy that twenty-four patients (about 41% of all¦patients) were referred from abroad (Western Africa). Continence was evaluated with the Krickenbeck's score.¦Results:¦Statistically, this study showed that operation duration, hospital stay, postoperative fasting and duration of postoperative antibiotics were significantly shorter (p value < 0.05) in group II (TEPT). But age at operation and length of aganglionic segment showed no significant difference between the two groups. The continence follow-up showed generally good results (Krickenbeck's scores 1; 2.1; 3.1) in both groups with a slight tendency to constipation in group I and soiling in group II.¦Conclusion:¦We found two indications for the Duhamel method that are being referred from a country without¦careful postoperative surveillance and/or having a previous colostomy. Even if the Duhamel technique tends to be replaced by the TEPT, it remains the best operative approach for some selected patients. TEPT has also proved some advantages but must be followed carefully because, among other points, of the postoperative dilatations. Our postoperative standards, like digital rectal examination and anal dilatations seem to reduce the occurrence of complications like rectal spur and anal/anastomosis stenosis, respectively in the Duhamel method and the TEPT technique.
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.