985 resultados para Variability Model
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia do Ambiente, Perfil de Gestão e Sistemas Ambientais
Resumo:
OBJECTIVE: To assess the effect of the oscillatory breathing on the variability of RR intervals (VRR) and on prognostic significance after one year follow-up in subjects with left ventricular global systolic dysfunction. METHODS: We studied 76 subjects, whose age ranged from 40 to 80 years, paired for age and gender, divided into two groups: group I - 34 healthy subjects; group II - 42 subjects with left ventricular global systolic dysfunction (ejection fraction < 0.40). The ECG signals were acquired during 600s in supine position, and analyzed the variation of the thoracic amplitude and the VRR. Clinical and V-RR variables were applied into a logistic multivariate model to foretell survival after one year follow-up. RESULTS: Oscillatory breathing was detected in 35.7% of subjects in vigil state of group II, with a concentration of the spectral power in the very low frequency band, and was independent of the presence of diabetes, functional class, ejection fraction, cause of ventricular dysfunction and survival after one year follow-up. In the logistic regression model, ejection fraction was the only independent variable to predict survival. CONCLUSION: 1) Oscillatory breathing pattern is frequent during wakefulness in the left ventricular global systolic dysfunction and concentrates spectral power in the very low band of V-RR; 2) it does not relate to severity and cause of left ventricular dysfunction; 3) ejection fraction is the only independent predictive variable for survival in this group of subjects.
Resumo:
The value given by commuters to the variability of travel times is empirically analysed using stated preference data from Barcelona (Spain). Respondents are asked to choose between alternatives that differ in terms of cost, average travel time, variability of travel times and departure time. Different specifications of a scheduling choice model are used to measure the influence of various socioeconomic characteristics. Our results show that travel time variability.
Resumo:
In order to investigate the value of the rabbit as an experimental model for Chagas' disease, 72 animals have been inoculated by intraperitoneal and conjunctival route with bloodstream forms, vector-derived metacyclic trypomastigotes and tissue culture trypomastigotes of Trypanosoma cruzi strains Y, CL and Ernane. In 95.6% of the animals trypomastigotes had been detected at the early stages of infection by fresh blood examination. The course of parasitemia at the acute phase was strongly influenced by the parasite strain and route of inoculation. At the chronic phase parasites had been recovered by xenodiagnosis and/or hemoculture in 40% of the examined animals. The xenodiagnosis studies have shown selective interactions between the T. cruzi strains and the four species of vectors used, inducing significant variability in the results. The data herein present are consistent with the parasitological requirements established for a suitable model for chronic Chagas' disease.
Resumo:
National inflation rates reflect domestic and international (regional and global) influences. The relative importance of these components remains a controversial empirical issue. We extend the literature on inflation co-movement by utilising a dynamic factor model with stochastic volatility to account for shifts in the variance of inflation and endogenously determined regional groupings. We find that most of inflation variability is explained by the country specific disturbance term. Nevertheless, the contribution of the global component in explaining industrialised countries’ inflation rates has increased over time.
Resumo:
INTRODUCTION: Therapeutic hypothermia (TH) is often used to treat out-of-hospital cardiac arrest (OHCA) patients who also often simultaneously receive insulin for stress-induced hyperglycaemia. However, the impact of TH on systemic metabolism and insulin resistance in critical illness is unknown. This study analyses the impact of TH on metabolism, including the evolution of insulin sensitivity (SI) and its variability, in patients with coma after OHCA. METHODS: This study uses a clinically validated, model-based measure of SI. Insulin sensitivity was identified hourly using retrospective data from 200 post-cardiac arrest patients (8,522 hours) treated with TH, shortly after admission to the intensive care unit (ICU). Blood glucose and body temperature readings were taken every one to two hours. Data were divided into three periods: 1) cool (T <35°C); 2) an idle period of two hours as normothermia was re-established; and 3) warm (T >37°C). A maximum of 24 hours each for the cool and warm periods was considered. The impact of each condition on SI is analysed per cohort and per patient for both level and hour-to-hour variability, between periods and in six-hour blocks. RESULTS: Cohort and per-patient median SI levels increase consistently by 35% to 70% and 26% to 59% (P <0.001) respectively from cool to warm. Conversely, cohort and per-patient SI variability decreased by 11.1% to 33.6% (P <0.001) for the first 12 hours of treatment. However, SI variability increases between the 18th and 30th hours over the cool to warm transition, before continuing to decrease afterward. CONCLUSIONS: OCHA patients treated with TH have significantly lower and more variable SI during the cool period, compared to the later warm period. As treatment continues, SI level rises, and variability decreases consistently except for a large, significant increase during the cool to warm transition. These results demonstrate increased resistance to insulin during mild induced hypothermia. Our study might have important implications for glycaemic control during targeted temperature management.
Resumo:
AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.
Resumo:
Chagas disease, caused by the protozoan Trypanosoma cruzi, has a variable clinical course, ranging from symptomless infection to severe chronic disease with cardiovascular or gastrointestinal involvement or, occasionally, overwhelming acute episodes. The factors influencing this clinical variability have not been elucidated, but it is likely that the genetic variability of both the host and the parasite are of importance. In this work we review the the genetic structure of T. cruzi populations and analyze the importance of genetic variation of the parasite in the pathogenesis of the disease under the light of the histotropic-clonal model.
Resumo:
Every year, debris flows cause huge damage in mountainous areas. Due to population pressure in hazardous zones, the socio-economic impact is much higher than in the past. Therefore, the development of indicative susceptibility hazard maps is of primary importance, particularly in developing countries. However, the complexity of the phenomenon and the variability of local controlling factors limit the use of processbased models for a first assessment. A debris flow model has been developed for regional susceptibility assessments using digital elevation model (DEM) with a GIS-based approach.. The automatic identification of source areas and the estimation of debris flow spreading, based on GIS tools, provide a substantial basis for a preliminary susceptibility assessment at a regional scale. One of the main advantages of this model is its workability. In fact, everything is open to the user, from the data choice to the selection of the algorithms and their parameters. The Flow-R model was tested in three different contexts: two in Switzerland and one in Pakistan, for indicative susceptibility hazard mapping. It was shown that the quality of the DEM is the most important parameter to obtain reliable results for propagation, but also to identify the potential debris flows sources.
Resumo:
General introductionThe Human Immunodeficiency/Acquired Immunodeficiency Syndrome (HIV/AIDS) epidemic, despite recent encouraging announcements by the World Health Organization (WHO) is still today one of the world's major health care challenges.The present work lies in the field of health care management, in particular, we aim to evaluate the behavioural and non-behavioural interventions against HIV/AIDS in developing countries through a deterministic simulation model, both in human and economic terms. We will focus on assessing the effectiveness of the antiretroviral therapies (ART) in heterosexual populations living in lesser developed countries where the epidemic has generalized (formerly defined by the WHO as type II countries). The model is calibrated using Botswana as a case study, however our model can be adapted to other countries with similar transmission dynamics.The first part of this thesis consists of reviewing the main mathematical concepts describing the transmission of infectious agents in general but with a focus on human immunodeficiency virus (HIV) transmission. We also review deterministic models assessing HIV interventions with a focus on models aimed at African countries. This review helps us to recognize the need for a generic model and allows us to define a typical structure of such a generic deterministic model.The second part describes the main feed-back loops underlying the dynamics of HIV transmission. These loops represent the foundation of our model. This part also provides a detailed description of the model, including the various infected and non-infected population groups, the type of sexual relationships, the infection matrices, important factors impacting HIV transmission such as condom use, other sexually transmitted diseases (STD) and male circumcision. We also included in the model a dynamic life expectancy calculator which, to our knowledge, is a unique feature allowing more realistic cost-efficiency calculations. Various intervention scenarios are evaluated using the model, each of them including ART in combination with other interventions, namely: circumcision, campaigns aimed at behavioral change (Abstain, Be faithful or use Condoms also named ABC campaigns), and treatment of other STD. A cost efficiency analysis (CEA) is performed for each scenario. The CEA consists of measuring the cost per disability-adjusted life year (DALY) averted. This part also describes the model calibration and validation, including a sensitivity analysis.The third part reports the results and discusses the model limitations. In particular, we argue that the combination of ART and ABC campaigns and ART and treatment of other STDs are the most cost-efficient interventions through 2020. The main model limitations include modeling the complexity of sexual relationships, omission of international migration and ignoring variability in infectiousness according to the AIDS stage.The fourth part reviews the major contributions of the thesis and discusses model generalizability and flexibility. Finally, we conclude that by selecting the adequate interventions mix, policy makers can significantly reduce the adult prevalence in Botswana in the coming twenty years providing the country and its donors can bear the cost involved.Part I: Context and literature reviewIn this section, after a brief introduction to the general literature we focus in section two on the key mathematical concepts describing the transmission of infectious agents in general with a focus on HIV transmission. Section three provides a description of HIV policy models, with a focus on deterministic models. This leads us in section four to envision the need for a generic deterministic HIV policy model and briefly describe the structure of such a generic model applicable to countries with generalized HIV/AIDS epidemic, also defined as pattern II countries by the WHO.
Resumo:
Background Demand for home care services has increased considerably, along with the growing complexity of cases and variability among resources and providers. Designing services that guarantee co-ordination and integration for providers and levels of care is of paramount importance. The aim of this study is to determine the effectiveness of a new case-management based, home care delivery model which has been implemented in Andalusia (Spain). Methods Quasi-experimental, controlled, non-randomised, multi-centre study on the population receiving home care services comparing the outcomes of the new model, which included nurse-led case management, versus the conventional one. Primary endpoints: functional status, satisfaction and use of healthcare resources. Secondary endpoints: recruitment and caregiver burden, mortality, institutionalisation, quality of life and family function. Analyses were performed at base-line, and at two, six and twelve months. A bivariate analysis was conducted with the Student's t-test, Mann-Whitney's U, and the chi squared test. Kaplan-Meier and log-rank tests were performed to compare survival and institutionalisation. A multivariate analysis was performed to pinpoint factors that impact on improvement of functional ability. Results Base-line differences in functional capacity – significantly lower in the intervention group (RR: 1.52 95%CI: 1.05–2.21; p = 0.0016) – disappeared at six months (RR: 1.31 95%CI: 0.87–1.98; p = 0.178). At six months, caregiver burden showed a slight reduction in the intervention group, whereas it increased notably in the control group (base-line Zarit Test: 57.06 95%CI: 54.77–59.34 vs. 60.50 95%CI: 53.63–67.37; p = 0.264), (Zarit Test at six months: 53.79 95%CI: 49.67–57.92 vs. 66.26 95%CI: 60.66–71.86 p = 0.002). Patients in the intervention group received more physiotherapy (7.92 CI95%: 5.22–10.62 vs. 3.24 95%CI: 1.37–5.310; p = 0.0001) and, on average, required fewer home care visits (9.40 95%CI: 7.89–10.92 vs.11.30 95%CI: 9.10–14.54). No differences were found in terms of frequency of visits to A&E or hospital re-admissions. Furthermore, patients in the control group perceived higher levels of satisfaction (16.88; 95%CI: 16.32–17.43; range: 0–21, vs. 14.65 95%CI: 13.61–15.68; p = 0,001). Conclusion A home care service model that includes nurse-led case management streamlines access to healthcare services and resources, while impacting positively on patients' functional ability and caregiver burden, with increased levels of satisfaction.
Resumo:
BACKGROUND: Using a bench test model, we investigated the hypothesis that neonatal and/or adult ventilators equipped with neonatal/pediatric modes currently do not reliably administer pressure support (PS) in neonatal or pediatric patient groups in either the absence or presence of air leaks. METHODS: PS was evaluated in 4 neonatal and 6 adult ventilators using a bench model to evaluate triggering, pressurization, and cycling in both the absence and presence of leaks. Delivered tidal volumes were also assessed. Three patients were simulated: a preterm infant (resistance 100 cm H2O/L/s, compliance 2 mL/cm H2O, inspiratory time of the patient [TI] 400 ms, inspiratory effort 1 and 2 cm H2O), a full-term infant (resistance 50 cm H2O/L/s, compliance 5 mL/cm H2O, TI 500 ms, inspiratory effort 2 and 4 cm H2O), and a child (resistance 30 cm H2O/L/s, compliance 10 mL/cm H2O, TI 600 ms, inspiratory effort 5 and 10 cm H2O). Two PS levels were tested (10 and 15 cm H2O) with and without leaks and with and without the leak compensation algorithm activated. RESULTS: Without leaks, only 2 neonatal ventilators and one adult ventilator had trigger delays under a given predefined acceptable limit (1/8 TI). Pressurization showed high variability between ventilators. Most ventilators showed TI in excess high enough to seriously impair patient-ventilator synchronization (> 50% of the TI of the subject). In some ventilators, leaks led to autotriggering and impairment of ventilation performance, but the influence of leaks was generally lower in neonatal ventilators. When a noninvasive ventilation algorithm was available, this was partially corrected. In general, tidal volume was calculated too low by the ventilators in the presence of leaks; the noninvasive ventilation algorithm was able to correct this difference in only 2 adult ventilators. CONCLUSIONS: No ventilator performed equally well under all tested conditions for all explored parameters. However, neonatal ventilators tended to perform better in the presence of leaks. These findings emphasize the need to improve algorithms for assisted ventilation modes to better deal with situations of high airway resistance, low pulmonary compliance, and the presence of leaks.
Resumo:
In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. Recent advances in machine learning offer a novel approach to model spatial distribution of petrophysical properties in complex reservoirs alternative to geostatistics. The approach is based of semisupervised learning, which handles both ?labelled? observed data and ?unlabelled? data, which have no measured value but describe prior knowledge and other relevant data in forms of manifolds in the input space where the modelled property is continuous. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic geological features and describe stochastic variability and non-uniqueness of spatial properties. On the other hand, it is able to capture and preserve key spatial dependencies such as connectivity of high permeability geo-bodies, which is often difficult in contemporary petroleum reservoir studies. Semi-supervised SVR as a data driven algorithm is designed to integrate various kind of conditioning information and learn dependences from it. The semi-supervised SVR model is able to balance signal/noise levels and control the prior belief in available data. In this work, stochastic semi-supervised SVR geomodel is integrated into Bayesian framework to quantify uncertainty of reservoir production with multiple models fitted to past dynamic observations (production history). Multiple history matched models are obtained using stochastic sampling and/or MCMC-based inference algorithms, which evaluate posterior probability distribution. Uncertainty of the model is described by posterior probability of the model parameters that represent key geological properties: spatial correlation size, continuity strength, smoothness/variability of spatial property distribution. The developed approach is illustrated with a fluvial reservoir case. The resulting probabilistic production forecasts are described by uncertainty envelopes. The paper compares the performance of the models with different combinations of unknown parameters and discusses sensitivity issues.
Resumo:
Objective: to assess the between and within-device reproducibility, as well as within-day variability of body fat measurements. Methods: body fat percentage (%BF) was measured twice on seventeen female students aged between 18 and 20 with a body mass index of 21.9 22.6 kg/m2 (mean SD) using seven bipolar bioelectrical impedance devices (BF-306) according to the manufacturer's recommendations. Each student was also measured each hour between 7:00 and 22:00. Statistical analysis was conducted using a general linear model for repeated measurements. Results: the correlation between first and second measurements was very high (Pearson r between 0.985 and 1.000, p<0.001), as well as the correlation between devices (Pearson r between 0.986 and 0.999, all p<0.001). Repeated measurements analysis showed no differences were between devices (F test=0.83, p=0.59) or readings (first vs. second: F test=0.12, p=0.74). Conversely, significant differences were found between assessment periods throughout the day, measurements made in the morning being lower than those made in the afternoon. Assuming an overall daily average of 100 (based on all measurements), the values were 95.8 3.2 (mean SD) at 8:00 versus 101.3 3.0 at 20:00, corresponding to a mean change of 2.2 1.1 in %BF (F test for repeated values=6.58, p<0.001). Conclusions: the between and within-device reproducibility for measuring body fat is high, enabling the use of multiple devices in a single study. Conversely, small but significant changes in body fat measurements occur during the day, urging body fat measurements to be performed at fixed times.