950 resultados para AFT Models for Crash Duration Survival Analysis
Resumo:
AIMS: Device-based remote monitoring (RM) has been linked to improved clinical outcomes at short to medium-term follow-up. Whether this benefit extends to long-term follow-up is unknown. We sought to assess the effect of device-based RM on long-term clinical outcomes in recipients of implantable cardioverter-defibrillators (ICD). METHODS: We performed a retrospective cohort study of consecutive patients who underwent ICD implantation for primary prevention. RM was initiated with patient consent according to availability of RM hardware at implantation. Patients with concomitant cardiac resynchronization therapy were excluded. Data on hospitalizations, mortality and cause of death were systematically assessed using a nationwide healthcare platform. A Cox proportional hazards model was employed to estimate the effect of RM on mortality and a composite endpoint of cardiovascular mortality and hospital admission due to heart failure (HF). RESULTS: 312 patients were included with a median follow-up of 37.7months (range 1 to 146). 121 patients (38.2%) were under RM since the first outpatient visit post-ICD and 191 were in conventional follow-up. No differences were found regarding age, left ventricular ejection fraction, heart failure etiology or NYHA class at implantation. Patients under RM had higher long-term survival (hazard ratio [HR] 0.50, CI 0.27-0.93, p=0.029) and lower incidence of the composite outcome (HR 0.47, CI 0.27-0.82, p=0.008). After multivariate survival analysis, overall survival was independently associated with younger age, higher LVEF, NYHA class lower than 3 and RM. CONCLUSION: RM was independently associated with increased long-term survival and a lower incidence of a composite endpoint of hospitalization for HF or cardiovascular mortality.
Resumo:
La galaxie spirale barrée NGC 5430 est particulière en ce sens qu’elle présente un noeud Wolf-Rayet très lumineux et des bras asymétriques. Des spectres longue-fente le long de la barre et dans le bras déformé ainsi que des données SpIOMM couvrant l’ensemble de la galaxie ont été analysées. L’absorption stellaire sous-jacente a été soustraite des spectres longue-fente à l’aide d’un ajustement de modèles théoriques de populations stellaires fait avec le programme GANDALF. L’absorption a un impact très important sur le calcul de l’extinction ainsi que sur les différents diagnostics propres aux régions HII et aux populations stellaires jeunes. Enfin, cette étude montre que NGC 5430 comporte une composante gazeuse ionisée diffuse sur toute son étendue et qu’il est important d’en tenir compte afin d’appliquer correctement les diagnostics. Un des scénarios évolutifs proposés au terme de cette étude est que le noeud Wolf-Rayet constitue le restant d’une petite galaxie ou d’un nuage intergalactique qui serait entré en collision avec NGC 5430. Une structure englobant le noeud Wolf-Rayet se déplace à une vitesse considérablement inférieure (50 - 70 km s-1) à celle attendue à une telle distance du centre de la galaxie (200 - 220 km s-1). De plus, le noeud Wolf-Rayet semble très massif puisque l’intensité maximale du continu stellaire de cette région est semblable à celle du noyau et est de loin supérieure à celle de l’autre côté de la barre. Le nombre d’étoiles Wolf-Rayet (2150) est aussi considérable. Il n’est toutefois pas exclu que la différence de vitesses observée témoigne d’un écoulement de gaz le long de la barre, qui alimenterait la formation stellaire du noeud Wolf-Rayet ou du noyau.
Resumo:
Les méthodes classiques d’analyse de survie notamment la méthode non paramétrique de Kaplan et Meier (1958) supposent l’indépendance entre les variables d’intérêt et de censure. Mais, cette hypothèse d’indépendance n’étant pas toujours soutenable, plusieurs auteurs ont élaboré des méthodes pour prendre en compte la dépendance. La plupart de ces méthodes émettent des hypothèses sur cette dépendance. Dans ce mémoire, nous avons proposé une méthode d’estimation de la dépendance en présence de censure dépendante qui utilise le copula-graphic estimator pour les copules archimédiennes (Rivest etWells, 2001) et suppose la connaissance de la distribution de la variable de censure. Nous avons ensuite étudié la consistance de cet estimateur à travers des simulations avant de l’appliquer sur un jeu de données réelles.
Resumo:
La gestion intégrée de la ressource en eau implique de distinguer les parcours de l’eau qui sont accessibles aux sociétés de ceux qui ne le sont pas. Les cheminements de l’eau sont nombreux et fortement variables d’un lieu à l’autre. Il est possible de simplifier cette question en s’attardant plutôt aux deux destinations de l’eau. L’eau bleue forme les réserves et les flux dans l’hydrosystème : cours d’eau, nappes et écoulements souterrains. L’eau verte est le flux invisible de vapeur d’eau qui rejoint l’atmosphère. Elle inclut l’eau consommée par les plantes et l’eau dans les sols. Or, un grand nombre d’études ne portent que sur un seul type d’eau bleue, en ne s’intéressant généralement qu’au devenir des débits ou, plus rarement, à la recharge des nappes. Le portrait global est alors manquant. Dans un même temps, les changements climatiques viennent impacter ce cheminement de l’eau en faisant varier de manière distincte les différents composants de cycle hydrologique. L’étude réalisée ici utilise l’outil de modélisation SWAT afin de réaliser le suivi de toutes les composantes du cycle hydrologique et de quantifier l’impact des changements climatiques sur l’hydrosystème du bassin versant de la Garonne. Une première partie du travail a permis d’affiner la mise en place du modèle pour répondre au mieux à la problématique posée. Un soin particulier a été apporté à l’utilisation de données météorologiques sur grille (SAFRAN) ainsi qu’à la prise en compte de la neige sur les reliefs. Le calage des paramètres du modèle a été testé dans un contexte differential split sampling, en calant puis validant sur des années contrastées en terme climatique afin d’appréhender la robustesse de la simulation dans un contexte de changements climatiques. Cette étape a permis une amélioration substantielle des performances sur la période de calage (2000-2010) ainsi que la mise en évidence de la stabilité du modèle face aux changements climatiques. Par suite, des simulations sur une période d’un siècle (1960-2050) ont été produites puis analysées en deux phases : i) La période passée (1960-2000), basée sur les observations climatiques, a servi de période de validation à long terme du modèle sur la simulation des débits, avec de très bonnes performances. L’analyse des différents composants hydrologiques met en évidence un impact fort sur les flux et stocks d’eau verte, avec une diminution de la teneur en eau des sols et une augmentation importante de l’évapotranspiration. Les composantes de l’eau bleue sont principalement perturbées au niveau du stock de neige et des débits qui présentent tous les deux une baisse substantielle. ii) Des projections hydrologiques ont été réalisées (2010-2050) en sélectionnant une gamme de scénarios et de modèles climatiques issus d’une mise à l’échelle dynamique. L’analyse de simulation vient en bonne part confirmer les conclusions tirées de la période passée : un impact important sur l’eau verte, avec toujours une baisse de la teneur en eau des sols et une augmentation de l’évapotranspiration potentielle. Les simulations montrent que la teneur en eau des sols pendant la période estivale est telle qu’elle en vient à réduire les flux d’évapotranspiration réelle, mettant en évidence le possible déficit futur des stocks d’eau verte. En outre, si l’analyse des composantes de l’eau bleue montre toujours une diminution significative du stock de neige, les débits semblent cette fois en hausse pendant l’automne et l’hiver. Ces résultats sont un signe de l’«accélération» des composantes d’eau bleue de surface, probablement en relation avec l’augmentation des évènements extrêmes de précipitation. Ce travail a permis de réaliser une analyse des variations de la plupart des composantes du cycle hydrologique à l’échelle d’un bassin versant, confirmant l’importance de prendre en compte toutes ces composantes pour évaluer l’impact des changements climatiques et plus largement des changements environnementaux sur la ressource en eau.
Resumo:
Uno de los temas más complejos y necesarios en los cursos de Administración de Operaciones, es el uso de los pronósticos con modelos de series de tiempo (TSM por sus siglas en inglés) -- Para facilitar el entendimiento y ayudar a los estudiantes a comprender fácilmente los pronósticos de demanda, este proyecto presenta FOR TSM, una herramienta desarrollada en MS Excel VBA® -- La herramienta fue diseñada con una Interfaz gráfica de Usuario (GUI por sus siglas en inglés) para explicar conceptos fundamentales como la selección de los parámetros, los valores de inicialización, cálculo y análisis de medidas de desempeño y finalmente la selección de modelos
Resumo:
The routine analysis for quantization of organic acids and sugars are generally slow methods that involve the use and preparation of several reagents, require trained professional, the availability of special equipment and is expensive. In this context, it has been increasing investment in research whose purpose is the development of substitutive methods to reference, which are faster, cheap and simple, and infrared spectroscopy have been highlighted in this regard. The present study developed multivariate calibration models for the simultaneous and quantitative determination of ascorbic acid, citric, malic and tartaric and sugars sucrose, glucose and fructose, and soluble solids in juices and fruit nectars and classification models for ACP. We used methods of spectroscopy in the near infrared (Near Infrared, NIR) in association with the method regression of partial least squares (PLS). Were used 42 samples between juices and fruit nectars commercially available in local shops. For the construction of the models were performed with reference analysis using high-performance liquid chromatography (HPLC) and refractometry for the analysis of soluble solids. Subsequently, the acquisition of the spectra was done in triplicate, in the spectral range 12500 to 4000 cm-1. The best models were applied to the quantification of analytes in study on natural juices and juice samples produced in the Paraná Southwest Region. The juices used in the application of the models also underwent physical and chemical analysis. Validation of chromatographic methodology has shown satisfactory results, since the external calibration curve obtained R-square value (R2) above 0.98 and coefficient of variation (%CV) for intermediate precision and repeatability below 8.83%. Through the Principal Component Analysis (PCA) was possible to separate samples of juices into two major groups, grape and apple and tangerine and orange, while for nectars groups separated guava and grape, and pineapple and apple. Different validation methods, and pre-processes that were used separately and in combination, were obtained with multivariate calibration models with average forecast square error (RMSEP) and cross validation (RMSECV) errors below 1.33 and 1.53 g.100 mL-1, respectively and R2 above 0.771, except for malic acid. The physicochemical analysis enabled the characterization of drinks, including the pH working range (variation of 2.83 to 5.79) and acidity within the parameters Regulation for each flavor. Regression models have demonstrated the possibility of determining both ascorbic acids, citric, malic and tartaric with successfully, besides sucrose, glucose and fructose by means of only a spectrum, suggesting that the models are economically viable for quality control and product standardization in the fruit juice and nectars processing industry.
Resumo:
An experimental and numerical study of turbulent fire suppression is presented. For this work, a novel and canonical facility has been developed, featuring a buoyant, turbulent, methane or propane-fueled diffusion flame suppressed via either nitrogen dilution of the oxidizer or application of a fine water mist. Flames are stabilized on a slot burner surrounded by a co-flowing oxidizer, which allows controlled delivery of either suppressant to achieve a range of conditions from complete combustion through partial and total flame quenching. A minimal supply of pure oxygen is optionally applied along the burner to provide a strengthened flame base that resists liftoff extinction and permits the study of substantially weakened turbulent flames. The carefully designed facility features well-characterized inlet and boundary conditions that are especially amenable to numerical simulation. Non-intrusive diagnostics provide detailed measurements of suppression behavior, yielding insight into the governing suppression processes, and aiding the development and validation of advanced suppression models. Diagnostics include oxidizer composition analysis to determine suppression potential, flame imaging to quantify visible flame structure, luminous and radiative emissions measurements to assess sooting propensity and heat losses, and species-based calorimetry to evaluate global heat release and combustion efficiency. The studied flames experience notable suppression effects, including transition in color from bright yellow to dim blue, expansion in flame height and structural intermittency, and reduction in radiative heat emissions. Still, measurements indicate that the combustion efficiency remains close to unity, and only near the extinction limit do the flames experience an abrupt transition from nearly complete combustion to total extinguishment. Measurements are compared with large eddy simulation results obtained using the Fire Dynamics Simulator, an open-source computational fluid dynamics software package. Comparisons of experimental and simulated results are used to evaluate the performance of available models in predicting fire suppression. Simulations in the present configuration highlight the issue of spurious reignition that is permitted by the classical eddy-dissipation concept for modeling turbulent combustion. To address this issue, simple treatments to prevent spurious reignition are developed and implemented. Simulations incorporating these treatments are shown to produce excellent agreement with the experimentally measured data, including the global combustion efficiency.
Resumo:
Roads represent a new source of mortality due to animal-vehicle risk of collision threatening log-term populations’ viability. Risk of road-kill depends on species sensitivity to roads and their specific life-history traits. The risk of road mortality for each species depends on the characteristics of roads and bioecological characteristics of the species. In this study we intend to know the importance of climatic parameters (temperature and precipitation) together with traffic and life history traits and understand the role of drought in barn owl population viability, also affected by road mortality in three scenarios: high mobility, high population density and the combination of previous scenarios (mixed) (Manuscript). For the first objective we correlated the several parameters (climate, traffic and life history traits). We used the most correlated variables to build a predictive mixed model (GLMM) the influence of the same. Using a population model we evaluated barn owl population viability in all three scenarios. Model revealed precipitation, traffic and dispersal have negative relationship with road-kills, although the relationship was not significant. Scenarios showed different results, high mobility scenario showed greater population depletion, more fluctuations over time and greater risk of extinction. High population density scenario showed a more stable population with lower risk of extinction and mixed scenario showed similar results as first scenario. Climate seems to play an indirect role on barn owl road-kills, it may influence prey availability which influences barn owl reproductive success and activity. Also, high mobility scenario showed a greater negative impact on viability of populations which may affect their ability and resilience to other stochastic events. Future research should take in account climate and how it may influence species life cycles and activity periods for a more complete approach of road-kills. Also it is important to make the best mitigation decisions which might include improving prey quality habitat.
Resumo:
The first version of this text was presented in the “Philosophy of Communication” section at the ECREA’s 5th European Communication Conference, “Communication for Empowerment,” in Lisbon in November 2014. I would like to thank the audience for the lively post-presentation discussion.
Resumo:
International audience
Resumo:
Objective: Evaluate the validity, reliability, and factorial invariance of the complete Portuguese version of the Oral Health Impact Profile (OHIP) and its short version (OHIP-14). Methods: A total of 1,162 adults enrolled in the Faculty of Dentistry of Araraquara/UNESP participated in the study; 73.1% were women; and the mean age was 40.7 ± 16.3 yr. We conducted a confirmatory factor analysis, where χ2/df, comparative fit index, goodness of fit index, and root mean square error of approximation were used as indices of goodness of fit. The convergent validity was judged from the average variance extracted and the composite reliability, and the internal consistency was estimated by Cronbach standardized alpha. The stability of the models was evaluated by multigroup analysis in independent samples (test and validation) and between users and nonusers of dental prosthesis. Results: We found best-fitting models for the OHIP-14 and among dental prosthesis users. The convergent validity was below adequate values for the factors “functional limitation” and “physical pain” for the complete version and for the factors “functional limitation” and “psychological discomfort” for the OHIP-14. Values of composite reliability and internal consistency were below adequate in the OHIP-14 for the factors “functional limitation” and “psychological discomfort.” We detected strong invariance between test and validation samples of the full version and weak invariance for OHIP-14. The models for users and nonusers of dental prosthesis were not invariant for both versions. Conclusion: The reduced version of the OHIP was parsimonious, reliable, and valid to capture the construct “impact of oral health on quality of life,” which was more pronounced in prosthesis users.
Resumo:
Human activities are altering greenhouse gas concentrations in the atmosphere and causing global climate change. The issue of impacts of human-induced climate change has become increasingly important in recent years. The objective of this work was to develop a database of climate information of the future scenarios using a Geographic Information System (GIS) tools. Future scenarios focused on the decades of the 2020?s, 2050?s, and 2080?s (scenarios A2 and B2) were obtained from the General Circulation Models (GCM) available on Data Distribution Centre from the Third Assessment Report (TAR) of Intergovernmental Panel on Climate Change (IPCC). The TAR is compounded by six GCM with different spatial resolutions (ECHAM4:2.8125×2.8125º, HadCM3: 3.75×2.5º, CGCM2: 3.75×3.75º, CSIROMk2b: 5.625×3.214º, and CCSR/NIES: 5.625×5.625º). The mean monthly of the climate variables was obtained by the average from the available models using the GIS spatial analysis tools (arithmetic operation). Maps of mean monthly variables of mean temperature, minimum temperature, maximum temperature, rainfall, relative humidity, and solar radiation were elaborated adopting the spatial resolution of 0.5° X 0.5° latitude and longitude. The method of elaborating maps using GIS tools allowed to evaluate the spatial and distribution of future climate assessments. Nowadays, this database is being used in studies of impacts of climate change on plant disease of Embrapa projects.
Resumo:
Poster presented at the Workshop on Flexible Models for Longitudinal and Survival Data with Applications in Biostatistics. University of Warwick, Coventry, UK, 27-29 July 2015
Resumo:
The addition of active silica potentially improves the quality of concrete due to its high reactivity and pore refinement effect. The reactivity of silica is likely related to its charge density. Variations in surface charge alter the reactivity of the material consequently affecting the properties of concrete. The present study aimed at investigating variations in the charge density of silica as a function of acid treatments using nitric or phosphoric acid and different pH values (2.0, 4.0 and 6.0). Effects on concrete properties including slump, mechanical strength, permeability and chloride corrosion were evaluated. To that end, a statistical analysis was carried out and empirical models that correlate studied parameters (pH, acid and cement) with concrete properties were established. The quality of the models was tested by variance analysis. The results revealed that the addition of silica was efficiency in improving the properties of concrete, especially the electrochemical parameters. The addition of silica treated using nitric acid at pH = 4.0 displayed the best cement performance including highest strength, reduced permeability and lowest corrosion current