861 resultados para multiple data sources
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Context Lung-protective mechanical ventilation with the use of lower tidal volumes has been found to improve outcomes of patients with acute respiratory distress syndrome (ARDS). It has been suggested that use of lower tidal volumes also benefits patients who do not have ARDS. Objective To determine whether use of lower tidal volumes is associated with improved outcomes of patients receiving ventilation who do not have ARDS. Data Sources MEDLINE, CINAHL, Web of Science, and Cochrane Central Register of Controlled Trials up to August 2012. Study Selection Eligible studies evaluated use of lower vs higher tidal volumes in patients without ARDS at onset of mechanical ventilation and reported lung injury development, overall mortality, pulmonary infection, atelectasis, and biochemical alterations. Data Extraction Three reviewers extracted data on study characteristics, methods, and outcomes. Disagreement was resolved by consensus. Data Synthesis Twenty articles (2822 participants) were included. Meta-analysis using a fixed-effects model showed a decrease in lung injury development (risk ratio [RR], 0.33; 95% CI, 0.23 to 0.47; I-2, 0%; number needed to treat [NNT], 11), and mortality (RR, 0.64; 95% CI, 0.46 to 0.89; I-2, 0%; NNT, 23) in patients receiving ventilation with lower tidal volumes. The results of lung injury development were similar when stratified by the type of study (randomized vs nonrandomized) and were significant only in randomized trials for pulmonary infection and only in nonrandomized trials for mortality. Meta-analysis using a random-effects model showed, in protective ventilation groups, a lower incidence of pulmonary infection (RR, 0.45; 95% CI, 0.22 to 0.92; I-2, 32%; NNT, 26), lower mean (SD) hospital length of stay (6.91 [2.36] vs 8.87 [2.93] days, respectively; standardized mean difference [SMD], 0.51; 95% CI, 0.20 to 0.82; I-2, 75%), higher mean (SD) PaCO2 levels (41.05 [3.79] vs 37.90 [4.19] mm Hg, respectively; SMD, -0.51; 95% CI, -0.70 to -0.32; I-2, 54%), and lower mean (SD) pH values (7.37 [0.03] vs 7.40 [0.04], respectively; SMD, 1.16; 95% CI, 0.31 to 2.02; I-2, 96%) but similar mean (SD) ratios of PaO2 to fraction of inspired oxygen (304.40 [65.7] vs 312.97 [68.13], respectively; SMD, 0.11; 95% CI, -0.06 to 0.27; I-2, 60%). Tidal volume gradients between the 2 groups did not influence significantly the final results. Conclusions Among patients without ARDS, protective ventilation with lower tidal volumes was associated with better clinical outcomes. Some of the limitations of the meta-analysis were the mixed setting of mechanical ventilation (intensive care unit or operating room) and the duration of mechanical ventilation. JAMA. 2012;308(16):1651-1659 www.jama.com
Resumo:
Abstract Background In areas with limited structure in place for microscopy diagnosis, rapid diagnostic tests (RDT) have been demonstrated to be effective. Method The cost-effectiveness of the Optimal® and thick smear microscopy was estimated and compared. Data were collected on remote areas of 12 municipalities in the Brazilian Amazon. Data sources included the National Malaria Control Programme of the Ministry of Health, the National Healthcare System reimbursement table, hospitalization records, primary data collected from the municipalities, and scientific literature. The perspective was that of the Brazilian public health system, the analytical horizon was from the start of fever until the diagnostic results provided to patient and the temporal reference was that of year 2006. The results were expressed in costs per adequately diagnosed cases in 2006 U.S. dollars. Sensitivity analysis was performed considering key model parameters. Results In the case base scenario, considering 92% and 95% sensitivity for thick smear microscopy to Plasmodium falciparum and Plasmodium vivax, respectively, and 100% specificity for both species, thick smear microscopy is more costly and more effective, with an incremental cost estimated at US$549.9 per adequately diagnosed case. In sensitivity analysis, when sensitivity and specificity of microscopy for P. vivax were 0.90 and 0.98, respectively, and when its sensitivity for P. falciparum was 0.83, the RDT was more cost-effective than microscopy. Conclusion Microscopy is more cost-effective than OptiMal® in these remote areas if high accuracy of microscopy is maintained in the field. Decision regarding use of rapid tests for diagnosis of malaria in these areas depends on current microscopy accuracy in the field.
Resumo:
This PhD thesis aims at providing an evaluation of EU Cohesion policy impact on regional growth. It employs methodologies and data sources never before applied for this purpose. Main contributions to the literature concerning EU regional policy effectiveness have been extensively analysed. Moreover, having carried out an overview of the current literature on Cohesion Policy, we deduce that this work introduces innovative features in the field. The work enriches the current literature with regards to two aspects. The first aspect concerns the use of the instrument of Regression Discontinuity Design in order to examine the presence of a different outcome in terms of growth between Objectives 1 regions and non-Objective 1 regions at the cut-off point (75 percent of EU-15 GDP per capita in PPS) during the two programming periods, 1994-1999 and 2000-2006. The results confirm a significant difference higher than 0.5 percent per year between the two groups. The other empirical evaluation regards the study of a cross-section regression model based on the convergence theory that analyses the dependence relation between regional per capita growth and EU Cohesion policy expenditure in several fields of interventions. We have built a very fine dataset of spending variables (certified expenditure), using sources of data directly provided from the Regional Policy Directorate of the European Commission.
Resumo:
L'elaborato si pone l'obiettivo di indagare il complesso quadro delle molestie in famiglia e sul lavoro nell'ordinamento italiano e di effettuare una comparazione con un ordinamento appartenente alla stessa tradizione giuridica, l'ordinamento francese. Nel corso dell'esposizione saranno ricostruiti gli aspetti socio-criminologici e giuridici delle molestie in famiglia verso i soggetti deboli, donne, minori, anziani o portatori di handicap, le molestie sul luogo di lavoro quali molestie sessuali e mobbing, le molestie a distanza o stalking, che per molti aspetti rappresentano un fenomeno sommerso e poco conosciuto. La tesi intende analizzare soprattutto le forme di molestie psicologiche e meno conosciute. La ricostruzione teorico-normativa degli argomenti trattati è integrata con i risultati di una ricerca quantitativa e qualitativa tratta dalla giurisprudenza dei due paesi. Il lavoro, quindi, è organizzato in due parti: la prima è incentrata sugli aspetti teorici, socio-criminologici e giuridici e la seconda è dedicata alla ricerca empirica, che è stata condotta utilizzando quali fonti di dati le sentenze della Suprema Corte di Cassazione italiana e francese.
Resumo:
Ein neu entwickeltes globales Atmosphärenchemie- und Zirkulationsmodell (ECHAM5/MESSy1) wurde verwendet um die Chemie und den Transport von Ozonvorläufersubstanzen zu untersuchen, mit dem Schwerpunkt auf Nichtmethankohlenwasserstoffen. Zu diesem Zweck wurde das Modell durch den Vergleich der Ergebnisse mit Messungen verschiedenen Ursprungs umfangreich evaluiert. Die Analyse zeigt, daß das Modell die Verteilung von Ozon realistisch vorhersagt, und zwar sowohl die Menge als auch den Jahresgang. An der Tropopause gibt das Modell den Austausch zwischen Stratosphäre und Troposphäre ohne vorgeschriebene Flüsse oder Konzentrationen richtig wieder. Das Modell simuliert die Ozonvorläufersubstanzen mit verschiedener Qualität im Vergleich zu den Messungen. Obwohl die Alkane vom Modell gut wiedergeben werden, ergibt sich einige Abweichungen für die Alkene. Von den oxidierten Substanzen wird Formaldehyd (HCHO) richtig wiedergegeben, während die Korrelationen zwischen Beobachtungen und Modellergebnissen für Methanol (CH3OH) und Aceton (CH3COCH3) weitaus schlechter ausfallen. Um die Qualität des Modells im Bezug auf oxidierte Substanzen zu verbessern, wurden einige Sensitivitätsstudien durchgeführt. Diese Substanzen werden durch Emissionen/Deposition von/in den Ozean beeinflußt, und die Kenntnis über den Gasaustausch mit dem Ozean ist mit großen Unsicherheiten behaftet. Um die Ergebnisse des Modells ECHAM5/MESSy1 zu verbessern wurde das neue Submodell AIRSEA entwickelt und in die MESSy-Struktur integriert. Dieses Submodell berücksichtigt den Gasaustausch zwischen Ozean und Atmosphäre einschließlich der oxidierten Substanzen. AIRSEA, welches Informationen über die Flüssigphasenkonzentration des Gases im Oberflächenwasser des Ozeans benötigt wurde ausgiebig getestet. Die Anwendung des neuen Submodells verbessert geringfügig die Modellergebnisse für Aceton und Methanol, obwohl die Verwendung einer vorgeschriebenen Flüssigphasenkonzentration stark den Erfolg der Methode einschränkt, da Meßergebnisse nicht in ausreichendem Maße zu Verfügung stehen. Diese Arbeit vermittelt neue Einsichten über organische Substanzen. Sie stellt die Wichtigkeit der Kopplung zwischen Ozean und Atmosphäre für die Budgets vieler Gase heraus.
Resumo:
This doctoral thesis aims at contributing to the literature on transition economies focusing on the Russian Federations and in particular on regional income convergence and fertility patterns. The first two chapter deal with the issue of income convergence across regions. Chapter 1 provides an historical-institutional analysis of the period between the late years of the Soviet Union and the last decade of economic growth and a presentation of the sample with a description of gross regional product composition, agrarian or industrial vocation, labor. Chapter 2 contributes to the literature on exploratory spatial data analysis with a application to a panel of 77 regions in the period 1994-2008. It provides an analysis of spatial patterns and it extends the theoretical framework of growth regressions controlling for spatial correlation and heterogeneity. Chapter 3 analyses the national demographic patterns since 1960 and provides a review of the policies on maternity leave and family benefits. Data sources are the Statistical Yearbooks of USSR, the Statistical Yearbooks of the Russian Soviet Federative Socialist Republic and the Demographic Yearbooks of Russia. Chapter 4 analyses the demographic patterns in light of the theoretical framework of the Becker model, the Second Demographic Transition and an economic-crisis argument. With national data from 1960, the theoretically issue of the pro or countercyclical relation between income and fertility is graphically analyzed and discussed, together with female employment and education. With regional data after 1994 different panel data models are tested. Individual level data from the Russian Longitudinal Monitoring Survey are employed using the logit model. Chapter 5 employs data from the Generations and Gender Survey by UNECE to focus on postponement and second births intentions. Postponement is studied through cohort analysis of mean maternal age at first birth, while the methodology used for second birth intentions is the ordered logit model.
Resumo:
The land-atmosphere exchange of atmospheric trace gases is sensitive to meteorological conditions and climate change. It contributes in turn to the atmospheric radiative forcing through its effects on tropospheric chemistry. The interactions between the hydrological cycle and atmospheric processes are intricate and often involve different levels of feedbacks. The Earth system model EMAC is used in this thesis to assess the direct role of the land surface components of the terrestrial hydrological cycle in the emissions, deposition and transport of key trace gases that control tropospheric chemistry. It is also used to examine its indirect role in changing the tropospheric chemical composition through the feedbacks between the atmospheric and the terrestrial branches of the hydrological cycle. Selected features of the hydrological cycle in EMAC are evaluated using observations from different data sources. The interactions between precipitation and the water vapor column, from the atmospheric branch of the hydrological cycle, and evapotranspiration, from its terrestrial branch, are assessed specially for tropical regions. The impacts of changes in the land surface hydrology on surface exchanges and the oxidizing chemistry of the atmosphere are assessed through two sensitivity simulations. In the first, a new parametrization for rainfall interception in the densely vegetated areas in the tropics is implemented, and its effects are assessed. The second study involves the application of a soil moisture forcing that replaces the model calculated soil moisture. Both experiments have a large impact on the local hydrological cycle, dry deposition of soluble and insoluble gases, emissions of isoprene through changes in surface temperature and the Planetary Boundary Layer height. Additionally the soil moisture forcing causes changes in local vertical transport and large-scale circulation. The changes in trace gas exchanges affect the oxidation capacity of the atmosphere through changes in OH, O$_3$, NO$_x$ concentrations.
Resumo:
The research aims at developing a framework for semantic-based digital survey of architectural heritage. Rooted in knowledge-based modeling which extracts mathematical constraints of geometry from architectural treatises, as-built information of architecture obtained from image-based modeling is integrated with the ideal model in BIM platform. The knowledge-based modeling transforms the geometry and parametric relation of architectural components from 2D printings to 3D digital models, and create large amount variations based on shape grammar in real time thanks to parametric modeling. It also provides prior knowledge for semantically segmenting unorganized survey data. The emergence of SfM (Structure from Motion) provides access to reconstruct large complex architectural scenes with high flexibility, low cost and full automation, but low reliability of metric accuracy. We solve this problem by combing photogrammetric approaches which consists of camera configuration, image enhancement, and bundle adjustment, etc. Experiments show the accuracy of image-based modeling following our workflow is comparable to that from range-based modeling. We also demonstrate positive results of our optimized approach in digital reconstruction of portico where low-texture-vault and dramatical transition of illumination bring huge difficulties in the workflow without optimization. Once the as-built model is obtained, it is integrated with the ideal model in BIM platform which allows multiple data enrichment. In spite of its promising prospect in AEC industry, BIM is developed with limited consideration of reverse-engineering from survey data. Besides representing the architectural heritage in parallel ways (ideal model and as-built model) and comparing their difference, we concern how to create as-built model in BIM software which is still an open area to be addressed. The research is supposed to be fundamental for research of architectural history, documentation and conservation of architectural heritage, and renovation of existing buildings.
Resumo:
Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.
Resumo:
Moderne ESI-LC-MS/MS-Techniken erlauben in Verbindung mit Bottom-up-Ansätzen eine qualitative und quantitative Charakterisierung mehrerer tausend Proteine in einem einzigen Experiment. Für die labelfreie Proteinquantifizierung eignen sich besonders datenunabhängige Akquisitionsmethoden wie MSE und die IMS-Varianten HDMSE und UDMSE. Durch ihre hohe Komplexität stellen die so erfassten Daten besondere Anforderungen an die Analysesoftware. Eine quantitative Analyse der MSE/HDMSE/UDMSE-Daten blieb bislang wenigen kommerziellen Lösungen vorbehalten. rn| In der vorliegenden Arbeit wurden eine Strategie und eine Reihe neuer Methoden zur messungsübergreifenden, quantitativen Analyse labelfreier MSE/HDMSE/UDMSE-Daten entwickelt und als Software ISOQuant implementiert. Für die ersten Schritte der Datenanalyse (Featuredetektion, Peptid- und Proteinidentifikation) wird die kommerzielle Software PLGS verwendet. Anschließend werden die unabhängigen PLGS-Ergebnisse aller Messungen eines Experiments in einer relationalen Datenbank zusammengeführt und mit Hilfe der dedizierten Algorithmen (Retentionszeitalignment, Feature-Clustering, multidimensionale Normalisierung der Intensitäten, mehrstufige Datenfilterung, Proteininferenz, Umverteilung der Intensitäten geteilter Peptide, Proteinquantifizierung) überarbeitet. Durch diese Nachbearbeitung wird die Reproduzierbarkeit der qualitativen und quantitativen Ergebnisse signifikant gesteigert.rn| Um die Performance der quantitativen Datenanalyse zu evaluieren und mit anderen Lösungen zu vergleichen, wurde ein Satz von exakt definierten Hybridproteom-Proben entwickelt. Die Proben wurden mit den Methoden MSE und UDMSE erfasst, mit Progenesis QIP, synapter und ISOQuant analysiert und verglichen. Im Gegensatz zu synapter und Progenesis QIP konnte ISOQuant sowohl eine hohe Reproduzierbarkeit der Proteinidentifikation als auch eine hohe Präzision und Richtigkeit der Proteinquantifizierung erreichen.rn| Schlussfolgernd ermöglichen die vorgestellten Algorithmen und der Analyseworkflow zuverlässige und reproduzierbare quantitative Datenanalysen. Mit der Software ISOQuant wurde ein einfaches und effizientes Werkzeug für routinemäßige Hochdurchsatzanalysen labelfreier MSE/HDMSE/UDMSE-Daten entwickelt. Mit den Hybridproteom-Proben und den Bewertungsmetriken wurde ein umfassendes System zur Evaluierung quantitativer Akquisitions- und Datenanalysesysteme vorgestellt.
Resumo:
Satellite image classification involves designing and developing efficient image classifiers. With satellite image data and image analysis methods multiplying rapidly, selecting the right mix of data sources and data analysis approaches has become critical to the generation of quality land-use maps. In this study, a new postprocessing information fusion algorithm for the extraction and representation of land-use information based on high-resolution satellite imagery is presented. This approach can produce land-use maps with sharp interregional boundaries and homogeneous regions. The proposed approach is conducted in five steps. First, a GIS layer - ATKIS data - was used to generate two coarse homogeneous regions, i.e. urban and rural areas. Second, a thematic (class) map was generated by use of a hybrid spectral classifier combining Gaussian Maximum Likelihood algorithm (GML) and ISODATA classifier. Third, a probabilistic relaxation algorithm was performed on the thematic map, resulting in a smoothed thematic map. Fourth, edge detection and edge thinning techniques were used to generate a contour map with pixel-width interclass boundaries. Fifth, the contour map was superimposed on the thematic map by use of a region-growing algorithm with the contour map and the smoothed thematic map as two constraints. For the operation of the proposed method, a software package is developed using programming language C. This software package comprises the GML algorithm, a probabilistic relaxation algorithm, TBL edge detector, an edge thresholding algorithm, a fast parallel thinning algorithm, and a region-growing information fusion algorithm. The county of Landau of the State Rheinland-Pfalz, Germany was selected as a test site. The high-resolution IRS-1C imagery was used as the principal input data.
Resumo:
OBJECTIVE: Neurologically normal term infants sometimes present with repetitive, rhythmic myoclonic jerks that occur during sleep. The condition, which is traditionally resolved by 3 months of age with no sequelae, is termed benign neonatal sleep myoclonus. The goal of this review was to synthesize the published literature on benign neonatal sleep myoclonus. METHODS: The US National Library of Medicine database and the Web-based search engine Google, through June 2009, were used as data sources. All articles published after the seminal description in 1982 as full-length articles or letters were collected. Reports that were published in languages other than English, French, German, Italian, Portuguese, or Spanish were not considered. RESULTS: We included 24 reports in which 164 term-born (96%) or near-term-born (4%) infants were described. Neonatal sleep myoclonus occurred in all sleep stages, disappeared after arousal, and was induced by rocking the infant or repetitive sound stimuli. Furthermore, in affected infants, jerks stopped or even worsened by holding the limbs or on medication with antiepileptic drugs. Finally, benign neonatal sleep myoclonus did not resolve by 3 months of age in one-third of the infants. CONCLUSIONS: This review provides new insights into the clinical features and natural course of benign neonatal sleep myoclonus. The most significant limitation of the review comes from the small number of reported cases.
Resumo:
Objective To examine the presence and extent of small study effects in clinical osteoarthritis research. Design Meta-epidemiological study. Data sources 13 meta-analyses including 153 randomised trials (41 605 patients) that compared therapeutic interventions with placebo or non-intervention control in patients with osteoarthritis of the hip or knee and used patients’ reported pain as an outcome. Methods We compared estimated benefits of treatment between large trials (at least 100 patients per arm) and small trials, explored funnel plots supplemented with lines of predicted effects and contours of significance, and used three approaches to estimate treatment effects: meta-analyses including all trials irrespective of sample size, meta-analyses restricted to large trials, and treatment effects predicted for large trials. Results On average, treatment effects were more beneficial in small than in large trials (difference in effect sizes −0.21, 95% confidence interval −0.34 to −0.08, P=0.001). Depending on criteria used, six to eight funnel plots indicated small study effects. In six of 13 meta-analyses, the overall pooled estimate suggested a clinically relevant, significant benefit of treatment, whereas analyses restricted to large trials and predicted effects in large trials yielded smaller non-significant estimates. Conclusions Small study effects can often distort results of meta-analyses. The influence of small trials on estimated treatment effects should be routinely assessed.
Resumo:
Outside of relatively limited crash testing with large trucks, very little is known regarding the performance of traffic barriers subjected to real-world large truck impacts. The purpose of this study was to investigate real-world large truck impacts into traffic barriers to determine barrier crash involvement rates, the impact performance of barriers not specifically designed to redirect large trucks, and the real-world performance of large-truck-specific barriers. Data sources included the Fatality Analysis Reporting System (2000-2009), the General Estimates System (2000-2009) and 155 in-depth large truck-to-barrier crashes from the Large Truck Crash Causation Study. Large truck impacts with a longitudinal barrier were found to comprise 3 percent of all police-reported longitudinal barrier impacts and roughly the same proportion of barrier fatalities. Based on a logistic regression model predicting barrier penetration, large truck barrier penetration risk was found to increase by a factor of 6 for impacts with barriers designed primarily for passenger vehicles. Although large-truck-specific barriers were found to perform better than non-heavy vehicle specific barriers, the penetration rate of these barriers were found to be 17 percent. This penetration rate is especially a concern because the higher test level barriers are designed to protect other road users, not the occupants of the large truck. Surprisingly, barriers not specifically designed for large truck impacts were found to prevent large truck penetration approximately half of the time. This suggests that adding costlier higher test level barriers may not always be warranted, especially on roadways with lower truck volumes.