915 resultados para Secondary Data Analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Visualization and exploratory analysis is an important part of any data analysis and is made more challenging when the data are voluminous and high-dimensional. One such example is environmental monitoring data, which are often collected over time and at multiple locations, resulting in a geographically indexed multivariate time series. Financial data, although not necessarily containing a geographic component, present another source of high-volume multivariate time series data. We present the mvtsplot function which provides a method for visualizing multivariate time series data. We outline the basic design concepts and provide some examples of its usage by applying it to a database of ambient air pollution measurements in the United States and to a hypothetical portfolio of stocks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many seemingly disparate approaches for marginal modeling have been developed in recent years. We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the proposed copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes Poisson log-linear multilevel models to investigate population variability in sleep state transition rates. We specifically propose a Bayesian Poisson regression model that is more flexible, scalable to larger studies, and easily fit than other attempts in the literature. We further use hierarchical random effects to account for pairings of individuals and repeated measures within those individuals, as comparing diseased to non-diseased subjects while minimizing bias is of epidemiologic importance. We estimate essentially non-parametric piecewise constant hazards and smooth them, and allow for time varying covariates and segment of the night comparisons. The Bayesian Poisson regression is justified through a re-derivation of a classical algebraic likelihood equivalence of Poisson regression with a log(time) offset and survival regression assuming piecewise constant hazards. This relationship allows us to synthesize two methods currently used to analyze sleep transition phenomena: stratified multi-state proportional hazards models and log-linear models with GEE for transition counts. An example data set from the Sleep Heart Health Study is analyzed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In-service hardened concrete pavement suffers from environmental loadings caused by curling and warping of the slab. Traditionally, these loadings are computed on the basis of treating the slab as an elastic material, and of evaluating separately the curling and warping components. This dissertation simulates temperature distribution and moisture distribution through the slabs by use of a developed numerical model that couples the heat transfer and moisture transport. The computation of environmental loadings treats the slab as an elastic-viscous material, which considers the relaxation behavior and Pickett effect of the concrete. The heat transfer model considers the impacts of solar radiation, wind speed, air temperature, pavement slab albedo, etc. on the pavement temperature distribution. This dissertation assesses the difference between documented models that aim to predict pavement temperature, highlighting their pros and cons. The moisture transport model is unique for the documented models; it mimics the wetting and drying events occurring at the slab surface. These events are estimated by a proposed statistical algorithm, which is verified by field rainfall data. Analysis of the predicted results examines on the roles of the local air RH (relative humidity), wind speed, rainy pattern in the moisture distribution through the slab. The findings reveal that seasonal air RH plays a decisive role on the slab‘s moisture distribution; but wind speed and its daily variation, daily RH variation, and seasonal rainfall pattern plays only a secondary role. This dissertation sheds light on the computation of environmental loadings that in-service pavement slabs suffer from. Analysis of the computed stresses centers on the stress relaxation near the surface, stress evolution after the curing ends, and the impact of construction season on the stress‘s magnitude. An unexpected finding is that the total environmental loadings at the cyclically-stable state divert from the thermal stresses. At such a state, the total stress at the daytime is roughly equal to the thermal stress; whereas the total stress during the nighttime is far greater than the thermal stress. An explanation for this phenomenon is that during the night hours, the decline of the slab‘s near-surface temperature leads to a drop of the near-surface RH. This RH drop results in contraction therein and develops additional tensile stresses. The dissertation thus argues that estimating the environmental loadings by solely computing the thermally-induced stresses may reach delusive results. It recommends that the total environmental loadings of in-service slabs should be estimated by a sophisticated model coupling both moisture component and temperature component.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Turrialba is one of the largest and most active stratovolcanoes in the Central Cordillera of Costa Rica and an excellent target for validation of satellite data using ground based measurements due to its high elevation, relative ease of access, and persistent elevated SO2 degassing. The Ozone Monitoring Instrument (OMI) aboard the Aura satellite makes daily global observations of atmospheric trace gases and it is used in this investigation to obtain volcanic SO2 retrievals in the Turrialba volcanic plume. We present and evaluate the relative accuracy of two OMI SO2 data analysis procedures, the automatic Band Residual Index (BRI) technique and the manual Normalized Cloud-mass (NCM) method. We find a linear correlation and good quantitative agreement between SO2 burdens derived from the BRI and NCM techniques, with an improved correlation when wet season data are excluded. We also present the first comparisons between volcanic SO2 emission rates obtained from ground-based mini-DOAS measurements at Turrialba and three new OMI SO2 data analysis techniques: the MODIS smoke estimation, OMI SO2 lifetime, and OMI SO2 transect techniques. A robust validation of OMI SO2 retrievals was made, with both qualitative and quantitative agreements under specific atmospheric conditions, proving the utility of satellite measurements for estimating accurate SO2 emission rates and monitoring passively degassing volcanoes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

After teaching regular education secondary mathematics for seven years, I accepted a position in an alternative education high school. Over the next four years, the State of Michigan adopted new graduation requirements phasing in a mandate for all students to complete Geometry and Algebra 2 courses. Since many of my students were already struggling in Algebra 1, getting them through Geometry and Algebra 2 seemed like a daunting task. To better instruct my students, I wanted to know how other teachers in similar situations were addressing the new High School Content Expectations (HSCEs) in upper level mathematics. This study examines how thoroughly alternative education teachers in Michigan are addressing the HSCEs in their courses, what approaches they have found most effective, and what issues are preventing teachers and schools from successfully implementing the HSCEs. Twenty-six alternative high school educators completed an online survey that included a variety of questions regarding school characteristics, curriculum alignment, implementation approaches and issues. Follow-up phone interviews were conducted with four of these participants. The survey responses were used to categorize schools as successful, unsuccessful, and neutral schools in terms of meeting the HSCEs. Responses from schools in each category were compared to identify common approaches and issues among them and to identify significant differences between school groups. Data analysis showed that successful schools taught more of the HSCEs through a variety of instructional approaches, with an emphasis on varying the ways students learned the material. Individualized instruction was frequently mentioned by successful schools and was strikingly absent from unsuccessful school responses. The main obstacle to successful implementation of the HSCEs identified in the study was gaps in student knowledge. This caused pace of instruction to also be a significant issue. School representatives were fairly united against the belief that the Algebra 2 graduation requirement was appropriate for all alternative education students. Possible implications of these findings are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Principal Component Analysis (PCA) is a popular method for dimension reduction that can be used in many fields including data compression, image processing, exploratory data analysis, etc. However, traditional PCA method has several drawbacks, since the traditional PCA method is not efficient for dealing with high dimensional data and cannot be effectively applied to compute accurate enough principal components when handling relatively large portion of missing data. In this report, we propose to use EM-PCA method for dimension reduction of power system measurement with missing data, and provide a comparative study of traditional PCA and EM-PCA methods. Our extensive experimental results show that EM-PCA method is more effective and more accurate for dimension reduction of power system measurement data than traditional PCA method when dealing with large portion of missing data set.