12 resultados para Data quality problems
Resumo:
Background: SPARCLE is a cross-sectional survey in nine European regions, examining the relationship of the environment of children with cerebral palsy to their participation and quality of life. The objective of this report is to assess data quality, in particular heterogeneity between regions, family and item non-response and potential for bias. Methods: 1,174 children aged 8–12 years were selected from eight population-based registers of children with cerebral palsy; one further centre recruited 75 children from multiple sources. Families were visited by trained researchers who administered psychometric questionnaires. Logistic regression was used to assess factors related to family non-response and self-completion of questionnaires by children. Results: 431/1,174 (37%) families identified from registers did not respond: 146 (12%) were not traced; of the 1,028 traced families, 250 (24%) declined to participate and 35 (3%) were not approached. Families whose disabled children could walk unaided were more likely to decline to participate. 818 children entered the study of which 500 (61%) self-reported their quality of life; children with low IQ, seizures or inability to walk were less likely to self-report. There was substantial heterogeneity between regions in response rates and socio-demographic characteristics of families but not in age or gender of children. Item non-response was 2% for children and ranged from 0.4% to 5% for questionnaires completed by parents. Conclusion: While the proportion of untraced families was higher than in similar surveys, the refusal rate was comparable. To reduce bias, all analyses should allow for region, walking ability, age and socio-demographic characteristics. The 75 children in the region without a population based register are unlikely to introduce bias
Resumo:
A new universal power quality manager is proposed. The proposal treats a number of power quality problems simultaneously. The universal manager comprises a combined series and shunt three-phase PWM controlled converters sharing a common DC link. A control scheme based on fuzzy logic is introduced and the general features of the design and operation processes are outlined. The performance of two configurations of the proposed power quality manager are compared in terms of a recently formulated unified power quality index. The validity and integrity of the proposed system is proved through computer simulated experiments
Resumo:
<p>In complex hydrogeological environments the effective management of groundwater quality problems by pump-and-treat operations can be most confidently achieved if the mixing dynamics induced within the aquifer by pumping are well understood. The utility of isotopic environmental tracers (C-, H-, O-, S-stable isotopic analyses and age indicators—14C, 3H) for this purpose is illustrated by the analysis of a pumping test in an abstraction borehole drilled into flooded, abandoned coal mineworkings at Deerplay (Lancashire, UK). Interpretation of the isotope data was undertaken conjunctively with that of major ion hydrochemistry, and interpreted in the context of the particular hydraulic setting of flooded mineworkings to identify the sources and mixing of water qualities in the groundwater system. Initial pumping showed breakdown of initial water quality stratification in the borehole, and gave evidence for distinctive isotopic signatures (d34S(SO4) ~= -1.6‰, d18O(SO4) ~= +15‰) associated with primary oxidation of pyrite in the zone of water table fluctuation—the first time this phenomenon has been successfully characterized by these isotopes in a flooded mine system. The overall aim of the test pumping—to replace an uncontrolled outflow from a mine entrance in an inconvenient location with a pumped discharge on a site where treatment could be provided—was swiftly achieved. Environmental tracing data illustrated the benefits of pumping as little as possible to attain this aim, as higher rates of pumping induced in-mixing of poorer quality waters from more distant old workings, and/or renewed pyrite oxidation in the shallow subsurface.</p>
Resumo:
In many environmental valuation applications standard sample sizes for choice modelling surveys are impractical to achieve. One can improve data quality using more in-depth surveys administered to fewer respondents. We report on a study using high quality rank-ordered data elicited with the best-worst approach. The resulting "exploded logit" choice model, estimated on 64 responses per person, was used to study the willingness to pay for external benefits by visitors for policies which maintain the cultural heritage of alpine grazing commons. We find evidence supporting this approach and reasonable estimates of mean WTP, which appear theoretically valid and policy informative. © The Author (2011).
Resumo:
Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.
Resumo:
There is a significant lack of indoor air quality research in low energy homes. This study compared the indoor air quality of eight<br/>newly built case study homes constructed to similar levels of air-tightness and insulation; with two different ventilation strategies (four homes with Mechanical Ventilation with Heat Recovery (MVHR) systems/Code level 4 and four homes naturally ventilated/Code level 3). Indoor air quality measurements were conducted over a 24 h period in the living room and main bedroom of each home during the summer and winter seasons. Simultaneous outside measurements and an occupant diary were also employed during the measurement period. Occupant interviews were conducted to gain information on perceived indoor air quality, occupant behaviour and building related illnesses. Knowledge of the MVHR system including ventilation related behaviour was also studied. Results suggest indoor air quality problems in both the mechanically ventilated and naturally ventilated homes, with significant issues identified regarding occupant use in the social homes<br/>
Resumo:
Although data quality and weighting decisions impact the outputs of reserve selection algorithms, these factors have not been closely studied. We examine these methodological issues in the use of reserve selection algorithms by comparing: (1) quality of input data and (2) use of different weighting methods for prioritizing among species. In 2003, the government of Madagascar, a global biodiversity hotspot, committed to tripling the size of its protected area network to protect 10% of the country’s total land area. We apply the Zonation reserve selection algorithm to distribution data for 52 lemur species to identify priority areas for the expansion of Madagascar’s reserve network. We assess the similarity of the areas selected, as well as the proportions of lemur ranges protected in the resulting areas when different forms of input data were used: extent of occurrence versus refined extent of occurrence. Low overlap between the areas selected suggests that refined extent of occurrence data are highly desirable, and to best protect lemur species, we recommend refining extent of occurrence ranges using habitat and altitude limitations. Reserve areas were also selected for protection based on three different species weighting schemes, resulting in marked variation in proportional representation of species among the IUCN Red List of Threatened Species extinction risk categories. This result demonstrates that assignment of species weights influences whether a reserve network prioritizes maximizing overall species protection or maximizing protection of the most threatened species.
Resumo:
The ecological footprint is now a widely accepted indicator of sustainable<br/>development. Footprinting translates resource consumption into the land area<br/>required to sustain it, and allows for an average per capita footprint for a region<br/>or nation to be compared with the global average. This paper reports on a project<br/>in which footprints were calculated for two Irish cities, namely Belfast in<br/>Northern Ireland and Limerick in the Republic of Ireland for the year 2001. As<br/>is frequently the case at sub-national scale, data quality and availability were<br/>often problematic, and in general data gaps were filled by means of population<br/>proxies or national averages. A range of methods was applied to convert<br/>resource flows to land areas. Both footprints suggest that the lifestyles of citizens<br/>of the cities use several times more land than their global share, as has been<br/>found for other cities.
Resumo:
We present an analysis of hard X-ray features in the spectrum of the bright Sy 1 galaxy Mrk 335 observed by the XMM-Newton satellite. Our analysis confirms the presence of a broad, ionized Fe Ka emission line in the spectrum, first found by Gondoin et al. The broad line can be modelled successfully by relativistic accretion disc reflection models. This interpretation is unusually robust in the case of Mrk 335 because of the lack of any ionized ('warm') absorber and the absence a clear narrow core to the line. Partial covering by neutral gas cannot, however, be ruled out statistically as the origin of the broad residuals. Regardless of the underlying continuum we report, for the first time in this source, the detection of a narrow absorption feature at the rest frame energy of ~5.9 keV. If the feature is identified with a resonance absorption line of iron in a highly ionized medium, the redshift of the line corresponds to an inflow velocity of ~0.11-0.15c. We present a simple model for the inflow, accounting approximately for relativistic and radiation pressure effects, and use Monte Carlo methods to compute synthetic spectra for qualitative comparison with the data. This modelling shows that the absorption feature can plausibly be reproduced by infalling gas providing that the feature is identified with Fe xxvi. We require the inflowing gas to extend over a limited range of radii at a few tens of r to match the observed feature. The mass accretion rate in the flow corresponds to 60 per cent of the Eddington limit, in remarkable agreement with the observed rate. The narrowness of the absorption line tends to argue against a purely gravitational origin for the redshift of the line, but given the current data quality we stress that such an interpretation cannot be ruled out. © 2006 The Authors.
Resumo:
The techniques of principal component analysis (PCA) and partial least squares (PLS) are introduced from the point of view of providing a multivariate statistical method for modelling process plants. The advantages and limitations of PCA and PLS are discussed from the perspective of the type of data and problems that might be encountered in this application area. These concepts are exemplified by two case studies dealing first with data from a continuous stirred tank reactor (CSTR) simulation and second a literature source describing a low-density polyethylene (LDPE) reactor simulation.