936 resultados para Data quality problems
Resumo:
This study investigates the relationships between work stressors and organizational performance in terms of the quality of care provided by the long-term care facilities. Work stressors are first examined in relation to the unit's structural factors, resident characteristics, and to the unit specialization. The study is completed by an investigation into the associations of work stressors such as job demands or time pressure, role ambiguity, resident-related stress, and procedural injustice to organizational performance. Also the moderating effect of job control in the job demands organizational performance relationship is examined. The study was carried out in the National Research and Development Centre for Welfare and Health (STAKES). Survey data were drawn from 1194 nursing employees in 107 residential-home and health-center inpatient units in 1999 and from 977 employees in 91 units in 2002. Information on the unit resident characteristics and the quality of care was provided by the Resident Assessment Instrument (RAI). The results showed that large unit size or lower staffing levels were not consistently related to work stressors, whereas the impairments in residents' physical functioning in particular initiated stressful working conditions for employees. However, unit specialization into dementia and psychiatric residents was found to buffer the effects that the resident characteristics had on employee appraisals of work stressors, in that a high proportion of behavioral problems was related to less time pressure and role conflicts for employees in specialized units. Unit specialization was also related to improved team climates and the organizational commitment of employees. Work stressors associated with problems in care quality. Time pressure explained most of the differences between units in how the employees perceived the quality of physical and psychosocial care they provide for the residents. A high level of job demands in the unit was also found to be related to some increases in all clinical quality problems. High job control buffered the effects of job demands on the quality of care in terms of the use of restraints on elderly residents. Physical restraint and especially antipsychotic drug use were less prevalent in units that combined both high job demands and high control for employees. In contrast, in high strain units where heavy job demands coincided with a lack of control for employees, quality was poor in terms of the frequent use of physical restraints. In addition, procedural injustice was related to the frequent use of antianxiety of hypnotic drugs for elderly residents. The results suggest that both job control and procedural justice may have improved employees' abilities to cope when caring for the elderly residents, resulting in better organizational performance.
Resumo:
Pyridoxal kinase (PdxK; EC 2.7.1.35) belongs to the phosphotransferase family of enzymes and catalyzes the conversion of the three active forms of vitamin B-6, pyridoxine, pyridoxal and pyridoxamine, to their phosphorylated forms and thereby plays a key role in pyridoxal 5 `-phosphate salvage. In the present study, pyridoxal kinase from Salmonella typhimurium was cloned and overexpressed in Escherichia coli, purified using Ni-NTA affinity chromatography and crystallized. X-ray diffraction data were collected to 2.6 angstrom resolution at 100 K. The crystal belonged to the primitive orthorhombic space group P2(1)2(1)2(1), with unitcell parameters a = 65.11, b = 72.89, c = 107.52 angstrom. The data quality obtained by routine processing was poor owing to the presence of strong diffraction rings caused by a polycrystalline material of an unknown small molecule in all oscillation images. Excluding the reflections close to powder/polycrystalline rings provided data of sufficient quality for structure determination. A preliminary structure solution has been obtained by molecular replacement with the Phaser program in the CCP4 suite using E. coli pyridoxal kinase (PDB entry 2ddm) as the phasing model. Further refinement and analysis of the structure are likely to provide valuable insights into catalysis by pyridoxal kinases.
Resumo:
Displacement estimation is a key step in the evaluation of tissue elasticity by quasistatic strain imaging. An efficient approach may incorporate a tracking strategy whereby each estimate is initially obtained from its neighbours' displacements and then refined through a localized search. This increases the accuracy and reduces the computational expense compared with exhaustive search. However, simple tracking strategies fail when the target displacement map exhibits complex structure. For example, there may be discontinuities and regions of indeterminate displacement caused by decorrelation between the pre- and post-deformation radio frequency (RF) echo signals. This paper introduces a novel displacement tracking algorithm, with a search strategy guided by a data quality indicator. Comparisons with existing methods show that the proposed algorithm is more robust when the displacement distribution is challenging.
Resumo:
The mapping and geospatial analysis of benthic environments are multidisciplinary tasks that have become more accessible in recent years because of advances in technology and cost reductions in survey systems. The complex relationships that exist among physical, biological, and chemical seafloor components require advanced, integrated analysis techniques to enable scientists and others to visualize patterns and, in so doing, allow inferences to be made about benthic processes. Effective mapping, analysis, and visualization of marine habitats are particularly important because the subtidal seafloor environment is not readily viewed directly by eye. Research in benthic environments relies heavily, therefore, on remote sensing techniques to collect effective data. Because many benthic scientists are not mapping professionals, they may not adequately consider the links between data collection, data analysis, and data visualization. Projects often start with clear goals, but may be hampered by the technical details and skills required for maintaining data quality through the entire process from collection through analysis and presentation. The lack of technical understanding of the entire data handling process can represent a significant impediment to success. While many benthic mapping efforts have detailed their methodology as it relates to the overall scientific goals of a project, only a few published papers and reports focus on the analysis and visualization components (Paton et al. 1997, Weihe et al. 1999, Basu and Saxena 1999, Bruce et al. 1997). In particular, the benthic mapping literature often briefly describes data collection and analysis methods, but fails to provide sufficiently detailed explanation of particular analysis techniques or display methodologies so that others can employ them. In general, such techniques are in large part guided by the data acquisition methods, which can include both aerial and water-based remote sensing methods to map the seafloor without physical disturbance, as well as physical sampling methodologies (e.g., grab or core sampling). The terms benthic mapping and benthic habitat mapping are often used synonymously to describe seafloor mapping conducted for the purpose of benthic habitat identification. There is a subtle yet important difference, however, between general benthic mapping and benthic habitat mapping. The distinction is important because it dictates the sequential analysis and visualization techniques that are employed following data collection. In this paper general seafloor mapping for identification of regional geologic features and morphology is defined as benthic mapping. Benthic habitat mapping incorporates the regional scale geologic information but also includes higher resolution surveys and analysis of biological communities to identify the biological habitats. In addition, this paper adopts the definition of habitats established by Kostylev et al. (2001) as a spatially defined area where the physical, chemical, and biological environment is distinctly different from the surrounding environment. (PDF contains 31 pages)
Resumo:
Daily sea surface temperatures have been acquired at the Hopkins Marine Station in Pacific Grove, California since January 20, 1919.This time series is one of the longest oceanographic records along the U.S. west coast. Because of its length it is well-suited for studying climate-related and oceanic variability on interannual, decadal, and interdecadal time scales. The record, however, is not homogeneous, has numerous gaps, contains possible outliers, and the observations were not always collected at the same time each day. Because of these problems we have undertaken the task of reconstructing this long and unique series. We describe the steps that were taken and the methods that were used in this reconstruction. Although the methods employed are basic, we believe that they are consistent with the quality of the data. The reconstructed record has values at every time point, original, or estimated, and has been adjusted for time-of-day variations where this information was available. Possible outliers have also been examined and replaced where their credibility could not be established. Many of the studies that have employed the Hopkins time series have not discussed the issue of data quality and how these problems were addressed. Because of growing interest in this record, it is important that a single, well-documented version be adopted, so that the results of future analyses can be directly compared. Although additional work may be done to further improve the quality of this record, it is now available via the internet. [PDF contains 48 pages]
Resumo:
<p>Smartphones and other powerful sensor-equipped consumer devices make it possible to sense the physical world at an unprecedented scale. Nearly 2 million Android and iOS devices are activated every day, each carrying numerous sensors and a high-speed internet connection. Whereas traditional sensor networks have typically deployed a fixed number of devices to sense a particular phenomena, community networks can grow as additional participants choose to install apps and join the network. In principle, this allows networks of thousands or millions of sensors to be created quickly and at low cost. However, making reliable inferences about the world using so many community sensors involves several challenges, including scalability, data quality, mobility, and user privacy.</p> <p>This thesis focuses on how learning at both the sensor- and network-level can provide scalable techniques for data collection and event detection. First, this thesis considers the abstract problem of distributed algorithms for data collection, and proposes a distributed, online approach to selecting which set of sensors should be queried. In addition to providing theoretical guarantees for submodular objective functions, the approach is also compatible with local rules or heuristics for detecting and transmitting potentially valuable observations. Next, the thesis presents a decentralized algorithm for spatial event detection, and describes its use detecting strong earthquakes within the Caltech Community Seismic Network. Despite the fact that strong earthquakes are rare and complex events, and that community sensors can be very noisy, our decentralized anomaly detection approach obtains theoretical guarantees for event detection performance while simultaneously limiting the rate of false alarms.</p>
Resumo:
Water quality problems are reported to be the factor limiting prawn production in the local prawn farm. This investigation was carried out to monitor water quality and its relationship to physical, chemical and biological conditions in the ponds in order to establish what factors should be monitored in order to predict problems. Pond collapse was found to be associated with high concentrations of ammonium, high pH and blue-green algae dominated phytoplankton populations. There was no easy means of predicting the imminent collapse of ponds as the phenomenon was never associated with the extreme of any of the conditions monitored. Rather it seemed to be related to the stability of the pond's algal population, which was largely unaccounted for. Recommendations toward improving water quality are proposed.
Resumo:
Technology-supported citizen science has created huge volumes of data with increasing potential to facilitate scientific progress, however, verifying data quality is still a substantial hurdle due to the limitations of existing data quality mechanisms. In this study, we adopted a mixed methods approach to investigate community-based data validation practices and the characteristics of records of wildlife species observations that affected the outcomes of collaborative data quality management in an online community where people record what they see in the nature. The findings describe the processes that both relied upon and added to information provenance through information stewardship behaviors, which led to improved reliability and informativity. The likelihood of community-based validation interactions were predicted by several factors, including the types of organisms observed and whether the data were submitted from a mobile device. We conclude with implications for technology design, citizen science practices, and research.
Resumo:
The GEOTRACES Intermediate Data Product 2014 (IDP2014) is the first publicly available data product of the international GEOTRACES programme, and contains data measured and quality controlled before the end of 2013. It consists of two parts: (1) a compilation of digital data for more than 200 trace elements and isotopes (TEls) as well as classical hydrographic parameters, and (2) the eGEOTRACES Electronic Atlas providing a strongly inter-linked on-line atlas including more than 300 section plots and 90 animated 3D scenes. The IDP2014 covers the Atlantic, Arctic, and Indian oceans, exhibiting highest data density in the Atlantic. The TEI data in the IDP2014 are quality controlled by careful assessment of intercalibration results and multi-laboratory data comparisons at cross-over stations. The digital data are provided in several formats, including ASCII spreadsheet, Excel spreadsheet, netCDF, and Ocean Data View collection. In addition to the actual data values the IDP2014 also contains data quality flags and 1-sigma data error values where available. Quality flags and error values are useful for data filtering. Metadata about data originators, analytical methods and original publications related to the data are linked to the data in an easily accessible way. The eGEOTRACES Electronic Atlas is the visual representation of the IDP2014 data providing section plots and a new kind of animated 3D scenes. The basin-wide 3D scenes allow for viewing of data from many cruises at the same time, thereby providing quick overviews of large-scale tracer distributions. In addition, the 3D scenes provide geographical and bathymetric context that is crucial for the interpretation and assessment of observed tracer plumes, as well as for making inferences about controlling processes.
Resumo:
A new universal power quality manager is proposed. The proposal treats a number of power quality problems simultaneously. The universal manager comprises a combined series and shunt three-phase PWM controlled converters sharing a common DC link. A control scheme based on fuzzy logic is introduced and the general features of the design and operation processes are outlined. The performance of two configurations of the proposed power quality manager are compared in terms of a recently formulated unified power quality index. The validity and integrity of the proposed system is proved through computer simulated experiments
Resumo:
<p>In complex hydrogeological environments the effective management of groundwater quality problems by pump-and-treat operations can be most confidently achieved if the mixing dynamics induced within the aquifer by pumping are well understood. The utility of isotopic environmental tracers (C-, H-, O-, S-stable isotopic analyses and age indicators14C, 3H) for this purpose is illustrated by the analysis of a pumping test in an abstraction borehole drilled into flooded, abandoned coal mineworkings at Deerplay (Lancashire, UK). Interpretation of the isotope data was undertaken conjunctively with that of major ion hydrochemistry, and interpreted in the context of the particular hydraulic setting of flooded mineworkings to identify the sources and mixing of water qualities in the groundwater system. Initial pumping showed breakdown of initial water quality stratification in the borehole, and gave evidence for distinctive isotopic signatures (d34S(SO4) ~= -1.6, d18O(SO4) ~= +15) associated with primary oxidation of pyrite in the zone of water table fluctuationthe first time this phenomenon has been successfully characterized by these isotopes in a flooded mine system. The overall aim of the test pumpingto replace an uncontrolled outflow from a mine entrance in an inconvenient location with a pumped discharge on a site where treatment could be providedwas swiftly achieved. Environmental tracing data illustrated the benefits of pumping as little as possible to attain this aim, as higher rates of pumping induced in-mixing of poorer quality waters from more distant old workings, and/or renewed pyrite oxidation in the shallow subsurface.</p>
Resumo:
In many environmental valuation applications standard sample sizes for choice modelling surveys are impractical to achieve. One can improve data quality using more in-depth surveys administered to fewer respondents. We report on a study using high quality rank-ordered data elicited with the best-worst approach. The resulting "exploded logit" choice model, estimated on 64 responses per person, was used to study the willingness to pay for external benefits by visitors for policies which maintain the cultural heritage of alpine grazing commons. We find evidence supporting this approach and reasonable estimates of mean WTP, which appear theoretically valid and policy informative. The Author (2011).
Resumo:
Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.
Resumo:
There is a significant lack of indoor air quality research in low energy homes. This study compared the indoor air quality of eight<br/>newly built case study homes constructed to similar levels of air-tightness and insulation; with two different ventilation strategies (four homes with Mechanical Ventilation with Heat Recovery (MVHR) systems/Code level 4 and four homes naturally ventilated/Code level 3). Indoor air quality measurements were conducted over a 24 h period in the living room and main bedroom of each home during the summer and winter seasons. Simultaneous outside measurements and an occupant diary were also employed during the measurement period. Occupant interviews were conducted to gain information on perceived indoor air quality, occupant behaviour and building related illnesses. Knowledge of the MVHR system including ventilation related behaviour was also studied. Results suggest indoor air quality problems in both the mechanically ventilated and naturally ventilated homes, with significant issues identified regarding occupant use in the social homes<br/>