962 resultados para Data quality problems
Resumo:
BACKGROUND: Maternal-infant transmission of hepatitis B virus (HBV) during birth carries a high risk for chronic HBV infection in infants with frequent subsequent development of chronic disease. This can be efficiently prevented by early immunization of exposed newborns. The purpose of this study was to determine the compliance with official recommendations for prevention of perinatal HBV transmission in hepatitis B surface antigen (HBsAg) exposed infants. METHODS: Records of pregnant women at 4 sites in Switzerland, admitted for delivery in 2005 and 2006, were screened for maternal HBsAg testing. In HBsAg-exposed infants, recommended procedures (postnatal active and passive immunization, completion of immunization series, and serological success control) were checked. RESULTS: Of 27,131 women tested for HBsAg, 194 (0.73%) were positive with 196 exposed neonates. Of these neonates, 143 (73%) were enrolled and 141 (99%) received simultaneous active and passive HBV immunization within 24 hours of birth. After discharge, the HBV immunization series was completed in 83%. Only 38% of children were tested for anti-HBs afterwards and protective antibody values (>100 U/L) were documented in 27% of the study cohort. No chronically infected child was identified. Analysis of hospital discharge letters revealed significant quality problems. CONCLUSIONS: Intensified efforts are needed to improve the currently suboptimal medical care in HBsAg-exposed infants. We propose standardized discharge letters, as well as reminders to primary care physicians with precise instructions on the need to complete the immunization series in HBsAg-exposed infants and to evaluate success by determination of anti-HBs antibodies after the last dose.
Resumo:
1. Few examples of habitat-modelling studies of rare and endangered species exist in the literature, although from a conservation perspective predicting their distribution would prove particularly useful. Paucity of data and lack of valid absences are the probable reasons for this shortcoming. Analytic solutions to accommodate the lack of absence include the ecological niche factor analysis (ENFA) and the use of generalized linear models (GLM) with simulated pseudo-absences. 2. In this study we tested a new approach to generating pseudo-absences, based on a preliminary ENFA habitat suitability (HS) map, for the endangered species Eryngium alpinum. This method of generating pseudo-absences was compared with two others: (i) use of a GLM with pseudo-absences generated totally at random, and (ii) use of an ENFA only. 3. The influence of two different spatial resolutions (i.e. grain) was also assessed for tackling the dilemma of quality (grain) vs. quantity (number of occurrences). Each combination of the three above-mentioned methods with the two grains generated a distinct HS map. 4. Four evaluation measures were used for comparing these HS maps: total deviance explained, best kappa, Gini coefficient and minimal predicted area (MPA). The last is a new evaluation criterion proposed in this study. 5. Results showed that (i) GLM models using ENFA-weighted pseudo-absence provide better results, except for the MPA value, and that (ii) quality (spatial resolution and locational accuracy) of the data appears to be more important than quantity (number of occurrences). Furthermore, the proposed MPA value is suggested as a useful measure of model evaluation when used to complement classical statistical measures. 6. Synthesis and applications. We suggest that the use of ENFA-weighted pseudo-absence is a possible way to enhance the quality of GLM-based potential distribution maps and that data quality (i.e. spatial resolution) prevails over quantity (i.e. number of data). Increased accuracy of potential distribution maps could help to define better suitable areas for species protection and reintroduction.
Resumo:
BACKGROUND: The Advisa MRI system is designed to safely undergo magnetic resonance imaging (MRI). Its influence on image quality is not well known. OBJECTIVE: To evaluate cardiac magnetic resonance (CMR) image quality and to characterize myocardial contraction patterns by using the Advisa MRI system. METHODS: In this international trial with 35 participating centers, an Advisa MRI system was implanted in 263 patients. Of those, 177 were randomized to the MRI group and 150 underwent MRI scans at the 9-12-week visit. Left ventricular (LV) and right ventricular (RV) cine long-axis steady-state free precession MR images were graded for quality. Signal loss along the implantable pulse generator and leads was measured. The tagging CMR data quality was assessed as the percentage of trackable tagging points on complementary spatial modulation of magnetization acquisitions (n=16) and segmental circumferential fiber shortening was quantified. RESULTS: Of all cine long-axis steady-state free precession acquisitions, 95% of LV and 98% of RV acquisitions were of diagnostic quality, with 84% and 93%, respectively, being of good or excellent quality. Tagging points were trackable from systole into early diastole (360-648 ms after the R-wave) in all segments. During RV pacing, tagging demonstrated a dyssynchronous contraction pattern, which was not observed in nonpaced (n = 4) and right atrial-paced (n = 8) patients. CONCLUSIONS: In the Advisa MRI study, high-quality CMR images for the assessment of cardiac anatomy and function were obtained in most patients with an implantable pacing system. In addition, this study demonstrated the feasibility of acquiring tagging data to study the LV function during pacing.
Resumo:
Especially in panel surveys, respondent attrition, respondent learning, and interviewer experience effects play a crucial role with respect to data quality. We examine three interview survey quality indicators in the same survey in a cross sectional as well as in a longitudinal way. In the cross sectional analysis we compare data quality in the mature original sample with that in a refreshment sample, surveyed in the same wave. Because in the same wave an interviewer survey was conducted, collecting attitudes on their socio demography, survey attitudes and burden measures, we are able to consider interviewer fixed effects as well. The longitudinal analysis gives more insight in the respondent learning effects with respect to the quality indicators considered by considering the very same respondents across waves. The Swiss Household Panel, a CATI survey representative of the Swiss residential population, forms an ideal modelling database: the interviewer - respondent assignment is random, both within and across waves. This design avoids possible confusion with other effects stemming from a non-random assignment of interviewers, e.g. area effects or effects from assigning the best interviewers to the hard cases. In order to separate interviewer, respondent and wave effects, we build cross-classified multilevel models.
Resumo:
Diplomity tehtiin Partek Oyj Abp:lle antamaan IT-jrjestelmist vastuussa oleville ja niihin tukeutuville johtohenkilille yleisnkemys IT-sovellusintegroinnista sek luomaan ohjeet integrointi projekteihin. Diplomityn alkuosassa esitelln liiketoiminnan prosessien pulmakohtia ja sovellusintegrointien liiketoiminnalle tuomia etuja yleisell tasolla perustuen kirjallisuuteen. Yleisen tason etuja saadaan mm. nopeampien prosessien, paremman tiedon saatavuuden ja ihmisille tulevien uusien toimintatapojen kautta tulevista eduista. Tyn seuraavassa osiossa esitelln mit sovellusintegraatio tarkoittaa kytnnss, mit erilaisia vaihtoehtoja integroinneille on ja mit etuja ja haittoja erilaisista integrointitavoista on. Integrointitavoista viesti-pohjainen integrointitapa on noussut suosituimmaksi sen yksinkertaisuuden, luotettavuuden ja helpon liitettvyyden takia. Integrointisovelluksilla on mahdollista siirt, muokata, prosessoida ja varastoida viestej. Niden ominaisuuksien avulla on mahdollista luoda reaaliaikaisia yhteistyverkostoja. Tm osio perustuu kirjallisuuteen , artikkeleihin ja haastatteluihin. Kolmas osio keskittyy integrointi projektin ominaispiirteisiin luoden toimintakartan integrointiprojektin kululle. Osiossa esitelln huomioitavat tekniset asiat, kustannukset ja edut sek mallipohjia integroinnin dokumentointiin. Osio perustuu omiin kokemuksiin, haastatteluihin sek kirjallisuuteen. Neljnness osiossa esitelln Partekissa tehty integrointiprojekti. Integrointity tehtiin ostajille tarkoitetun toimittajarekisterin (PPM) ja ERP-jrjestelmn (Baan) vlill. Integrointiin kytettiin yht suosituinta integrointitykalua nimeltn IBM WebSphere MQ.Osio perustuu projektin dokumentointiin, omiin kokemuksiin ja kirjallisuuteen. Diplomityn ptt yhteenveto. Kolme petua voidaan saavuttaa integroinneilla ja toimintakartalla; tiedon luotettavuus paranee, toimintakartalla saadaan integroinneille malli ja luodaan riippumattomuutta tiettyihin avain henkilihin tarkalla dokumentoinnilla ja toimintatapojen standardoinnilla.
Resumo:
Tyn tavoittena oli selvitt, miten tietovarastointi voi tukea yrityksess tapahtuvaa ptksentekoa. Tietovarastokomponenttien ja prosessien kuvauksen jlkeen on ksitelty tietovarastoprojektin eri vaiheita. Esitetty teoriaa sovellettiin kytntn globaalissa metalliteollisuusyrityksess, jossa tietovarastointikonseptia testattiin. Testauksen perusteella arvioitiin olemassa olevan tiedon tilaa sek kahden kytetyn ohjelmiston toimivuutta tietovarastoinnissa. Yrityksen operatiivisten jrjestelmien tiedon laadun todettiin olevan tutkituilta osin epyhtenist ja puutteellista. Siksi tiedon suora yrityslaajuinen hydyntminen luotettavien ja hyvlaatuisten raporttien luonnissa on vaikeaa. Lisksi eri yksikiden vlill havaittiin epyhtenisyytt kytettyjen liiketoiminnan ksitteiden sek jrjestelmien kytttapojen suhteen. Testauksessa kytetyt ohjelmistot suoriutuivat perustietovarastoinnista hyvin, vaikkakin joitain rajoituksia ja erikoisuuksia ilmenikin. Tyt voidaan pit ennen varsinaista tietovarastoprojektia tehtvn esitutkimuksena. Jatkotoimenpitein ehdotetaan testauksen jatkamista nykyisill tykaluilla kohdistaen tavoitteet konkreettisiin tuloksiin. Tiedon laadun trkeytt tulee korostaa koko organisaatiossa ja olemassa olevan tiedon laatua pit parantaa tulevaisuudessa.
Resumo:
The enhanced functional sensitivity offered by ultra-high field imaging may significantly benefit simultaneous EEG-fMRI studies, but the concurrent increases in artifact contamination can strongly compromise EEG data quality. In the present study, we focus on EEG artifacts created by head motion in the static B0 field. A novel approach for motion artifact detection is proposed, based on a simple modification of a commercial EEG cap, in which four electrodes are non-permanently adapted to record only magnetic induction effects. Simultaneous EEG-fMRI data were acquired with this setup, at 7T, from healthy volunteers undergoing a reversing-checkerboard visual stimulation paradigm. Data analysis assisted by the motion sensors revealed that, after gradient artifact correction, EEG signal variance was largely dominated by pulse artifacts (81-93%), but contributions from spontaneous motion (4-13%) were still comparable to or even larger than those of actual neuronal activity (3-9%). Multiple approaches were tested to determine the most effective procedure for denoising EEG data incorporating motion sensor information. Optimal results were obtained by applying an initial pulse artifact correction step (AAS-based), followed by motion artifact correction (based on the motion sensors) and ICA denoising. On average, motion artifact correction (after AAS) yielded a 61% reduction in signal power and a 62% increase in VEP trial-by-trial consistency. Combined with ICA, these improvements rose to a 74% power reduction and an 86% increase in trial consistency. Overall, the improvements achieved were well appreciable at single-subject and single-trial levels, and set an encouraging quality mark for simultaneous EEG-fMRI at ultra-high field.
Resumo:
For any international companies who wish to enter the Chinese market, quality is base on the fundamental. The companies are coming to realize the importance of quality gradually, thus companies have been put the quality problems on the agenda. The competitiveness of companies comes from quality. Quality is the key to success, and it can decide that the companies can be accepted or eliminated by the market. Due to the obvious benefits, the demand of the method of how to achieve high quality of product keeps growing. During achieving the high quality process, the main troubles come from the impact between Eastern and Western culture. Chinese culture which is different with Western one have lasted as long as five thousand years. Such a culture deeply rooted in the hearts of Chinese people, and effected generation after generation of Chinese people's working style and ways of thinking. This thesis determines how to find a good fit point between Eastern and Western culture. Doing right thing by the right way. The nature of improving quality is improving management level in fact. "How to manage, who should be managed", the thesis explains the basic and best option to achieve those. It describes three-dimension-style management to monitoring the working process. This kind of management style can inspect production process from horizontal and vertical direction. In this management way, it defines effective evaluation system to every subcontractor, and makes the companies to achieve the ultimate goal - satisfy quality. Because of the importance of human factor, the thesis determines the range of training of the inspector and welder due to the current situation of China. The results show that in order to reach reliable training effective evaluation, not only the quality of the human but also the ultimate goal of product quality.
Resumo:
Tutkimus ksittelee tuotekustannuslaskennan kehittmist VAASAN Oy:ss, joka on yksi johtavista leipomoyrityksist Suomessa ja Baltiassa. Tyn tavoitteena on kehitt tuotekustannuslaskentaa paremmin ptksentekoa tukevaksi. Kehitystyss otettiin huomioon laskentamallin joustavuuden silyminen. Kehitystarpeet selvitettiin tuotekustannuslaskennan nykytilaanalyysill sek kyttjryhmien tietotarvekartoituksella. Kehityksen ohjaamiseen sovellettiin kustannustiedon laadukkuuden teorioita sek riskienhallinnan tykalua. Lhtkohtana oli varsin hajautetusti toteutettu tuotekustannuslaskenta, mink luotettavuuden ongelma tiedostettiin. Tyn ptteeksi tuotekustannuslaskenta keskitetn. Keskittmisen tuoma tilaisuus halutaan kytt mahdollisimman tehokkaasti hydyksi kustannuslaskennan kehittmisess. Ennen keskittmist tulee tuotekustannuslaskenta johdonmukaistaa eri leipomoiden vlill. Tss tyss kartoitettiin kehityskohteet sek esitetn toimintamalli tai ehdotus trkeimpien kehityskohteiden ratkaisemiseksi. Trkeimmt kehityskohteet lydettiin jsentmll ja priorisoimalla kehityskohteet. Tuotekustannuslaskentaa kehitettiin paremmin ptksentekoa tukevaksi, johdonmukaisemmaksi, luotettavammaksi sek sen kytettvyytt kehitettiin.
Resumo:
Statistical analyses of measurements that can be described by statistical models are of essence in astronomy and in scientific inquiry in general. The sensitivity of such analyses, modelling approaches, and the consequent predictions, is sometimes highly dependent on the exact techniques applied, and improvements therein can result in significantly better understanding of the observed system of interest. Particularly, optimising the sensitivity of statistical techniques in detecting the faint signatures of low-mass planets orbiting the nearby stars is, together with improvements in instrumentation, essential in estimating the properties of the population of such planets, and in the race to detect Earth-analogs, i.e. planets that could support liquid water and, perhaps, life on their surfaces. We review the developments in Bayesian statistical techniques applicable to detections planets orbiting nearby stars and astronomical data analysis problems in general. We also discuss these techniques and demonstrate their usefulness by using various examples and detailed descriptions of the respective mathematics involved. We demonstrate the practical aspects of Bayesian statistical techniques by describing several algorithms and numerical techniques, as well as theoretical constructions, in the estimation of model parameters and in hypothesis testing. We also apply these algorithms to Doppler measurements of nearby stars to show how they can be used in practice to obtain as much information from the noisy data as possible. Bayesian statistical techniques are powerful tools in analysing and interpreting noisy data and should be preferred in practice whenever computational limitations are not too restrictive.
Resumo:
Diplomityn tavoitteena on tutkia tuotetiedon laadun vaikutuksia autojen varaosia myyvien liikkeiden toimintaan. Tutkimusta varten haastatellaan myymlhenkilkuntaa kahdessa varaosamyymlketjussa. Haastatteluissa hydynnetn tarinankerronnallista lhestymistapaa. Haastatteluissa esille nousevien kertomusten perusteella tunnistetaan henkilkunnan kokemia ongelmia ja niihin liittyvi tuotetiedon puutteita. Lytyneet ongelmat luokitellaan teoriaosassa esiteltvn tiedon laadun luokittelun perusteella ja selvitetn aiheeseen liittyvt keskeiset ksitteet. Tyss kuvataan tutkittavien myymliden tietojrjestelmiss olevaa tuotetiedon rakennetta. Lisksi esitetn parannusehdotuksia tietojrjestelmien tietosisltn ja tiedonhallintaprosessiin tiedon laadun parantamiseksi.
Resumo:
Lquipe de travail reprsente, de nos jours, une forme dorganisation du travail incontournable pour accrotre la performance des entreprises. Il est maintenant largement reconnu que la composition dune quipe de travail est susceptible daffecter son efficacit et, plus particulirement, la qualit des relations interpersonnelles au sein de lquipe et la performance de ses membres. Des tudes se sont donc attardes isoler certaines caractristiques individuelles permettant dinfluencer le contexte des quipes de travail. Ainsi, leffet de la composition des quipes a t tudi sous langle des traits de personnalit des membres, de leurs attitudes et de leurs valeurs, ou encore en fonction des habilets cognitives dmontres par chacun des coquipiers. Malgr quelques invitations rptes tudier davantage la motivation des quipes de travail et de leurs membres, peu de travaux ont port sur la composition motivationnelle de ce contexte collectif. Dans la foule des tudes portant sur les quipes de travail, on observe une tendance ne considrer que la performance des quipes et des coquipiers comme unique critre dvaluation de leur efficacit. Devant les donnes alarmantes concernant les problmes de sant psychologique vcus par les travailleurs, il appert quil est essentiel de se pencher sur les conditions ncessaires mettre en place dans ce contexte de travail interpersonnel pour contribuer la fois au bien-tre et la performance des coquipiers. Avec lutilisation du cadre danalyse propos par la thorie de lautodtermination, la prsente thse vise rpondre ces enjeux. Le premier article de la thse propose un modle thorique qui dfinit en quoi la composition dune quipe de travail, en regard des styles de rgulations individuelles des membres de lquipe, est susceptible daffecter les relations interpersonnelles au sein de lquipe et davoir un impact sur la performance et le bien-tre des membres. Sappuyant sur les mcanismes dmergence proposs par les thories multiniveaux, ce cadre thorique suggre galement que, sous certaines conditions, la composition motivationnelle dune quipe de travail puisse entraner la formation dun phnomne singulier de motivation dquipe. Les mcanismes favorisant cette mergence sont prsents dans larticle. Le second article de la thse reprsente une premire vrification empirique de certaines des propositions de larticle thorique. partir dun chantillon de 138 quipes, regroupant 680 travailleurs, il a t possible de vrifier, partir danalyses multiniveaux, limpact de la composition autonome ou contrle dune quipe sur la satisfaction au travail des participants. Les rsultats de ltude montrent quune forme de composition dquipe de nature plus autonome est positivement relie la satisfaction des travailleurs. Plus encore, on observe une interaction entre la rgulation autonome individuelle et la rgulation autonome dquipe quant la satisfaction vcue au travail. Ainsi, la satisfaction au travail est plus leve pour les participants dont le style de rgulation est plus autonome et qui voluent dans une quipe composition motivationnelle plus autonome. Paralllement, les rsultats montrent que la composition motivationnelle plus contrle est ngativement relie la satisfaction au travail. De faon gnrale, la prsente thse souligne la pertinence de considrer le contexte sociomotivationnel mergeant de la composition de lquipe en regard des styles de rgulations individuelles des membres qui la composent. Cette thse permet de considrer, avec un regard nouveau, la motivation des quipes de travail et les variables de motivation valuer dans la formation des quipes de travail au sein de nos organisations.
Resumo:
The GaussNewton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an inner direct or iterative process. In comparison with Newtons method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact GaussNewton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the GaussNewton method of two types of approximation used commonly in data assimilation. First, we examine truncated GaussNewton methods where the inner linear least squares problem is not solved exactly, and second, we examine perturbed GaussNewton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed GaussNewton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.
Resumo:
Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.
Resumo:
Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.