993 resultados para Statistical Error
Resumo:
L'expérience Belle, située dans le centre de recherche du KEK, au Japon, est consacrée principalement à l'étude de la violation de CP dans le système des mésons B. Elle est placée sur le collisionneur KEKB, qui produit des paires Banti-B. KEKB, l'une des deux « usines à B » actuellement en fonction, détient le record du nombre d'événements produits avec plus de 150 millions de paires. Cet échantillon permet des mesures d'une grande précision dans le domaine de la physique du méson B. C'est dans le cadre de ces mesures de précision que s'inscrit cette analyse. L'un des phénomènes remarquables de la physique des hautes énergies est la faculté qu'a l'interaction faible de coupler un méson neutre avec son anti-méson. Dans le présent travail, nous nous intéressons au méson B neutre couplé à l'anti-méson B neutre, avec une fréquence d'oscillation _md mesurable précisément. Outre la beauté de ce phénomène lui-même, une telle mesure trouve sa place dans la quête de l'origine de la violation de CP. Cette dernière n'est incluse que d'une façon peu satisfaisante dans le modèle standard des interactions électro-faibles. C'est donc la recherche de phénomènes physiques encore inexpliqués qui motive en premier lieu la collaboration Belle. Il existe déjà de nombreuses mesures de _md antérieures. Celle que nous présentons ici est cependant d'une précision encore jamais atteinte grâce, d'une part, à l'excellente performance de KEKB et, d'autre part, à une approche originale qui permet de réduire considérablement la contamination de la mesure par des événements indésirés. Cette approche fut déjà mise à profit par d'autres expériences, dans des conditions quelque peu différentes de celles de Belle. La méthode utilisée consiste à reconstruire partiellement l'un des mésons dans le canal ___D*(D0_)l_l, en n'utilisant que les informations relatives au lepton l et au pion _. L'information concernant l'autre méson de la paire Banti-B initiale n'est tirée que d'un seul lepton de haute énergie. Ainsi, l'échantillon à disposition ne souffre pas de grandes réductions dues à une reconstruction complète, tandis que la contamination due aux mésons B chargés, produits par KEKB en quantité égale aux B0, est fortement diminuée en comparaison d'une analyse inclusive. Nous obtenons finalement le résultat suivant : _md = 0.513±0.006±0.008 ps^-1, la première erreur étant l'erreur statistique et la deuxième, l'erreur systématique.<br/><br/>The Belle experiment is located in the KEK research centre (Japan) and is primarily devoted to the study of CP violation in the B meson sector. Belle is placed on the KEKB collider, one of the two currently running "B-meson factories", which produce Banti-B pairs. KEKB has created more than 150 million pairs in total, a world record for this kind of colliders. This large sample allows very precise measurements in the physics of beauty mesons. The present analysis falls within the framework of these precise measurements. One of the most remarkable phenomena in high-energy physics is the ability of weak interactions to couple a neutral meson to its anti-meson. In this work, we study the coupling of neutral B with neutral anti-B meson, which induces an oscillation of frequency _md we can measure accurately. Besides the interest of this phenomenon itself, this measurement plays an important role in the quest for the origin of CP violation. The standard model of electro-weak interactions does not include CP violation in a fully satisfactory way. The search for yet unexplained physical phenomena is, therefore, the main motivation of the Belle collaboration. Many measurements of _md have previously been performed. The present work, however, leads to a precision on _md that was never reached before. This is the result of the excellent performance of KEKB, and of an original approach that allows to considerably reduce background contamination of pertinent events. This approach was already successfully used by other collaborations, in slightly different conditions as here. The method we employed consists in the partial reconstruction of one of the B mesons through the decay channel ___D*(D0_)l_l, where only the information on the lepton l and the pion _ are used. The information on the other B meson of the initial Banti-B pair is extracted from a single high-energy lepton. The available sample of Banti-B pairs thus does not suffer from large reductions due to complete reconstruction, nor does it suffer of high charged B meson background, as in inclusive analyses. We finally obtain the following result: _md = 0.513±0.006±0.008 ps^-1, where the first error is statistical, and the second, systematical.<br/><br/>De quoi la matière est-elle constituée ? Comment tient-elle ensemble ? Ce sont là les questions auxquelles la recherche en physique des hautes énergies tente de répondre. Cette recherche est conduite à deux niveaux en constante interaction. D?une part, des modèles théoriques sont élaborés pour tenter de comprendre et de décrire les observations. Ces dernières, d?autre part, sont réalisées au moyen de collisions à haute énergie de particules élémentaires. C?est ainsi que l?on a pu mettre en évidence l?existence de quatre forces fondamentales et de 24 constituants élémentaires, classés en « quarks » et « leptons ». Il s?agit là de l?une des plus belles réussites du modèle en usage aujourd?hui, appelé « Modèle Standard ». Il est une observation fondamentale que le Modèle Standard peine cependant à expliquer, c?est la disparition quasi complète de l?anti-matière (le « négatif » de la matière). Au niveau fondamental, cela doit correspondre à une asymétrie entre particules (constituants de la matière) et antiparticules (constituants de l?anti-matière). On l?appelle l?asymétrie (ou violation) CP. Bien qu?incluse dans le Modèle Standard, cette asymétrie n?est que partiellement prise en compte, semble-t-il. En outre, son origine est inconnue. D?intenses recherches sont donc aujourd?hui entreprises pour mettre en lumière cette asymétrie. L?expérience Belle, au Japon, en est une des pionnières. Belle étudie en effet les phénomènes physiques liés à une famille de particules appelées les « mésons B », dont on sait qu?elles sont liées de près à l?asymétrie CP. C?est dans le cadre de cette recherche que se place cette thèse. Nous avons étudié une propriété remarquable du méson B neutre : l?oscillation de ce méson avec son anti-méson. Cette particule est de se désintégrer pour donner l?antiparticule associée. Il est clair que cette oscillation est rattachée à l?asymétrie CP. Nous avons ici déterminé avec une précision encore inégalée la fréquence de cette oscillation. La méthode utilisée consiste à caractériser une paire de mésons B à l?aide de leur désintégration comprenant un lepton chacun. Une plus grande précision est obtenue en recherchant également une particule appelée le pion, et qui provient de la désintégration d?un des mésons. Outre l?intérêt de ce phénomène oscillatoire en lui-même, cette mesure permet d?affiner, directement ou indirectement, le Modèle Standard. Elle pourra aussi, à terme, aider à élucider le mystère de l?asymétrie entre matière et anti-matière.
Resumo:
Quality inspection and assurance is a veryimportant step when today's products are sold to markets. As products are produced in vast quantities, the interest to automate quality inspection tasks has increased correspondingly. Quality inspection tasks usuallyrequire the detection of deficiencies, defined as irregularities in this thesis. Objects containing regular patterns appear quite frequently on certain industries and science, e.g. half-tone raster patterns in the printing industry, crystal lattice structures in solid state physics and solder joints and components in the electronics industry. In this thesis, the problem of regular patterns and irregularities is described in analytical form and three different detection methods are proposed. All the methods are based on characteristics of Fourier transform to represent regular information compactly. Fourier transform enables the separation of regular and irregular parts of an image but the three methods presented are shown to differ in generality and computational complexity. Need to detect fine and sparse details is common in quality inspection tasks, e.g., locating smallfractures in components in the electronics industry or detecting tearing from paper samples in the printing industry. In this thesis, a general definition of such details is given by defining sufficient statistical properties in the histogram domain. The analytical definition allowsa quantitative comparison of methods designed for detail detection. Based on the definition, the utilisation of existing thresholding methodsis shown to be well motivated. Comparison of thresholding methods shows that minimum error thresholding outperforms other standard methods. The results are successfully applied to a paper printability and runnability inspection setup. Missing dots from a repeating raster pattern are detected from Heliotest strips and small surface defects from IGT picking papers.
Resumo:
The market place of the twenty-first century will demand that manufacturing assumes a crucial role in a new competitive field. Two potential resources in the area of manufacturing are advanced manufacturing technology (AMT) and empowered employees. Surveys in Finland have shown the need to invest in the new AMT in the Finnish sheet metal industry in the 1990's. In this run the focus has been on hard technology and less attention is paid to the utilization of human resources. In manymanufacturing companies an appreciable portion of the profit within reach is wasted due to poor quality of planning and workmanship. The production flow production error distribution of the sheet metal part based constructions is inspectedin this thesis. The objective of the thesis is to analyze the origins of production errors in the production flow of sheet metal based constructions. Also the employee empowerment is investigated in theory and the meaning of the employee empowerment in reducing the overall production error amount is discussed in this thesis. This study is most relevant to the sheet metal part fabricating industrywhich produces sheet metal part based constructions for electronics and telecommunication industry. This study concentrates on the manufacturing function of a company and is based on a field study carried out in five Finnish case factories. In each studied case factory the most delicate work phases for production errors were detected. It can be assumed that most of the production errors are caused in manually operated work phases and in mass production work phases. However, no common theme in collected production error data for production error distribution in the production flow can be found. Most important finding was still that most of the production errors in each case factory studied belong to the 'human activity based errors-category'. This result indicates that most of the problemsin the production flow are related to employees or work organization. Development activities must therefore be focused to the development of employee skills orto the development of work organization. Employee empowerment gives the right tools and methods to achieve this.
Resumo:
OBJECTIVE: The cause precipitating intracranial aneurysm rupture remains unknown in many cases. It has been observed that aneurysm ruptures are clustered in time, but the trigger mechanism remains obscure. Because solar activity has been associated with cardiovascular mortality and morbidity, we decided to study its association to aneurysm rupture in the Swiss population. METHODS: Patient data were extracted from the Swiss SOS database, at time of analysis covering 918 consecutive patients with angiography-proven aneurysmal subarachnoid hemorrhage treated at 7 Swiss neurovascular centers between January 1, 2009, and December 31, 2011. The daily rupture frequency (RF) was correlated to the absolute amount and the change in various parameters of interest representing continuous measurements of solar activity (radioflux [F10.7 index], solar proton flux, solar flare occurrence, planetary K-index/planetary A-index, Space Environment Services Center [SESC] sunspot number and sunspot area) using Poisson regression analysis. RESULTS: During the period of interest, there were 517 days without recorded aneurysm rupture. There were 398, 139, 27, 12, 1, and 1 days with 1, 2, 3, 4, 5, and 6 ruptures per day. Poisson regression analysis demonstrated a significant correlation of F10.7 index and RF (incidence rate ratio [IRR] = 1.006303; standard error (SE) 0.0013201; 95% confidence interval (CI) 1.003719-1.008894; P < 0.001), according to which every 1-unit increase of the F10.7 index increased the count for an aneurysm to rupture by 0.63%. A likewise statistically significant relationship of both the SESC sunspot number (IRR 1.003413; SE 0.0007913; 95% CI 1.001864-1.004965; P < 0.001) and the sunspot area (IRR 1.000419; SE 0.0000866; 95% CI 1.000249-1.000589; P < 0.001) emerged. All other variables analyzed showed no significant correlation with RF. CONCLUSIONS: We found greater radioflux, SESC sunspot number, and sunspot area to be associated with an increased count of aneurysm rupture. The clinical meaningfulness of this statistical association must be interpreted carefully and future studies are warranted to rule out a type-1 error.
Resumo:
The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.
Resumo:
Location information is becoming increasingly necessary as every new smartphone incorporates a GPS (Global Positioning System) which allows the development of various applications based on it. However, it is not possible to properly receive the GPS signal in indoor environments. For this reason, new indoor positioning systems are being developed. As indoors is a very challenging scenario, it is necessary to study the precision of the obtained location information in order to determine if these new positioning techniques are suitable for indoor positioning.
Resumo:
The effect of the heat flux on the rate of chemical reaction in dilute gases is shown to be important for reactions characterized by high activation energies and in the presence of very large temperature gradients. This effect, obtained from the second-order terms in the distribution function (similar to those obtained in the Burnett approximation to the solution of the Boltzmann equation), is derived on the basis of information theory. It is shown that the analytical results describing the effect are simpler if the kinetic definition for the nonequilibrium temperature is introduced than if the thermodynamic definition is introduced. The numerical results are nearly the same for both definitions
Resumo:
Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.
Resumo:
Background. Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both. Methodology and Principal Findings. Interventions were defined as 1) the addition of a statistical reviewer to the clinical peer review process, and 2) suggesting reporting guidelines to reviewers; with"no statistical expert" and"no checklist" as controls. The two interventions were crossed in a 262 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc) editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others). Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3%) of the 115 included papers were interventions, 46 (40.0%) were longitudinal designs, 28 (24.3%) cross-sectional and 20 (17.4%) others. The 16 (13.9%) rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6- 24.4). The effect of suggesting a guideline to the reviewers had no effect on change in overall quality as measured by the Goodman scale (0.9, 95% CI: 20.3+2.1). The estimated effect of adding a statistical reviewer was 5.5 (95% CI: 4.3-6.7), showing a significant improvement in quality. Conclusions and Significance. This prospective randomized study shows the positive effect of adding a statistical reviewer to the field-expert peers in improving manuscript quality. We did not find a statistically significant positive effect by suggesting reviewers use reporting guidelines.
Resumo:
This study deals with the statistical properties of a randomization test applied to an ABAB design in cases where the desirable random assignment of the points of change in phase is not possible. In order to obtain information about each possible data division we carried out a conditional Monte Carlo simulation with 100,000 samples for each systematically chosen triplet. Robustness and power are studied under several experimental conditions: different autocorrelation levels and different effect sizes, as well as different phase lengths determined by the points of change. Type I error rates were distorted by the presence of autocorrelation for the majority of data divisions. Satisfactory Type II error rates were obtained only for large treatment effects. The relationship between the lengths of the four phases appeared to be an important factor for the robustness and the power of the randomization test.
Resumo:
BACKGROUND/RATIONALE: Patient safety is a major concern in healthcare systems worldwide. Although most safety research has been conducted in the inpatient setting, evidence indicates that medical errors and adverse events are a threat to patients in the primary care setting as well. Since information about the frequency and outcomes of safety incidents in primary care is required, the goals of this study are to describe the type, frequency, seasonal and regional distribution of medication incidents in primary care in Switzerland and to elucidate possible risk factors for medication incidents. Label="METHODS AND ANALYSIS" ="METHODS"/> <AbstractText STUDY DESIGN AND SETTING: We will conduct a prospective surveillance study to identify cases of medication incidents among primary care patients in Switzerland over the course of the year 2015. PARTICIPANTS: Patients undergoing drug treatment by 167 general practitioners or paediatricians reporting to the Swiss Federal Sentinel Reporting System. INCLUSION CRITERIA: Any erroneous event, as defined by the physician, related to the medication process and interfering with normal treatment course. EXCLUSION CRITERIA: Lack of treatment effect, adverse drug reactions or drug-drug or drug-disease interactions without detectable treatment error. PRIMARY OUTCOME: Medication incidents. RISK FACTORS: Age, gender, polymedication, morbidity, care dependency, hospitalisation. STATISTICAL ANALYSIS: Descriptive statistics to assess type, frequency, seasonal and regional distribution of medication incidents and logistic regression to assess their association with potential risk factors. Estimated sample size: 500 medication incidents. LIMITATIONS: We will take into account under-reporting and selective reporting among others as potential sources of bias or imprecision when interpreting the results. ETHICS AND DISSEMINATION: No formal request was necessary because of fully anonymised data. The results will be published in a peer-reviewed journal. TRIAL REGISTRATION NUMBER: NCT0229537.
Resumo:
Background: The repertoire of statistical methods dealing with the descriptive analysis of the burden of a disease has been expanded and implemented in statistical software packages during the last years. The purpose of this paper is to present a web-based tool, REGSTATTOOLS http://regstattools.net intended to provide analysis for the burden of cancer, or other group of disease registry data. Three software applications are included in REGSTATTOOLS: SART (analysis of disease"s rates and its time trends), RiskDiff (analysis of percent changes in the rates due to demographic factors and risk of developing or dying from a disease) and WAERS (relative survival analysis). Results: We show a real-data application through the assessment of the burden of tobacco-related cancer incidence in two Spanish regions in the period 1995-2004. Making use of SART we show that lung cancer is the most common cancer among those cancers, with rising trends in incidence among women. We compared 2000-2004 data with that of 1995-1999 to assess percent changes in the number of cases as well as relative survival using RiskDiff and WAERS, respectively. We show that the net change increase in lung cancer cases among women was mainly attributable to an increased risk of developing lung cancer, whereas in men it is attributable to the increase in population size. Among men, lung cancer relative survival was higher in 2000-2004 than in 1995-1999, whereas it was similar among women when these time periods were compared. Conclusions: Unlike other similar applications, REGSTATTOOLS does not require local software installation and it is simple to use, fast and easy to interpret. It is a set of web-based statistical tools intended for automated calculation of population indicators that any professional in health or social sciences may require.
Resumo:
Tutkimuksen kohteena olleen UPM-Kymmene Oyj Kajaanin tehtaan PK3:n laatusäätöjärjestelmä ja mittapalkki uusittiin, jolloin haluttiin selvittää uusinnan vaikutuksia laatusäätöjen suorituskykyyn ja paperin laatuun. Työn kirjallisessa osassa perehdyttiin paperinvalmistusprosessin osiin kyseisen sanomalehtipaperikoneen tapauksessa sekä keskeisimpiin paperin laatuominaisuuksiin liittyviin mittaus- ja säätölaitteisiin sekä niiden toimintaan. Seurattaviksi paperin laatusuureiksi valittiin neliömassa, kuivamassa, kosteus ja paksuus, jotka ovat sanomalehtipaperin tärkeimpiä online-mitattavia ominaisuuksia. Paperin laatusuureiden seurantaan käytetään erilaisia tunnuslukuja ja työkaluja, joita on esitelty tässä työssä. Laatusuureiden konesuuntaisen ja poikkisuuntaisen seurannan tunnusluvuksi valittiin yleisesti käytössä oleva 2σ-keskiarvohajonta. Säätöjen suorituskykyä seurattiin suorituskykykolmion ohjausmatkaindeksien (CTI) ja erosuureen integraalien (IAE) avulla. Kokeellisessa osassa kerättiin mittaustietoja sekä vanhan että uuden laatusäätöjärjestelmän aikana. Seurattavat ajotilanteet paperikoneella jaettiin stabiiliin ajoon ja muutostilanteisiin, jotka käsittävät katkot ja lajinvaihtotilanteet. Stabiilin ajon aikana selvitettiin laatusuureiden hajontojen ja säätöjen suorituskykyindeksien normaaleissa tasoissa tapahtuneet muutokset. Muutostilanteiden osalta haluttiin selvittää, nopeuttaako järjestelmäuusinta katkoista toipumista ja lajinvaihtoaikaa. Stabiilin ajon seurannasta saatujen tulosten perusteella neliömassan ja kuivamassan konesuuntaiset hajonnat kasvoivat järjestelmäuusinnan myötä, mutta kosteuden konesuuntaiset hajonnat pienenivät. Laatusuureiden poikkisuuntaisista hajonnoista neliömassan sekä kuivamassan hajonnat kasvoivat ja kosteuden sekä paksuuden hajonnat pienenivät joidenkin lajien osalta. Poikkisuuntaisten laatusuureiden, etenkin paksuuden, toipuminen katkon jälkeen nopeutui. Myös lajinvaihtoon kuluva aika lyheni poikkisuuntaisilla laatusuureilla. Muutostilanteiden konesuuntaisten hajontojen asettumisajat eivät juuri parantuneet.
Resumo:
[cat] Estudiem les propietats teòriques que una funció d.emparellament ha de satisfer per tal de representar un mercat laboral amb friccions dins d'un model d'equilibri general amb emparellament aleatori. Analitzem el cas Cobb-Douglas, CES i altres formes funcionals per a la funció d.emparellament. Els nostres resultats estableixen restriccions sobre els paràmetres d'aquests formes funcionals per assegurar que l.equilibri és interior. Aquestes restriccions aporten raons teòriques per escollir entre diverses formes funcionals i permeten dissenyar tests d'error d'especificació de model en els treballs empírics.
Resumo:
Tämä diplomityö liittyy Spektrikuvien tutkimiseen tilastollisen kuvamallin näkökulmasta. Diplomityön ensimmäisessä osassa tarkastellaan tilastollisten parametrien jakaumien vaikutusta väreihin ja korostumiin erilaisissa valaistusolosuhteissa. Havaittiin, että tilastollisten parametrien väliset suhteet eivät riipu valaistusolosuhteista, mutta riippuvat kuvan häiriöttömyydestä. Ilmeni myös, että korkea huipukkuus saattaa aiheutua värikylläisyydestä. Lisäksi työssä kehitettiin tilastolliseen spektrimalliin perustuvaa tekstuurinyhdistämisalgoritmia. Sillä saavutettiin hyviä tuloksia, kun tilastollisten parametrien väliset riippuvuussuhteet olivat voimassa. Työn toisessa osassa erilaisia spektrikuvia tutkittiin käyttäen itsenäistä komponenttien analyysia (ICA). Seuraavia itsenäiseen komponenttien analyysiin tarkoitettuja algoritmia tarkasteltiin: JADE, kiinteän pisteen ICA ja momenttikeskeinen ICA. Tutkimuksissa painotettiin erottelun laatua. Paras erottelu saavutettiin JADE- algoritmilla, joskin erot muiden algoritmien välillä eivät olleet merkittäviä. Algoritmi jakoi kuvan kahteen itsenäiseen, joko korostuneeseen ja korostumattomaan tai kromaattiseen ja akromaattiseen, komponenttiin. Lopuksi pohditaan huipukkuuden suhdetta kuvan ominaisuuksiin, kuten korostuneisuuteen ja värikylläisyyteen. Työn viimeisessä osassa ehdotetaan mahdollisia jatkotutkimuskohteita.