986 resultados para R-Statistical computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Measurement System Analysis (MSA - Measurement System Analysis) is a statistical methodology developed to study and analyze the behavior of the measurement systems, and, therefore, allow the increased of the confidence readings performed by measuring instruments. It’s widely used in the automotive industry since the 90’s and is a mandatory requirement for the approval of the parts according to ISO Standard of the automotive sector. However, the aerospace industry doesn’t require this type of Study, once which the vast majority of aeronautics parts have characteristics (dimensions) with very tight tolerances, closed, ie, at the home of microns. This work aims to create lists of recommendations for definitions of measuring instruments in developing of control plans, which correlates tolerances fields of characteristics for different settings and acceptance of the instrument, classified as optimum, recommended and not recommended, through of the study of R&R (Repeatability and Reproducibility) in aeronautics parts. Every methodology of the experimental part was based on modern strategy of continuous improvement, the DMAIC (Define Measure Analyze Implant Control), in order to achieve better measurement method used in the control of milling aeronautics parts, identifying and reducing the variations of the measurement process. The results of the R&R Study in large part of measuring instrument manuals were considered acceptable and/or recommended, ie with values of %P/T and %RR lower than 30%, providing statistical data which have enabled the elaboration of tables of recommendations, which, from this work, have turned into very important documents and aid for Process Engineering, having in their hands a technical study able to identify which is the most appropriate instrument to get a more robust dimensional... (Complete abstract click electronic access below)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Control charts are very important tools in statistical quality control of industrial processes and its use started last century. Since its development, the charts have always been attributed to independent processes, i.e. without any correlation between samples. But nowadays, with the high level of automation in the industrial environment, it is noticeable the autocorrelation factor between samples. The main Xcharts used in monitoring quality characteristics represented by continuous variables are the mean (X ), amplitude (R) and variance (S²). Therefore, this work aims to analyze the performance of X and R charts and in of X and S² charts with different sample sizes (4 and 5) for monitoring autocorrelated processes. Through computer simulations using the Fortran software and the use of mathematical expressions was possible to obtain data and performance analysis of the detection power charts for independent observations and for autocorrelated observations according to the model AR (1). The results show that the effect of autocorrelation reduces the ability of monitoring the control charts and that, the greater this effect, the slower the chart becomes in misfits signaling

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model for the joint economic design of X̄ and R control charts is developed. This model assumes that the process is subject to two assignable causes. One assignable cause shifts the process mean; the other shifts the process variance. The occurrence of the assignable cause of one kind does not block the occurrence of the assignable cause of another kind. Consequently, a second process parameter can go out-of-control after the first process parameter has gone out-of-control. A numerical study of the cost surface to the model considered has revealed that it is convex, at least in the interest region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most authors struggle to pick a title that adequately conveys all of the material covered in a book. When I first saw Applied Spatial Data Analysis with R, I expected a review of spatial statistical models and their applications in packages (libraries) from the CRAN site of R. The authors’ title is not misleading, but I was very pleasantly surprised by how deep the word “applied” is here. The first half of the book essentially covers how R handles spatial data. To some statisticians this may be boring. Do you want, or need, to know the difference between S3 and S4 classes, how spatial objects in R are organized, and how various methods work on the spatial objects? A few years ago I would have said “no,” especially to the “want” part. Just let me slap my EXCEL spreadsheet into R and run some spatial functions on it. Unfortunately, the world is not so simple, and ultimately we want to minimize effort to get all of our spatial analyses accomplished. The first half of this book certainly convinced me that some extra effort in organizing my data into certain spatial class structures makes the analysis easier and less subject to mistakes. I also admit that I found it very interesting and I learned a lot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Recent knowledge regarding tissue biology highlights a complex regulation of growth factors in reaction to tissue damage. Platelet Rich Plasma (P.R.P.), containing a natural pool of growth factors, can be obtained in a simple and minimally invasive way and be applied to the lesion site. The aim of this study is to explore this novel approach to treat cartilage degenerative lesions of the knee and tendon chronic lesions( patellar tendon, and achilles tendon). In this study we evaluated if the treatment with PRP injections can reduce pain and increase function in cases of patellar tendinosis (Jumper’s Knee), in chronic achilles tendinopathy and in patients with cartilage injuries of the knee. Materials and Methods: 40 patients with cartilage lesion of the knee, 28 male and 12 female with mean age 47 y. (min 18- max 52 years), were treated and prospectively evaluated at a minimum 6 months follow-up; in the same way, 12 patients with achilles tendon lesion (8 male and 4 female) with mean age 44,5 y. (min 32-max 58 years) and 10 patients with “Jumper’s Knee” (8 male and 2 female) with mean age 23,2 y. (min 18-max 37 years), were evaluated at 6 months follow up. The procedure involved 3 multiple injections , performed every two weeks. All patients were clinically evaluated at the end of the treatment and at 6 months follow up. IKDC, SF36, EQ-VAS, scores were used for clinical evaluation and patient satisfaction and functional status were also recorded. Results: Statistical analysis showed a significant improvement in the SF36 questionnaire in all parameters evaluated at the end of the therapy and 6 months follow-up in both group(tendinopathies and chondral lesions), and in the EQ VAS and IKDC score (paired T-test, p<0.0005) from basal evaluation to the end of the therapy, and a further improvement was present at 6 months follow-up. Whereas a higher improvement of the sport activity level was achieved in the “Jumper’s Knee” group. No complications related to the injections or severe adverse events were observed during the treatment and follow up period. Conclusion: PRP inhibits excess inflammation, apoptosis, and metalloproteinase activity. These interactive pathways may result in the restoration of tendon or cartilage, which can with stand loading with work or sports activity, thereby diminishing pain. PRP may also modulate the microvascular environment or alter efferent or afferent neural receptors. The clinical results are encouraging, indicating that PRP injections may have the potential to increase the tendon and cartilage healing capacity in cases with chronic tendinosis and chondropathy of the knee.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With improvements in acquisition speed and quality, the amount of medical image data to be screened by clinicians is starting to become challenging in the daily clinical practice. To quickly visualize and find abnormalities in medical images, we propose a new method combining segmentation algorithms with statistical shape models. A statistical shape model built from a healthy population will have a close fit in healthy regions. The model will however not fit to morphological abnormalities often present in the areas of pathologies. Using the residual fitting error of the statistical shape model, pathologies can be visualized very quickly. This idea is applied to finding drusen in the retinal pigment epithelium (RPE) of optical coherence tomography (OCT) volumes. A segmentation technique able to accurately segment drusen in patients with age-related macular degeneration (AMD) is applied. The segmentation is then analyzed with a statistical shape model to visualize potentially pathological areas. An extensive evaluation is performed to validate the segmentation algorithm, as well as the quality and sensitivity of the hinting system. Most of the drusen with a height of 85.5 microm were detected, and all drusen at least 93.6 microm high were detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For various reasons, it is important, if not essential, to integrate the computations and code used in data analyses, methodological descriptions, simulations, etc. with the documents that describe and rely on them. This integration allows readers to both verify and adapt the statements in the documents. Authors can easily reproduce them in the future, and they can present the document's contents in a different medium, e.g. with interactive controls. This paper describes a software framework for authoring and distributing these integrated, dynamic documents that contain text, code, data, and any auxiliary content needed to recreate the computations. The documents are dynamic in that the contents, including figures, tables, etc., can be recalculated each time a view of the document is generated. Our model treats a dynamic document as a master or ``source'' document from which one can generate different views in the form of traditional, derived documents for different audiences. We introduce the concept of a compendium as both a container for the different elements that make up the document and its computations (i.e. text, code, data, ...), and as a means for distributing, managing and updating the collection. The step from disseminating analyses via a compendium to reproducible research is a small one. By reproducible research, we mean research papers with accompanying software tools that allow the reader to directly reproduce the results and employ the methods that are presented in the research paper. Some of the issues involved in paradigms for the production, distribution and use of such reproducible research are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. The information derived from central venous catheters is underused. We developed an EKG-R synchronization and averaging system to obtained distinct CVP waveforms and analyzed components of these. Methods. Twenty-five paralyzed surgical patients undergoing CVP monitoring under mechanical ventilation were studied. CVP and EKG signals were analyzed employing our system, the mean CVP and CVP at end-diastole during expiration were compared, and CVP waveform components were measured using this system. Results. CVP waveforms were clearly visualized in all patients. They showed the a peak to be 1.8+/- 0.7 mmHg, which was the highest of three peaks, and the x trough to be lower than the y trough (-1.6+/- 0.7mmHgand-0.9+/- 0.5mmHg, respectively), withameanpulsepressureof3.4mmHg.ThedifferencebetweenthemeanCVPandCVPatend-diastoleduringexpirationwas0.58+/- 0.81 mmHg. Conclusions. The mean CVP can be used as an index of right ventricular preload in patients under mechanical ventilation with regular sinus rhythm. Our newly developed system is useful for clinical monitoring and for education in circulatory physiology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a framework for statistical finite element analysis combining shape and material properties, and allowing performing statistical statements of biomechanical performance across a given population. In this paper, we focus on the design of orthopaedic implants that fit a maximum percentage of the target population, both in terms of geometry and biomechanical stability. CT scans of the bone under consideration are registered non-rigidly to obtain correspondences in position and intensity between them. A statistical model of shape and intensity (bone density) is computed by means of principal component analysis. Afterwards, finite element analysis (FEA) is performed to analyse the biomechanical performance of the bones. Realistic forces are applied on the bones and the resulting displacement and bone stress distribution are calculated. The mechanical behaviour of different PCA bone instances is compared.