867 resultados para Nonparametric discriminant analysis
Resumo:
Improvements in the analysis of microarray images are critical for accurately quantifying gene expression levels. The acquisition of accurate spot intensities directly influences the results and interpretation of statistical analyses. This dissertation discusses the implementation of a novel approach to the analysis of cDNA microarray images. We use a stellar photometric model, the Moffat function, to quantify microarray spots from nylon microarray images. The inherent flexibility of the Moffat shape model makes it ideal for quantifying microarray spots. We apply our novel approach to a Wilms' tumor microarray study and compare our results with a fixed-circle segmentation approach for spot quantification. Our results suggest that different spot feature extraction methods can have an impact on the ability of statistical methods to identify differentially expressed genes. We also used the Moffat function to simulate a series of microarray images under various experimental conditions. These simulations were used to validate the performance of various statistical methods for identifying differentially expressed genes. Our simulation results indicate that tests taking into account the dependency between mean spot intensity and variance estimation, such as the smoothened t-test, can better identify differentially expressed genes, especially when the number of replicates and mean fold change are low. The analysis of the simulations also showed that overall, a rank sum test (Mann-Whitney) performed well at identifying differentially expressed genes. Previous work has suggested the strengths of nonparametric approaches for identifying differentially expressed genes. We also show that multivariate approaches, such as hierarchical and k-means cluster analysis along with principal components analysis, are only effective at classifying samples when replicate numbers and mean fold change are high. Finally, we show how our stellar shape model approach can be extended to the analysis of 2D-gel images by adapting the Moffat function to take into account the elliptical nature of spots in such images. Our results indicate that stellar shape models offer a previously unexplored approach for the quantification of 2D-gel spots. ^
Resumo:
Background. Screening for colorectal cancer (CRC) is considered cost effective but screening compliance in the US remains low. There have been very few studies on economic analyses of screening promotion strategies for colorectal cancer. The main aim of the current study is to conduct a cost effectiveness analysis (CEA) and examine the uncertainty involved in the results of the CEA of a tailored intervention to promote screening for CRC among patients of a multispeciality clinic in Houston, TX. ^ Methods. The two intervention arms received a PC based tailored program and web based educational information to promote CRC screening. The incremental cost of implementing a tailored PC based program was compared to the website based education and the status quo of no intervention for each unit of effect after 12 months of delivering the intervention. Uncertainty analysis in the point estimates of cost and effect was conducted using nonparametric bootstrapping. ^ Results. The cost of implementing a web based educational intervention was $36.00 per person and the cost of the tailored PC based interactive intervention was $43.00 per person. The additional cost per person screened for the web-based strategy was $2374 and the effect of the tailored intervention was negative. ^
Resumo:
Introduction. Despite the ban of lead-containing gasoline and paint, childhood lead poisoning remains a public health issue. Furthermore, a Medicaid-eligible child is 8 times more likely to have an elevated blood lead level (EBLL) than a non-Medicaid child, which is the primary reason for the early detection lead screening mandate for ages 12 and 24 months among the Medicaid population. Based on field observations, there was evidence that suggested a screening compliance issue. Objective. The purpose of this study was to analyze blood lead screening compliance in previously lead poisoned Medicaid children and test for an association between timely lead screening and timely childhood immunizations. The mean months between follow-up tests were also examined for a significant difference between the non-compliant and compliant lead screened children. Methods. Access to the surveillance data of all childhood lead poisoned cases in Bexar County was granted by the San Antonio Metropolitan Health District. A database was constructed and analyzed using descriptive statistics, logistic regression methods and non-parametric tests. Lead screening at 12 months of age was analyzed separately from lead screening at 24 months. The small portion of the population who were also related were included in one analysis and removed from a second analysis to check for significance. Gender, ethnicity, age of home, and having a sibling with an EBLL were ruled out as confounders for the association tests but ethnicity and age of home were adjusted in the nonparametric tests. Results. There was a strong significant association between lead screening compliance at 12 months and childhood immunization compliance, with or without including related children (p<0.00). However, there was no significant association between the two variables at the age of 24 months. Furthermore, there was no significant difference between the median of the mean months of follow-up blood tests among the non-compliant and compliant lead screened population for at the 12 month screening group but there was a significant difference at the 24 month screening group (p<0.01). Discussion. Descriptive statistics showed that 61% and 56% of the previously lead poisoned Medicaid population did not receive their 12 and 24 month mandated lead screening on time, respectively. This suggests that their elevated blood lead level may have been diagnosed earlier in their childhood. Furthermore, a child who is compliant with their lead screening at 12 months of age is 2.36 times more likely to also receive their childhood immunizations on time compared to a child who was not compliant with their 12 month screening. Even though there was no statistical significant association found for the 24 month group, the public health significance of a screening compliance issue is no less important. The Texas Medicaid program needs to enforce lead screening compliance because it is evident that there has been no monitoring system in place. Further recommendations include a need for an increased focus on parental education and the importance of taking their children for wellness exams on time.^
Resumo:
We have performed quantitative X-ray diffraction (qXRD) analysis of 157 grab or core-top samples from the western Nordic Seas between (WNS) ~57°-75°N and 5° to 45° W. The RockJock Vs6 analysis includes non-clay (20) and clay (10) mineral species in the <2 mm size fraction that sum to 100 weight %. The data matrix was reduced to 9 and 6 variables respectively by excluding minerals with low weight% and by grouping into larger groups, such as the alkali and plagioclase feldspars. Because of its potential dual origins calcite was placed outside of the sum. We initially hypothesized that a combination of regional bedrock outcrops and transport associated with drift-ice, meltwater plumes, and bottom currents would result in 6 clusters defined by "similar" mineral compositions. The hypothesis was tested by use of a fuzzy k-mean clustering algorithm and key minerals were identified by step-wise Discriminant Function Analysis. Key minerals in defining the clusters include quartz, pyroxene, muscovite, and amphibole. With 5 clusters, 87.5% of the observations are correctly classified. The geographic distributions of the five k-mean clusters compares reasonably well with the original hypothesis. The close spatial relationship between bedrock geology and discrete cluster membership stresses the importance of this variable at both the WNS-scale and at a more local scale in NE Greenland.
Resumo:
The objective of this thesis is the development of cooperative localization and tracking algorithms using nonparametric message passing techniques. In contrast to the most well-known techniques, the goal is to estimate the posterior probability density function (PDF) of the position of each sensor. This problem can be solved using Bayesian approach, but it is intractable in general case. Nevertheless, the particle-based approximation (via nonparametric representation), and an appropriate factorization of the joint PDFs (using message passing methods), make Bayesian approach acceptable for inference in sensor networks. The well-known method for this problem, nonparametric belief propagation (NBP), can lead to inaccurate beliefs and possible non-convergence in loopy networks. Therefore, we propose four novel algorithms which alleviate these problems: nonparametric generalized belief propagation (NGBP) based on junction tree (NGBP-JT), NGBP based on pseudo-junction tree (NGBP-PJT), NBP based on spanning trees (NBP-ST), and uniformly-reweighted NBP (URW-NBP). We also extend NBP for cooperative localization in mobile networks. In contrast to the previous methods, we use an optional smoothing, provide a novel communication protocol, and increase the efficiency of the sampling techniques. Moreover, we propose novel algorithms for distributed tracking, in which the goal is to track the passive object which cannot locate itself. In particular, we develop distributed particle filtering (DPF) based on three asynchronous belief consensus (BC) algorithms: standard belief consensus (SBC), broadcast gossip (BG), and belief propagation (BP). Finally, the last part of this thesis includes the experimental analysis of some of the proposed algorithms, in which we found that the results based on real measurements are very similar with the results based on theoretical models.
Resumo:
In this paper, the authors provide a methodology to design nonparametric permutation tests and, in particular, nonparametric rank tests for applications in detection. In the first part of the paper, the authors develop the optimization theory of both permutation and rank tests in the Neyman?Pearson sense; in the second part of the paper, they carry out a comparative performance analysis of the permutation and rank tests (detectors) against the parametric ones in radar applications. First, a brief review of some contributions on nonparametric tests is realized. Then, the optimum permutation and rank tests are derived. Finally, a performance analysis is realized by Monte-Carlo simulations for the corresponding detectors, and the results are shown in curves of detection probability versus signal-to-noise ratio
Resumo:
This work proposes an optimization of a semi-supervised Change Detection methodology based on a combination of Change Indices (CI) derived from an image multitemporal data set. For this purpose, SPOT 5 Panchromatic images with 2.5 m spatial resolution have been used, from which three Change Indices have been calculated. Two of them are usually known indices; however the third one has been derived considering the Kullbak-Leibler divergence. Then, these three indices have been combined forming a multiband image that has been used in as input for a Support Vector Machine (SVM) classifier where four different discriminant functions have been tested in order to differentiate between change and no_change categories. The performance of the suggested procedure has been assessed applying different quality measures, reaching in each case highly satisfactory values. These results have demonstrated that the simultaneous combination of basic change indices with others more sophisticated like the Kullback-Leibler distance, and the application of non-parametric discriminant functions like those employees in the SVM method, allows solving efficiently a change detection problem.
Resumo:
A novel GPU-based nonparametric moving object detection strategy for computer vision tools requiring real-time processing is proposed. An alternative and efficient Bayesian classifier to combine nonparametric background and foreground models allows increasing correct detections while avoiding false detections. Additionally, an efficient region of interest analysis significantly reduces the computational cost of the detections.
Resumo:
A sample of 95 sib pairs affected with insulin-dependent diabetes and typed with their normal parents for 28 markers on chromosome 6 has been analyzed by several methods. When appropriate parameters are efficiently estimated, a parametric model is equivalent to the β model, which is superior to nonparametric alternatives both in single point tests (as found previously) and in multipoint tests. Theory is given for meta-analysis combined with allelic association, and problems that may be associated with errors of map location and/or marker typing are identified. Reducing by multipoint analysis the number of association tests in a dense map can give a 3-fold reduction in the critical lod, and therefore in the cost of positional cloning.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The judicial interest in ‘scientific’ evidence has driven recent work to quantify results for forensic linguistic authorship analysis. Through a methodological discussion and a worked example this paper examines the issues which complicate attempts to quantify results in work. The solution suggested to some of the difficulties is a sampling and testing strategy which helps to identify potentially useful, valid and reliable markers of authorship. An important feature of the sampling strategy is that these markers identified as being generally valid and reliable are retested for use in specific authorship analysis cases. The suggested approach for drawing quantified conclusions combines discriminant function analysis and Bayesian likelihood measures. The worked example starts with twenty comparison texts for each of three potential authors and then uses a progressively smaller comparison corpus, reducing to fifteen, ten, five and finally three texts per author. This worked example demonstrates how reducing the amount of data affects the way conclusions can be drawn. With greater numbers of reference texts quantified and safe attributions are shown to be possible, but as the number of reference texts reduces the analysis shows how the conclusion which should be reached is that no attribution can be made. The testing process at no point results in instances of a misattribution.
Resumo:
Bove, Pervan, Beatty, and Shiu [Bove, LL, Pervan, SJ, Beatty, SE, Shiu, E. Service worker role in encouraging customer organizational citizenship behaviors. J Bus Res 2009;62(7):698–705.] develop and test a latent variable model of the role of service workers in encouraging customers' organizational citizenship behaviors. However, Bove et al. [Bove, LL, Pervan, SJ, Beatty, SE, Shiu, E. Service worker role in encouraging customer organizational citizenship behaviors. J Bus Res 2009;62(7):698–705.] claim support for hypothesized relationships between constructs that, due to insufficient discriminant validity regarding certain constructs, may be inaccurate. This research comment discusses what discriminant validity represents, procedures for establishing discriminant validity, and presents an example of inaccurate discriminant validity assessment based upon the work of Bove et al. [Bove, LL, Pervan, SJ, Beatty, SE, Shiu, E. Service worker role in encouraging customer organizational citizenship behaviors. J Bus Res 2009;62(7):698–705.]. Solutions to discriminant validity problems and a five-step procedure for assessing discriminant validity then conclude the paper. This comment hopes to motivate a review of discriminant validity issues and offers assistance to future researchers conducting latent variable analysis.
Resumo:
The use of Diagnosis Related Groups (DRG) as a mechanism for hospital financing is a currently debated topic in Portugal. The DRG system was scheduled to be initiated by the Health Ministry of Portugal on January 1, 1990 as an instrument for the allocation of public hospital budgets funded by the National Health Service (NHS), and as a method of payment for other third party payers (e.g., Public Employees (ADSE), private insurers, etc.). Based on experience from other countries such as the United States, it was expected that implementation of this system would result in more efficient hospital resource utilisation and a more equitable distribution of hospital budgets. However, in order to minimise the potentially adverse financial impact on hospitals, the Portuguese Health Ministry decided to gradually phase in the use of the DRG system for budget allocation by using blended hospitalspecific and national DRG casemix rates. Since implementation in 1990, the percentage of each hospitals budget based on hospital specific costs was to decrease, while the percentage based on DRG casemix was to increase. This was scheduled to continue until 1995 when the plan called for allocating yearly budgets on a 50% national and 50% hospitalspecific cost basis. While all other nonNHS third party payers are currently paying based on DRGs, the adoption of DRG casemix as a National Health Service budget setting tool has been slower than anticipated. There is now some argument in both the political and academic communities as to the appropriateness of DRGs as a budget setting criterion as well as to their impact on hospital efficiency in Portugal. This paper uses a twostage procedure to assess the impact of actual DRG payment on the productivity (through its components, i.e., technological change and technical efficiency change) of diagnostic technology in Portuguese hospitals during the years 1992–1994, using both parametric and nonparametric frontier models. We find evidence that the DRG payment system does appear to have had a positive impact on productivity and technical efficiency of some commonly employed diagnostic technologies in Portugal during this time span.
Resumo:
The kinematic mapping of a rigid open-link manipulator is a homomorphism between Lie groups. The homomorphisrn has solution groups that act on an inverse kinematic solution element. A canonical representation of solution group operators that act on a solution element of three and seven degree-of-freedom (do!) dextrous manipulators is determined by geometric analysis. Seven canonical solution groups are determined for the seven do! Robotics Research K-1207 and Hollerbach arms. The solution element of a dextrous manipulator is a collection of trivial fibre bundles with solution fibres homotopic to the Torus. If fibre solutions are parameterised by a scalar, a direct inverse funct.ion that maps the scalar and Cartesian base space coordinates to solution element fibre coordinates may be defined. A direct inverse pararneterisation of a solution element may be approximated by a local linear map generated by an inverse augmented Jacobian correction of a linear interpolation. The action of canonical solution group operators on a local linear approximation of the solution element of inverse kinematics of dextrous manipulators generates cyclical solutions. The solution representation is proposed as a model of inverse kinematic transformations in primate nervous systems. Simultaneous calibration of a composition of stereo-camera and manipulator kinematic models is under-determined by equi-output parameter groups in the composition of stereo-camera and Denavit Hartenberg (DH) rnodels. An error measure for simultaneous calibration of a composition of models is derived and parameter subsets with no equi-output groups are determined by numerical experiments to simultaneously calibrate the composition of homogeneous or pan-tilt stereo-camera with DH models. For acceleration of exact Newton second-order re-calibration of DH parameters after a sequential calibration of stereo-camera and DH parameters, an optimal numerical evaluation of DH matrix first order and second order error derivatives with respect to a re-calibration error function is derived, implemented and tested. A distributed object environment for point and click image-based tele-command of manipulators and stereo-cameras is specified and implemented that supports rapid prototyping of numerical experiments in distributed system control. The environment is validated by a hierarchical k-fold cross validated calibration to Cartesian space of a radial basis function regression correction of an affine stereo model. Basic design and performance requirements are defined for scalable virtual micro-kernels that broker inter-Java-virtual-machine remote method invocations between components of secure manageable fault-tolerant open distributed agile Total Quality Managed ISO 9000+ conformant Just in Time manufacturing systems.
Resumo:
This thesis seeks to describe the development of an inexpensive and efficient clustering technique for multivariate data analysis. The technique starts from a multivariate data matrix and ends with graphical representation of the data and pattern recognition discriminant function. The technique also results in distances frequency distribution that might be useful in detecting clustering in the data or for the estimation of parameters useful in the discrimination between the different populations in the data. The technique can also be used in feature selection. The technique is essentially for the discovery of data structure by revealing the component parts of the data. lhe thesis offers three distinct contributions for cluster analysis and pattern recognition techniques. The first contribution is the introduction of transformation function in the technique of nonlinear mapping. The second contribution is the us~ of distances frequency distribution instead of distances time-sequence in nonlinear mapping, The third contribution is the formulation of a new generalised and normalised error function together with its optimal step size formula for gradient method minimisation. The thesis consists of five chapters. The first chapter is the introduction. The second chapter describes multidimensional scaling as an origin of nonlinear mapping technique. The third chapter describes the first developing step in the technique of nonlinear mapping that is the introduction of "transformation function". The fourth chapter describes the second developing step of the nonlinear mapping technique. This is the use of distances frequency distribution instead of distances time-sequence. The chapter also includes the new generalised and normalised error function formulation. Finally, the fifth chapter, the conclusion, evaluates all developments and proposes a new program. for cluster analysis and pattern recognition by integrating all the new features.