38 resultados para geometric quantization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Partial moments are extensively used in actuarial science for the analysis of risks. Since the first order partial moments provide the expected loss in a stop-loss treaty with infinite cover as a function of priority, it is referred as the stop-loss transform. In the present work, we discuss distributional and geometric properties of the first and second order partial moments defined in terms of quantile function. Relationships of the scaled stop-loss transform curve with the Lorenz, Gini, Bonferroni and Leinkuhler curves are developed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Knowledge discovery in databases is the non-trivial process of identifying valid, novel potentially useful and ultimately understandable patterns from data. The term Data mining refers to the process which does the exploratory analysis on the data and builds some model on the data. To infer patterns from data, data mining involves different approaches like association rule mining, classification techniques or clustering techniques. Among the many data mining techniques, clustering plays a major role, since it helps to group the related data for assessing properties and drawing conclusions. Most of the clustering algorithms act on a dataset with uniform format, since the similarity or dissimilarity between the data points is a significant factor in finding out the clusters. If a dataset consists of mixed attributes, i.e. a combination of numerical and categorical variables, a preferred approach is to convert different formats into a uniform format. The research study explores the various techniques to convert the mixed data sets to a numerical equivalent, so as to make it equipped for applying the statistical and similar algorithms. The results of clustering mixed category data after conversion to numeric data type have been demonstrated using a crime data set. The thesis also proposes an extension to the well known algorithm for handling mixed data types, to deal with data sets having only categorical data. The proposed conversion has been validated on a data set corresponding to breast cancer. Moreover, another issue with the clustering process is the visualization of output. Different geometric techniques like scatter plot, or projection plots are available, but none of the techniques display the result projecting the whole database but rather demonstrate attribute-pair wise analysis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pedicle screw insertion technique has made revolution in the surgical treatment of spinal fractures and spinal disorders. Although X- ray fluoroscopy based navigation is popular, there is risk of prolonged exposure to X- ray radiation. Systems that have lower radiation risk are generally quite expensive. The position and orientation of the drill is clinically very important in pedicle screw fixation. In this paper, the position and orientation of the marker on the drill is determined using pattern recognition based methods, using geometric features, obtained from the input video sequence taken from CCD camera. A search is then performed on the video frames after preprocessing, to obtain the exact position and orientation of the drill. An animated graphics, showing the instantaneous position and orientation of the drill is then overlaid on the processed video for real time drill control and navigation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper the effectiveness of a novel method of computer assisted pedicle screw insertion was studied using testing of hypothesis procedure with a sample size of 48. Pattern recognition based on geometric features of markers on the drill has been performed on real time optical video obtained from orthogonally placed CCD cameras. The study reveals the exactness of the calculated position of the drill using navigation based on CT image of the vertebra and real time optical video of the drill. The significance value is 0.424 at 95% confidence level which indicates good precision with a standard mean error of only 0.00724. The virtual vision method is less hazardous to both patient and the surgeon

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The standard separable two dimensional wavelet transform has achieved a great success in image denoising applications due to its sparse representation of images. However it fails to capture efficiently the anisotropic geometric structures like edges and contours in images as they intersect too many wavelet basis functions and lead to a non-sparse representation. In this paper a novel de-noising scheme based on multi directional and anisotropic wavelet transform called directionlet is presented. The image denoising in wavelet domain has been extended to the directionlet domain to make the image features to concentrate on fewer coefficients so that more effective thresholding is possible. The image is first segmented and the dominant direction of each segment is identified to make a directional map. Then according to the directional map, the directionlet transform is taken along the dominant direction of the selected segment. The decomposed images with directional energy are used for scale dependent subband adaptive optimal threshold computation based on SURE risk. This threshold is then applied to the sub-bands except the LLL subband. The threshold corrected sub-bands with the unprocessed first sub-band (LLL) are given as input to the inverse directionlet algorithm for getting the de-noised image. Experimental results show that the proposed method outperforms the standard wavelet-based denoising methods in terms of numeric and visual quality

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis explores the area of still image compression. The image compression techniques can be broadly classified into lossless and lossy compression. The most common lossy compression techniques are based on Transform coding, Vector Quantization and Fractals. Transform coding is the simplest of the above and generally employs reversible transforms like, DCT, DWT, etc. Mapped Real Transform (MRT) is an evolving integer transform, based on real additions alone. The present research work aims at developing new image compression techniques based on MRT. Most of the transform coding techniques employ fixed block size image segmentation, usually 8×8. Hence, a fixed block size transform coding is implemented using MRT and the merits and demerits are analyzed for both 8×8 and 4×4 blocks. The N2 unique MRT coefficients, for each block, are computed using templates. Considering the merits and demerits of fixed block size transform coding techniques, a hybrid form of these techniques is implemented to improve the performance of compression. The performance of the hybrid coder is found to be better compared to the fixed block size coders. Thus, if the block size is made adaptive, the performance can be further improved. In adaptive block size coding, the block size may vary from the size of the image to 2×2. Hence, the computation of MRT using templates is impractical due to memory requirements. So, an adaptive transform coder based on Unique MRT (UMRT), a compact form of MRT, is implemented to get better performance in terms of PSNR and HVS The suitability of MRT in vector quantization of images is then experimented. The UMRT based Classified Vector Quantization (CVQ) is implemented subsequently. The edges in the images are identified and classified by employing a UMRT based criteria. Based on the above experiments, a new technique named “MRT based Adaptive Transform Coder with Classified Vector Quantization (MATC-CVQ)”is developed. Its performance is evaluated and compared against existing techniques. A comparison with standard JPEG & the well-known Shapiro’s Embedded Zero-tree Wavelet (EZW) is done and found that the proposed technique gives better performance for majority of images

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work is intended to discuss various properties and reliability aspects of higher order equilibrium distributions in continuous, discrete and multivariate cases, which contribute to the study on equilibrium distributions. At first, we have to study and consolidate the existing literature on equilibrium distributions. For this we need some basic concepts in reliability. These are being discussed in the 2nd chapter, In Chapter 3, some identities connecting the failure rate functions and moments of residual life of the univariate, non-negative continuous equilibrium distributions of higher order and that of the baseline distribution are derived. These identities are then used to characterize the generalized Pareto model, mixture of exponentials and gamma distribution. An approach using the characteristic functions is also discussed with illustrations. Moreover, characterizations of ageing classes using stochastic orders has been discussed. Part of the results of this chapter has been reported in Nair and Preeth (2009). Various properties of equilibrium distributions of non-negative discrete univariate random variables are discussed in Chapter 4. Then some characterizations of the geo- metric, Waring and negative hyper-geometric distributions are presented. Moreover, the ageing properties of the original distribution and nth order equilibrium distribu- tions are compared. Part of the results of this chapter have been reported in Nair, Sankaran and Preeth (2012). Chapter 5 is a continuation of Chapter 4. Here, several conditions, in terms of stochastic orders connecting the baseline and its equilibrium distributions are derived. These conditions can be used to rede_ne certain ageing notions. Then equilibrium distributions of two random variables are compared in terms of various stochastic orders that have implications in reliability applications. In Chapter 6, we make two approaches to de_ne multivariate equilibrium distribu- tions of order n. Then various properties including characterizations of higher order equilibrium distributions are presented. Part of the results of this chapter have been reported in Nair and Preeth (2008). The Thesis is concluded in Chapter 7. A discussion on further studies on equilib- rium distributions is also made in this chapter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hat Stiffened Plates are used in composite ships and are gaining popularity in metallic ship construction due to its high strength-to-weight ratio. Light weight structures will result in greater payload, higher speeds, reduced fuel consumption and environmental emissions. Numerical Investigations have been carried out using the commercial Finite Element software ANSYS 12 to substantiate the high strength-to-weight ratio of Hat Stiffened Plates over other open section stiffeners which are commonly used in ship building. Analysis of stiffened plate has always been a matter of concern for the structural engineers since it has been rather difficult to quantify the actual load sharing between stiffeners and plating. Finite Element Method has been accepted as an efficient tool for the analysis of stiffened plated structure. Best results using the Finite Element Method for the analysis of thin plated structures are obtained when both the stiffeners and the plate are modeled using thin plate elements having six degrees of freedom per node. However, one serious problem encountered with this design and analysis process is that the generation of the finite element models for a complex configuration is time consuming and laborious. In order to overcome these difficulties two different methods viz., Orthotropic Plate Model and Superelement for Hat Stiffened Plate have been suggested in the present work. In the Orthotropic Plate Model geometric orthotropy is converted to material orthotropy i.e., the stiffeners are smeared and they vanish from the field of analysis and the structure can be analysed using any commercial Finite Element software which has orthotropic elements in its element library. The Orthotropic Plate Model developed has predicted deflection, stress and linear buckling load with sufficiently good accuracy in the case of all four edges simply supported boundary condition. Whereas, in the case of two edges fixed and other two edges simply supported boundary condition even though the stress has been predicted with good accuracy there has been large variation in the deflection predicted. This variation in the deflection predicted is because, for the Orthotropic Plate Model the rigidity is uniform throughout the plate whereas in the actual Hat Stiffened Plate the rigidity along the line of attachment of the stiffeners to the plate is large as compared to the unsupported portion of the plate. The Superelement technique is a method of treating a portion of the structure as if it were a single element even though it is made up of many individual elements. The Superelement has predicted the deflection and in-plane stress of Hat Stiffened Plate with sufficiently good accuracy for different boundary conditions. Formulation of Superelement for composite Hat Stiffened Plate has also been presented in the thesis. The capability of Orthotropic Plate Model and Superelement to handle typical boundary conditions and characteristic loads in a ship structure has been demonstrated through numerical investigations.