915 resultados para Mellin Transform
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
Biolistic delivery of transforming DNA into fungal genomes, especially when performed on uninucleate haploid conidia, has proven successful in bypassing the time-consuming repetitive purification of protoplasts used for the widely applied polyethylene glycol-mediated method. Biolistic transformation is also relatively quick compared to other available methods and provides a high percentage of stable transformants.
Resumo:
Age-related macular degeneration (AMD) affects the central vision and subsequently may lead to visual loss in people over 60 years of age. There is no permanent cure for AMD, but early detection and successive treatment may improve the visual acuity. AMD is mainly classified into dry and wet type; however, dry AMD is more common in aging population. AMD is characterized by drusen, yellow pigmentation, and neovascularization. These lesions are examined through visual inspection of retinal fundus images by ophthalmologists. It is laborious, time-consuming, and resource-intensive. Hence, in this study, we have proposed an automated AMD detection system using discrete wavelet transform (DWT) and feature ranking strategies. The first four-order statistical moments (mean, variance, skewness, and kurtosis), energy, entropy, and Gini index-based features are extracted from DWT coefficients. We have used five (t test, Kullback–Lieber Divergence (KLD), Chernoff Bound and Bhattacharyya Distance, receiver operating characteristics curve-based, and Wilcoxon) feature ranking strategies to identify optimal feature set. A set of supervised classifiers namely support vector machine (SVM), decision tree, k -nearest neighbor ( k -NN), Naive Bayes, and probabilistic neural network were used to evaluate the highest performance measure using minimum number of features in classifying normal and dry AMD classes. The proposed framework obtained an average accuracy of 93.70 %, sensitivity of 91.11 %, and specificity of 96.30 % using KLD ranking and SVM classifier. We have also formulated an AMD Risk Index using selected features to classify the normal and dry AMD classes using one number. The proposed system can be used to assist the clinicians and also for mass AMD screening programs.
Resumo:
Information and Communication Technologies are dramatically transforming Allopathic medicine. Technological developments including Tele-medicine, Electronic health records, Standards to ensure computer systems inter-operate, Data mining, Simulation, Decision Support and easy access to medical information each contribute to empowering patients in new ways and change the practice of medicine. To date, informatics has had little impact on Ayurvedic medicine. This tutorial provides an introduction to key informatics initiatives in Allopothic medicine using real examples and suggests how applications can be applied to Ayurvedic medicine.
Resumo:
The detection of line-like features in images finds many applications in microanalysis. Actin fibers, microtubules, neurites, pilis, DNA, and other biological structures all come up as tenuous curved lines in microscopy images. A reliable tracing method that preserves the integrity and details of these structures is particularly important for quantitative analyses. We have developed a new image transform called the "Coalescing Shortest Path Image Transform" with very encouraging properties. Our scheme efficiently combines information from an extensive collection of shortest paths in the image to delineate even very weak linear features. © Copyright Microscopy Society of America 2011.
Resumo:
The structural characteristics of raw coal and hydrogen peroxide (H2O2)-oxidized coals were investigated using scanning electron microscopy, X-ray diffraction (XRD), Raman spectra, and Fourier transform infrared (FT-IR) spectroscopy. The results indicate that the derivative coals oxidized by H2O2 are improved noticeably in aromaticity and show an increase first and then a decrease up to the highest aromaticity at 24 h. The stacking layer number of crystalline carbon decreases and the aspect ratio (width versus stacking height) increases with an increase in oxidation time. The content of crystalline carbon shows the same change tendency as the aromaticity measured by XRD. The hydroxyl bands of oxidized coals become much stronger due to an increase in soluble fatty acids and alcohols as a result of the oxidation of the aromatic and aliphatic C‐H bonds. In addition, the derivative coals display a decrease first and then an increase in the intensity of aliphatic C‐H bond and present a diametrically opposite tendency in the aromatic C‐H bonds with an increase in oxidation time. There is good agreement with the changes of aromaticity and crystalline carbon content as measured by XRD and Raman spectra. The particle size of oxidized coals (<200 nm in width) shows a significant decrease compared with that of raw coal (1 μm). This study reveals that the optimal oxidation time is ∼24 h for improving the aromaticity and crystalline carbon content of H2O2-oxidized coals. This process can help us obtain superfine crystalline carbon materials similar to graphite in structure.
Resumo:
In this paper, we generalize the existing rate-one space frequency (SF) and space-time frequency (STF) code constructions. The objective of this exercise is to provide a systematic design of full-diversity STF codes with high coding gain. Under this generalization, STF codes are formulated as linear transformations of data. Conditions on these linear transforms are then derived so that the resulting STF codes achieve full diversity and high coding gain with a moderate decoding complexity. Many of these conditions involve channel parameters like delay profile (DP) and temporal correlation. When these quantities are not available at the transmitter, design of codes that exploit full diversity on channels with arbitrary DIP and temporal correlation is considered. Complete characterization of a class of such robust codes is provided and their bit error rate (BER) performance is evaluated. On the other hand, when channel DIP and temporal correlation are available at the transmitter, linear transforms are optimized to maximize the coding gain of full-diversity STF codes. BER performance of such optimized codes is shown to be better than those of existing codes.
Resumo:
The power of projects has been demonstrated by the growth in their use across an increasing range of industries and workplaces in recent years. Not only has the number of people involved in project management increased, but the qualifications and backgrounds of those people have also broadened, with engineering no longer being the only path to project management (PM). Predicting the career trajectories in Project Management has become more important for both organisations employing project managers and project managers building career portfolios. Our research involved interviewing more than 75 project officers and project managers across a range of industries to explore their career journey. We used Wittgenstein’s family resemblance theory is to analyse the information from the transcripts to identify the extent to which the roles of participants fit with the commonly accepted definition of project management. Findings demonstrate diversity of project manager backgrounds and experiences and relational competencies across these backgrounds that form and shape PM careers.
Resumo:
This paper presents the architecture and the VHDL design of an integer 2-D DCT used in the H.264/AVC. The 2-D DCT computation is performed by exploiting it’s orthogonality and separability property. The symmetry of the forward and inverse transform is used in this implementation. To reduce the computation overhead for the addition, subtraction and multiplication operations, we analyze the suitability of carry-free position independent residue number system (RNS) for the implementation of 2-D DCT. The implementation has been carried out in VHDL for Altera FPGA. We used the negative number representation in RNS, bit width analysis of the transforms and dedicated registers present in the Logic element of the FPGA to optimize the area. The complexity and efficiency analysis show that the proposed architecture could provide higher through-put.
Resumo:
The breakdown of the usual method of Fourier transforms in the problem of an external line crack in a thin infinite elastic plate is discovered and the correct solution of this problem is derived using the concept of a generalised Fourier transform of a type discussed first by Golecki [1] in connection with Flamant's problem.
Resumo:
Quantization formats of four digital holographic codes (Lohmann,Lee, Burckhardt and Hsueh-Sawchuk) are evaluated. A quantitative assessment is made from errors in both the Fourier transform and image domains. In general, small errors in the Fourier amplitude or phase alone do not guarantee high image fidelity. From quantization considerations, the Lee hologram is shown to be the best choice for randomly phase coded objects. When phase coding is not feasible, the Lohmann hologram is preferable as it is easier to plot.
Resumo:
Using analysis-by-synthesis (AbS) approach, we develop a soft decision based switched vector quantization (VQ) method for high quality and low complexity coding of wideband speech line spectral frequency (LSF) parameters. For each switching region, a low complexity transform domain split VQ (TrSVQ) is designed. The overall rate-distortion (R/D) performance optimality of new switched quantizer is addressed in the Gaussian mixture model (GMM) based parametric framework. In the AbS approach, the reduction of quantization complexity is achieved through the use of nearest neighbor (NN) TrSVQs and splitting the transform domain vector into higher number of subvectors. Compared to the current LSF quantization methods, the new method is shown to provide competitive or better trade-off between R/D performance and complexity.