29 resultados para 340402 Econometric and Statistical Methods
Resumo:
In part 1 of this article, cleavage initiation in the intercritically reheated coarse-grained heat affected zone (IC CG HAZ) of high-strength low-alloy (HSLA) steels was determined to occur between two closely spaced blocky MA particles. Blunt notch, crack tip opening displacement (CTOD), and precracked Charpy testing were used in this investigation to determine the failure criteria required for cleavage initiation to occur by this mechanism in the IC CG HAZ. It was found that the attainment of a critical level of strain was required in addition to a critical level of stress. This does not occur in the case of high strain rate testing, for example, during precracked Charpy testing. A different cleavage initiation mechanism is then found to operate. The precise fracture criteria and microstructural requirements (described in part I of this article) result in competition between potential cleavage initiation mechanisms in the IC CG HAZ.
Resumo:
Silicon carbide ceramics are candidate materials for use in aggressive environments, including those where aqueous acids are present. Standard corrosion testing methods such as immersion testing are not always sufficiently sensitive for these ceramics owing to the very low, almost unobservable, corrosion rates encountered. Using electrochemical methods the corrosion processes can be assisted, leading to higher rates and thus the elucidation of reaction mechanisms. The behaviour of a sintered and a reaction bonded silicon carbide has been investigated in aqueous HCl, HF, HNO3, and H2SO4, using standard immersion and new electrochemical methods. Both materials were passive in HCl, HNO3, and H2SO4 because of the formation of a surface silica film, and were active in HF. In HF, corrosion of sintered silicon carbide was slight and the residual silicon was removed from reaction bonded specimens.
Resumo:
Recently, temporal and statistical properties of quasi-CW fiber lasers have attracted a great attention. In particular, properties of Raman fiber laser (RFLs) have been studied both numerically and experimentally [1,2]. Experimental investigation is more challengeable, as the full generation optical bandwidth (typically hundreds of GHz for RFLs) is much bigger than real-time bandwidth of oscilloscopes (up to 60GHz for the newest models). So experimentally measured time dynamics is highly bandwidth averaged and do not provide precise information about overall statistical properties. To overpass this, one can use the spectral filtering technique to study temporal and statistical properties within optical bandwidth comparable with measurement bandwidth [3] or indirect measurements [4]. Ytterbium-doped fiber lasers (YDFL) are more suitable for experimental investigation, as their generation spectrum usually 10 times narrower. Moreover, recently ultra-narrow-band generation has been demonstrated in YDFL [5] which provides in principle possibility to measure time dynamics and statistics in real time using conventional oscilloscopes. © 2013 IEEE.
Resumo:
The reaction of localised C=C bonds on the surface of activated carbons has been shown to be an effective method of chemical modification especially using microwave-assisted reactions.
Resumo:
We report results of an experimental study, complemented by detailed statistical analysis of the experimental data, on the development of a more effective control method of drug delivery using a pH sensitive acrylic polymer. New copolymers based on acrylic acid and fatty acid are constructed from dodecyl castor oil and a tercopolymer based on methyl methacrylate, acrylic acid and acryl amide were prepared using this new approach. Water swelling characteristics of fatty acid, acrylic acid copolymer and tercopolymer respectively in acid and alkali solutions have been studied by a step-change method. The antibiotic drug cephalosporin and paracetamol have also been incorporated into the polymer blend through dissolution with the release of the antibiotic drug being evaluated in bacterial stain media and buffer solution. Our results show that the rate of release of paracetamol getss affected by the pH factor and also by the nature of polymer blend. Our experimental data have later been statistically analyzed to quantify the precise nature of polymer decay rates on the pH density of the relevant polymer solvents. The time evolution of the polymer decay rates indicate a marked transition from a linear to a strictly non-linear regime depending on the whether the chosen sample is a general copolymer (linear) or a tercopolymer (non-linear). Non-linear data extrapolation techniques have been used to make probabilistic predictions about the variation in weight percentages of retained polymers at all future times, thereby quantifying the degree of efficacy of the new method of drug delivery.
Resumo:
Purpose: To assess the inter and intra observer variability of subjective grading of the retinal arterio-venous ratio (AVR) using a visual grading and to compare the subjectively derived grades to an objective method using a semi-automated computer program. Methods: Following intraocular pressure and blood pressure measurements all subjects underwent dilated fundus photography. 86 monochromatic retinal images with the optic nerve head centred (52 healthy volunteers) were obtained using a Zeiss FF450+ fundus camera. Arterio-venous ratios (AVR), central retinal artery equivalent (CRAE) and central retinal vein equivalent (CRVE) were calculated on three separate occasions by one single observer semi-automatically using the software VesselMap (ImedosSystems, Jena, Germany). Following the automated grading, three examiners graded the AVR visually on three separate occasions in order to assess their agreement. Results: Reproducibility of the semi-automatic parameters was excellent (ICCs: 0.97 (CRAE); 0.985 (CRVE) and 0.952 (AVR)). However, visual grading of AVR showed inter grader differences as well as discrepancies between subjectively derived and objectively calculated AVR (all p < 0.000001). Conclusion: Grader education and experience leads to inter-grader differences but more importantly, subjective grading is not capable to pick up subtle differences across healthy individuals and does not represent true AVR when compared with an objective assessment method. Technology advancements mean we no longer rely on opthalmoscopic evaluation but can capture and store fundus images with retinal cameras, enabling us to measure vessel calibre more accurately compared to visual estimation; hence it should be integrated in optometric practise for improved accuracy and reliability of clinical assessments of retinal vessel calibres. © 2014 Spanish General Council of Optometry.
Resumo:
Efficient numerical modelling of the power, spectral and statistical properties of partially coherent quasi-CW Raman fiber laser radiation is presented. XPM between pump wave and generated Stokes wave is not important in the generation spectrum broadening and XPM term can be omitted in propagation equation what sufficiently speeds-up simulations. The time dynamics of Raman fiber laser (RFL) is stochastic exhibiting events several times more intense that the mean value on the ps timescale. However, the RFL has different statistical properties on different time scales. The probability density function of spectral power density is exponential for the generation modes located either in the spectrum centre or spectral wings while the phases are distributed uniformly. The pump wave preserves the initial Gaussian statistics during propagation in the laser cavity. Intense pulses in the pump wave are evolved under the SPM influence and are not disturbed by the dispersion. Contrarily, in the generated wave the dispersion plays a significant role that results in stochastic behavior. © 2012 Elsevier B.V. All rights reserved.
Resumo:
We present first experimental investigation of fast-intensity dynamics of random distributed feedback (DFB) fiber lasers. We found that the laser dynamics are stochastic on a short time scale and exhibit pronounced fluctuations including generation of extreme events. We also experimentally characterize statistical properties of radiation of random DFB fiber lasers. We found that statistical properties deviate from Gaussian and depend on the pump power.
Resumo:
In this work, we introduce the periodic nonlinear Fourier transform (PNFT) method as an alternative and efficacious tool for compensation of the nonlinear transmission effects in optical fiber links. In the Part I, we introduce the algorithmic platform of the technique, describing in details the direct and inverse PNFT operations, also known as the inverse scattering transform for periodic (in time variable) nonlinear Schrödinger equation (NLSE). We pay a special attention to explaining the potential advantages of the PNFT-based processing over the previously studied nonlinear Fourier transform (NFT) based methods. Further, we elucidate the issue of the numerical PNFT computation: we compare the performance of four known numerical methods applicable for the calculation of nonlinear spectral data (the direct PNFT), in particular, taking the main spectrum (utilized further in Part II for the modulation and transmission) associated with some simple example waveforms as the quality indicator for each method. We show that the Ablowitz-Ladik discretization approach for the direct PNFT provides the best performance in terms of the accuracy and computational time consumption.
Resumo:
Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
Two contrasting multivariate statistical methods, viz., principal components analysis (PCA) and cluster analysis were applied to the study of neuropathological variations between cases of Alzheimer's disease (AD). To compare the two methods, 78 cases of AD were analyzed, each characterised by measurements of 47 neuropathological variables. Both methods of analysis revealed significant variations between AD cases. These variations were related primarily to differences in the distribution and abundance of senile plaques (SP) and neurofibrillary tangles (NFT) in the brain. Cluster analysis classified the majority of AD cases into five groups which could represent subtypes of AD. However, PCA suggested that variation between cases was more continuous with no distinct subtypes. Hence, PCA may be a more appropriate method than cluster analysis in the study of neuropathological variations between AD cases.
Resumo:
Biological experiments often produce enormous amount of data, which are usually analyzed by data clustering. Cluster analysis refers to statistical methods that are used to assign data with similar properties into several smaller, more meaningful groups. Two commonly used clustering techniques are introduced in the following section: principal component analysis (PCA) and hierarchical clustering. PCA calculates the variance between variables and groups them into a few uncorrelated groups or principal components (PCs) that are orthogonal to each other. Hierarchical clustering is carried out by separating data into many clusters and merging similar clusters together. Here, we use an example of human leukocyte antigen (HLA) supertype classification to demonstrate the usage of the two methods. Two programs, Generating Optimal Linear Partial Least Square Estimations (GOLPE) and Sybyl, are used for PCA and hierarchical clustering, respectively. However, the reader should bear in mind that the methods have been incorporated into other software as well, such as SIMCA, statistiXL, and R.
Resumo:
We survey articles covering how hedge fund returns are explained, using largely non-linear multifactor models that examine the non-linear pay-offs and exposures of hedge funds. We provide an integrated view of the implicit factor and statistical factor models that are largely able to explain the hedge fund return-generating process. We present their evolution through time by discussing pioneering studies that made a significant contribution to knowledge, and also recent innovative studies that examine hedge fund exposures using advanced econometric methods. This is the first review that analyzes very recent studies that explain a large part of hedge fund variation. We conclude by presenting some gaps for future research.