128 resultados para non-Gaussian process
Resumo:
Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the 'COOPER-framework' a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly. © 2010 Elsevier B.V. All rights reserved.
Resumo:
Rotation invariance is important for an iris recognition system since changes of head orientation and binocular vergence may cause eye rotation. The conventional methods of iris recognition cannot achieve true rotation invariance. They only achieve approximate rotation invariance by rotating the feature vector before matching or unwrapping the iris ring at different initial angles. In these methods, the complexity of the method is increased, and when the rotation scale is beyond the certain scope, the error rates of these methods may substantially increase. In order to solve this problem, a new rotation invariant approach for iris feature extraction based on the non-separable wavelet is proposed in this paper. Firstly, a bank of non-separable orthogonal wavelet filters is used to capture characteristics of the iris. Secondly, a method of Markov random fields is used to capture rotation invariant iris feature. Finally, two-class kernel Fisher classifiers are adopted for classification. Experimental results on public iris databases show that the proposed approach has a low error rate and achieves true rotation invariance. © 2010.
Resumo:
Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.
Resumo:
Since 1996 direct femtosecond inscription in transparent dielectrics has become the subject of intensive research. This enabling technology significantly expands the technological boundaries for direct fabrication of 3D structures in a wide variety of materials. It allows modification of non-photosensitive materials, which opens the door to numerous practical applications. In this work we explored the direct femtosecond inscription of waveguides and demonstrated at least one order of magnitude enhancement in the most critical parameter - the induced contrast of the refractive index in a standard borosilicate optical glass. A record high induced refractive contrast of 2.5×10-2 is demonstrated. The waveguides fabricated possess one of the lowest losses, approaching level of Fresnel reflection losses at the glassair interface. High refractive index contrast allows the fabrication of curvilinear waveguides with low bend losses. We also demonstrated the optimisation of the inscription regimes in BK7 glass over a broad range of experimental parameters and observed a counter-intuitive increase of the induced refractive index contrast with increasing translation speed of a sample. Examples of inscription in a number of transparent dielectrics hosts using high repetition rate fs laser system (both glasses and crystals) are also presented. Sub-wavelength scale periodic inscription inside any material often demands supercritical propagation regimes, when pulse peak power is more than the critical power for selffocusing, sometimes several times higher than the critical power. For a sub-critical regime, when the pulse peak power is less than the critical power for self-focusing, we derive analytic expressions for Gaussian beam focusing in the presence of Kerr non-linearity as well as for a number of other beam shapes commonly used in experiments, including astigmatic and ring-shaped ones. In the part devoted to the fabrication of periodic structures, we report on recent development of our point-by-point method, demonstrating the shortest periodic perturbation created in the bulk of a pure fused silica sample, by using third harmonics (? =267 nm) of fundamental laser frequency (? =800 nm) and 1 kHz femtosecond laser system. To overcome the fundamental limitations of the point-by-point method we suggested and experimentally demonstrated the micro-holographic inscription method, which is based on using the combination of a diffractive optical element and standard micro-objectives. Sub-500 nm periodic structures with a much higher aspect ratio were demonstrated. From the applications point of view, we demonstrate examples of photonics devices by direct femtosecond fabrication method, including various vectorial bend-sensors fabricated in standard optical fibres, as well as a highly birefringent long-period gratings by direct modulation method. To address the intrinsic limitations of femtosecond inscription at very shallow depths we suggested the hybrid mask-less lithography method. The method is based on precision ablation of a thin metal layer deposited on the surface of the sample to create a mask. After that an ion-exchange process in the melt of Ag-containing salts allows quick and low-cost fabrication of shallow waveguides and other components of integrated optics. This approach covers the gap in direct fs inscription of shallow waveguide. Perspectives and future developments of direct femtosecond micro-fabrication are also discussed.
Resumo:
In recent years there has been an increased interest in applying non-parametric methods to real-world problems. Significant research has been devoted to Gaussian processes (GPs) due to their increased flexibility when compared with parametric models. These methods use Bayesian learning, which generally leads to analytically intractable posteriors. This thesis proposes a two-step solution to construct a probabilistic approximation to the posterior. In the first step we adapt the Bayesian online learning to GPs: the final approximation to the posterior is the result of propagating the first and second moments of intermediate posteriors obtained by combining a new example with the previous approximation. The propagation of em functional forms is solved by showing the existence of a parametrisation to posterior moments that uses combinations of the kernel function at the training points, transforming the Bayesian online learning of functions into a parametric formulation. The drawback is the prohibitive quadratic scaling of the number of parameters with the size of the data, making the method inapplicable to large datasets. The second step solves the problem of the exploding parameter size and makes GPs applicable to arbitrarily large datasets. The approximation is based on a measure of distance between two GPs, the KL-divergence between GPs. This second approximation is with a constrained GP in which only a small subset of the whole training dataset is used to represent the GP. This subset is called the em Basis Vector, or BV set and the resulting GP is a sparse approximation to the true posterior. As this sparsity is based on the KL-minimisation, it is probabilistic and independent of the way the posterior approximation from the first step is obtained. We combine the sparse approximation with an extension to the Bayesian online algorithm that allows multiple iterations for each input and thus approximating a batch solution. The resulting sparse learning algorithm is a generic one: for different problems we only change the likelihood. The algorithm is applied to a variety of problems and we examine its performance both on more classical regression and classification tasks and to the data-assimilation and a simple density estimation problems.
Resumo:
Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under low-density parity-check (LDPC) network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and symmetric and asymmetric interference. Both dynamical and thermodynamical transitions from the ferromagnetic solution of perfect decoding to a non-ferromagnetic solution are identified for the cases considered, marking the practical and theoretical limits of the system under the current coding scheme. Numerical results are provided, showing the typical level of improvement/deterioration achieved with respect to the single transmitter/receiver result, for the various cases. © 2007 IOP Publishing Ltd.
Resumo:
The delicately orchestrated process of bone fracture healing is not always successful and long term non union of fractured bone occurs in 5-20% of all cases. Atrophic fracture non unions have been described as the most difficult to treat and this is thought to arise through a cellular and local failure of osteogenesis. However, little is known about the presence and osteogenic proficiency of cells in the local area of non union tissue. We have examined the growth and differentiation potential of cells isolated from human non union tissues compared with normal human bone marrow mesenchymal stromal cells (BMSC). We report the isolation and culture expansion of a population of non union stromal cells (NUSC) which have a CD profile similar to that of BMSC, i.e. CD34-ve, CD45-ve and CD105+ve. The NUSC demonstrated multipotentiality and differentiated to some extent along chondrogenic, adipogenic and osteogenic lineages. However, and importantly, the NUSC showed significantly reduced osteogenic differentiation and mineralization in vitro compared to BMSC. We also found increased levels of cell senescence in NUSC compared to BMSC based on culture growth kinetics and cell positivity for senescence associated beta galactosidase (SA-beta-Gal) activity. The reduced capacity of NUSC to form osteoblasts was associated with significantly elevated secretion of Dickkopf-1 (Dkk-1) which is an important inhibitor of Wnt signalling during osteogenesis, compared to BMSC. Conversely, treating BMSC with levels of rhDkk-1 that were equivalent to those levels secreted by NUSC inhibited the capacity of BMSC to undergo osteogenesis. Treating BMSC with NUSC conditioned medium also inhibited the capacity of the BMSC to undergo osteogenic differentiation when compared to their treatment with BMSC conditioned medium. Our results suggest that the development of fracture non union is linked with a localised reduced capacity of cells to undergo osteogenesis, which in turn is associated with increased cell senescence and Dkk-1 secretion.
Resumo:
Time, cost and quality achievements on large-scale construction projects are uncertain because of technological constraints, involvement of many stakeholders, long durations, large capital requirements and improper scope definitions. Projects that are exposed to such an uncertain environment can effectively be managed with the application of risk management throughout the project life cycle. Risk is by nature subjective. However, managing risk subjectively poses the danger of non-achievement of project goals. Moreover, risk analysis of the overall project also poses the danger of developing inappropriate responses. This article demonstrates a quantitative approach to construction risk management through an analytic hierarchy process (AHP) and decision tree analysis. The entire project is classified to form a few work packages. With the involvement of project stakeholders, risky work packages are identified. As all the risk factors are identified, their effects are quantified by determining probability (using AHP) and severity (guess estimate). Various alternative responses are generated, listing the cost implications of mitigating the quantified risks. The expected monetary values are derived for each alternative in a decision tree framework and subsequent probability analysis helps to make the right decision in managing risks. In this article, the entire methodology is explained by using a case application of a cross-country petroleum pipeline project in India. The case study demonstrates the project management effectiveness of using AHP and DTA.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
In this article we describe and evaluate the process of conducting online survey research about the legal recognition of same-sex relationships (key findings from which we have reported elsewhere, see Harding and Peel, 2006). Our aim in so doing is to contribute to the growing generic literature on internet-based research methods (Nosek et al., 2002; Rhodes et al., 2003; Stern, 2003; Strickland et al., 2003; Thomas et al., 2000) to the research methods literature within lesbian, gay, bisexual, trans and queer (LGBTQ) psychologies (Fish, 2000; Morris and Rothblum, 1999; Meezan and Martin, 2003; Mustanski, 2001) and also to extend the germinal literature focusing on internet research with non-heterosexual groups (Elford et al., 2004; Ellis et al., 2003; Ross et al., 2000). We begin by discussing the process of developing the online survey tool, before outlining the experience of the survey ‘going live’ and providing details of who completed the survey. We conclude by exploring some of the positives and pitfalls of this type of research methodology.
Resumo:
We consider return-to-zero (RZ) pulses with random phase modulation propagating in a nonlinear channel (modelled by the integrable nonlinear Schrödinger equation, NLSE). We suggest two different models for the phase fluctuations of the optical field: (i) Gaussian short-correlated fluctuations and (ii) generalized telegraph process. Using the rectangular-shaped pulse form we demonstrate that the presence of phase fluctuations of both types strongly influences the number of solitons generated in the channel. It is also shown that increasing the correlation time for the random phase fluctuations affects the coherent content of a pulse in a non-trivial way. The result obtained has potential consequences for all-optical processing and design of optical decision elements.
Resumo:
The object of this work was to further develop the idea introduced by Muaddi et al (1981) which enables some of the disadvantages of earlier destructive adhesion test methods to be overcome. The test is non-destructive in nature but it does need to be calibrated against a destructive method. Adhesion is determined by measuring the effect of plating on internal friction. This is achieved by determining the damping of vibrations of a resonating specimen before and after plating. The level of adhesion was considered by the above authors to influence the degree of damping. In the major portion of the research work the electrodeposited metal was Watt's nickel, which is ductile in nature and is therefore suitable for peel adhesion testing. The base metals chosen were aluminium alloys S1C and HE9 as it is relatively easy to produce varying levels of adhesion between the substrate and electrodeposited coating by choosing the appropriate process sequence. S1C alloy is the commercially pure aluminium and was used to produce good adhesion. HE9 aluminium alloy is a more difficult to plate alloy and was chosen to produce poorer adhesion. The "Modal Testing" method used for studying vibrations was investigated as a possible means of evaluating adhesion but was not successful and so research was concentrated on the "Q" meter. The method based on the use of a "Q" meter involves the principle of exciting vibrations in a sample, interrupting the driving signal and counting the number of oscillations of the freely decaying vibrations between two known preselected amplitudes of oscillations. It was not possible to reconstruct a working instrument using Muaddi's thesis (1982) as it had either a serious error or the information was incomplete. Hence a modified "Q" meter had to be designed and constructed but it was then difficult to resonate non-magnetic materials, such as aluminium, therefore, a comparison before and after plating could not be made. A new "Q" meter was then developed based on an Impulse Technique. A regulated miniature hammer was used to excite the test piece at the fundamental mode instead of an electronic hammer and test pieces were supported at the two predetermined nodal points using nylon threads. This instrument developed was not very successful at detecting changes due to good and poor pretreatments given before plating, however, it was more sensitive to changes at the surface such as room temperature oxidation. Statistical analysis of test results from untreated aluminium alloys show that the instrument is not always consistent, the variation was even bigger when readings were taken on different days. Although aluminium is said to form protective oxides at room temperature there was evidence that the aluminium surface changes continuously due to film formation, growth and breakdown. Nickel plated and zinc alloy immersion coated samples also showed variation in Q with time. In order to prove that the variations in Q were mainly due to surface oxidation, aluminium samples were lacquered and anodised Such treatments enveloped the active surfaces reacting with the environment and the Q variation with time was almost eliminated especially after hard anodising. This instrument detected major differences between different untreated aluminium substrates.Also Q values decreased progressively as coating thicknesses were increased. This instrument was also able to detect changes in Q due to heat-treatment of aluminium alloys.
Resumo:
Molecular transport in phase space is crucial for chemical reactions because it defines how pre-reactive molecular configurations are found during the time evolution of the system. Using Molecular Dynamics (MD) simulated atomistic trajectories we test the assumption of the normal diffusion in the phase space for bulk water at ambient conditions by checking the equivalence of the transport to the random walk model. Contrary to common expectations we have found that some statistical features of the transport in the phase space differ from those of the normal diffusion models. This implies a non-random character of the path search process by the reacting complexes in water solutions. Our further numerical experiments show that a significant long period of non-stationarity in the transition probabilities of the segments of molecular trajectories can account for the observed non-uniform filling of the phase space. Surprisingly, the characteristic periods in the model non-stationarity constitute hundreds of nanoseconds, that is much longer time scales compared to typical lifetime of known liquid water molecular structures (several picoseconds).
Resumo:
Different types of numerical data can be collected in a scientific investigation and the choice of statistical analysis will often depend on the distribution of the data. A basic distinction between variables is whether they are ‘parametric’ or ‘non-parametric’. When a variable is parametric, the data come from a symmetrically shaped distribution known as the ‘Gaussian’ or ‘normal distribution’ whereas non-parametric variables may have a distribution which deviates markedly in shape from normal. This article describes several aspects of the problem of non-normality including: (1) how to test for two common types of deviation from a normal distribution, viz., ‘skew’ and ‘kurtosis’, (2) how to fit the normal distribution to a sample of data, (3) the transformation of non-normally distributed data and scores, and (4) commonly used ‘non-parametric’ statistics which can be used in a variety of circumstances.
Resumo:
From a manufacturing perspective, the efficiency of manufacturing operations (such as process planning and production scheduling) are the key element for enhancing manufacturing competence. Process planning and production scheduling functions have been traditionally treated as two separate activities, and have resulted in a range of inefficiencies. These include infeasible process plans, non-available/overloaded resources, high production costs, long production lead times, and so on. Above all, it is unlikely that the dynamic changes can be efficiently dealt with. Despite much research has been conducted to integrate process planning and production scheduling to generate optimised solutions to improve manufacturing efficiency, there is still a gap to achieve the competence required for the current global competitive market. In this research, the concept of multi-agent system (MAS) is adopted as a means to address the aforementioned gap. A MAS consists of a collection of intelligent autonomous agents able to solve complex problems. These agents possess their individual objectives and interact with each other to fulfil the global goal. This paper describes a novel use of an autonomous agent system to facilitate the integration of process planning and production scheduling functions to cope with unpredictable demands, in terms of uncertainties in product mix and demand pattern. The novelty lies with the currency-based iterative agent bidding mechanism to allow process planning and production scheduling options to be evaluated simultaneously, so as to search for an optimised, cost-effective solution. This agent based system aims to achieve manufacturing competence by means of enhancing the flexibility and agility of manufacturing enterprises.