33 resultados para estimation and filtering


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conventional project management techniques are not always sufficient for ensuring time, cost and quality achievement of large-scale construction projects due to complexity in planning and implementation processes. The main reasons for project non-achievement are changes in scope and design, changes in Government policies and regulations, unforeseen inflation) under-estimation and improper estimation. Projects that are exposed to such an uncertain environment can be effectively managed with the application of risk numagement throughout project life cycle. However, the effectiveness of risk management depends on the technique in which the effects of risk factors are analysed and! or quantified. This study proposes Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique as a tool for risk analysis because it can handle subjective as well as objective factors in decision model that are conflicting in nature. This provides a decision support system (DSS) to project managenumt for making the right decision at the right time for ensuring project success in line with organisation policy, project objectives and competitive business environment. The whole methodology is explained through a case study of a cross-country petroleum pipeline project in India and its effectiveness in project1nana.gement is demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report the case of a neologistic jargonaphasic and ask whether her target-related and abstruse neologisms are the result of a single deficit, which affects some items more severely than others, or two deficits: one to lexical access and the other to phonological encoding. We analyse both correct/incorrect performance and errors and apply both traditional and formal methods (maximum-likelihood estimation and model selection). All evidence points to a single deficit at the level of phonological encoding. Further characteristics are used to constrain the locus still further. V.S. does not show the type of length effect expected of a memory component, nor the pattern of errors associated with an articulatory deficit. We conclude that her neologistic errors can result from a single deficit at a level of phonological encoding that immediately follows lexical access where segments are represented in terms of their features. We do not conclude, however, that this is the only possible locus that will produce phonological errors in aphasia, or, indeed, jargonaphasia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The subject of this thesis is the n-tuple net.work (RAMnet). The major advantage of RAMnets is their speed and the simplicity with which they can be implemented in parallel hardware. On the other hand, this method is not a universal approximator and the training procedure does not involve the minimisation of a cost function. Hence RAMnets are potentially sub-optimal. It is important to understand the source of this sub-optimality and to develop the analytical tools that allow us to quantify the generalisation cost of using this model for any given data. We view RAMnets as classifiers and function approximators and try to determine how critical their lack of' universality and optimality is. In order to understand better the inherent. restrictions of the model, we review RAMnets showing their relationship to a number of well established general models such as: Associative Memories, Kamerva's Sparse Distributed Memory, Radial Basis Functions, General Regression Networks and Bayesian Classifiers. We then benchmark binary RAMnet. model against 23 other algorithms using real-world data from the StatLog Project. This large scale experimental study indicates that RAMnets are often capable of delivering results which are competitive with those obtained by more sophisticated, computationally expensive rnodels. The Frequency Weighted version is also benchmarked and shown to perform worse than the binary RAMnet for large values of the tuple size n. We demonstrate that the main issues in the Frequency Weighted RAMnets is adequate probability estimation and propose Good-Turing estimates in place of the more commonly used :Maximum Likelihood estimates. Having established the viability of the method numerically, we focus on providillg an analytical framework that allows us to quantify the generalisation cost of RAMnets for a given datasetL. For the classification network we provide a semi-quantitative argument which is based on the notion of Tuple distance. It gives a good indication of whether the network will fail for the given data. A rigorous Bayesian framework with Gaussian process prior assumptions is given for the regression n-tuple net. We show how to calculate the generalisation cost of this net and verify the results numerically for one dimensional noisy interpolation problems. We conclude that the n-tuple method of classification based on memorisation of random features can be a powerful alternative to slower cost driven models. The speed of the method is at the expense of its optimality. RAMnets will fail for certain datasets but the cases when they do so are relatively easy to determine with the analytical tools we provide.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computer integrated manufacture has brought about great advances in manufacturing technology and its recognition is world wide. Cold roll forming of thin-walled sections, and in particular the design and manufacture of form-rolls, the special tooling used in the cold roll forming process, is but one such area where computer integrated manufacture can make a positive contribution. The work reported in this thesis, concerned with the development of an integrated manufacturing system for assisting the design and manufacture of form-rolls, was undertaken in collaboration with a leading manufacturer of thin-walled sections. A suit of computer programs, written in FORTRAN 77, have been developed to provide computer aids for every aspect of work in form-roll design and manufacture including cost estimation and stock control aids. The first phase of the development programme dealt with the establishment of CAD facilities for form-roll design, comprising the design of the finished section, the flower pattern, the roll design and the interactive roll editor program. Concerning the CAM facilities, dealt with in the second phase, an expert system roll machining processor and a general post-processor have been developed for considering the roll geometry and automatically generating NC tape programs for any required CNC lathe system. These programs have been successfully implemented, as an integrated manufacturing software system, on the VAX 11/750 super-minicomputer with graphics facilities for displaying drawings interactively on the terminal screen. The development of the integrated system has been found beneficial in all aspects of form-roll design and manufacture. Design and manufacturing lead times have been reduced by several weeks, quality has improved considerably and productivity has increased. The work has also demonstrated the promising nature of the expert systems approach to computer integrated manufacture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current analytical assay methods for ampicillin sodium and cloxacillin sodium are discussed and compared, High Performance Liquid Chromatography (H.P.L.C.) being chosen as the most accurate, specific and precise. New H.P.L.C. methods for the analysis of benzathine cloxacillin; benzathine penicillin V; procaine penicillin injection B.P.; benethamine penicillin injection; fortified B.P.C.; benzathine penicillin injection; benzathine penicillin injection, fortified B.P.C.; benzathine penicillin suspnsion; ampicillin syrups and penicillin syrups are described. Mechanical or chemical damage to column packings is often associated with H.P.L.C. analysis. One type, that of channel formation, is investigated. The high linear velocity of solvent and solvent pulsing during the pumping cycle were found to be the cause of this damage. The applicability of nonisotherrnal kinetic experiments to penicillin V preparations, including formulated paediatric syrups, is evaluated. A new type of nonisotherrnal analysis, based on slope estimation and using a 64K Random Access Memory (R.A.M.) microcomputer is described. The name of the program written for this analysis is NONISO. The distribution of active penicillin in granules for reconstitution into ampicillin and penicillin V syrups, and its effect on the stability of the reconstituted products, are investigated. Changing the diluent used to reconstitue the syrups was found to affect the stability of the product. Dissolution and stability of benzathine cloxacillin at pH2, pH6 and pH9 is described, with proposed dissolution mechanisms and kinetic analysis to support these mechanisms. Benzathine and cloxacillin were found to react in solution at pH9, producing an insoluble amide.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Methods of solving the neuro-electromagnetic inverse problem are examined and developed, with specific reference to the human visual cortex. The anatomy, physiology and function of the human visual system are first reviewed. Mechanisms by which the visual cortex gives rise to external electric and magnetic fields are then discussed, and the forward problem is described mathematically for the case of an isotropic, piecewise homogeneous volume conductor, and then for an anisotropic, concentric, spherical volume conductor. Methods of solving the inverse problem are reviewed, before a new technique is presented. This technique combines prior anatomical information gained from stereotaxic studies, with a probabilistic distributed-source algorithm to yield accurate, realistic inverse solutions. The solution accuracy is enhanced by using both visual evoked electric and magnetic responses simultaneously. The numerical algorithm is then modified to perform equivalent current dipole fitting and minimum norm estimation, and these three techniques are implemented on a transputer array for fast computation. Due to the linear nature of the techniques, they can be executed on up to 22 transputers with close to linear speedup. The latter part of the thesis describes the application of the inverse methods to the analysis of visual evoked electric and magnetic responses. The CIIm peak of the pattern onset evoked magnetic response is deduced to be a product of current flowing away from the surface areas 17, 18 and 19, while the pattern reversal P100m response originates in the same areas, but from oppositely directed current. Cortical retinotopy is examined using sectorial stimuli, the CI and CIm ;peaks of the pattern onset electric and magnetic responses are found to originate from areas V1 and V2 simultaneously, and they therefore do not conform to a simple cruciform model of primary visual cortex.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the rebirth of coherent detection, various algorithms have come forth to alleviate phase noise, one of the main impairments for coherent receivers. These algorithms provide stable compensation, however they limit the DSP. With this key issue in mind, Fabry Perot filter based self coherent optical OFDM was analyzed which does not require phase noise compensation reducing the complexity in DSP at low OSNR. However, the performance of such a receiver is limited due to ASE noise at the carrier wavelength, especially since an optical amplifier is typically employed with the filter to ensure sufficient carrier power. Subsequently, the use of an injection-locked laser (ILL) to retrieve the frequency and phase information from the extracted carrier without the use of an amplifier was recently proposed. In ILL based system, an optical carrier is sent along with the OFDM signal in the transmitter. At the receiver, the carrier is extracted from the OFDM signal using a Fabry-Perot tunable filter and an ILL is used to significantly amplify the carrier and reduce intensity and phase noise. In contrast to CO-OFDM, such a system supports low-cost broad linewidth lasers and benefits with lower complexity in the DSP as no carrier frequency estimation and correction along with phase noise compensation is required.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We examine the statistics of three interacting optical solitons under the effects of amplifier noise and filtering. We derive rigorously the Fokker-Planck equation that governs the probability distribution of soliton parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a semiparametric smooth-coefficient stochastic production frontier model where all the coefficients are expressed as some unknown functions of environmental factors. The inefficiency term is multiplicatively decomposed into a scaling function of the environmental factors and a standard truncated normal random variable. A testing procedure is suggested for the relevance of the environmental factors. Monte Carlo study shows plausible ¯nite sample behavior of our proposed estimation and inference procedure. An empirical example is given, where both the semiparametric and standard parametric models are estimated and results are compared.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We examine the statistics of three interacting optical solitons under the effects of amplifier noise and filtering. We derive rigorously the Fokker-Planck equation that governs the probability distribution of soliton parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Time, cost and quality are the prime objectives of any project. Unfortunately, today’s project management does not always ensure the realisation of these objectives. The main reasons of project non-achievement are changes in scope and design, changes in Government policies and regulations, unforeseen inflation, under-estimation and mis-estimation. An overall organisational approach with the application of appropriate management philosophies, tools and techniques can only solve the problem. The present study establishes a methodology for achieving success in implementing projects using a business process re-engineering (BPR) framework. Internal performance characteristics are introspected through condition diagnosis that identifies and prioritises areas of concern requiring attention. Process re-engineering emerges as a most critical area for immediate attention. Project process re-engineering is carried out by eliminating non-value added activities, taking up activities concurrently by applying information systems rigorously and applying risk management techniques throughout the project life cycle. The overall methodology is demonstrated through applications to cross country petroleum pipeline project organisation in an Indian scenario.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included. © 2010 The authors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the rebirth of coherent detection, various algorithms have come forth to alleviate phase noise, one of the main impairments for coherent receivers. These algorithms provide stable compensation, however they limit the DSP. With this key issue in mind, Fabry Perot filter based self coherent optical OFDM was analyzed which does not require phase noise compensation reducing the complexity in DSP at low OSNR. However, the performance of such a receiver is limited due to ASE noise at the carrier wavelength, especially since an optical amplifier is typically employed with the filter to ensure sufficient carrier power. Subsequently, the use of an injection-locked laser (ILL) to retrieve the frequency and phase information from the extracted carrier without the use of an amplifier was recently proposed. In ILL based system, an optical carrier is sent along with the OFDM signal in the transmitter. At the receiver, the carrier is extracted from the OFDM signal using a Fabry-Perot tunable filter and an ILL is used to significantly amplify the carrier and reduce intensity and phase noise. In contrast to CO-OFDM, such a system supports low-cost broad linewidth lasers and benefits with lower complexity in the DSP as no carrier frequency estimation and correction along with phase noise compensation is required.