957 resultados para Statistical analysis techniques


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Research on production systems design has in recent years tended to concentrate on ‘software’ factors such as organisational aspects, work design, and the planning of the production operations. In contrast, relatively little attention has been paid to maximising the contributions made by fixed assets, particularly machines and equipment. However, as the cost of unproductive machine time has increased, reliability, particularly of machine tools, has become ever more important. Reliability theory and research has traditionally been based in the main on electrical and electronic equipment whereas mechanical devices, especially machine tools, have not received sufficiently objective treatment. A recently completed research project has considered the reliability of machine tools by taking sample surveys of purchasers, maintainers and manufacturers. Breakdown data were also collected from a number of engineering companies and analysed using both manual and computer techniques. Results obtained have provided an indication of those factors most likely to influence reliability and which in turn could lead to improved design and selection of machine tool systems. Statistical analysis of long-term field data has revealed patterns of trends of failure which could help in the design of more meaningful maintenance schemes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We have studied the spatial distribution of plaques in coronal and tangential sections of the parahippocampal gyrus (PHG), the hippocampus, the frontal lobe and the temporal lobe of five SDAT patients. Sections were stained with cresyl violet and examined at two magnifications (x100 and x400). in all cases (and at both magnifications) statistical analysis using the Poisson distribution showed that the plaques were arranged in clumps (x100: V/M = 1.48 - 4.49; x400 V/M = 1.17 - 1.95). this indicates that both large scale and small scale clumping occurs. Application of the statistical techniques of pattern analysis to coronal sections of frontal and temporal cortex and PHG showed. furthermore, that both large (3200-6400 micron) and small scale (100 - 400 micron) clumps were arranged with a high degree of regularity in the tissue. This suggests that the clumps of plaques reflect underlying neural structure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes work carried out to improve the fundamental modelling of liquid flows on distillation trays. A mathematical model is presented based on the principles of computerised fluid dynamics. It models the liquid flow in the horizontal directions allowing for the effects of the vapour through the use of an increased liquid turbulence, modelled by an eddy viscosity, and a resistance to liquid flow caused by the vapour being accelerated horizontally by the liquid. The resultant equations are similar to the Navier-Stokes equations with the addition of a resistance term.A mass-transfer model is used to calculate liquid concentration profiles and tray efficiencies. A heat and mass transfer analogy is used to compare theoretical concentration profiles to experimental water-cooling data obtained from a 2.44 metre diameter air-water distillation simulation rig. The ratios of air to water flow rates are varied in order to simulate three pressures: vacuum, atmospheric pressure and moderate pressure.For simulated atmospheric and moderate pressure distillation, the fluid mechanical model constantly over-predicts tray efficiencies with an accuracy of between +1.7% and +11.3%. This compares to -1.8% to -10.9% for the stagnant regions model (Porter et al. 1972) and +12.8% to +34.7% for the plug flow plus back-mixing model (Gerster et al. 1958). The model fails to predict the flow patterns and tray efficiencies for vacuum simulation due to the change in the mechanism of liquid transport, from a liquid continuous layer to a spray as the liquid flow-rate is reduced. This spray is not taken into account in the development of the fluid mechanical model. A sensitivity analysis carried out has shown that the fluid mechanical model is relatively insensitive to the prediction of the average height of clear liquid, and a reduction in the resistance term results in a slight loss of tray efficiency. But these effects are not great. The model is quite sensitive to the prediction of the eddy viscosity term. Variations can produce up to a 15% decrease in tray efficiency. The fluid mechanical model has been incorporated into a column model so that statistical optimisation techniques can be employed to fit a theoretical column concentration profile to experimental data. Through the use of this work mass-transfer data can be obtained.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study is to increase our knowledge of the nature of the surface properties of polymeric materials and improve our understanding of how these factors influence the deposition of proteins to form a reactive biological/synthetic interface. A number of surface analytical techniques were identified as being of potential benefit to this investigation and included in a multidisciplinary research program. Cell adhesion in culture was the primary biological sensor of surface properties, and it showed that the cell response to different materials can be modified by adhesion promoting protein layers: cell adhesion is a protein-mediated event. A range of surface rugosity can be produced on polystyrene, and the results presented here show that surface rugosity does not play a major role in determining a material's cell adhesiveness. Contact angle measurements showed that surface energy (specifically the polar fraction) is important in promoting cell spreading on surfaces. The immunogold labelling technique indicated that there were small, but noticeable differences, between the distribution of proteins on a range of surfaces. This study has shown that surface analysis techniques have different sensitivities in terms of detection limits and depth probed, and these are important in determining the usefulness of the information obtained. The techniques provide information on differing aspects of the biological/synthetic interface, and the consequence of this is that a range of techniques is needed in any full study of such a complex field as the biomaterials area.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis we use statistical physics techniques to study the typical performance of four families of error-correcting codes based on very sparse linear transformations: Sourlas codes, Gallager codes, MacKay-Neal codes and Kanter-Saad codes. We map the decoding problem onto an Ising spin system with many-spins interactions. We then employ the replica method to calculate averages over the quenched disorder represented by the code constructions, the arbitrary messages and the random noise vectors. We find, as the noise level increases, a phase transition between successful decoding and failure phases. This phase transition coincides with upper bounds derived in the information theory literature in most of the cases. We connect the practical decoding algorithm known as probability propagation with the task of finding local minima of the related Bethe free-energy. We show that the practical decoding thresholds correspond to noise levels where suboptimal minima of the free-energy emerge. Simulations of practical decoding scenarios using probability propagation agree with theoretical predictions of the replica symmetric theory. The typical performance predicted by the thermodynamic phase transitions is shown to be attainable in computation times that grow exponentially with the system size. We use the insights obtained to design a method to calculate the performance and optimise parameters of the high performance codes proposed by Kanter and Saad.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The principal theme of this thesis is the in vivo examination of ocular morphological changes during phakic accommodation, with particular attention paid to the ciliary muscle and crystalline lens. The investigations detailed involved the application of high-resolution imaging techniques to facilitate the acquisition of new data to assist in the clarification of aspects of the accommodative system that were poorly understood. A clinical evaluation of the newly available Grand Seiko Auto Ref/ Keratometer WAM-5500 optometer was undertaken to assess its value in the field of accommodation research. The device was found to be accurate and repeatable compared to subjective refraction, and has the added advantage of allowing dynamic data collection at a frequency of around 5 Hz. All of the subsequent investigations applied the WAM-5500 for determination of refractive error and objective accommodative responses. Anterior segment optical coherence tomography (AS-OCT) based studies examined the morphology and contractile response of youthful and ageing ciliary muscle. Nasal versus temporal asymmetry was identified, with the temporal aspect being both thicker and demonstrating a greater contractile response. The ciliary muscle was longer in terms of both its anterior (r = 0.49, P <0.001) and overall length (r = 0.45, P = 0.02) characteristics, in myopes. The myopic ciliary muscle does not appear to be merely stretched during axial elongation, as no significant relationship between thickness and refractive error was identified. The main contractile responses observed were a thickening of the anterior region and a shortening of the muscle, particularly anteriorly. Similar patterns of response were observed in subjects aged up to 70 years, supporting a lensocentric theory of presbyopia development. Following the discovery of nasal/ temporal asymmetry in ciliary muscle morphology and response, an investigation was conducted to explore whether the regional variations in muscle contractility impacted on lens stability during accommodation. A bespoke programme was developed to analyse AS-OCT images and determine whether lens tilt and decentration varied between the relaxed and accommodated states. No significant accommodative difference in these parameters was identified, implying that any changes in lens stability with accommodation are very slight, as a possible consequence of vitreous support. Novel three-dimensional magnetic resonance imaging (MRI) and analysis techniques were used to investigate changes in lens morphology and ocular conformation during accommodation. An accommodative reduction in lens equatorial diameter provides further evidence to support the Helmholtzian mechanism of accommodation, whilst the observed increase in lens volume challenges the widespread assertion that this structure is incompressible due to its high water content. Wholeeye MRI indicated that the volume of the vitreous chamber remains constant during accommodation. No significant changes in ocular conformation were detected using MRI. The investigations detailed provide further insight into the mechanisms of accommodation and presbyopia, and represent a platform for future work in this field.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Decomposition of domestic wastes in an anaerobic environment results in the production of landfill gas. Public concern about landfill disposal and particularly the production of landfill gas has been heightened over the past decade. This has been due in large to the increased quantities of gas being generated as a result of modern disposal techniques, and also to their increasing effect on modern urban developments. In order to avert diasters, effective means of preventing gas migration are required. This, in turn requires accurate detection and monitoring of gas in the subsurface. Point sampling techniques have many drawbacks, and accurate measurement of gas is difficult. Some of the disadvantages of these techniques could be overcome by assessing the impact of gas on biological systems. This research explores the effects of landfill gas on plants, and hence on the spectral response of vegetation canopies. Examination of the landfill gas/vegetation relationship is covered, both by review of the literature and statistical analysis of field data. The work showed that, although vegetation health was related to landfill gas, it was not possible to define a simple correlation. In the landfill environment, contribution from other variables, such as soil characteristics, frequently confused the relationship. Two sites are investigated in detail, the sites contrasting in terms of the data available, site conditions, and the degree of damage to vegetation. Gas migration at the Panshanger site was dominantly upwards, affecting crops being grown on the landfill cap. The injury was expressed as an overall decline in plant health. Discriminant analysis was used to account for the variations in plant health, and hence the differences in spectral response of the crop canopy, using a combination of soil and gas variables. Damage to both woodland and crops at the Ware site was severe, and could be easily related to the presence of gas. Air photographs, aerial video, and airborne thematic mapper data were used to identify damage to vegetation, and relate this to soil type. The utility of different sensors for this type of application is assessed, and possible improvements that could lead to more widespread use are identified. The situations in which remote sensing data could be combined with ground survey are identified. In addition, a possible methodology for integrating the two approaches is suggested.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An apparatus was developed to project spinning golf balls directly onto golf greens. This employed a modified baseball/practice machine with two counter-rotating pneumatic wheels. The speed of the wheels could be varied independently allowing backspin to be given to the ball. The ball was projected into a darkened enclosure where the motion of the ball before and after impacting with the turf was recorded using a still camera and a stroboscope. The resulting photographs contained successive images of the ball on a single frame of film. The apparatus was tested on eighteen golf courses resulting in 721 photographs of impacts. Statistical analysis was carried out on the results of the photographs and from this, two types of green emerged. On the first, the ball tended to rebound with topspin, while on the second, the ball retained backspin after impact if the initial backspin was greater than about 350 rads-1. Eleven tests were devised to determine the characteristics of greens and statistical techniques were used to analyse the relationships between these tests. These showed the effects of the green characteristics on ball/turf impacts. It was found that the ball retained backspin on greens that were freely drained and had less than 60% of Poa annua (annual meadow grass) in their swards. Visco-elastic models were used to simulate the impact of the ball with the turf. Impacts were simulated by considering the ball to be rigid and the turf to be a two layered system consisting of springs and dampers. The model showed good agreement with experiment and was used to simulate impacts from two different shots onto two contrasting types of green.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is to illustrate the measurement of productive efficiency using Nerlovian indicator and metafrontier with data envelopment analysis techniques. Further, we illustrate how profit efficiency of firms operating in different regions can be aggregated into one overarching frontier. Sugarcane production in three regions in Kenya has been used to illustrate these concepts. Results show that the sources of inefficiency in all regions are both technical and allocative, but allocative efficiency contributes more to the overall Nerlovian (in)efficiency indicator. © 2011 Springer-Verlag.