25 resultados para Meyer–Konig and Zeller Operators
em Aston University Research Archive
Resumo:
A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.
Resumo:
This study seeks to investigate how stakeholder power and an organization's pursuit of legitimacy influence its reaction to conflict with a supplier. We conducted an empirical study among travel agents and tour operators to test the relationship between conflict and stakeholder power and legitimacy derived from three different stakeholders. Our findings imply that power has a dual role. Whereas supplier power reduces buyer–supplier conflict, stakeholder power increases it. Moreover, this study shows that the quest to achieve greater legitimacy from the firm's competitive arena increases conflict. This study is one of the few that test stakeholder theory empirically. We demonstrate that stakeholder theory provides additional explanations above the hitherto taken dyadic approach toward understanding conflict. This study also shows that power can simultaneously reduce and increase conflict depending on which party possesses power. Greater supplier power decreases conflict, while greater stakeholder power and stakeholder-derived legitimacy increases it. Therefore, organizations have to balance their stakeholder and supplier interests.
Resumo:
Ernst Mach observed that light or dark bands could be seen at abrupt changes of luminance gradient in the absence of peaks or troughs in luminance. Many models of feature detection share the idea that bars, lines, and Mach bands are found at peaks and troughs in the output of even-symmetric spatial filters. Our experiments assessed the appearance of Mach bands (position and width) and the probability of seeing them on a novel set of generalized Gaussian edges. Mach band probability was mainly determined by the shape of the luminance profile and increased with the sharpness of its corners, controlled by a single parameter (n). Doubling or halving the size of the images had no significant effect. Variations in contrast (20%-80%) and duration (50-300 ms) had relatively minor effects. These results rule out the idea that Mach bands depend simply on the amplitude of the second derivative, but a multiscale model, based on Gaussian-smoothed first- and second-derivative filtering, can account accurately for the probability and perceived spatial layout of the bands. A key idea is that Mach band visibility depends on the ratio of second- to first-derivative responses at peaks in the second-derivative scale-space map. This ratio is approximately scale-invariant and increases with the sharpness of the corners of the luminance ramp, as observed. The edges of Mach bands pose a surprisingly difficult challenge for models of edge detection, but a nonlinear third-derivative operation is shown to predict the locations of Mach band edges strikingly well. Mach bands thus shed new light on the role of multiscale filtering systems in feature coding. © 2012 ARVO.
Resumo:
Offshore oil and gas pipelines are vulnerable to environment as any leak and burst in pipelines cause oil/gas spill resulting in huge negative Impacts on marine lives. Breakdown maintenance of these pipelines is also cost-intensive and time-consuming resulting in huge tangible and intangible loss to the pipeline operators. Pipelines health monitoring and integrity analysis have been researched a lot for successful pipeline operations and risk-based maintenance model is one of the outcomes of those researches. This study develops a risk-based maintenance model using a combined multiple-criteria decision-making and weight method for offshore oil and gas pipelines in Thailand with the active participation of experienced executives. The model's effectiveness has been demonstrated through real life application on oil and gas pipelines in the Gulf of Thailand. Practical implications. Risk-based inspection and maintenance methodology is particularly important for oil pipelines system, as any failure in the system will not only affect productivity negatively but also has tremendous negative environmental impact. The proposed model helps the pipelines operators to analyze the health of pipelines dynamically, to select specific inspection and maintenance method for specific section in line with its probability and severity of failure.
Resumo:
Motion discontinuities can signal object boundaries where few or no other cues, such as luminance, colour, or texture, are available. Hence, motion-defined contours are an ecologically important counterpart to luminance contours. We developed a novel motion-defined Gabor stimulus to investigate the nature of neural operators analysing visual motion fields in order to draw parallels with known luminance operators. Luminance-defined Gabors have been successfully used to discern the spatial-extent and spatial-frequency specificity of possible visual contour detectors. We now extend these studies into the motion domain. We define a stimulus using limited-lifetime moving dots whose velocity is described over 2-D space by a Gabor pattern surrounded by randomly moving dots. Participants were asked to determine whether the orientation of the Gabor pattern (and hence of the motion contours) was vertical or horizontal in a 2AFC task, and the proportion of correct responses was recorded. We found that with practice participants became highly proficient at this task, able in certain cases to reach 90% accuracy with only 12 limited-lifetime dots. However, for both practised and novice participants we found that the ability to detect a single boundary saturates with the size of the Gaussian envelope of the Gabor at approximately 5 deg full-width at half-height. At this optimal size we then varied spatial frequency and found the optimum was at the lowest measured spatial frequency (0.1 cycle deg-1 ) and then steadily decreased with higher spatial frequencies, suggesting that motion contour detectors may be specifically tuned to a single, isolated edge.
Resumo:
Marr's work offered guidelines on how to investigate vision (the theory - algorithm - implementation distinction), as well as specific proposals on how vision is done. Many of the latter have inevitably been superseded, but the approach was inspirational and remains so. Marr saw the computational study of vision as tightly linked to psychophysics and neurophysiology, but the last twenty years have seen some weakening of that integration. Because feature detection is a key stage in early human vision, we have returned to basic questions about representation of edges at coarse and fine scales. We describe an explicit model in the spirit of the primal sketch, but tightly constrained by psychophysical data. Results from two tasks (location-marking and blur-matching) point strongly to the central role played by second-derivative operators, as proposed by Marr and Hildreth. Edge location and blur are evaluated by finding the location and scale of the Gaussian-derivative `template' that best matches the second-derivative profile (`signature') of the edge. The system is scale-invariant, and accurately predicts blur-matching data for a wide variety of 1-D and 2-D images. By finding the best-fitting scale, it implements a form of local scale selection and circumvents the knotty problem of integrating filter outputs across scales. [Supported by BBSRC and the Wellcome Trust]
Resumo:
The kinematic mapping of a rigid open-link manipulator is a homomorphism between Lie groups. The homomorphisrn has solution groups that act on an inverse kinematic solution element. A canonical representation of solution group operators that act on a solution element of three and seven degree-of-freedom (do!) dextrous manipulators is determined by geometric analysis. Seven canonical solution groups are determined for the seven do! Robotics Research K-1207 and Hollerbach arms. The solution element of a dextrous manipulator is a collection of trivial fibre bundles with solution fibres homotopic to the Torus. If fibre solutions are parameterised by a scalar, a direct inverse funct.ion that maps the scalar and Cartesian base space coordinates to solution element fibre coordinates may be defined. A direct inverse pararneterisation of a solution element may be approximated by a local linear map generated by an inverse augmented Jacobian correction of a linear interpolation. The action of canonical solution group operators on a local linear approximation of the solution element of inverse kinematics of dextrous manipulators generates cyclical solutions. The solution representation is proposed as a model of inverse kinematic transformations in primate nervous systems. Simultaneous calibration of a composition of stereo-camera and manipulator kinematic models is under-determined by equi-output parameter groups in the composition of stereo-camera and Denavit Hartenberg (DH) rnodels. An error measure for simultaneous calibration of a composition of models is derived and parameter subsets with no equi-output groups are determined by numerical experiments to simultaneously calibrate the composition of homogeneous or pan-tilt stereo-camera with DH models. For acceleration of exact Newton second-order re-calibration of DH parameters after a sequential calibration of stereo-camera and DH parameters, an optimal numerical evaluation of DH matrix first order and second order error derivatives with respect to a re-calibration error function is derived, implemented and tested. A distributed object environment for point and click image-based tele-command of manipulators and stereo-cameras is specified and implemented that supports rapid prototyping of numerical experiments in distributed system control. The environment is validated by a hierarchical k-fold cross validated calibration to Cartesian space of a radial basis function regression correction of an affine stereo model. Basic design and performance requirements are defined for scalable virtual micro-kernels that broker inter-Java-virtual-machine remote method invocations between components of secure manageable fault-tolerant open distributed agile Total Quality Managed ISO 9000+ conformant Just in Time manufacturing systems.
Resumo:
BACKGROUND: Some studies report that increased tear osmolarity is a reliable indicator of dry eye syndrome (DES). The OcuSense TearLab™ osmometer requires less than a 100-nl sample of tears and provides an instant quantitative result. Our aim was to clinically evaluate this instrument in terms of its reproducibility and repeatability. METHODS: Twenty-nine participants who ranged in age from 19 to 49 years (mean?±?SD: 23.3?±?5.5 years) were recruited. Osmolarity readings were collected by two operators, in two sessions separated by 1 or 2 weeks in order to assess test reproducibility and repeatability. RESULTS: The coefficient of reproducibility was 39 mOsms/l; the coefficient of repeatability was 33 mOsms/l. CONCLUSIONS: Our mean coefficient of variation over four readings for 29 subjects is 2.9%, which compares well with that reported by the manufacturer. Our results inform practitioners about the level of change over time that can be considered clinically relevant for healthy subjects. This value is 33mOsms/l; any change smaller than this could be attributed to measurement noise.
Resumo:
The scaling problems which afflict attempts to optimise neural networks (NNs) with genetic algorithms (GAs) are disclosed. A novel GA-NN hybrid is introduced, based on the bumptree, a little-used connectionist model. As well as being computationally efficient, the bumptree is shown to be more amenable to genetic coding lthan other NN models. A hierarchical genetic coding scheme is developed for the bumptree and shown to have low redundancy, as well as being complete and closed with respect to the search space. When applied to optimising bumptree architectures for classification problems the GA discovers bumptrees which significantly out-perform those constructed using a standard algorithm. The fields of artificial life, control and robotics are identified as likely application areas for the evolutionary optimisation of NNs. An artificial life case-study is presented and discussed. Experiments are reported which show that the GA-bumptree is able to learn simulated pole balancing and car parking tasks using only limited environmental feedback. A simple modification of the fitness function allows the GA-bumptree to learn mappings which are multi-modal, such as robot arm inverse kinematics. The dynamics of the 'geographic speciation' selection model used by the GA-bumptree are investigated empirically and the convergence profile is introduced as an analytical tool. The relationships between the rate of genetic convergence and the phenomena of speciation, genetic drift and punctuated equilibrium arc discussed. The importance of genetic linkage to GA design is discussed and two new recombination operators arc introduced. The first, linkage mapped crossover (LMX) is shown to be a generalisation of existing crossover operators. LMX provides a new framework for incorporating prior knowledge into GAs.Its adaptive form, ALMX, is shown to be able to infer linkage relationships automatically during genetic search.
Resumo:
This thesis deals with the integration of the manpower criterion with the strategic decision making processes of technological projects in developing countries. This integration is to be achieved by ensuring the involvement of the actors, who have relevant roles and responsibilities along the whole life cycle of the project, in the strategic decision making phases of the project. The relevance of the actors is ascertained by the use of a responsibility index which relates their responsibility to the project's constituent stages. In the context of a technological project in a typical centrally-planned developing environment, the actors are identified as Arbiters, Planners, Implementors and Operators and their roles, concerns and objectives are derived. In this context, the actors are usually government and non-government organisations. Hence, decision making will involve multiple agencies as well as multiple criteria. A methodology covering the whole decision-making process, from options generation to options selection, and adopting Saaty's Analytical Hierarchy Process as an operational tool is proposed to deal with such multiple-criteria, multipleagency decision situations. The methodology is intended to integrate the consideration of the relevant criteria, the prevailing environmental and policy factors, and the concerns and objectives of the relevant actors into a unifying decision-making process which strives to facilitate enlightened decision making and to enhance learning and interaction. An extensive assessment of the methodology's feasibility, based on a specific technological project within the Iraqi oil industry is included, and indicates that the methodology should be both useful and implementable.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
This thesis presents the design, fabrication and testing of novel grating based Optical Fibre Sensor (OFS) systems being interrogated using “off the shelf” interrogation systems, with the eventual development of marketable commercial systems at the forefront of the research. Both in the industrial weighing and aerospace industries, there has been a drive to investigate the feasibility of using optical fibre sensors being deployed where traditionally their electrical or mechanical counterparts would traditionally have been. Already, in the industrial weighing industry, commercial operators are deploying OFS-based Weigh-In-Motion (WIM) systems. Likewise, in the aerospace industry, OFS have been deployed to monitor such parameters as load history, impact detection, structural damage, overload detection, centre of gravity and the determination of blade shape. Based on the intrinsic properties of fibre Bragg gratings (FBGs) and Long Period Fibre Gratings (LPFGs), a number of novel OFS-based systems have been realised. Experimental work has shown that in the case of static industrial weighing, FBGs can be integrated with current commercial products and used to detect applied loads. The work has also shown that embedding FBGs in e-glass, to form a sensing patch, can result in said patches being bonded to rail track, forming the basis of an FBG-based WIM system. The results obtained have been sufficiently encouraging to the industrial partner that this work will be progressed beyond the scope of the work presented in this thesis. Likewise, and to the best of the author’s knowledge, a novel Bragg grating based systems for aircraft fuel parameter sensing has been presented. FBG-based pressure sensors have been shown to demonstrate good sensitivity, linearity and repeatability, whilst LPFG-based systems have demonstrated a far greater sensitivity when compared to FBGs, as well the advantage of being potentially able to detect causes of fuel adulteration based on their sensitivity to refractive index (RI). In the case of the LPFG-based system, considerable work remains to be done on the mechanical strengthening to improve its survivability in a live aircraft fuel tank environment. The FBG system has already been developed to an aerospace compliant prototype and is due to be tested at the fuel testing facility based at Airbus, Filton, UK. It is envisaged by the author that in both application areas, continued research in this area will lead to the eventual development of marketable commercial products.
Resumo:
A wide range of essential reasoning tasks rely on contradiction identification, a cornerstone of human rationality, communication and debate founded on the inversion of the logical operators "Every" and "Some." A high-density electroencephalographic (EEG) study was performed in 11 normal young adults. The cerebral network involved in the identification of contradiction included the orbito-frontal and anterior-cingulate cortices and the temporo-polar cortices. The event-related dynamic of this network showed an early negative deflection lasting 500 ms after sentence presentation. This was followed by a positive deflection lasting 1.5 s, which was different for the two logical operators. A lesser degree of network activation (either in neuron number or their level of phase locking or both) occurred while processing statements with "Some," suggesting that this was a relatively simpler scenario with one example to be figured out, instead of the many examples or the absence of a counterexample searched for while processing statements with "Every." A self-generated reward system seemed to resonate the recruited circuitry when the contradictory task is successfully completed.
Resumo:
The energy consumption and the energy efficiency have become very important issue in optimizing the current as well as in designing the future telecommunications networks. The energy and power metrics are being introduced in order to enable assessment and comparison of the energy consumption and power efficiency of the telecommunications networks and other transmission equipment. The standardization of the energy and power metrics is a significant ongoing activity aiming to define the baseline energy and power metrics for the telecommunications systems. This article provides an up-to-date overview of the energy and power metrics being proposed by the various standardization bodies and subsequently adopted worldwide by the equipment manufacturers and the network operators. © Institut Télécom and Springer-Verlag 2012.and Springer-Verlag 2012.
Resumo:
This paper introduces a theoretical framework to guide research into the psychological effects of advanced manufacturing technology (AMT) on shopfloor operators. The framework has two main aspects. First, based on the emerging literature on the job content implications of AMT, it identifies four key constructs, namely: control, cognitive demand, production responsibility and social interaction. Second, by drawing on the more established job design, stress and related literatures, it predicts how these independent variables differentially affect system performance, job-related strain and job satisfaction. The wider implications and limitations of the theoretical framework are discussed.