47 resultados para Algorithms, Properties, the KCube Graphs

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A theoretical model is presented which describes selection in a genetic algorithm (GA) under a stochastic fitness measure and correctly accounts for finite population effects. Although this model describes a number of selection schemes, we only consider Boltzmann selection in detail here as results for this form of selection are particularly transparent when fitness is corrupted by additive Gaussian noise. Finite population effects are shown to be of fundamental importance in this case, as the noise has no effect in the infinite population limit. In the limit of weak selection we show how the effects of any Gaussian noise can be removed by increasing the population size appropriately. The theory is tested on two closely related problems: the one-max problem corrupted by Gaussian noise and generalization in a perceptron with binary weights. The averaged dynamics can be accurately modelled for both problems using a formalism which describes the dynamics of the GA using methods from statistical mechanics. The second problem is a simple example of a learning problem and by considering this problem we show how the accurate characterization of noise in the fitness evaluation may be relevant in machine learning. The training error (negative fitness) is the number of misclassified training examples in a batch and can be considered as a noisy version of the generalization error if an independent batch is used for each evaluation. The noise is due to the finite batch size and in the limit of large problem size and weak selection we show how the effect of this noise can be removed by increasing the population size. This allows the optimal batch size to be determined, which minimizes computation time as well as the total number of training examples required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose two algorithms involving the relaxation of either the given Dirichlet data or the prescribed Neumann data on the over-specified boundary, in the case of the alternating iterative algorithm of ` 12 ` 12 `$12 `&12 `#12 `^12 `_12 `%12 `~12 *Kozlov91 applied to Cauchy problems for the modified Helmholtz equation. A convergence proof of these relaxation methods is given, along with a stopping criterion. The numerical results obtained using these procedures, in conjunction with the boundary element method (BEM), show the numerical stability, convergence, consistency and computational efficiency of the proposed methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multidimensional compound optimization is a new paradigm in the drug discovery process, yielding efficiencies during early stages and reducing attrition in the later stages of drug development. The success of this strategy relies heavily on understanding this multidimensional data and extracting useful information from it. This paper demonstrates how principled visualization algorithms can be used to understand and explore a large data set created in the early stages of drug discovery. The experiments presented are performed on a real-world data set comprising biological activity data and some whole-molecular physicochemical properties. Data visualization is a popular way of presenting complex data in a simpler form. We have applied powerful principled visualization methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), to help the domain experts (screening scientists, chemists, biologists, etc.) understand and draw meaningful decisions. We also benchmark these principled methods against relatively better known visualization approaches, principal component analysis (PCA), Sammon's mapping, and self-organizing maps (SOMs), to demonstrate their enhanced power to help the user visualize the large multidimensional data sets one has to deal with during the early stages of the drug discovery process. The results reported clearly show that the GTM and HGTM algorithms allow the user to cluster active compounds for different targets and understand them better than the benchmarks. An interactive software tool supporting these visualization algorithms was provided to the domain experts. The tool facilitates the domain experts by exploration of the projection obtained from the visualization algorithms providing facilities such as parallel coordinate plots, magnification factors, directional curvatures, and integration with industry standard software. © 2006 American Chemical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The computer systems of today are characterised by data and program control that are distributed functionally and geographically across a network. A major issue of concern in this environment is the operating system activity of resource management for different processors in the network. To ensure equity in load distribution and improved system performance, load balancing is often undertaken. The research conducted in this field so far, has been primarily concerned with a small set of algorithms operating on tightly-coupled distributed systems. More recent studies have investigated the performance of such algorithms in loosely-coupled architectures but using a small set of processors. This thesis describes a simulation model developed to study the behaviour and general performance characteristics of a range of dynamic load balancing algorithms. Further, the scalability of these algorithms are discussed and a range of regionalised load balancing algorithms developed. In particular, we examine the impact of network diameter and delay on the performance of such algorithms across a range of system workloads. The results produced seem to suggest that the performance of simple dynamic policies are scalable but lack the load stability of more complex global average algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Paper-based phenolic laminates are used extensively in the electrical industry. Many small components are fabricated from these materials by the process known as punching. Recently an investigation was carried out to study the effect of processing variables on the punching properties. It was concluded that further work would be justified and that this should include a critical examination of the resin properties in a more controlled and systematic manner. In this investigation an attempt has been made to assess certain features of the resin structure in terms of thermomechanical properties. The number of crosslinks in the system was controlled using resins based on phenol and para-cresol formulations. Intramolecular hydrogen bonding effects were examined using substituted resins and a synthetically derived phenol based on 1,3-di-(o-hydroxyphenyl) propane.. A resin system was developed using the Friedel Crafts reaction to examine inter-molecular hydrogen bonding at the resin-paper interface. The punching properties of certain selected resins were assessed on a qualitative basis. In addition flexural and dynamic mechanical properties were determined in a general study of the structure-property relationships of these materials. It has been shown that certain features of the resin structure significantly influenced mechanical properties. :F'urther, it was noted that there is a close relationship between punching properties, mechanical damping and flexural strain. This work includes a critical examination of the curing mechanism and views are postulated in an attempt to extend knowledge in this area of the work. Finally, it is argued that future work should be based on a synthetic approach and that dynamic mechanical testing would provide a powerful tool In developing a deeper understanding of the resin fine structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inference and optimization in sparse graphs with real variables is studied using methods of statistical mechanics. Efficient distributed algorithms for the resource allocation problem are devised. Numerical simulations show excellent performance and full agreement with the theoretical results. © Springer-Verlag Berlin Heidelberg 2006.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Fibre Distributed Data Interface (FDDI) represents the new generation of local area networks (LANs). These high speed LANs are capable of supporting up to 500 users over a 100 km distance. User traffic is expected to be as diverse as file transfers, packet voice and video. As the proliferation of FDDI LANs continues, the need to interconnect these LANs arises. FDDI LAN interconnection can be achieved in a variety of different ways. Some of the most commonly used today are public data networks, dial up lines and private circuits. For applications that can potentially generate large quantities of traffic, such as an FDDI LAN, it is cost effective to use a private circuit leased from the public carrier. In order to send traffic from one LAN to another across the leased line, a routing algorithm is required. Much research has been done on the Bellman-Ford algorithm and many implementations of it exist in computer networks. However, due to its instability and problems with routing table loops it is an unsatisfactory algorithm for interconnected FDDI LANs. A new algorithm, termed ISIS which is being standardized by the ISO provides a far better solution. ISIS will be implemented in many manufacturers routing devices. In order to make the work as practical as possible, this algorithm will be used as the basis for all the new algorithms presented. The ISIS algorithm can be improved by exploiting information that is dropped by that algorithm during the calculation process. A new algorithm, called Down Stream Path Splits (DSPS), uses this information and requires only minor modification to some of the ISIS routing procedures. DSPS provides a higher network performance, with very little additional processing and storage requirements. A second algorithm, also based on the ISIS algorithm, generates a massive increase in network performance. This is achieved by selecting alternative paths through the network in times of heavy congestion. This algorithm may select the alternative path at either the originating node, or any node along the path. It requires more processing and memory storage than DSPS, but generates a higher network power. The final algorithm combines the DSPS algorithm with the alternative path algorithm. This is the most flexible and powerful of the algorithms developed. However, it is somewhat complex and requires a fairly large storage area at each node. The performance of the new routing algorithms is tested in a comprehensive model of interconnected LANs. This model incorporates the transport through physical layers and generates random topologies for routing algorithm performance comparisons. Using this model it is possible to determine which algorithm provides the best performance without introducing significant complexity and storage requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cassava rhizome was catalytically pyrolysed at 500 °C using analytical pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) in order to investigate the effect of catalysts on bio-oil properties. The catalysts studied were zeolite ZSM-5, two aluminosilicate mesoporous materials Al-MCM-41 and Al-MSU-F, and a proprietary commercial catalyst alumina-stabilised ceria MI-575. The influence of catalysts on pyrolysis products was observed through the yields of aromatic hydrocarbons, phenols, lignin-derived compounds, carbonyls, methanol and acetic acid. Results showed that all the catalysts produced aromatic hydrocarbons and reduced oxygenated lignin derivatives, thus indicating an improvement of bio-oil heating value and viscosity. Among the catalysts, ZSM-5 was the most active to all the changes in pyrolysis products. In addition, all the catalysts with the exception of MI-575 enhanced the formation of acetic acid. This is clearly a disadvantage with respect to the level of pH in the liquid bio-fuel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two sets of experiments, categorized as TG–FTIR and Py–GC–FTIR, are employed to investigate the mechanism of the hemicellulose pyrolysis and the formation of main gaseous and bio-oil products. The “sharp mass loss stage” and the corresponding evolution of the volatile products are examined by the TG–FTIR graphs at the heating rate of 3–80 K/min. A pyrolysis unit, composed of fluidized bed reactor, carbon filter, vapour condensing system and gas storage, is employed to investigate the products of the hemicellulose pyrolysis under different temperatures (400–690 °C) at the feeding flow rate of 600 l/h. The effects of temperature on the condensable products are examined thoroughly. The possible routes for the formation of the products are systematically proposed from the primary decomposition of the three types of unit (xylan, O-acetylxylan and 4-O-methylglucuronic acid) and the secondary reactions of the fragments. It is found that the formation of CO is enhanced with elevated temperature, while slight change is observed for the yield of CO2 which is the predominant products in the gaseous mixture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis introduces a flexible visual data exploration framework which combines advanced projection algorithms from the machine learning domain with visual representation techniques developed in the information visualisation domain to help a user to explore and understand effectively large multi-dimensional datasets. The advantage of such a framework to other techniques currently available to the domain experts is that the user is directly involved in the data mining process and advanced machine learning algorithms are employed for better projection. A hierarchical visualisation model guided by a domain expert allows them to obtain an informed segmentation of the input space. Two other components of this thesis exploit properties of these principled probabilistic projection algorithms to develop a guided mixture of local experts algorithm which provides robust prediction and a model to estimate feature saliency simultaneously with the training of a projection algorithm.Local models are useful since a single global model cannot capture the full variability of a heterogeneous data space such as the chemical space. Probabilistic hierarchical visualisation techniques provide an effective soft segmentation of an input space by a visualisation hierarchy whose leaf nodes represent different regions of the input space. We use this soft segmentation to develop a guided mixture of local experts (GME) algorithm which is appropriate for the heterogeneous datasets found in chemoinformatics problems. Moreover, in this approach the domain experts are more involved in the model development process which is suitable for an intuition and domain knowledge driven task such as drug discovery. We also derive a generative topographic mapping (GTM) based data visualisation approach which estimates feature saliency simultaneously with the training of a visualisation model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Market oriented behaviours have been found to be important predictors of business success across a wide array of studies. Despite their potential importance, research into market oriented behaviours in the joint venture (JV) context is very scarce. This study represents a novel attempt to address this gap by examining a set of antecedent factors which arises from sources outside a traditional firm’s boundary. An extensive review and synthesis of the market orientation and JV literature yielded a set of context-specific antecedent factors relevant to the JV’s relational context. In accordance with the perspective offered by the transaction cost theory, a system of hypotheses about the effects of these antecedent factors on JV’s market oriented behaviours was developed. In order to test these hypotheses, empirical evidence was collected by means of a mail survey to international joint ventures operating in the coastal regions of mainland China. A sample of 191 JV firms was collected as a result. Following well established procedures for scale development and purification as recommended in the methodology literature, the scales were critically trimmed and reviewed for their psychometric properties. The conceptual model was tested with a structural equation model. Results suggested that a number of context-specific antecedents are in fact important determinants of JVs’ level of market oriented behaviours. In addition, the linkage between market oriented behaviours and market performance was also successfully established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Post-operative infections resulting from total hip arthroplasty are caused by bacteria such as Staphylococcus aureus and Pseudomonas aeruginosa entering the wound perioperatively or by haemetogenous spread from distant loci of infection. They can endanger patient health and require expensive surgical revision procedures. Gentamicin impregnated poly (methyl methacrylate) bone cement is traditionally used for treatment but is often removed due to harbouring bacterial growth, while bacterial resistance to gentamicin is increasing. The aim of this work was to encapsulate the antibiotics vancomycin, ciprofloxacin and rifampicin within sustained release microspheres composed of the biodegradable polymer poly (dl-lactide-co-glycolide) [PLCG] 75:25. Topical administration to the wound in hydroxypropylmethylcellulose gel should achieve high local antibiotic concentrations while the two week in vivo half life of PLCG 75:25 removes the need for expensive surgical retrieval operations. Unloaded and 20% w/w antibiotic loaded PLCG 75:25 microspheres were fabricated using a Water in Oil emulsification with solvent evaporation technique. Microspheres were spherical in shape with a honeycomb-like internal matrix and showed reproducible physical properties. The kinetics of in vitro antibiotic release into newborn calf serum (NCS) and Hank's balanced salt solution (HBSS) at 37°C were measured using a radial diffusion assay. Generally, the day to day concentration of each antibiotic released into NCS over a 30 day period was in excess of that required to kill St. aureus and Ps. auruginosa. Only limited microsphere biodegradation had occurred after 30 days of in vitro incubation in NCS and HBSS at 37°C. The moderate in vitro cytotoxicity of 20% w/w antibiotic loaded microspheres to cultured 3T3-L1 cells was antibiotic induced. In conclusion, generated data indicate the potential for 20% w/w antibiotic loaded microspheres to improve the present treatment regimens for infections occurring after total hip arthroplasty such that future work should focus on gaining industrial collaboration for commercial exploitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human NT2.D1 cell line was differentiated to form both a 1:2 co-culture of post-mitotic NT2 neuronal and NT2 astrocytic (NT2.N/A) cells and a pure NT2.N culture. The respective sensitivities to several test chemicals of the NT2.N/A, the NT2.N, and the NT2.D1 cells were evaluated and compared with the CCF-STTG1 astrocytoma cell line, using a combination of basal cytotoxicity and biochemical endpoints. Using the MTT assay, the basal cytotoxicity data estimated the comparative toxicities of the test chemicals (chronic neurotoxin 2,5-hexanedione, cytotoxins 2,3- and 3,4-hexanedione and acute neurotoxins tributyltin- and trimethyltin- chloride) and also provided the non-cytotoxic concentration-range for each compound. Biochemical endpoints examined over the non-cytotoxic range included assays for ATP levels, oxidative status (H2O2 and GSH levels) and caspase-3 levels as an indicator of apoptosis. although the endpoints did not demonstrate the known neurotoxicants to be consistently more toxic to the cell systems with the greatest number of neuronal properties, the NT2 astrocytes appeared to contribute positively to NT2 neuronal health following exposure to all the test chemicals. The NT2.N/A co-culture generally maintained superior ATP and GSH levels and reduced H2O2 levels in comparison with the NT2.N mono-culture. In addition, the pure NT2.N culture showed a significantly lower level of caspase-3 activation compared with the co-culture, suggesting NT2 astrocytes may be important in modulating the mode of cell death following toxic insult. Overall, these studies provide evidence that an in vitro integrated population of post-mitotic human neurons and astrocytes may offer significant relevance to the human in vivo heterogeneous nervous system, when initially screening compounds for acute neurotoxic potential.