34 resultados para Branch and bound algorithms
Resumo:
In this paper we propose algorithms for combining and ranking answers from distributed heterogeneous data sources in the context of a multi-ontology Question Answering task. Our proposal includes a merging algorithm that aggregates, combines and filters ontology-based search results and three different ranking algorithms that sort the final answers according to different criteria such as popularity, confidence and semantic interpretation of results. An experimental evaluation on a large scale corpus indicates improvements in the quality of the search results with respect to a scenario where the merging and ranking algorithms were not applied. These collective methods for merging and ranking allow to answer questions that are distributed across ontologies, while at the same time, they can filter irrelevant answers, fuse similar answers together, and elicit the most accurate answer(s) to a question.
Resumo:
Multidimensional compound optimization is a new paradigm in the drug discovery process, yielding efficiencies during early stages and reducing attrition in the later stages of drug development. The success of this strategy relies heavily on understanding this multidimensional data and extracting useful information from it. This paper demonstrates how principled visualization algorithms can be used to understand and explore a large data set created in the early stages of drug discovery. The experiments presented are performed on a real-world data set comprising biological activity data and some whole-molecular physicochemical properties. Data visualization is a popular way of presenting complex data in a simpler form. We have applied powerful principled visualization methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), to help the domain experts (screening scientists, chemists, biologists, etc.) understand and draw meaningful decisions. We also benchmark these principled methods against relatively better known visualization approaches, principal component analysis (PCA), Sammon's mapping, and self-organizing maps (SOMs), to demonstrate their enhanced power to help the user visualize the large multidimensional data sets one has to deal with during the early stages of the drug discovery process. The results reported clearly show that the GTM and HGTM algorithms allow the user to cluster active compounds for different targets and understand them better than the benchmarks. An interactive software tool supporting these visualization algorithms was provided to the domain experts. The tool facilitates the domain experts by exploration of the projection obtained from the visualization algorithms providing facilities such as parallel coordinate plots, magnification factors, directional curvatures, and integration with industry standard software. © 2006 American Chemical Society.
Resumo:
Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.
Resumo:
PURPOSE. A methodology for noninvasively characterizing the three-dimensional (3-D) shape of the complete human eye is not currently available for research into ocular diseases that have a structural substrate, such as myopia. A novel application of a magnetic resonance imaging (MRI) acquisition and analysis technique is presented that, for the first time, allows the 3-D shape of the eye to be investigated fully. METHODS. The technique involves the acquisition of a T2-weighted MRI, which is optimized to reveal the fluid-filled chambers of the eye. Automatic segmentation and meshing algorithms generate a 3-D surface model, which can be shaded with morphologic parameters such as distance from the posterior corneal pole and deviation from sphericity. Full details of the method are illustrated with data from 14 eyes of seven individuals. The spatial accuracy of the calculated models is demonstrated by comparing the MRI-derived axial lengths with values measured in the same eyes using interferometry. RESULTS. The color-coded eye models showed substantial variation in the absolute size of the 14 eyes. Variations in the sphericity of the eyes were also evident, with some appearing approximately spherical whereas others were clearly oblate and one was slightly prolate. Nasal-temporal asymmetries were noted in some subjects. CONCLUSIONS. The MRI acquisition and analysis technique allows a novel way of examining 3-D ocular shape. The ability to stratify and analyze eye shape, ocular volume, and sphericity will further extend the understanding of which specific biometric parameters predispose emmetropic children subsequently to develop myopia. Copyright © Association for Research in Vision and Ophthalmology.
Resumo:
The identification and quantification of spin adducts and their reduction products (>NOH, >NOR) formed from nitroso compounds and nitrones in EPR and PP during spin trapping techniques have been examined. The nitroxyl yield and polymer bound nitroxyl percentage formed from these spin traps were found to be strongly dependent on the nature of spin trap and radical generator, processing temperature, and irradiation time. The nitroxyl yield and % bound nitroxyl of the spin traps improved significantly in the presence of Trigonox 101 and 2-0H benzophenone. The effect of these spin traps used as normal additive and their spin adducts in the form of EPR-masterbatch on the photo and thermal-oxidation of PP have been studied. Aliphatic nitroso compounds were found to have much better photo-antioxidant activity than nitrones and aromatic nitroso compounds, and their antioxidant activity improved appreciably in the presence of, a free radical generator, Trigonox 101, before and after extraction. The effect of heat, light and oxidising agent (meta-dichloro per benzoic acid) on the nitroxyl yield of nitroso tertiary butane in solution as a model study has been investigated and a cyclic regenerative process involving both chain breaking acceptor and chain breaking donor process has been proposed.
Resumo:
INTAMAP is a web processing service for the automatic interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the open geospatial consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an open source solution. The system couples the 52-North web processing service, accepting data in the form of an observations and measurements (O&M) document with a computing back-end realized in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a new markup language to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropies and extreme values. In the light of the INTAMAP experience, we discuss the lessons learnt.
Resumo:
As systems for computer-aided-design and production of mechanical parts have developed, there has arisen a need for techniques for the comprehensive description of the desired part, including its 3-D shape. The creation and manipulation of shapes is generally known as geometric modelling. It is desirable that links be established between geometric modellers and machining programs. Currently, unbounded APT and some bounded geometry systems are being widely used in manufacturing industry for machining operations such as: milling, drilling, boring and turning, applied mainly to engineering parts. APT systems, however, are presently only linked to wire-frame drafting systems. The combination of a geometric modeller and APT will provide a powerful manufacturing system for industry from the initial design right through part manufacture using NC machines. This thesis describes a recently developed interface (ROMAPT) between a bounded geometry modeller (ROMULUS) and an unbounded NC processor (APT). A new set of theoretical functions and practical algorithms for the computer aided manufacturing of 3D solid geometric model has been investigated. This work has led to the development of a sophisticated computer program, ROMAPT, which provides a new link between CAD (in form of a goemetric modeller ROMULUS) and CAM (in form of the APT NC system). ROMAPT has been used to machine some engineering prototypes successfully both in soft foam material and aluminium. It has been demonstrated above that the theory and algorithms developed by the author for the development of computer aided manufacturing of 3D solid modelling are both valid and applicable. ROMAPT allows the full potential of a solid geometric modeller (ROMULUS) to be further exploited for NC applications without requiring major investment in new NC processor. ROMAPT supports output in APT-AC, APT4 and the CAM-I SSRI NC languages.
Resumo:
The techno-economic implications of recycling the components of mixed plastics waste have been studied in a two-part investigation: (a) An economic survey of the prospects for plastics recycling, the plastics waste arisings from retailing, building, automotive, light engineering and chemical industries have been surveyed by mans of questionnaires and interviews. This was partially successful and indicated that very considerable quantities of relatively clean plastics packaging was available in major department chains and household stores. The possibility of devising collection systems for such sources, which do not lead to any extra cost, have been suggested. However, the household collection of plastics waste has been found to be uneconomic due to high cost of collection, transportation and lack of markets for the end products. (b) In a technical study of blends of PE/PP and PE/PS which are found in admixture in waste plastics, it has been shown that they exhibit poor mechanical properties due to incompatibility. Consequently reprocessing of such unsegregated blends results in products of little technological value. The inclusion of some commercial block and graft copolymers which behave as solid phase dispersants (SPES) increase the toughness of the blends (e.g. EPDM in PE/PP blend and SBS in PE/PS blend). Also, EPDM is found to be very effective for improving the toughness of single component polypropylene. However, the improved Technical properties of such blends have been accompanied by a fast rate of photo-oxidation and loss of toughness due to the presence of unsaturation in SPD's. The change in mechanical properties occurring during oven ageing and ultra-violet light accelerated weathering of these binary and ternary blends was followed by a viscoelastonetric technique (Rheovibron) over 9,, wide range of temperatures, impact resistance at room temperature (20-41'G) and changes in functional groups (i.e. carbonyl and trans-1,4-polybutadiene). Also the heat and light stability of single and mixed plastics to which thiol antioxidants were bound to SPE1 segment have been studied and compared with conventional antioxidants. The long-term performance of the mixed plastics containing SPE1 have been improved significantly by the use of conventional and bound antioxidants. It is concluded that an estimated amount of 30000 tonnes/year of plastics waste is available from department chains and household stores which can be converted to useful end products. This justifies pilot-experiments in collaboration with supermarkets, recyclers and converters by use of low cost SPD's and additives designed to make the materials more compatible.
Resumo:
This thesis presents results of experiments designed to study the effect of applying electrochemical chloride extraction (ECE) to a range of different hardened cement pastes. Rectangular prism specimens of hydrated cement paste containing sodium chloride at different concentrations were subjected to electrolysis between the embedded steel cathodes and external anodes of activated titanium mesh. The cathodic current density used was in the range of 1 to 5 A/m2 with treatment periods of 4 to 12 weeks. After treatment, the specimens were cut into sections which were subjected to pore-solution expression and analysis in order to determine changes in the distribution of free and total ionic species. The effect of the ECE treatment on the physical and microstructural properties of the cements was studied by using microhardness and MIP techniques. XRD was employed to look at the possibility of ettringite redistribution as a result of the accumulation of soluble sulphate ions in the cement matrix near the cathode during ECE. Remigration of chloride which remains after the ECE treatment and distribution of other ions were studied by analysing specimens which had been stored for several months, after undergoing ECE treatment. The potentials of the steel cathodes were also monitored over the period to detect any changes in their corrosion state. The main findings of this research were as follows: 1, ECE, as applied in this investigation, was capable of removing both free and bound chloride. The removal process occurred relatively quickly and an equilibrium between free and bound chlorides in the specimens was maintained throughout. At the same time, alkali concentrations in the pore solution near the steel cathode increased. The soluble sulphate ionic concentration near the cathode also increased due to the local increase in the pH of the pore solution. 2, ECE caused some changes in physical and microstructural of the cement matrix. However these changes were minimal and in the case of microhardness, the results were highly scattered. Ettringite in the bulk material well away from the cathode was found not to increase significantly with the increase in charge passed.3, Remigration of chloride and other ionic species occurred slowly after cessation of ECE with a resultant gradual increase in the Cl-/OH- ratio around the steel.4, The removal of chloride from blended cements was slower than that from OPC.
Resumo:
This thesis considers the computer simulation of moist agglomerate collisions using the discrete element method (DEM). The study is confined to pendular state moist agglomerates, at which liquid is presented as either absorbed immobile films or pendular liquid bridges and the interparticle force is modelled as the adhesive contact force and interstitial liquid bridge force. Algorithms used to model the contact force due to surface adhesion, tangential friction and particle deformation have been derived by other researchers and are briefly described in the thesis. A theoretical study of the pendular liquid bridge force between spherical particles has been made and the algorithms for the modelling of the pendular liquid bridge force between spherical particles have been developed and incorporated into the Aston version of the DEM program TRUBAL. It has been found that, for static liquid bridges, the more explicit criterion for specifying the stable solution and critical separation is provided by the total free energy. The critical separation is given by the cube root of liquid bridge volume to a good approximation and the 'gorge method' of evaluation based on the toroidal approximation leads to errors in the calculated force of less than 10%. Three dimensional computer simulations of an agglomerate impacting orthogonally with a wall are reported. The results demonstrate the effectiveness of adding viscous binder to prevent attrition, a common practice in process engineering. Results of simulated agglomerate-agglomerate collisions show that, for colinear agglomerate impacts, there is an optimum velocity which results in a near spherical shape of the coalesced agglomerate and, hence, minimises attrition due to subsequent collisions. The relationship between the optimum impact velocity and the liquid viscosity and surface tension is illustrated. The effect of varying the angle of impact on the coalescence/attrition behaviour is also reported. (DX 187, 340).
Resumo:
The study developed statistical techniques to evaluate visual field progression for use with the Humphrey Field Analyzer (HFA). The long-term fluctuation (LF) was evaluated in stable glaucoma. The magnitude of both LF components showed little relationship with MD, CPSD and SF. An algorithm was proposed for determining the clinical necessity for a confirmatory follow-up examination. The between-examination variability was determined for the HFA Standard and FASTPAC algorithms in glaucoma. FASTPAC exhibited greater between-examination variability than the Standard algorithm across the range of sensitivities and with increasing eccentricity. The difference in variability between the algorithms had minimal clinical significance. The effect of repositioning the baseline in the Glaucoma Change Probability Analysis (GCPA) was evaluated. The global baseline of the GCPA limited the detection of progressive change at a single stimulus location. A new technique, pointwise univariate linear regressions (ULR), of absolute sensitivity and, of pattern deviation, against time to follow-up was developed. In each case, pointwise ULR was more sensitive to localised progressive changes in sensitivity than ULR of MD, alone. Small changes in sensitivity were more readily determined by the pointwise ULR than by the GCPA. A comparison between the outcome of pointwise ULR for all fields and for the last six fields manifested linear and curvilinear declines in the absolute sensitivity and the pattern deviation. A method for delineating progressive loss in glaucoma, based upon the error in the forecasted sensitivity of a multivariate model, was developed. Multivariate forecasting exhibited little agreement with GCPA in glaucoma but showed promise for monitoring visual field progression in OHT patients. The recovery of sensitivity in optic neuritis over time was modelled with a Cumulative Gaussian function. The rate and level of recovery was greater in the peripheral than the central field. Probability models to forecast the field of recovery were proposed.
Resumo:
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. © 2013 IOP Publishing Ltd.
Resumo:
To be competitive in contemporary turbulent environments, firms must be capable of processing huge amounts of information, and effectively convert it into actionable knowledge. This is particularly the case in the marketing context, where problems are also usually highly complex, unstructured and ill-defined. In recent years, the development of marketing management support systems has paralleled this evolution in informational problems faced by managers, leading to a growth in the study (and use) of artificial intelligence and soft computing methodologies. Here, we present and implement a novel intelligent system that incorporates fuzzy logic and genetic algorithms to operate in an unsupervised manner. This approach allows the discovery of interesting association rules, which can be linguistically interpreted, in large scale databases (KDD or Knowledge Discovery in Databases.) We then demonstrate its application to a distribution channel problem. It is shown how the proposed system is able to return a number of novel and potentially-interesting associations among variables. Thus, it is argued that our method has significant potential to improve the analysis of marketing and business databases in practice, especially in non-programmed decisional scenarios, as well as to assist scholarly researchers in their exploratory analysis. © 2013 Elsevier Inc.
Resumo:
We overview our recent results on polarisation dynamics of vector solitons in erbium doped fibre laser mode locked with carbon nanotubes. Our experimental and theoretical study revealed new families of vector solitons for fundamental and bound-state soliton operations. The observed scenario of the evolution of the states of polarisation (SOPs) on the Poincare sphere includes fast polarisation switching between two and three SOPs along with slow SOP evolution on a double scroll chaotic attractor. The underlying physics presents an interplay between effects of birefringence of the laser cavity and light induced anisotropy caused by polarisation hole burning. © 2014 IEEE.
Resumo:
Heterogeneous multi-core FPGAs contain different types of cores, which can improve efficiency when used with an effective online task scheduler. However, it is not easy to find the right cores for tasks when there are multiple objectives or dozens of cores. Inappropriate scheduling may cause hot spots which decrease the reliability of the chip. Given that, our research builds a simulating platform to evaluate all kinds of scheduling algorithms on a variety of architectures. On this platform, we provide an online scheduler which uses multi-objective evolutionary algorithm (EA). Comparing the EA and current algorithms such as Predictive Dynamic Thermal Management (PDTM) and Adaptive Temperature Threshold Dynamic Thermal Management (ATDTM), we find some drawbacks in previous work. First, current algorithms are overly dependent on manually set constant parameters. Second, those algorithms neglect optimization for heterogeneous architectures. Third, they use single-objective methods, or use linear weighting method to convert a multi-objective optimization into a single-objective optimization. Unlike other algorithms, the EA is adaptive and does not require resetting parameters when workloads switch from one to another. EAs also improve performance when used on heterogeneous architecture. A efficient Pareto front can be obtained with EAs for the purpose of multiple objectives.