934 resultados para Branch and bound algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information on decomposition of harvest residues may assist in the maintenance of soil fertility in second rotation (2R) hoop pine plantations (Araucaria cunninghamii Aiton ex A. Cunn.) of subtropical Australia. The experiment was undertaken to determine the dynamics of residue decomposition and fate of residue-derived N. We used N-15-labeled hoop pine foliage, branch, and stem material in microplots, over a 30-mo period following harvesting. We examined the decomposition of each component both singly and combined, and used C-13 cross-polarization and magic-angle spinning nuclear magnetic resonance (C-13 CPMAS NMR) to chart C transformations in decomposing foliage. Residue-derived N-15 was immobilized in the 0- to 5-cm soil layer, with approximately 40% N-15 recovery in the soil from the combined residues by the end of the 30-mo period. Total recovery of N-15 in residues and soil varied between 60 and 80% for the combined-residue microplots, with 20 to 40% of the residue N-15 apparently lost. When residues were combined within microplots the rate of foliage decomposition decreased by 30% while the rate of branch and stem decomposition increased by 50 and 40% compared with rates for these components when decomposed separately. Residue decomposition studies should include a combined-residue treatment. Based on C-15 CPMAS NMR spectra for decomposing foliage, we obtained good correlations for methoxyl C, aryl C, carbohydrate C and phenolic C with residue mass, N-15 enrichment, and total N. The ratio of carbohydrate C to methoxyl C may be useful as an indicator of harvest residue decomposition in hoop pine plantations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Obstructive sleep apnea (OSA) is a highly prevalent disease in which upper airways are collapsed during sleep, leading to serious consequences. The gold standard of diagnosis, called polysomnography (PSG), requires a full-night hospital stay connected to over ten channels of measurements requiring physical contact with sensors. PSG is inconvenient, expensive and unsuited for community screening. Snoring is the earliest symptom of OSA, but its potential in clinical diagnosis is not fully recognized yet. Diagnostic systems intent on using snore-related sounds (SRS) face the tough problem of how to define a snore. In this paper, we present a working definition of a snore, and propose algorithms to segment SRS into classes of pure breathing, silence and voiced/unvoiced snores. We propose a novel feature termed the 'intra-snore-pitch-jump' (ISPJ) to diagnose OSA. Working on clinical data, we show that ISPJ delivers OSA detection sensitivities of 86-100% while holding specificity at 50-80%. These numbers indicate that snore sounds and the ISPJ have the potential to be good candidates for a take-home device for OSA screening. Snore sounds have the significant advantage in that they can be conveniently acquired with low-cost non-contact equipment. The segmentation results presented in this paper have been derived using data from eight patients as the training set and another eight patients as the testing set. ISPJ-based OSA detection results have been derived using training data from 16 subjects and testing data from 29 subjects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human perception is finely tuned to extract structure about the 4D world of time and space as well as properties such as color and texture. Developing intuitions about spatial structure beyond 4D requires exploiting other perceptual and cognitive abilities. One of the most natural ways to explore complex spaces is for a user to actively navigate through them, using local explorations and global summaries to develop intuitions about structure, and then testing the developing ideas by further exploration. This article provides a brief overview of a technique for visualizing surfaces defined over moderate-dimensional binary spaces, by recursively unfolding them onto a 2D hypergraph. We briefly summarize the uses of a freely available Web-based visualization tool, Hyperspace Graph Paper (HSGP), for exploring fitness landscapes and search algorithms in evolutionary computation. HSGP provides a way for a user to actively explore a landscape, from simple tasks such as mapping the neighborhood structure of different points, to seeing global properties such as the size and distribution of basins of attraction or how different search algorithms interact with landscape structure. It has been most useful for exploring recursive and repetitive landscapes, and its strength is that it allows intuitions to be developed through active navigation by the user, and exploits the visual system's ability to detect pattern and texture. The technique is most effective when applied to continuous functions over Boolean variables using 4 to 16 dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The elastic net and related algorithms, such as generative topographic mapping, are key methods for discretized dimension-reduction problems. At their heart are priors that specify the expected topological and geometric properties of the maps. However, up to now, only a very small subset of possible priors has been considered. Here we study a much more general family originating from discrete, high-order derivative operators. We show theoretically that the form of the discrete approximation to the derivative used has a crucial influence on the resulting map. Using a new and more powerful iterative elastic net algorithm, we confirm these results empirically, and illustrate how different priors affect the form of simulated ocular dominance columns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The blood types determination is essential to perform safe blood transfusions. In emergency situations isadministrated the “universal donor” blood type. However, sometimes, this blood type can cause incom-patibilities in the transfusion receptor. A mechatronic prototype was developed to solve this problem.The prototype was built to meet specific goals, incorporating all the necessary components. The obtainedsolution is close to the final system that will be produced later, at industrial scale, as a medical device.The prototype is a portable and low cost device, and can be used in remote locations. A computer appli-cation, previously developed is used to operate with the developed mechatronic prototype, and obtainautomatically test results. It allows image acquisition, processing and analysis, based on Computer Visionalgorithms, Machine Learning algorithms and deterministic algorithms. The Machine Learning algorithmsenable the classification of occurrence, or alack of agglutination in the mixture (blood/reagents), and amore reliable and a safer methodology as test data are stored in a database. The work developed allowsthe administration of a compatible blood type in emergency situations, avoiding the discontinuity of the“universal donor” blood type stocks, and reducing the occurrence of human errors in the transfusion practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multidimensional compound optimization is a new paradigm in the drug discovery process, yielding efficiencies during early stages and reducing attrition in the later stages of drug development. The success of this strategy relies heavily on understanding this multidimensional data and extracting useful information from it. This paper demonstrates how principled visualization algorithms can be used to understand and explore a large data set created in the early stages of drug discovery. The experiments presented are performed on a real-world data set comprising biological activity data and some whole-molecular physicochemical properties. Data visualization is a popular way of presenting complex data in a simpler form. We have applied powerful principled visualization methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), to help the domain experts (screening scientists, chemists, biologists, etc.) understand and draw meaningful decisions. We also benchmark these principled methods against relatively better known visualization approaches, principal component analysis (PCA), Sammon's mapping, and self-organizing maps (SOMs), to demonstrate their enhanced power to help the user visualize the large multidimensional data sets one has to deal with during the early stages of the drug discovery process. The results reported clearly show that the GTM and HGTM algorithms allow the user to cluster active compounds for different targets and understand them better than the benchmarks. An interactive software tool supporting these visualization algorithms was provided to the domain experts. The tool facilitates the domain experts by exploration of the projection obtained from the visualization algorithms providing facilities such as parallel coordinate plots, magnification factors, directional curvatures, and integration with industry standard software. © 2006 American Chemical Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE. A methodology for noninvasively characterizing the three-dimensional (3-D) shape of the complete human eye is not currently available for research into ocular diseases that have a structural substrate, such as myopia. A novel application of a magnetic resonance imaging (MRI) acquisition and analysis technique is presented that, for the first time, allows the 3-D shape of the eye to be investigated fully. METHODS. The technique involves the acquisition of a T2-weighted MRI, which is optimized to reveal the fluid-filled chambers of the eye. Automatic segmentation and meshing algorithms generate a 3-D surface model, which can be shaded with morphologic parameters such as distance from the posterior corneal pole and deviation from sphericity. Full details of the method are illustrated with data from 14 eyes of seven individuals. The spatial accuracy of the calculated models is demonstrated by comparing the MRI-derived axial lengths with values measured in the same eyes using interferometry. RESULTS. The color-coded eye models showed substantial variation in the absolute size of the 14 eyes. Variations in the sphericity of the eyes were also evident, with some appearing approximately spherical whereas others were clearly oblate and one was slightly prolate. Nasal-temporal asymmetries were noted in some subjects. CONCLUSIONS. The MRI acquisition and analysis technique allows a novel way of examining 3-D ocular shape. The ability to stratify and analyze eye shape, ocular volume, and sphericity will further extend the understanding of which specific biometric parameters predispose emmetropic children subsequently to develop myopia. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification and quantification of spin adducts and their reduction products (>NOH, >NOR) formed from nitroso compounds and nitrones in EPR and PP during spin trapping techniques have been examined. The nitroxyl yield and polymer bound nitroxyl percentage formed from these spin traps were found to be strongly dependent on the nature of spin trap and radical generator, processing temperature, and irradiation time. The nitroxyl yield and % bound nitroxyl of the spin traps improved significantly in the presence of Trigonox 101 and 2-0H benzophenone. The effect of these spin traps used as normal additive and their spin adducts in the form of EPR-masterbatch on the photo and thermal-oxidation of PP have been studied. Aliphatic nitroso compounds were found to have much better photo-antioxidant activity than nitrones and aromatic nitroso compounds, and their antioxidant activity improved appreciably in the presence of, a free radical generator, Trigonox 101, before and after extraction. The effect of heat, light and oxidising agent (meta-dichloro per benzoic acid) on the nitroxyl yield of nitroso tertiary butane in solution as a model study has been investigated and a cyclic regenerative process involving both chain breaking acceptor and chain breaking donor process has been proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTAMAP is a web processing service for the automatic interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the open geospatial consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an open source solution. The system couples the 52-North web processing service, accepting data in the form of an observations and measurements (O&M) document with a computing back-end realized in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a new markup language to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropies and extreme values. In the light of the INTAMAP experience, we discuss the lessons learnt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As systems for computer-aided-design and production of mechanical parts have developed, there has arisen a need for techniques for the comprehensive description of the desired part, including its 3-D shape. The creation and manipulation of shapes is generally known as geometric modelling. It is desirable that links be established between geometric modellers and machining programs. Currently, unbounded APT and some bounded geometry systems are being widely used in manufacturing industry for machining operations such as: milling, drilling, boring and turning, applied mainly to engineering parts. APT systems, however, are presently only linked to wire-frame drafting systems. The combination of a geometric modeller and APT will provide a powerful manufacturing system for industry from the initial design right through part manufacture using NC machines. This thesis describes a recently developed interface (ROMAPT) between a bounded geometry modeller (ROMULUS) and an unbounded NC processor (APT). A new set of theoretical functions and practical algorithms for the computer aided manufacturing of 3D solid geometric model has been investigated. This work has led to the development of a sophisticated computer program, ROMAPT, which provides a new link between CAD (in form of a goemetric modeller ROMULUS) and CAM (in form of the APT NC system). ROMAPT has been used to machine some engineering prototypes successfully both in soft foam material and aluminium. It has been demonstrated above that the theory and algorithms developed by the author for the development of computer aided manufacturing of 3D solid modelling are both valid and applicable. ROMAPT allows the full potential of a solid geometric modeller (ROMULUS) to be further exploited for NC applications without requiring major investment in new NC processor. ROMAPT supports output in APT-AC, APT4 and the CAM-I SSRI NC languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The techno-economic implications of recycling the components of mixed plastics waste have been studied in a two-part investigation: (a) An economic survey of the prospects for plastics recycling, the plastics waste arisings from retailing, building, automotive, light engineering and chemical industries have been surveyed by mans of questionnaires and interviews. This was partially successful and indicated that very considerable quantities of relatively clean plastics packaging was available in major department chains and household stores. The possibility of devising collection systems for such sources, which do not lead to any extra cost, have been suggested. However, the household collection of plastics waste has been found to be uneconomic due to high cost of collection, transportation and lack of markets for the end products. (b) In a technical study of blends of PE/PP and PE/PS which are found in admixture in waste plastics, it has been shown that they exhibit poor mechanical properties due to incompatibility. Consequently reprocessing of such unsegregated blends results in products of little technological value. The inclusion of some commercial block and graft copolymers which behave as solid phase dispersants (SPES) increase the toughness of the blends (e.g. EPDM in PE/PP blend and SBS in PE/PS blend). Also, EPDM is found to be very effective for improving the toughness of single component polypropylene. However, the improved Technical properties of such blends have been accompanied by a fast rate of photo-oxidation and loss of toughness due to the presence of unsaturation in SPD's. The change in mechanical properties occurring during oven ageing and ultra-violet light accelerated weathering of these binary and ternary blends was followed by a viscoelastonetric technique (Rheovibron) over 9,, wide range of temperatures, impact resistance at room temperature (20-41'G) and changes in functional groups (i.e. carbonyl and trans-1,4-polybutadiene). Also the heat and light stability of single and mixed plastics to which thiol antioxidants were bound to SPE1 segment have been studied and compared with conventional antioxidants. The long-term performance of the mixed plastics containing SPE1 have been improved significantly by the use of conventional and bound antioxidants. It is concluded that an estimated amount of 30000 tonnes/year of plastics waste is available from department chains and household stores which can be converted to useful end products. This justifies pilot-experiments in collaboration with supermarkets, recyclers and converters by use of low cost SPD's and additives designed to make the materials more compatible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents results of experiments designed to study the effect of applying electrochemical chloride extraction (ECE) to a range of different hardened cement pastes. Rectangular prism specimens of hydrated cement paste containing sodium chloride at different concentrations were subjected to electrolysis between the embedded steel cathodes and external anodes of activated titanium mesh. The cathodic current density used was in the range of 1 to 5 A/m2 with treatment periods of 4 to 12 weeks. After treatment, the specimens were cut into sections which were subjected to pore-solution expression and analysis in order to determine changes in the distribution of free and total ionic species. The effect of the ECE treatment on the physical and microstructural properties of the cements was studied by using microhardness and MIP techniques. XRD was employed to look at the possibility of ettringite redistribution as a result of the accumulation of soluble sulphate ions in the cement matrix near the cathode during ECE. Remigration of chloride which remains after the ECE treatment and distribution of other ions were studied by analysing specimens which had been stored for several months, after undergoing ECE treatment. The potentials of the steel cathodes were also monitored over the period to detect any changes in their corrosion state. The main findings of this research were as follows: 1, ECE, as applied in this investigation, was capable of removing both free and bound chloride. The removal process occurred relatively quickly and an equilibrium between free and bound chlorides in the specimens was maintained throughout. At the same time, alkali concentrations in the pore solution near the steel cathode increased. The soluble sulphate ionic concentration near the cathode also increased due to the local increase in the pH of the pore solution. 2, ECE caused some changes in physical and microstructural of the cement matrix. However these changes were minimal and in the case of microhardness, the results were highly scattered. Ettringite in the bulk material well away from the cathode was found not to increase significantly with the increase in charge passed.3, Remigration of chloride and other ionic species occurred slowly after cessation of ECE with a resultant gradual increase in the Cl-/OH- ratio around the steel.4, The removal of chloride from blended cements was slower than that from OPC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis considers the computer simulation of moist agglomerate collisions using the discrete element method (DEM). The study is confined to pendular state moist agglomerates, at which liquid is presented as either absorbed immobile films or pendular liquid bridges and the interparticle force is modelled as the adhesive contact force and interstitial liquid bridge force. Algorithms used to model the contact force due to surface adhesion, tangential friction and particle deformation have been derived by other researchers and are briefly described in the thesis. A theoretical study of the pendular liquid bridge force between spherical particles has been made and the algorithms for the modelling of the pendular liquid bridge force between spherical particles have been developed and incorporated into the Aston version of the DEM program TRUBAL. It has been found that, for static liquid bridges, the more explicit criterion for specifying the stable solution and critical separation is provided by the total free energy. The critical separation is given by the cube root of liquid bridge volume to a good approximation and the 'gorge method' of evaluation based on the toroidal approximation leads to errors in the calculated force of less than 10%. Three dimensional computer simulations of an agglomerate impacting orthogonally with a wall are reported. The results demonstrate the effectiveness of adding viscous binder to prevent attrition, a common practice in process engineering. Results of simulated agglomerate-agglomerate collisions show that, for colinear agglomerate impacts, there is an optimum velocity which results in a near spherical shape of the coalesced agglomerate and, hence, minimises attrition due to subsequent collisions. The relationship between the optimum impact velocity and the liquid viscosity and surface tension is illustrated. The effect of varying the angle of impact on the coalescence/attrition behaviour is also reported. (DX 187, 340).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study developed statistical techniques to evaluate visual field progression for use with the Humphrey Field Analyzer (HFA). The long-term fluctuation (LF) was evaluated in stable glaucoma. The magnitude of both LF components showed little relationship with MD, CPSD and SF. An algorithm was proposed for determining the clinical necessity for a confirmatory follow-up examination. The between-examination variability was determined for the HFA Standard and FASTPAC algorithms in glaucoma. FASTPAC exhibited greater between-examination variability than the Standard algorithm across the range of sensitivities and with increasing eccentricity. The difference in variability between the algorithms had minimal clinical significance. The effect of repositioning the baseline in the Glaucoma Change Probability Analysis (GCPA) was evaluated. The global baseline of the GCPA limited the detection of progressive change at a single stimulus location. A new technique, pointwise univariate linear regressions (ULR), of absolute sensitivity and, of pattern deviation, against time to follow-up was developed. In each case, pointwise ULR was more sensitive to localised progressive changes in sensitivity than ULR of MD, alone. Small changes in sensitivity were more readily determined by the pointwise ULR than by the GCPA. A comparison between the outcome of pointwise ULR for all fields and for the last six fields manifested linear and curvilinear declines in the absolute sensitivity and the pattern deviation. A method for delineating progressive loss in glaucoma, based upon the error in the forecasted sensitivity of a multivariate model, was developed. Multivariate forecasting exhibited little agreement with GCPA in glaucoma but showed promise for monitoring visual field progression in OHT patients. The recovery of sensitivity in optic neuritis over time was modelled with a Cumulative Gaussian function. The rate and level of recovery was greater in the peripheral than the central field. Probability models to forecast the field of recovery were proposed.