824 resultados para Pruning techniques
Resumo:
In this article we describe a feature extraction algorithm for pattern classification based on Bayesian Decision Boundaries and Pruning techniques. The proposed method is capable of optimizing MLP neural classifiers by retaining those neurons in the hidden layer that realy contribute to correct classification. Also in this article we proposed a method which defines a plausible number of neurons in the hidden layer based on the stem-and-leaf graphics of training samples. Experimental investigation reveals the efficiency of the proposed method. © 2002 IEEE.
Resumo:
Planning in realistic domains typically involves reasoning under uncertainty, operating under time and resource constraints, and finding the optimal subset of goals to work on. Creating optimal plans that consider all of these features is a computationally complex, challenging problem. This dissertation develops an AO* search based planner named CPOAO* (Concurrent, Probabilistic, Over-subscription AO*) which incorporates durative actions, time and resource constraints, concurrent execution, over-subscribed goals, and probabilistic actions. To handle concurrent actions, action combinations rather than individual actions are taken as plan steps. Plan optimization is explored by adding two novel aspects to plans. First, parallel steps that serve the same goal are used to increase the plan’s probability of success. Traditionally, only parallel steps that serve different goals are used to reduce plan execution time. Second, actions that are executing but are no longer useful can be terminated to save resources and time. Conventional planners assume that all actions that were started will be carried out to completion. To reduce the size of the search space, several domain independent heuristic functions and pruning techniques were developed. The key ideas are to exploit dominance relations for candidate action sets and to develop relaxed planning graphs to estimate the expected rewards of states. This thesis contributes (1) an AO* based planner to generate parallel plans, (2) domain independent heuristics to increase planner efficiency, and (3) the ability to execute redundant actions and to terminate useless actions to increase plan efficiency.
Resumo:
This paper describes a novel approach to phonotactic LID, where instead of using soft-counts based on phoneme lattices, we use posteriogram to obtain n-gram counts. The high-dimensional vectors of counts are reduced to low-dimensional units for which we adapted the commonly used term i-vectors. The reduction is based on multinomial subspace modeling and is designed to work in the total-variability space. The proposed technique was tested on the NIST 2009 LRE set with better results to a system based on using soft-counts (Cavg on 30s: 3.15% vs 3.43%), and with very good results when fused with an acoustic i-vector LID system (Cavg on 30s acoustic 2.4% vs 1.25%). The proposed technique is also compared with another low dimensional projection system based on PCA. In comparison with the original soft-counts, the proposed technique provides better results, reduces the problems due to sparse counts, and avoids the process of using pruning techniques when creating the lattices.
Resumo:
A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical.
In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications.
We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.
Resumo:
Edge-labeled graphs have proliferated rapidly over the last decade due to the increased popularity of social networks and the Semantic Web. In social networks, relationships between people are represented by edges and each edge is labeled with a semantic annotation. Hence, a huge single graph can express many different relationships between entities. The Semantic Web represents each single fragment of knowledge as a triple (subject, predicate, object), which is conceptually identical to an edge from subject to object labeled with predicates. A set of triples constitutes an edge-labeled graph on which knowledge inference is performed. Subgraph matching has been extensively used as a query language for patterns in the context of edge-labeled graphs. For example, in social networks, users can specify a subgraph matching query to find all people that have certain neighborhood relationships. Heavily used fragments of the SPARQL query language for the Semantic Web and graph queries of other graph DBMS can also be viewed as subgraph matching over large graphs. Though subgraph matching has been extensively studied as a query paradigm in the Semantic Web and in social networks, a user can get a large number of answers in response to a query. These answers can be shown to the user in accordance with an importance ranking. In this thesis proposal, we present four different scoring models along with scalable algorithms to find the top-k answers via a suite of intelligent pruning techniques. The suggested models consist of a practically important subset of the SPARQL query language augmented with some additional useful features. The first model called Substitution Importance Query (SIQ) identifies the top-k answers whose scores are calculated from matched vertices' properties in each answer in accordance with a user-specified notion of importance. The second model called Vertex Importance Query (VIQ) identifies important vertices in accordance with a user-defined scoring method that builds on top of various subgraphs articulated by the user. Approximate Importance Query (AIQ), our third model, allows partial and inexact matchings and returns top-k of them with a user-specified approximation terms and scoring functions. In the fourth model called Probabilistic Importance Query (PIQ), a query consists of several sub-blocks: one mandatory block that must be mapped and other blocks that can be opportunistically mapped. The probability is calculated from various aspects of answers such as the number of mapped blocks, vertices' properties in each block and so on and the most top-k probable answers are returned. An important distinguishing feature of our work is that we allow the user a huge amount of freedom in specifying: (i) what pattern and approximation he considers important, (ii) how to score answers - irrespective of whether they are vertices or substitution, and (iii) how to combine and aggregate scores generated by multiple patterns and/or multiple substitutions. Because so much power is given to the user, indexing is more challenging than in situations where additional restrictions are imposed on the queries the user can ask. The proposed algorithms for the first model can also be used for answering SPARQL queries with ORDER BY and LIMIT, and the method for the second model also works for SPARQL queries with GROUP BY, ORDER BY and LIMIT. We test our algorithms on multiple real-world graph databases, showing that our algorithms are far more efficient than popular triple stores.
Resumo:
Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.
Resumo:
OBJECTIVE. The main goal of this paper is to obtain a classification model based on feed-forward multilayer perceptrons in order to improve postpartum depression prediction during the 32 weeks after childbirth with a high sensitivity and specificity and to develop a tool to be integrated in a decision support system for clinicians. MATERIALS AND METHODS. Multilayer perceptrons were trained on data from 1397 women who had just given birth, from seven Spanish general hospitals, including clinical, environmental and genetic variables. A prospective cohort study was made just after delivery, at 8 weeks and at 32 weeks after delivery. The models were evaluated with the geometric mean of accuracies using a hold-out strategy. RESULTS. Multilayer perceptrons showed good performance (high sensitivity and specificity) as predictive models for postpartum depression. CONCLUSIONS. The use of these models in a decision support system can be clinically evaluated in future work. The analysis of the models by pruning leads to a qualitative interpretation of the influence of each variable in the interest of clinical protocols.
Resumo:
The grapevine moth Lobesia botrana (Denis & Schiffermuller) (Lepidoptera: Tortricidae) is a key pest of grapevines in Greece. As part of a broader study on integrated pest management, the effects were investigated of different cultural methods on the establishment and survival of L. botrana, specifically: application of different nitrogen levels (30 and 100 units of ammonium sulfate or 70 units of Agrobiosol); summer leaf and shoot pruning; application of growth regulators (Regalis, probexadione-calcium; or Falgro, gibberellic acid). There were significant differences among the three levels of N application. The lowest L. botrana infestation rates were found in plots treated with 30 units of (NH4)(2)SO4 and plots that received some summer pruning. Following the application of plant growth regulators, the lowest L. botrana infestation levels occurred in the plots treated with Regalis or Falgro at the manufacturers' recommended concentrations. On vines where growth regulators had been applied, the clusters had fewer berries than those not treated with growth regulators.
Resumo:
The introduction of dwarfed rootstocks in apple crop has led to a new concept of intensive planting systems with the aim of producing early high yield and with returns of the initial high investment. Although yield is an important aspect to the grower, the consumer has become demanding regards fruit quality and is generally attracted by appearance. To fulfil the consumer’s expectations the grower may need to choose a proper training system along with an ideal pruning technique, which ensure a good light distribution in different parts of the canopy and a marketable fruit quality in terms of size and skin colour. Although these aspects are important, these fruits might not reach the proper ripening stage within the canopy because they are often heterogeneous. To describe the variability present in a tree, a software (PlantToon®), was used to recreate the tree architecture in 3D in the two training systems. The ripening stage of each of the fruits was determined using a non-destructive device (DA-Meter), thus allowing to estimate the fruit ripening variability. This study deals with some of the main parameters that can influence fruit quality and ripening stage within the canopy and orchard management techniques that can ameliorate a ripening fruit homogeneity. Significant differences in fruit quality were found within the canopies due to their position, flowering time and bud wood age. Bi-axis appeared to be suitable for high density planting, even though the fruit quality traits resulted often similar to those obtained with a Slender Spindle, suggesting similar fruit light availability within the canopies. Crop load confirmed to be an important factor that influenced fruit quality as much as the interesting innovative pruning method “Click”, in intensive planting systems.
Resumo:
Small-scale village woodlots of less than 0.5ha are the preferred use of land for local farmers with extra land in the village of Isangati, a small community located in the southern highlands of Tanzania. Farmers view woodlots as lucrative investments that do not involve intensive labor or time. The climate is ideal for the types of trees grown and the risks are minimal with no serious threats from insects, fires, thieves, or grazing livestock. It was hypothesized that small-scale village woodlot owners were not maximizing timber outputs with their current timber stand management and harvesting techniques. Personal interviews were conducted over a five month period and field data was collected at each farmer’s woodlots over a seven month period. Woodlot field data included woodlot size, number of trees, tree species, tree height, dbh, age, and spacing. The results indicated that the lack of proper woodlot management techniques results in failure to fully capitalize on the investment of woodlots. While farmers should continue with their current harvesting rotations, some of the reasons for not maximizing tree growth include close spacing (2m x 2m), no tree thinning, extreme pruning (60% of tree), and little to no weeding. Through education and hands-on woodlot management workshops, the farmers could increase their timber output and value of woodlots.
Resumo:
This paper presents the application of a variety of techniques to study jet substructure. The performance of various modified jet algorithms, or jet grooming techniques, for several jet types and event topologies is investigated for jets with transverse momentum larger than 300 GeV. Properties of jets subjected to the mass-drop filtering, trimming, and pruning algorithms are found to have reduced sensitivity to multiple proton-proton interactions, are more stable at high luminosity and improve the physics potential of searches for heavy boosted objects. Studies of the expected discrimination power of jet mass and jet substructure observables in searches for new physics are also presented. Event samples enriched in boosted W and Z bosons and top-quark pairs are used to study both the individual jet invariant mass scales and the efficacy of algorithms to tag boosted hadronic objects. The analyses presented use the full 2011 ATLAS dataset, corresponding to an integrated luminosity of 4.7 +/- 0.1 /fb from proton-proton collisions produced by the Large Hadron Collider at a center-of-mass energy of sqrt(s) = 7 TeV.
Resumo:
Graph automorphism (GA) is a classical problem, in which the objective is to compute the automorphism group of an input graph. In this work we propose four novel techniques to speed up algorithms that solve the GA problem by exploring a search tree. They increase the performance of the algorithm by allowing to reduce the depth of the search tree, and by effectively pruning it. We formally prove that a GA algorithm that uses these techniques correctly computes the automorphism group of the input graph. We also describe how the techniques have been incorporated into the GA algorithm conauto, as conauto-2.03, with at most an additive polynomial increase in its asymptotic time complexity. We have experimentally evaluated the impact of each of the above techniques with several graph families. We have observed that each of the techniques by itself significantly reduces the number of processed nodes of the search tree in some subset of graphs, which justifies the use of each of them. Then, when they are applied together, their effect is combined, leading to reductions in the number of processed nodes in most graphs. This is also reflected in a reduction of the running time, which is substantial in some graph families.
Resumo:
The aim of this investigation was to compare the skeletal stability of three different rigid fixation methods after mandibular advancement. Fifty-five class II malocclusion patients treated with the use of bilateral sagittal split ramus osteotomy and mandibular advancement were selected for this retrospective study. Group 1 (n = 17) had miniplates with monocortical screws, Group 2 (n = 16) had bicortical screws and Group 3 (n = 22) had the osteotomy fixed by means of the hybrid technique. Cephalograms were taken preoperatively, 1 week within the postoperative care period, and 6 months after the orthognathic surgery. Linear and angular changes of the cephalometric landmarks of the chin region were measured at each period, and the changes at each cephalometric landmark were determined for the time gaps. Postoperative changes in the mandibular shape were analyzed to determine the stability of fixation methods. There was minimum difference in the relapse of the mandibular advancement among the three groups. Statistical analysis showed no significant difference in postoperative stability. However, a positive correlation between the amount of advancement and the amount of postoperative relapse was demonstrated by the linear multiple regression test (p < 0.05). It can be concluded that all techniques can be used to obtain stable postoperative results in mandibular advancement after 6 months.
Resumo:
Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.
Resumo:
The Centers for High Cost Medication (Centros de Medicação de Alto Custo, CEDMAC), Health Department, São Paulo were instituted by project in partnership with the Clinical Hospital of the Faculty of Medicine, USP, sponsored by the Foundation for Research Support of the State of São Paulo (Fundação de Amparo à Pesquisa do Estado de São Paulo, FAPESP) aimed at the formation of a statewide network for comprehensive care of patients referred for use of immunobiological agents in rheumatological diseases. The CEDMAC of Hospital de Clínicas, Universidade Estadual de Campinas (HC-Unicamp), implemented by the Division of Rheumatology, Faculty of Medical Sciences, identified the need for standardization of the multidisciplinary team conducts, in face of the specificity of care conducts, verifying the importance of describing, in manual format, their operational and technical processes. The aim of this study is to present the methodology applied to the elaboration of the CEDMAC/HC-Unicamp Manual as an institutional tool, with the aim of offering the best assistance and administrative quality. In the methodology for preparing the manuals at HC-Unicamp since 2008, the premise was to obtain a document that is participatory, multidisciplinary, focused on work processes integrated with institutional rules, with objective and didactic descriptions, in a standardized format and with electronic dissemination. The CEDMAC/HC-Unicamp Manual was elaborated in 10 months, with involvement of the entire multidisciplinary team, with 19 chapters on work processes and techniques, in addition to those concerning the organizational structure and its annexes. Published in the electronic portal of HC Manuals in July 2012 as an e-Book (ISBN 978-85-63274-17-5), the manual has been a valuable instrument in guiding professionals in healthcare, teaching and research activities.