42 resultados para Rule-based techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: High-throughput molecular approaches for gene expression profiling, such as Serial Analysis of Gene Expression (SAGE), Massively Parallel Signature Sequencing (MPSS) or Sequencing-by-Synthesis (SBS) represent powerful techniques that provide global transcription profiles of different cell types through sequencing of short fragments of transcripts, denominated sequence tags. These techniques have improved our understanding about the relationships between these expression profiles and cellular phenotypes. Despite this, more reliable datasets are still necessary. In this work, we present a web-based tool named S3T: Score System for Sequence Tags, to index sequenced tags in accordance with their reliability. This is made through a series of evaluations based on a defined rule set. S3T allows the identification/selection of tags, considered more reliable for further gene expression analysis. Results: This methodology was applied to a public SAGE dataset. In order to compare data before and after filtering, a hierarchical clustering analysis was performed in samples from the same type of tissue, in distinct biological conditions, using these two datasets. Our results provide evidences suggesting that it is possible to find more congruous clusters after using S3T scoring system. Conclusion: These results substantiate the proposed application to generate more reliable data. This is a significant contribution for determination of global gene expression profiles. The library analysis with S3T is freely available at http://gdm.fmrp.usp.br/s3t/.S3T source code and datasets can also be downloaded from the aforementioned website.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. HD 181231 is a B5IVe star, which has been observed with the CoRoT satellite during similar to 5 consecutive months and simultaneously from the ground in spectroscopy and spectropolarimetry. Aims. By analysing these data, we aim to detect and characterize as many pulsation frequencies as possible, to search for the presence of beating effects possibly at the origin of the Be phenomenon. Our results will also provide a basis for seismic modelling. Methods. The fundamental parameters of the star are determined from spectral fitting and from the study of the circumstellar emission. The CoRoT photometric data and ground-based spectroscopy are analysed using several Fourier techniques: CLEAN-NG, PASPER, and TISAFT, as well as a time-frequency technique. A search for a magnetic field is performed by applying the LSD technique to the spectropolarimetric data. Results. We find that HD 181231 is a B5IVe star seen with an inclination of similar to 45 degrees. No magnetic field is detected in its photosphere. We detect at least 10 independent significant frequencies of variations among the 54 detected frequencies, interpreted in terms of non-radial pulsation modes and rotation. Two longer-term variations are also detected: one at similar to 14 days resulting from a beating effect between the two main frequencies of short-term variations, the other at similar to 116 days due either to a beating of frequencies or to a zonal pulsation mode. Conclusions. Our analysis of the CoRoT light curve and ground-based spectroscopic data of HD 181231 has led to the determination of the fundamental and pulsational parameters of the star, including beating effects. This will allow a precise seismic modelling of this star.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pollen counts from samples taken from storage pots throughout one year (from October to September) were adjusted by Tasei's volumetric correction coefficient for the determination of pollen sources exploited by two colonies of Nannotrigona testaceicornis in Sao Paulo, Brazil. The results obtained by this sampling technique for seven months (December to June) were compared with those from corbicula load samples taken within the same period. This species visited a large variety of plant species, but few of them were frequently used. As a rule, pollen sources that appeared at frequencies greater than 1% were found with both sampling methods and significant positive correlations (Spearman correlation coefficient) were found between their values. The pollen load sample data showed that N. testaceicornis gathered pollen throughout the external activity period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a combination of density functional theory and recursive Green's functions techniques, we present a full description of a large scale sensor, accounting for disorder and different coverages. Here, we use this method to demonstrate the functionality of nitrogen-rich carbon nanotubes as ammonia sensors as an example. We show how the molecules one wishes to detect bind to the most relevant defects on the nanotube, describe how these interactions lead to changes in the electronic transport properties of each isolated defect, and demonstrate that there are significative resistance changes even in the presence of disorder, elucidating how a realistic nanosensor works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Southeastern Brazil has seen dramatic landscape modifications in recent decades, due to expansion of agriculture and urban areas; these changes have influenced the distribution and abundance of vertebrates. We developed predictive models of ecological and spatial distributions of capybaras (Hydrochoerus hydrochaeris) using ecological niche modeling. Most Occurrences of capybaras were in flat areas with water bodies Surrounded by sugarcane and pasture. More than 75% of the Piracicaba River basin was estimated as potentially habitable by capybara. The models had low omission error (2.3-3.4%), but higher commission error (91.0-98.5%); these ""model failures"" seem to be more related to local habitat characteristics than to spatial ones. The potential distribution of capybaras in the basin is associated with anthropogenic habitats, particularly with intensive land use for agriculture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first problem of the Seleucid mathematical cuneiform tablet BM 34 568 calculates the diagonal of a rectangle from its sides without resorting to the Pythagorean rule. For this reason, it has been a source of discussion among specialists ever since its first publication. but so far no consensus in relation to its mathematical meaning has been attained. This paper presents two new interpretations of the scribe`s procedure. based on the assumption that he was able to reduce the problem to a standard Mesopotamian question about reciprocal numbers. These new interpretations are then linked to interpretations of the Old Babylonian tablet Plimpton 322 and to the presence of Pythagorean triples in the contexts of Old Babylonian and Hellenistic mathematics. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the modeling of a weed infestation risk inference system that implements a collaborative inference scheme based on rules extracted from two Bayesian network classifiers. The first Bayesian classifier infers a categorical variable value for the weed-crop competitiveness using as input categorical variables for the total density of weeds and corresponding proportions of narrow and broad-leaved weeds. The inferred categorical variable values for the weed-crop competitiveness along with three other categorical variables extracted from estimated maps for the weed seed production and weed coverage are then used as input for a second Bayesian network classifier to infer categorical variables values for the risk of infestation. Weed biomass and yield loss data samples are used to learn the probability relationship among the nodes of the first and second Bayesian classifiers in a supervised fashion, respectively. For comparison purposes, two types of Bayesian network structures are considered, namely an expert-based Bayesian classifier and a naive Bayes classifier. The inference system focused on the knowledge interpretation by translating a Bayesian classifier into a set of classification rules. The results obtained for the risk inference in a corn-crop field are presented and discussed. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The productivity associated with commonly available disassembly methods today seldomly makes disassembly the preferred end-of-life solution for massive take back product streams. Systematic reuse of parts or components, or recycling of pure material fractions are often not achievable in an economically sustainable way. In this paper a case-based review of current disassembly practices is used to analyse the factors influencing disassembly feasibility. Data mining techniques were used to identify major factors influencing the profitability of disassembly operations. Case characteristics such as involvement of the product manufacturer in the end-of-life treatment and continuous ownership are some of the important dimensions. Economic models demonstrate that the efficiency of disassembly operations should be increased an order of magnitude to assure the competitiveness of ecologically preferred, disassembly oriented end-of-life scenarios for large waste of electric and electronic equipment (WEEE) streams. Technological means available to increase the productivity of the disassembly operations are summarized. Automated disassembly techniques can contribute to the robustness of the process, but do not allow to overcome the efficiency gap if not combined with appropriate product design measures. Innovative, reversible joints, collectively activated by external trigger signals, form a promising approach to low cost, mass disassembly in this context. A short overview of the state-of-the-art in the development of such self-disassembling joints is included. (c) 2008 CIRP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently semi-empirical models to estimate flow boiling heat transfer coefficient, saturated CHF and pressure drop in micro-scale channels have been proposed. Most of the models were developed based on elongated bubbles and annular flows in the view of the fact that these flow patterns are predominant in smaller channels. In these models, the liquid film thickness plays an important role and such a fact emphasizes that the accurate measurement of the liquid film thickness is a key point to validate them. On the other hand, several techniques have been successfully applied to measure liquid film thicknesses during condensation and evaporation under macro-scale conditions. However, although this subject has been targeted by several leading laboratories around the world, it seems that there is no conclusive result describing a successful technique capable of measuring dynamic liquid film thickness during evaporation inside micro-scale round channels. This work presents a comprehensive literature review of the methods used to measure liquid film thickness in macro- and micro-scale systems. The methods are described and the main difficulties related to their use in micro-scale systems are identified. Based on this discussion, the most promising methods to measure dynamic liquid film thickness in micro-scale channels are identified. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article a novel algorithm based on the chemotaxis process of Echerichia coil is developed to solve multiobjective optimization problems. The algorithm uses fast nondominated sorting procedure, communication between the colony members and a simple chemotactical strategy to change the bacterial positions in order to explore the search space to find several optimal solutions. The proposed algorithm is validated using 11 benchmark problems and implementing three different performance measures to compare its performance with the NSGA-II genetic algorithm and with the particle swarm-based algorithm NSPSO. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modal filters may be obtained by a properly designed weighted sum of the output signals of an array of sensors distributed on the host structure. Although several research groups have been interested in techniques for designing and implementing modal filters based on a given array of sensors, the effect of the array topology on the effectiveness of the modal filter has received much less attention. In particular, it is known that some parameters, such as size, shape and location of a sensor, are very important in determining the observability of a vibration mode. Hence, this paper presents a methodology for the topological optimization of an array of sensors in order to maximize the effectiveness of a set of selected modal filters. This is done using a genetic algorithm optimization technique for the selection of 12 piezoceramic sensors from an array of 36 piezoceramic sensors regularly distributed on an aluminum plate, which maximize the filtering performance, over a given frequency range, of a set of modal filters, each one aiming to isolate one of the first vibration modes. The vectors of the weighting coefficients for each modal filter are evaluated using QR decomposition of the complex frequency response function matrix. Results show that the array topology is not very important for lower frequencies but it greatly affects the filter effectiveness for higher frequencies. Therefore, it is possible to improve the effectiveness and frequency range of a set of modal filters by optimizing the topology of an array of sensors. Indeed, using 12 properly located piezoceramic sensors bonded on an aluminum plate it is shown that the frequency range of a set of modal filters may be enlarged by 25-50%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central issue for pillar design in underground coal mining is the in situ uniaxial compressive strength (sigma (cm)). The paper proposes a new method for estimating in situ uniaxial compressive strength in coal seams based on laboratory strength and P wave propagation velocity. It describes the collection of samples in the Bonito coal seam, Fontanella Mine, southern Brazil, the techniques used for the structural mapping of the coal seam and determination of seismic wave propagation velocity as well as the laboratory procedures used to determine the strength and ultrasonic wave velocity. The results obtained using the new methodology are compared with those from seven other techniques for estimating in situ rock mass uniaxial compressive strength.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most post-processors for boundary element (BE) analysis use an auxiliary domain mesh to display domain results, working against the profitable modelling process of a pure boundary discretization. This paper introduces a novel visualization technique which preserves the basic properties of the boundary element methods. The proposed algorithm does not require any domain discretization and is based on the direct and automatic identification of isolines. Another critical aspect of the visualization of domain results in BE analysis is the effort required to evaluate results in interior points. In order to tackle this issue, the present article also provides a comparison between the performance of two different BE formulations (conventional and hybrid). In addition, this paper presents an overview of the most common post-processing and visualization techniques in BE analysis, such as the classical algorithms of scan line and the interpolation over a domain discretization. The results presented herein show that the proposed algorithm offers a very high performance compared with other visualization procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coatings based on NiCrAlC intermetallic based alloy were applied on AISI 316L stainless steel substrates using a high velocity oxygen fuel torch. The influence of the spray parameters on friction and abrasive wear resistance were investigated using an instrumented rubber wheel abrasion test, able to measure the friction forces. The corrosion behaviour of the coatings were studied with electrochemical techniques and compared with the corrosion resistance of the substrate material. Specimens prepared using lower O(2)/C(3)H(8) ratios showed smaller porosity values. The abrasion wear rate of the NiCrAlC coatings was much smaller than that described in the literature for bulk as cast materials with similar composition and one order of magnitude higher than bulk cast and heat treated (aged) NiCrAlC alloy. All coatings showed higher corrosion resistance than the AISI 316L substrate in HCl (5%) aqueous solution at 40 degrees C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.