994 resultados para rough analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Q. Shen. Rough feature selection for intelligent classifiers. LNCS Transactions on Rough Sets, 7:244-255, 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Q. Shen and R. Jensen, 'Rough sets, their extensions and applications,' International Journal of Automation and Computing (IJAC), vol. 4, no. 3, pp. 217-218, 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

X. Wang, J. Yang, R. Jensen and X. Liu, 'Rough Set Feature Selection and Rule Induction for Prediction of Malignancy Degree in Brain Glioma,' Computer Methods and Programs in Biomedicine, vol. 83, no. 2, pp. 147-156, 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper focuses on the development of an aircraft design optimization methodology that models uncertainty and sensitivity analysis in the tradeoff between manufacturing cost, structural requirements, andaircraft direct operating cost.Specifically,ratherthanonlylooking atmanufacturingcost, direct operatingcost is also consideredintermsof the impact of weight on fuel burn, in addition to the acquisition cost to be borne by the operator. Ultimately, there is a tradeoff between driving design according to minimal weight and driving it according to reduced manufacturing cost. Theanalysis of cost is facilitated withagenetic-causal cost-modeling methodology,andthe structural analysis is driven by numerical expressions of appropriate failure modes that use ESDU International reference data. However, a key contribution of the paper is to investigate the modeling of uncertainty and to perform a sensitivity analysis to investigate the robustness of the optimization methodology. Stochastic distributions are used to characterize manufacturing cost distributions, andMonteCarlo analysis is performed in modeling the impact of uncertainty on the cost modeling. The results are then used in a sensitivity analysis that incorporates the optimization methodology. In addition to investigating manufacturing cost variance, the sensitivity of the optimization to fuel burn cost and structural loading are also investigated. It is found that the consideration of manufacturing cost does make an impact and results in a different optimal design configuration from that delivered by the minimal-weight method. However, it was shown that at lower applied loads there is a threshold fuel burn cost at which the optimization process needs to reduce weight, and this threshold decreases with increasing load. The new optimal solution results in lower direct operating cost with a predicted savings of 640=m2 of fuselage skin over the life, relating to a rough order-of-magnitude direct operating cost savings of $500,000 for the fuselage alone of a small regional jet. Moreover, it was found through the uncertainty analysis that the principle was not sensitive to cost variance, although the margins do change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Groove Gap Waveguide (GGW) shows a behavior similar to the classical rectangular waveguide (RWG), but it is formed by two pieces which do not require metal contact. This feature suggests the GGW as a suitable alternative to the RGW for mm-wave frequencies, where ensuring the proper metal contact according to the wavelength size results challenging. Nevertheless, there is a lack of effective analysis tools for the complex GGW topology, and assuming a direct equivalence between the RGW and the GGW is too rough, so that dilatory full-wave simulations are required. This work presents a fast analysis method based on transmission line theory, which establishes the proper correspondence between the GGW and the RWG. In addition, below cutoff behavior of the GGW is studied for the first time. Several numerical tests and two manufactured prototypes validate the proposed method, which seems very adequate to optimize future GGW structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cell cycle and differentiation are two highly coordinated processes during organ development. Recent studies have demonstrated that core cell cycle regulators also play cell cycle-independent functions in post-mitotic neurons, and are essential for the maintenance of neuronal homeostasis. CDC25 phosphatases are well-established CDK activators and their activity is mainly associated to proliferating tissues. The expression and activity of mammalian CDC25s has been reported in adult brains. However, their physiological relevance and the potential substrates in a non-proliferative context have never been addressed. string (stg) encodes the Drosophila CDC25 homolog. Previous studies from our group showed that stg is expressed in photoreceptors (PRs) and in lamina neurons, which are two differentiated cell types that compose the fly visual system. The aims of this work are to uncover the function of stg and to identify its potential neuronal substrates, using the Drosophila visual system as a model. To gain insight into the function of stg in a non-dividing context we used the GAL4/UAS system to promote downregulation of stg in PR-neurons, through the use of an RNAi transgene. The defects caused by stg loss-of-function were evaluated in the developing eye imaginal disc by immunofluorescence, and during adult stages by scanning electron microscopy. This genetic approach was combined with a specific proteomic method, two-dimensional difference gel electrophoresis (2D-DIGE), to identify the potential substrates in PR-cells. Our results showed that stg downregulation in PRs affects the well-patterned retina organization, inducing the loss of apical maintenance of PR-nuclei on the eye disc, and ommatidia disorganization. We also detected an abnormal accumulation of cytoskeletal proteins and a disruption of the axon structure. As a consequence, the projection of PR-axons into the lamina and medulla neuropils of the optic lobe was impaired. Upon stg downregulation, we also detected that PR-cells accumulate Cyclin B. Although the rough eye phenotype observed upon stg downregulation suggests neurodegeneration, we did not detect neuronal death during larval stages, suggesting that it likely occurs during pupal stages or during adulthood. By 2D-DIGE, we identified seven proteins which were differentially expressed upon stg downregulation, and are potential neuronal substrates of Stg. Altogether, our observations suggest that Stg phosphatase plays an essential role in the Drosophila visual system neurons, regulating several cell components and processes in order to ensure their homeostasis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rough Set Data Analysis (RSDA) is a non-invasive data analysis approach that solely relies on the data to find patterns and decision rules. Despite its noninvasive approach and ability to generate human readable rules, classical RSDA has not been successfully used in commercial data mining and rule generating engines. The reason is its scalability. Classical RSDA slows down a great deal with the larger data sets and takes much longer times to generate the rules. This research is aimed to address the issue of scalability in rough sets by improving the performance of the attribute reduction step of the classical RSDA - which is the root cause of its slow performance. We propose to move the entire attribute reduction process into the database. We defined a new schema to store the initial data set. We then defined SOL queries on this new schema to find the attribute reducts correctly and faster than the traditional RSDA approach. We tested our technique on two typical data sets and compared our results with the traditional RSDA approach for attribute reduction. In the end we also highlighted some of the issues with our proposed approach which could lead to future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feature selection plays an important role in knowledge discovery and data mining nowadays. In traditional rough set theory, feature selection using reduct - the minimal discerning set of attributes - is an important area. Nevertheless, the original definition of a reduct is restrictive, so in one of the previous research it was proposed to take into account not only the horizontal reduction of information by feature selection, but also a vertical reduction considering suitable subsets of the original set of objects. Following the work mentioned above, a new approach to generate bireducts using a multi--objective genetic algorithm was proposed. Although the genetic algorithms were used to calculate reduct in some previous works, we did not find any work where genetic algorithms were adopted to calculate bireducts. Compared to the works done before in this area, the proposed method has less randomness in generating bireducts. The genetic algorithm system estimated a quality of each bireduct by values of two objective functions as evolution progresses, so consequently a set of bireducts with optimized values of these objectives was obtained. Different fitness evaluation methods and genetic operators, such as crossover and mutation, were applied and the prediction accuracies were compared. Five datasets were used to test the proposed method and two datasets were used to perform a comparison study. Statistical analysis using the one-way ANOVA test was performed to determine the significant difference between the results. The experiment showed that the proposed method was able to reduce the number of bireducts necessary in order to receive a good prediction accuracy. Also, the influence of different genetic operators and fitness evaluation strategies on the prediction accuracy was analyzed. It was shown that the prediction accuracies of the proposed method are comparable with the best results in machine learning literature, and some of them outperformed it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corteo is a program that implements Monte Carlo (MC) method to simulate ion beam analysis (IBA) spectra of several techniques by following the ions trajectory until a sufficiently large fraction of them reach the detector to generate a spectrum. Hence, it fully accounts for effects such as multiple scattering (MS). Here, a version of Corteo is presented where the target can be a 2D or 3D image. This image can be derived from micrographs where the different compounds are identified, therefore bringing extra information into the solution of an IBA spectrum, and potentially significantly constraining the solution. The image intrinsically includes many details such as the actual surface or interfacial roughness, or actual nanostructures shape and distribution. This can for example lead to the unambiguous identification of structures stoichiometry in a layer, or at least to better constraints on their composition. Because MC computes in details the trajectory of the ions, it simulates accurately many of its aspects such as ions coming back into the target after leaving it (re-entry), as well as going through a variety of nanostructures shapes and orientations. We show how, for example, as the ions angle of incidence becomes shallower than the inclination distribution of a rough surface, this process tends to make the effective roughness smaller in a comparable 1D simulation (i.e. narrower thickness distribution in a comparable slab simulation). Also, in ordered nanostructures, target re-entry can lead to replications of a peak in a spectrum. In addition, bitmap description of the target can be used to simulate depth profiles such as those resulting from ion implantation, diffusion, and intermixing. Other improvements to Corteo include the possibility to interpolate the cross-section in angle-energy tables, and the generation of energy-depth maps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper highlights the prediction of learning disabilities (LD) in school-age children using rough set theory (RST) with an emphasis on application of data mining. In rough sets, data analysis start from a data table called an information system, which contains data about objects of interest, characterized in terms of attributes. These attributes consist of the properties of learning disabilities. By finding the relationship between these attributes, the redundant attributes can be eliminated and core attributes determined. Also, rule mining is performed in rough sets using the algorithm LEM1. The prediction of LD is accurately done by using Rosetta, the rough set tool kit for analysis of data. The result obtained from this study is compared with the output of a similar study conducted by us using Support Vector Machine (SVM) with Sequential Minimal Optimisation (SMO) algorithm. It is found that, using the concepts of reduct and global covering, we can easily predict the learning disabilities in children

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate the use of the perfectly matched layer (PML) to truncate a time harmonic rough surface scattering problem in the direction away from the scatterer. We prove existence and uniqueness of the solution of the truncated problem as well as an error estimate depending on the thickness and composition of the layer. This global error estimate predicts a linear rate of convergence (under some conditions on the relative size of the real and imaginary parts of the PML function) rather than the usual exponential rate. We then consider scattering by a half-space and show that the solution of the PML truncated problem converges globally at most quadratically (up to logarithmic factors), providing support for our general theory. However we also prove exponential convergence on compact subsets. We continue by proposing an iterative correction method for the PML truncated problem and, using our estimate for the PML approximation, prove convergence of this method. Finally we provide some numerical results in 2D.(C) 2008 IMACS. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a class of boundary integral equations that arise in the study of strongly elliptic BVPs in unbounded domains of the form $D = \{(x, z)\in \mathbb{R}^{n+1} : x\in \mathbb{R}^n, z > f(x)\}$ where $f : \mathbb{R}^n \to\mathbb{R}$ is a sufficiently smooth bounded and continuous function. A number of specific problems of this type, for example acoustic scattering problems, problems involving elastic waves, and problems in potential theory, have been reformulated as second kind integral equations $u+Ku = v$ in the space $BC$ of bounded, continuous functions. Having recourse to the so-called limit operator method, we address two questions for the operator $A = I + K$ under consideration, with an emphasis on the function space setting $BC$. Firstly, under which conditions is $A$ a Fredholm operator, and, secondly, when is the finite section method applicable to $A$?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.