879 resultados para Reserve Selection
Resumo:
This paper proposes a filter-based algorithm for feature selection. The filter is based on the partitioning of the set of features into clusters. The number of clusters, and consequently the cardinality of the subset of selected features, is automatically estimated from data. The computational complexity of the proposed algorithm is also investigated. A variant of this filter that considers feature-class correlations is also proposed for classification problems. Empirical results involving ten datasets illustrate the performance of the developed algorithm, which in general has obtained competitive results in terms of classification accuracy when compared to state of the art algorithms that find clusters of features. We show that, if computational efficiency is an important issue, then the proposed filter May be preferred over their counterparts, thus becoming eligible to join a pool of feature selection algorithms to be used in practice. As an additional contribution of this work, a theoretical framework is used to formally analyze some properties of feature selection methods that rely on finding clusters of features. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
We use an inequality due to Bochnak and Lojasiewicz, which follows from the Curve Selection Lemma of real algebraic geometry in order to prove that, given a C(r) function f : U subset of R(m) -> R, we have lim(y -> xy is an element of crit(f)) vertical bar f(y) - f(x)vertical bar/vertical bar y - x vertical bar(r) = 0, for all x is an element of crit(f)` boolean AND U, where crit( f) = {x is an element of U vertical bar df ( x) = 0}. This shows that the so-called Morse decomposition of the critical set, used in the classical proof of the Morse-Sard theorem, is not necessary: the conclusion of the Morse decomposition lemma holds for the whole critical set. We use this result to give a simple proof of the classical Morse-Sard theorem ( with sharp differentiability assumptions).
Resumo:
In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
In hypertension, left ventricular (LV) hypertrophy develops as an adaptive mechanism to compensate for increased afterload and thus preserve systolic function. Associated structural changes such as microvascular disease might potentially interfere with this mechanism, producing pathological hypertrophy. A poorer outcome is expected to occur when LV function is put in jeopardy by impaired coronary reserve. The aim of this study was to evaluate the role of coronary reserve in the long-term outcome of patients with hypertensive dilated cardiomyopathy. Between 1996 and 2000, 45 patients, 30 of them male, with 52 +/- 11 years and LV fractional shortening <30% were enrolled and followed until 2006. Coronary flow velocity reserve was assessed by transesophageal Doppler of the left anterior descending coronary artery. Sixteen patients showed >= 10% improvement in LV fractional shortening after 17 +/- 6 months. Coronary reserve was the only variable independently related to this improvement. Total mortality was 38% in 10 years. The Cox model identified coronary reserve (hazard ratio = 0.814; 95% CI = 0.72-0.92), LV mass, low diastolic blood pressure, and male gender as independent predictors of mortality. In hypertensive dilated cardiomyopathy, coronary reserve impairment adversely affects survival, possibly by interfering with the improvement of LV dysfunction. J Am Soc Hypertens 2010;4(1):14-21. (C) 2010 American Society of Hypertension. All rights reserved.
Resumo:
Object selection refers to the mechanism of extracting objects of interest while ignoring other objects and background in a given visual scene. It is a fundamental issue for many computer vision and image analysis techniques and it is still a challenging task to artificial Visual systems. Chaotic phase synchronization takes place in cases involving almost identical dynamical systems and it means that the phase difference between the systems is kept bounded over the time, while their amplitudes remain chaotic and may be uncorrelated. Instead of complete synchronization, phase synchronization is believed to be a mechanism for neural integration in brain. In this paper, an object selection model is proposed. Oscillators in the network representing the salient object in a given scene are phase synchronized, while no phase synchronization occurs for background objects. In this way, the salient object can be extracted. In this model, a shift mechanism is also introduced to change attention from one object to another. Computer simulations show that the model produces some results similar to those observed in natural vision systems.
Resumo:
Biological systems have facility to capture salient object(s) in a given scene, but it is still a difficult task to be accomplished by artificial vision systems. In this paper a visual selection mechanism based on the integrate and fire neural network is proposed. The model not only can discriminate objects in a given visual scene, but also can deliver focus of attention to the salient object. Moreover, it processes a combination of relevant features of an input scene, such as intensity, color, orientation, and the contrast of them. In comparison to other visual selection approaches, this model presents several interesting features. It is able to capture attention of objects in complex forms, including those linearly nonseparable. Moreover, computer simulations show that the model produces results similar to those observed in natural vision systems.
Resumo:
This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
We consider consider the problem of dichotomizing a continuous covariate when performing a regression analysis based on a generalized estimation approach. The problem involves estimation of the cutpoint for the covariate and testing the hypothesis that the binary covariate constructed from the continuous covariate has a significant impact on the outcome. Due to the multiple testing used to find the optimal cutpoint, we need to make an adjustment to the usual significance test to preserve the type-I error rates. We illustrate the techniques on one data set of patients given unrelated hematopoietic stem cell transplantation. Here the question is whether the CD34 cell dose given to patient affects the outcome of the transplant and what is the smallest cell dose which is needed for good outcomes. (C) 2010 Elsevier BM. All rights reserved.
Resumo:
We construct a two-point selection f : [P](2) -> P, where P is the set of the irrational numbers, such that the space (P, tau(f)) is not normal and it is not collectionwise Hausdorff either. Here, tau(f) denotes the topology generated by the two-point selection f. This example answers a question posed by V. Gutev and T. Nogura. We also show that if f :[X](2) -> X is a two-point selection such that the topology tau(f) has countable pseudocharacter, then tau(f) is a Tychonoff topology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The purpose of this thesis is to identify the destination site selection criteria for internationalconferences from the perspectives of the three main players of the conference industry,conference buyers (organizers and delegates) and suppliers. Additionally, the researchidentifies the strengths and weaknesses of the congress cities of Stockholm and Vienna.Through a comparison with Vienna, the top city for hosting international conferences, a roadmap for Stockholm has been designed, to strengthen its congress tourism opportunities, thus,obtaining a higher status as an international congress city. This qualitative research hascombined both primary and secondary data methods, through semi-standardized expertinterviews and secondary studies respectively, to fulfil the study’s aim. The data have beenanalysed by applying the techniques of qualitative content analysis; the secondary dataadopting an inductive approach according to Mayring (2003) while the expert interviewsusing a deductive approach according to Meuser & Nagel (2009). The conclusions of thesecondary data have been further compared and contrasted with the outcomes of the primarydata, to propose fresh discoveries, clarifications, and concepts related to the site selectioncriteria for international conferences, and for the congress tourism industry of Stockholm. Theresearch discusses the discoveries of the site selection criteria, the implications of thestrengths and weaknesses of Stockholm in comparison to Vienna, recommendations forStockholm via a road map, and future research areas in detail. The findings andrecommendation, not only provide specific steps and inceptions that Stockholm as aninternational conference city can apply, but also propose findings, which can aid conferencebuyers and suppliers to cooperate, to strengthen their marketing strategies and developsuccessful international conferences and destinations to help achieve a greater competitiveadvantage.
Resumo:
We consider methods for estimating causal effects of treatment in the situation where the individuals in the treatment and the control group are self selected, i.e., the selection mechanism is not randomized. In this case, simple comparison of treated and control outcomes will not generally yield valid estimates of casual effects. The propensity score method is frequently used for the evaluation of treatment effect. However, this method is based onsome strong assumptions, which are not directly testable. In this paper, we present an alternative modeling approachto draw causal inference by using share random-effect model and the computational algorithm to draw likelihood based inference with such a model. With small numerical studies and a real data analysis, we show that our approach gives not only more efficient estimates but it is also less sensitive to model misspecifications, which we consider, than the existing methods.
Resumo:
Agent-oriented software engineering (AOSE) is a promising approach to developing applications for dynamic open systems. If well developed, these applications can be opportunistic, taking advantage of services implemented by other developers at appropriate times. However, methodologies are needed to aid the development of systems that are both flexible enough to be opportunistic and tightly defined by the application requirements. In this paper, we investigate how developers can choose the coordination mechanisms of agents so that the agents will best fulfil application requirements in an open system.