936 resultados para Satellite selection algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a systematic approach to determine the most suitable analogue redesign method to be used for forward-type converters under digital voltage mode control. The focus of the method is to achieve the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration with pre-warping have the largest phase margins. An algorithm has been developed to determine the frequency of the crossing point where the recommended discretisation method changes. An accurate model of the power stage is used for simulation and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent closeness between the simulation and experimental results is presented. This work provides a concrete example to allow academics and engineers to systematically choose a discretisation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Imagery registration is a fundamental step, which greatly affects later processes in image mosaic, multi-spectral image fusion, digital surface modelling, etc., where the final solution needs blending of pixel information from more than one images. It is highly desired to find a way to identify registration regions among input stereo image pairs with high accuracy, particularly in remote sensing applications in which ground control points (GCPs) are not always available, such as in selecting a landing zone on an outer space planet. In this paper, a framework for localization in image registration is developed. It strengthened the local registration accuracy from two aspects: less reprojection error and better feature point distribution. Affine scale-invariant feature transform (ASIFT) was used for acquiring feature points and correspondences on the input images. Then, a homography matrix was estimated as the transformation model by an improved random sample consensus (IM-RANSAC) algorithm. In order to identify a registration region with a better spatial distribution of feature points, the Euclidean distance between the feature points is applied (named the S criterion). Finally, the parameters of the homography matrix were optimized by the Levenberg–Marquardt (LM) algorithm with selective feature points from the chosen registration region. In the experiment section, the Chang’E-2 satellite remote sensing imagery was used for evaluating the performance of the proposed method. The experiment result demonstrates that the proposed method can automatically locate a specific region with high registration accuracy between input images by achieving lower root mean square error (RMSE) and better distribution of feature points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We utilized an ecosystem process model (SIPNET, simplified photosynthesis and evapotranspiration model) to estimate carbon fluxes of gross primary productivity and total ecosystem respiration of a high-elevation coniferous forest. The data assimilation routine incorporated aggregated twice-daily measurements of the net ecosystem exchange of CO2 (NEE) and satellite-based reflectance measurements of the fraction of absorbed photosynthetically active radiation (fAPAR) on an eight-day timescale. From these data we conducted a data assimilation experiment with fifteen different combinations of available data using twice-daily NEE, aggregated annual NEE, eight-day f AP AR, and average annual fAPAR. Model parameters were conditioned on three years of NEE and fAPAR data and results were evaluated to determine the information content from the different combinations of data streams. Across the data assimilation experiments conducted, model selection metrics such as the Bayesian Information Criterion and Deviance Information Criterion obtained minimum values when assimilating average annual fAPAR and twice-daily NEE data. Application of wavelet coherence analyses showed higher correlations between measured and modeled fAPAR on longer timescales ranging from 9 to 12 months. There were strong correlations between measured and modeled NEE (R2, coefficient of determination, 0.86), but correlations between measured and modeled eight-day fAPAR were quite poor (R2 = −0.94). We conclude that this inability to determine fAPAR on eight-day timescale would improve with the considerations of the radiative transfer through the plant canopy. Modeled fluxes when assimilating average annual fAPAR and annual NEE were comparable to corresponding results when assimilating twice-daily NEE, albeit at a greater uncertainty. Our results support the conclusion that for this coniferous forest twice-daily NEE data are a critical measurement stream for the data assimilation. The results from this modeling exercise indicate that for this coniferous forest, average annuals for satellite-based fAPAR measurements paired with annual NEE estimates may provide spatial detail to components of ecosystem carbon fluxes in proximity of eddy covariance towers. Inclusion of other independent data streams in the assimilation will also reduce uncertainty on modeled values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a filter-based algorithm for feature selection. The filter is based on the partitioning of the set of features into clusters. The number of clusters, and consequently the cardinality of the subset of selected features, is automatically estimated from data. The computational complexity of the proposed algorithm is also investigated. A variant of this filter that considers feature-class correlations is also proposed for classification problems. Empirical results involving ten datasets illustrate the performance of the developed algorithm, which in general has obtained competitive results in terms of classification accuracy when compared to state of the art algorithms that find clusters of features. We show that, if computational efficiency is an important issue, then the proposed filter May be preferred over their counterparts, thus becoming eligible to join a pool of feature selection algorithms to be used in practice. As an additional contribution of this work, a theoretical framework is used to formally analyze some properties of feature selection methods that rely on finding clusters of features. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider methods for estimating causal effects of treatment in the situation where the individuals in the treatment and the control group are self selected, i.e., the selection mechanism is not randomized. In this case, simple comparison of treated and control outcomes will not generally yield valid estimates of casual effects. The propensity score method is frequently used for the evaluation of treatment effect. However, this method is based onsome strong assumptions, which are not directly testable. In this paper, we present an alternative modeling approachto draw causal inference by using share random-effect model and the computational algorithm to draw likelihood based inference with such a model. With small numerical studies and a real data analysis, we show that our approach gives not only more efficient estimates but it is also less sensitive to model misspecifications, which we consider, than the existing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Millions of unconscious calculations are made daily by pedestrians walking through the Colby College campus. I used ArcGIS to make a predictive spatial model that chose paths similar to those that are actually used by people on a regular basis. To make a viable model of how most travelers choose their way, I considered both the distance required and the type of traveling surface. I used an iterative process to develop a scheme for weighting travel costs which resulted in accurate least-cost paths to be predicted by ArcMap. The accuracy was confirmed when the calculated routes were compared to satellite photography and were found to overlap well-worn “shortcuts” taken between the paved paths throughout campus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jakarta is vulnerable to flooding mainly caused by prolonged and heavy rainfall and thus a robust hydrological modeling is called for. A good quality of spatial precipitation data is therefore desired so that a good hydrological model could be achieved. Two types of rainfall sources are available: satellite and gauge station observations. At-site rainfall is considered to be a reliable and accurate source of rainfall. However, the limited number of stations makes the spatial interpolation not very much appealing. On the other hand, the gridded rainfall nowadays has high spatial resolution and improved accuracy, but still, relatively less accurate than its counterpart. To achieve a better precipitation data set, the study proposes cokriging method, a blending algorithm, to yield the blended satellite-gauge gridded rainfall at approximately 10-km resolution. The Global Satellite Mapping of Precipitation (GSMaP, 0.1⁰×0.1⁰) and daily rainfall observations from gauge stations are used. The blended product is compared with satellite data by cross-validation method. The newly-yield blended product is then utilized to re-calibrate the hydrological model. Several scenarios are simulated by the hydrological models calibrated by gauge observations alone and blended product. The performance of two calibrated hydrological models is then assessed and compared based on simulated and observed runoff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of optimization algorithm to PDE modeling groundwater remediation can greatly reduce remediation cost. However, groundwater remediation analysis requires a computational expensive simulation, therefore, effective parallel optimization could potentially greatly reduce computational expense. The optimization algorithm used in this research is Parallel Stochastic radial basis function. This is designed for global optimization of computationally expensive functions with multiple local optima and it does not require derivatives. In each iteration of the algorithm, an RBF is updated based on all the evaluated points in order to approximate expensive function. Then the new RBF surface is used to generate the next set of points, which will be distributed to multiple processors for evaluation. The criteria of selection of next function evaluation points are estimated function value and distance from all the points known. Algorithms created for serial computing are not necessarily efficient in parallel so Parallel Stochastic RBF is different algorithm from its serial ancestor. The application for two Groundwater Superfund Remediation sites, Umatilla Chemical Depot, and Former Blaine Naval Ammunition Depot. In the study, the formulation adopted treats pumping rates as decision variables in order to remove plume of contaminated groundwater. Groundwater flow and contamination transport is simulated with MODFLOW-MT3DMS. For both problems, computation takes a large amount of CPU time, especially for Blaine problem, which requires nearly fifty minutes for a simulation for a single set of decision variables. Thus, efficient algorithm and powerful computing resource are essential in both cases. The results are discussed in terms of parallel computing metrics i.e. speedup and efficiency. We find that with use of up to 24 parallel processors, the results of the parallel Stochastic RBF algorithm are excellent with speed up efficiencies close to or exceeding 100%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces an improved tabu-based vector optimal algorithm for multiobjective optimal designs of electromagnetic devices. The improvements include a division of the entire search process, a new method for fitness assignment, a novel scheme for the generation and selection of neighborhood solutions, and so forth. Numerical results on a mathematical function and an engineering multiobjective design problem demonstrate that the proposed method can produce virtually the exact Pareto front, in both parameter and objective spaces, even though the iteration number used by it is only about 70% of that required by its ancestor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An algorithm is presented that finds the optimal plan long-term transmission for till cases studied, including relatively large and complex networks. The knowledge of optimal plans is becoming more important in the emerging competitive environment, to which the correct economic signals have to be sent to all participants. The paper presents a new specialised branch-and-bound algorithm for transmission network expansion planning. Optimality is obtained at a cost, however: that is the use of a transportation model for representing the transmission network, in this model only the Kirchhoff current law is taken into account (the second law being relaxed). The expansion problem then becomes an integer linear program (ILP) which is solved by the proposed branch-and-bound method without any further approximations. To control combinatorial explosion the branch- and bound algorithm is specialised using specific knowledge about the problem for both the selection of candidate problems and the selection of the next variable to be used for branching. Special constraints are also used to reduce the gap between the optimal integer solution (ILP program) and the solution obtained by relaxing the integrality constraints (LP program). Tests have been performed with small, medium and large networks available in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A low-cost computer procedure to determine the orbit of an artificial satellite by using short arc data from an onboard GPS receiver is proposed. Pseudoranges are used as measurements to estimate the orbit via recursive least squares method. The algorithm applies orthogonal Givens rotations for solving recursive and sequential orbit determination problems. To assess the procedure, it was applied to the TOPEX/POSEIDON satellite for data batches of one orbital period (approximately two hours), and force modelling, due to the full JGM-2 gravity field model, was considered. When compared with the reference Precision Orbit Ephemeris (POE) of JPL/NASA, the results have indicated that precision better than 9 m is easily obtained, even when short batches of data are used. Copyright (c) 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The risk for venous thromboembolism (VTE) in medical patients is high, but risk assessment is rarely performed because there is not yet a good method to identify candidates for prophylaxis. Purpose: To perform a systematic review about VTE risk factors (RFs) in hospitalized medical patients and generate recommendations (RECs) for prophylaxis that can be implemented into practice. Data sources: A multidisciplinary group of experts from 12 Brazilian Medical Societies searched MEDLINE, Cochrane, and LILACS. Study selection: Two experts independently classified the evidence for each RF by its scientific quality in a standardized manner. A risk-assessment algorithm was created based on the results of the review. Data synthesis: Several VTE RFs have enough evidence to support RECs for prophylaxis in hospitalized medical patients (eg, increasing age, heart failure, and stroke). Other factors are considered adjuncts of risk (eg, varices, obesity, and infections). According to the algorithm, hospitalized medical patients ≥40 years-old with decreased mobility, and ≥1 RFs should receive chemoprophylaxis with heparin, provided they don't have contraindications. High prophylactic doses of unfractionated heparin or low-molecular-weight-heparin must be administered and maintained for 6-14 days. Conclusions: A multidisciplinary group generated evidence-based RECs and an easy-to-use algorithm to facilitate VTE prophylaxis in medical patients. © 2007 Rocha et al, publisher and licensee Dove Medical Press Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, we return to live a period of lunar exploration. China, Japan and India heavily invest in missions to the moon, and then try to implement manned bases on this satellite. These bases must be installed in polar regions due to the apparent existence of water. Therefore, the study of the feasibility of satellite constellations for navigation, control and communication recovers importance. The Moon's gravitational potential and resonant movements due to the proximity to Earth as the Kozai-Lidov resonance, must be considered in addition to other perturbations of lesser magnitude. The usual satellite constellations provide, as a basic feature, continuous and global coverage of the Earth. With this goal, they are designed for the smallest number of objects possible to perform a specific task and this amount is directly related to the altitude of the orbits and visual abilities of the members of the constellation. However the problem is different when the area to be covered is reduced to a given zone. The required number of space objects can be reduced. Furthermore, depending on the mission requirements it may be not necessary to provide continuous coverage. Taking into account the possibility of setting up a constellation that covers a specific region of the Moon on a non-continuous base, in this study we seek a criterion of optimization related to the time between visits. The propagation of the orbits of objects in the constellation in conjunction with the coverage constraints, provide information on the periods of time in which points of the surface are covered by a satellite, and time intervals in which they are not. So we minimize the time between visits considering several sets of possible constellations and using genetic algorithms.