866 resultados para the Fuzzy Colour Segmentation Algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The new generation of artificial satellites is providing a huge amount of Earth observation images whose exploitation can report invaluable benefits, both economical and environmental. However, only a small fraction of this data volume has been analyzed, mainly due to the large human resources needed for that task. In this sense, the development of unsupervised methodologies for the analysis of these images is a priority. In this work, a new unsupervised segmentation algorithm for satellite images is proposed. This algorithm is based on the rough-set theory, and it is inspired by a previous segmentation algorithm defined in the RGB color domain. The main contributions of the new algorithm are: (i) extending the original algorithm to four spectral bands; (ii) the concept of the superpixel is used in order to define the neighborhood similarity of a pixel adapted to the local characteristics of each image; (iii) and two new region merged strategies are proposed and evaluated in order to establish the final number of regions in the segmented image. The experimental results show that the proposed approach improves the results provided by the original method when both are applied to satellite images with different spectral and spatial resolutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Hybrid Monte Carlo algorithm is adapted to the simulation of a system of classical degrees of freedom coupled to non self-interacting lattices fermions. The diagonalization of the Hamiltonian matrix is avoided by introducing a path-integral formulation of the problem, in d + 1 Euclidean space–time. A perfect action formulation allows to work on the continuum Euclidean time, without need for a Trotter–Suzuki extrapolation. To demonstrate the feasibility of the method we study the Double Exchange Model in three dimensions. The complexity of the algorithm grows only as the system volume, allowing to simulate in lattices as large as 163 on a personal computer. We conclude that the second order paramagnetic–ferromagnetic phase transition of Double Exchange Materials close to half-filling belongs to the Universality Class of the three-dimensional classical Heisenberg model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power flow calculations are one of the most important tools for power system planning and operation. The need to account for uncertainties when performing power flow studies led, among others methods, to the development of the fuzzy power flow (FPF). This kind of models is especially interesting when a scarcity of information exists, which is a common situation in liberalized power systems (where generation and commercialization of electricity are market activities). In this framework, the symmetric/constrained fuzzy power flow (SFPF/CFPF) was proposed in order to avoid some of the problems of the original FPF model. The SFPF/CFPF models are suitable to quantify the adequacy of transmission network to satisfy “reasonable demands for the transmission of electricity” as defined, for instance, in the European Directive 2009/72/EC. In this work it is illustrated how the SFPF/CFPF may be used to evaluate the impact on the adequacy of a transmission system originated by specific investments on new network elements

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis project is to automatically localize HCC tumors in the human liver and subsequently predict if the tumor will undergo microvascular infiltration (MVI), the initial stage of metastasis development. The input data for the work have been partially supplied by Sant'Orsola Hospital and partially downloaded from online medical databases. Two Unet models have been implemented for the automatic segmentation of the livers and the HCC malignancies within it. The segmentation models have been evaluated with the Intersection-over-Union and the Dice Coefficient metrics. The outcomes obtained for the liver automatic segmentation are quite good (IOU = 0.82; DC = 0.35); the outcomes obtained for the tumor automatic segmentation (IOU = 0.35; DC = 0.46) are, instead, affected by some limitations: it can be state that the algorithm is almost always able to detect the location of the tumor, but it tends to underestimate its dimensions. The purpose is to achieve the CT images of the HCC tumors, necessary for features extraction. The 14 Haralick features calculated from the 3D-GLCM, the 120 Radiomic features and the patients' clinical information are collected to build a dataset of 153 features. Now, the goal is to build a model able to discriminate, based on the features given, the tumors that will undergo MVI and those that will not. This task can be seen as a classification problem: each tumor needs to be classified either as “MVI positive” or “MVI negative”. Techniques for features selection are implemented to identify the most descriptive features for the problem at hand and then, a set of classification models are trained and compared. Among all, the models with the best performances (around 80-84% ± 8-15%) result to be the XGBoost Classifier, the SDG Classifier and the Logist Regression models (without penalization and with Lasso, Ridge or Elastic Net penalization).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Driving simulators emulate a real vehicle drive in a virtual environment. One of the most challenging problems in this field is to create a simulated drive as real as possible to deceive the driver's senses and cause the believing to be in a real vehicle. This thesis first provides an overview of the Stuttgart driving simulator with a description of the overall system, followed by a theoretical presentation of the commonly used motion cueing algorithms. The second and predominant part of the work presents the implementation of the classical and optimal washout algorithms in a Simulink environment. The project aims to create a new optimal washout algorithm and compare the obtained results with the results of the classical washout. The classical washout algorithm, already implemented in the Stuttgart driving simulator, is the most used in the motion control of the simulator. This classical algorithm is based on a sequence of filters in which each parameter has a clear physical meaning and a unique assignment to a single degree of freedom. However, the effects on human perception are not exploited, and each parameter must be tuned online by an engineer in the control room, depending on the driver's feeling. To overcome this problem and also consider the driver's sensations, the optimal washout motion cueing algorithm was implemented. This optimal control-base algorithm treats motion cueing as a tracking problem, forcing the accelerations perceived in the simulator to track the accelerations that would have been perceived in a real vehicle, by minimizing the perception error within the constraints of the motion platform. The last chapter presents a comparison between the two algorithms, based on the driver's feelings after the test drive. Firstly it was implemented an off-line test with a step signal as an input acceleration to verify the behaviour of the simulator. Secondly, the algorithms were executed in the simulator during a test drive on several tracks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo track structures (MCTS) simulations have been recognized as useful tools for radiobiological modeling. However, the authors noticed several issues regarding the consistency of reported data. Therefore, in this work, they analyze the impact of various user defined parameters on simulated direct DNA damage yields. In addition, they draw attention to discrepancies in published literature in DNA strand break (SB) yields and selected methodologies. The MCTS code Geant4-DNA was used to compare radial dose profiles in a nanometer-scale region of interest (ROI) for photon sources of varying sizes and energies. Then, electron tracks of 0.28 keV-220 keV were superimposed on a geometric DNA model composed of 2.7 × 10(6) nucleosomes, and SBs were simulated according to four definitions based on energy deposits or energy transfers in DNA strand targets compared to a threshold energy ETH. The SB frequencies and complexities in nucleosomes as a function of incident electron energies were obtained. SBs were classified into higher order clusters such as single and double strand breaks (SSBs and DSBs) based on inter-SB distances and on the number of affected strands. Comparisons of different nonuniform dose distributions lacking charged particle equilibrium may lead to erroneous conclusions regarding the effect of energy on relative biological effectiveness. The energy transfer-based SB definitions give similar SB yields as the one based on energy deposit when ETH ≈ 10.79 eV, but deviate significantly for higher ETH values. Between 30 and 40 nucleosomes/Gy show at least one SB in the ROI. The number of nucleosomes that present a complex damage pattern of more than 2 SBs and the degree of complexity of the damage in these nucleosomes diminish as the incident electron energy increases. DNA damage classification into SSB and DSB is highly dependent on the definitions of these higher order structures and their implementations. The authors' show that, for the four studied models, different yields are expected by up to 54% for SSBs and by up to 32% for DSBs, as a function of the incident electrons energy and of the models being compared. MCTS simulations allow to compare direct DNA damage types and complexities induced by ionizing radiation. However, simulation results depend to a large degree on user-defined parameters, definitions, and algorithms such as: DNA model, dose distribution, SB definition, and the DNA damage clustering algorithm. These interdependencies should be well controlled during the simulations and explicitly reported when comparing results to experiments or calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The efficacy of the human papillomavirus type 16 (HPV-16)/HPV-18 AS04-adjuvanted vaccine against cervical infections with HPV in the Papilloma Trial against Cancer in Young Adults (PATRICIA) was evaluated using a combination of the broad-spectrum L1-based SPF10 PCR-DNA enzyme immunoassay (DEIA)/line probe assay (LiPA25) system with type-specific PCRs for HPV-16 and -18. Broad-spectrum PCR assays may underestimate the presence of HPV genotypes present at relatively low concentrations in multiple infections, due to competition between genotypes. Therefore, samples were retrospectively reanalyzed using a testing algorithm incorporating the SPF10 PCR-DEIA/LiPA25 plus a novel E6-based multiplex type-specific PCR and reverse hybridization assay (MPTS12 RHA), which permits detection of a panel of nine oncogenic HPV genotypes (types 16, 18, 31, 33, 35, 45, 52, 58, and 59). For the vaccine against HPV types 16 and 18, there was no major impact on estimates of vaccine efficacy (VE) for incident or 6-month or 12-month persistent infections when the MPTS12 RHA was included in the testing algorithm versus estimates with the protocol-specified algorithm. However, the alternative testing algorithm showed greater sensitivity than the protocol-specified algorithm for detection of some nonvaccine oncogenic HPV types. More cases were gained in the control group than in the vaccine group, leading to higher point estimates of VE for 6-month and 12-month persistent infections for the nonvaccine oncogenic types included in the MPTS12 RHA assay (types 31, 33, 35, 45, 52, 58, and 59). This post hoc analysis indicates that the per-protocol testing algorithm used in PATRICIA underestimated the VE against some nonvaccine oncogenic HPV types and that the choice of the HPV DNA testing methodology is important for the evaluation of VE in clinical trials. (This study has been registered at ClinicalTrials.gov under registration no. NCT00122681.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To evaluate the occurrence of severe obstetric complications associated with antepartum and intrapartum hemorrhage among women from the Brazilian Network for Surveillance of Severe Maternal Morbidity.Design Multicenter cross-sectional study.Setting Twenty-seven obstetric referral units in Brazil between July 2009 and June 2010.Population A total of 9555 women categorized as having obstetric complications.Methods The occurrence of potentially life-threatening conditions, maternal near miss and maternal deaths associated with antepartum and intrapartum hemorrhage was evaluated. Sociodemographic and obstetric characteristics and the use of criteria for management of severe bleeding were also assessed in these women.Main outcome measures The prevalence ratios with their respective 95% confidence intervals adjusted for the cluster effect of the design, and multiple logistic regression analysis were performed to identify factors independently associated with the occurrence of severe maternal outcome.Results Antepartum and intrapartum hemorrhage occurred in only 8% (767) of women experiencing any type of obstetric complication. However, it was responsible for 18.2% (140) of maternal near miss and 10% (14) of maternal death cases. On multivariate analysis, maternal age and previous cesarean section were shown to be independently associated with an increased risk of severe maternal outcome (near miss or death).Conclusion Severe maternal outcome due to antepartum and intrapartum hemorrhage was highly prevalent among Brazilian women. Certain risk factors, maternal age and previous cesarean delivery in particular, were associated with the occurrence of bleeding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims. We report the discovery of very shallow (Delta F/F approximate to 3.4 x 10(-4)), periodic dips in the light curve of an active V = 11.7 G9V star observed by the CoRoT satellite, which we interpret as caused by a transiting companion. We describe the 3-colour CoRoT data and complementary ground-based observations that support the planetary nature of the companion. Methods. We used CoRoT colours information, good angular resolution ground-based photometric observations in- and out- of transit, adaptive optics imaging, near-infrared spectroscopy, and preliminary results from radial velocity measurements, to test the diluted eclipsing binary scenarios. The parameters of the host star were derived from optical spectra, which were then combined with the CoRoT light curve to derive parameters of the companion. Results. We examined all conceivable cases of false positives carefully, and all the tests support the planetary hypothesis. Blends with separation >0.40 '' or triple systems are almost excluded with a 8 x 10(-4) risk left. We conclude that, inasmuch we have been exhaustive, we have discovered a planetary companion, named CoRoT-7b, for which we derive a period of 0.853 59 +/- 3 x 10(-5) day and a radius of R(p) = 1.68 +/- 0.09 R(Earth). Analysis of preliminary radial velocity data yields an upper limit of 21 M(Earth) for the companion mass, supporting the finding. Conclusions. CoRoT-7b is very likely the first Super-Earth with a measured radius. This object illustrates what will probably become a common situation with missions such as Kepler, namely the need to establish the planetary origin of transits in the absence of a firm radial velocity detection and mass measurement. The composition of CoRoT-7b remains loosely constrained without a precise mass. A very high surface temperature on its irradiated face, approximate to 1800-2600 K at the substellar point, and a very low one, approximate to 50 K, on its dark face assuming no atmosphere, have been derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fuzzy control strategy for voltage regulation in electric power distribution systems is introduced in this article. This real-time controller would act on power transformers equipped with under-load tap changers. The fuzzy system was employed to turn the voltage-control relays into adaptive devices. The scope of the present study has been limited to the power distribution substation, and both the voltage measurements and control actions are carried out on the secondary bus. The capacity of fuzzy systems to handle approximate data, together with their unique ability to interpret qualitative information, make it possible to design voltage control strategies that satisfy both the requirements of the Brazilian regulatory bodies and the real concerns of the electric power distribution companies. A prototype based on the fuzzy control strategy proposed in this paper has also been implemented for validation purposes and its experimental results were highly satisfactory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research presents a method for frequency estimation in power systems using an adaptive filter based on the Least Mean Square Algorithm (LMS). In order to analyze a power system, three-phase voltages were converted into a complex signal applying the alpha beta-transform and the results were used in an adaptive filtering algorithm. Although the use of the complex LMS algorithm is described in the literature, this paper deals with some practical aspects of the algorithm implementation. In order to reduce computing time, a coefficient generator was implemented. For the algorithm validation, a computing simulation of a power system was carried Out using the ATP software. Many different situations were Simulated for the performance analysis of the proposed methodology. The results were compared to a commercial relay for validation, showing the advantages of the new method. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this paper is to relieve the power system engineers from the burden of the complex and time-consuming process of power system stabilizer (PSS) tuning. To achieve this goal, the paper proposes an automatic process for computerized tuning of PSSs, which is based on an iterative process that uses a linear matrix inequality (LMI) solver to find the PSS parameters. It is shown in the paper that PSS tuning can be written as a search problem over a non-convex feasible set. The proposed algorithm solves this feasibility problem using an iterative LMI approach and a suitable initial condition, corresponding to a PSS designed for nominal operating conditions only (which is a quite simple task, since the required phase compensation is uniquely defined). Some knowledge about the PSS tuning is also incorporated in the algorithm through the specification of bounds defining the allowable PSS parameters. The application of the proposed algorithm to a benchmark test system and the nonlinear simulation of the resulting closed-loop models demonstrate the efficiency of this algorithm. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article a novel algorithm based on the chemotaxis process of Echerichia coil is developed to solve multiobjective optimization problems. The algorithm uses fast nondominated sorting procedure, communication between the colony members and a simple chemotactical strategy to change the bacterial positions in order to explore the search space to find several optimal solutions. The proposed algorithm is validated using 11 benchmark problems and implementing three different performance measures to compare its performance with the NSGA-II genetic algorithm and with the particle swarm-based algorithm NSPSO. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many authors point out that the front-end of new product development (NPD) is a critical success factor in the NPD process and that numerous companies face difficulties in carrying it out appropriately. Therefore, it is important to develop new theories and proposals that support the effective implementation of this earliest phase of NPD. This paper presents a new method to support the development of front-end activities based on integrating technology roadmapping (TRM) and project portfolio management (PPM). This new method, called the ITP Method, was implemented at a small Brazilian high-tech company in the nanotechnology industry to explore the integration proposal. The case study demonstrated that the ITP Method provides a systematic procedure for the fuzzy front-end and integrates innovation perspectives into a single roadmap, which allows for a better alignment of business efforts and communication of product innovation goals. Furthermore, the results indicated that the method may also improve quality, functional integration and strategy alignment. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a domain boundary element formulation for inelastic saturated porous media with rate-independent behavior for the solid skeleton. The formulation is then applied to elastic-plastic behavior for the solid. Biot`s consolidation theory, extended to include irreversible phenomena is considered and the direct boundary element technique is used for the numerical solution after time discretization by the implicit Euler backward algorithm. The associated nonlinear algebraic problem is solved by the Newton-Raphson procedure whereby the loading/unloading conditions are fully taken into account and the consistent tangent operator defined. Only domain nodes (nodes defined inside the domain) are used to represent all domain values and the corresponding integrals are computed by using an accurate sub-elementation scheme. The developments are illustrated through the Drucker-Prager elastic-plastic model for the solid skeleton and various examples are analyzed with the proposed algorithms. (c) 2008 Elsevier B.V. All rights reserved.