866 resultados para the Fuzzy Colour Segmentation Algorithm
Resumo:
The thermodynamic consistency of almost 90 VLE data series, including isothermal and isobaric conditions for systems of both total and partial miscibility in the liquid phase, has been examined by means of the area and point-to-point tests. In addition, the Gibbs energy of mixing function calculated from these experimental data has been inspected, with some rather surprising results: certain data sets exhibiting high dispersion or leading to Gibbs energy of mixing curves inconsistent with the total or partial miscibility of the liquid phase, surprisingly, pass the tests. Several possible inconsistencies in the tests themselves or in their application are discussed. Related to this is a very interesting and ambitious initiative that arose within the NIST organization: the development of an algorithm to assess the quality of experimental VLE data. The present paper questions the applicability of two of the five tests that are combined in the algorithm. It further shows that the deviation of the experimental VLE data from the correlation obtained by a given model, the basis of some point-to-point tests, should not be used to evaluate the quality of these data.
Resumo:
We present an extension of the logic outer-approximation algorithm for dealing with disjunctive discrete-continuous optimal control problems whose dynamic behavior is modeled in terms of differential-algebraic equations. Although the proposed algorithm can be applied to a wide variety of discrete-continuous optimal control problems, we are mainly interested in problems where disjunctions are also present. Disjunctions are included to take into account only certain parts of the underlying model which become relevant under some processing conditions. By doing so the numerical robustness of the optimization algorithm improves since those parts of the model that are not active are discarded leading to a reduced size problem and avoiding potential model singularities. We test the proposed algorithm using three examples of different complex dynamic behavior. In all the case studies the number of iterations and the computational effort required to obtain the optimal solutions is modest and the solutions are relatively easy to find.
Resumo:
This study describes a novel spectral LED-based tunable light source used for customized lighting solutions, especially for the reconstruction of CIE (Commission Internationale de l’Éclairage) standard illuminants. The light source comprises 31 spectral bands ranging from 400 to 700 nm, an integrating cube and a control board with a 16-bit resolution. A minimization algorithm to calculate the weighting values for each channel was applied to reproduce illuminants with precision. The differences in spectral fitting and colorimetric parameters showed that the reconstructed spectra were comparable to the standard, especially for the D65, D50, A and E illuminants. Accurate results were also obtained for illuminants with narrow peaks such as fluorescents (F2 and F11) and a high-pressure sodium lamp (HP1). In conclusion, the developed spectral LED-based light source and the minimization algorithm are able to reproduce any CIE standard illuminants with a high spectral and colorimetric accuracy able to advance available custom lighting systems useful in the industry and other fields such as museum lighting.
Resumo:
AIM: To define the financial and management conditions required to introduce a femtosecond laser system for cataract surgery in a clinic using a fuzzy logic approach. METHODS: In the simulation performed in the current study, the costs associated to the acquisition and use of a commercially available femtosecond laser platform for cataract surgery (VICTUS, TECHNOLAS Perfect Vision GmbH, Bausch & Lomb, Munich, Germany) during a period of 5y were considered. A sensitivity analysis was performed considering such costs and the countable amortization of the system during this 5y period. Furthermore, a fuzzy logic analysis was used to obtain an estimation of the money income associated to each femtosecond laser-assisted cataract surgery (G). RESULTS: According to the sensitivity analysis, the femtosecond laser system under evaluation can be profitable if 1400 cataract surgeries are performed per year and if each surgery can be invoiced more than $500. In contrast, the fuzzy logic analysis confirmed that the patient had to pay more per surgery, between $661.8 and $667.4 per surgery, without considering the cost of the intraocular lens (IOL). CONCLUSION: A profitability of femtosecond laser systems for cataract surgery can be obtained after a detailed financial analysis, especially in those centers with large volumes of patients. The cost of the surgery for patients should be adapted to the real flow of patients with the ability of paying a reasonable range of cost.
Resumo:
Tourmaline from a gem-quality deposit in the Grenville province has been studied with X-ray diffraction, visible-near infrared spectroscopy, Fourier transform infrared spectroscopy, scanning electron microscopy, electron microprobe and optical measurements. The tourmaline is found within tremolite-rich calc-silicate pods hosted in marble of the Central Metasedimentary Belt. The crystals are greenish-greyish-brown and have yielded facetable material up to 2.09 carats in size. Using the classification of Henry et al. 2011 the tourmaline is classified as a dravite, with a representative formula shown to be (Na0.73Ca0.2380.032)(Mg2+2.913Fe2+0.057Ti4+0.030) (Al3+5.787Fe3+0.017Mg2+0.14)(Si6.013O18)(BO3)3(OH)3((OH,O)0.907F0.093). Rietveld analysis of powder diffraction data gives a = 15.9436(8) Å, c = 7.2126(7) Å and a unit cell volume of 1587.8 Å3. A polished thin section was cut perpendicular to the c-axis of one tourmaline crystal, which showed zoning from a dark brown core into a lighter rim into a thin darker rim and back into lighter zonation. Through the geochemical data, three key stages of crystal growth can be seen within this thin section. The first is the core stage which occurs from the dark core to the first colourless zone; the second is from this colourless zone increasing in brown colour to the outer limit before a sudden absence of colour is noted; the third is a sharp change from the end of the second and is entirely colourless. These events are the result of metamorphism and hydrothermal fluids resulting from nearby felsic intrusive plutons. Scanning electron microscope, and electron microprobe traverses across this cross-section revealed that the green colour is the result of iron present throughout the system while the brown colour is correlated with titanium content. Crystal inclusions in the tourmaline of chlorapatite, and zircon were identified by petrographic analysis and confirmed using scanning electron microscope data and occur within the third stage of formation.
Resumo:
Evolutionary-based algorithms play an important role in finding solutions to many problems that are not solved by classical methods, and particularly so for those cases where solutions lie within extreme non-convex multidimensional spaces. The intrinsic parallel structure of evolutionary algorithms are amenable to the simultaneous testing of multiple solutions; this has proved essential to the circumvention of local optima, and such robustness comes with high computational overhead, though custom digital processor use may reduce this cost. This paper presents a new implementation of an old, and almost forgotten, evolutionary algorithm: the population-based incremental learning method. We show that the structure of this algorithm is well suited to implementation within programmable logic, as compared with contemporary genetic algorithms. Further, the inherent concurrency of our FPGA implementation facilitates the integration and testing of micro-populations.
Resumo:
This study evaluates the degree of segmentation of the market for agricultural machinery and equipment in the EU. We focus on agricultural tractors, the most common and biggest investment in machinery and equipment in the agricultural sector. By using country price data for individual tractor models, we test the law of one price, i.e. the existence of a common price for tractors across EU member states. We find that significant price differences exist, yet unlike most other studies we find that large price deviations are penalised within a short time. The study also shows that transport costs are an important source of price differences, as domestic production leads to lower prices on the domestic market and as price convergence is negatively correlated with distance. Finally, price differences should not solely be understood from a geographical perspective, as evidence supports the idea that farmers’ buying power is significant in explaining price differences within countries.
Resumo:
The problem of similarity measurement of biological signals is considered on this article. The dynamic time warping algorithm is used as a possible solution. A short overview of this algorithm and its modifications are given. Testing procedure for different modifications of DTW, which are based on artificial test signals, are presented.
Resumo:
The continuous plankton recorder (CPR) survey is an upper layer plankton monitoring program that has regularly collected samples, at monthly intervals, in the North Atlantic and adjacent seas since 1946. Water from approximately 6 m depth enters the CPR through a small aperture at the front of the sampler and travels down a tunnel where it passes through a silk filtering mesh of 270 µm before exiting at the back of the CPR. The plankton filtered on the silk is analyzed in sections corresponding to 10 nautical miles (approx. 3 m**3 of seawater filtered) and the plankton microscopically identified (Richardson et al., 2006 and reference therein). In the present study we used the CPR data to investigate the current basin scale distribution of C. finmarchicus (C5-C6), C. helgolandicus (C5-C6), C. hyperboreus (C5-C6), Pseudocalanus spp. (C6), Oithona spp. (C1-C6), total Euphausiida, total Thecosomata and the presence/absence of Cnidaria and the Phytoplankton Colour Index (PCI). The PCI, which is a visual assessment of the greenness of the silk, is used as an indicator of the distribution of total phytoplankton biomass across the Atlantic basin (Batten et al., 2003). Monthly data collected between 2000 and 2009 were gridded using the inverse-distance interpolation method, in which the interpolated values were the nodes of a 2 degree by 2 degree grid. The resulting twelve monthly matrices were then averaged within the year and in the case of the zooplankton the data were log-transformed (i.e. log10 (x+1).
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The integration of geo-information from multiple sources and of diverse nature in developing mineral favourability indexes (MFIs) is a well-known problem in mineral exploration and mineral resource assessment. Fuzzy set theory provides a convenient framework to combine and analyse qualitative and quantitative data independently of their source or characteristics. A novel, data-driven formulation for calculating MFIs based on fuzzy analysis is developed in this paper. Different geo-variables are considered fuzzy sets and their appropriate membership functions are defined and modelled. A new weighted average-type aggregation operator is then introduced to generate a new fuzzy set representing mineral favourability. The membership grades of the new fuzzy set are considered as the MFI. The weights for the aggregation operation combine the individual membership functions of the geo-variables, and are derived using information from training areas and L, regression. The technique is demonstrated in a case study of skarn tin deposits and is used to integrate geological, geochemical and magnetic data. The study area covers a total of 22.5 km(2) and is divided into 349 cells, which include nine control cells. Nine geo-variables are considered in this study. Depending on the nature of the various geo-variables, four different types of membership functions are used to model the fuzzy membership of the geo-variables involved. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Fault diagnosis has become an important component in intelligent systems, such as intelligent control systems and intelligent eLearning systems. Reiter's diagnosis theory, described by first-order sentences, has been attracting much attention in this field. However, descriptions and observations of most real-world situations are related to fuzziness because of the incompleteness and the uncertainty of knowledge, e. g., the fault diagnosis of student behaviors in the eLearning processes. In this paper, an extension of Reiter's consistency-based diagnosis methodology, Fuzzy Diagnosis, has been proposed, which is able to deal with incomplete or fuzzy knowledge. A number of important properties of the Fuzzy diagnoses schemes have also been established. The computing of fuzzy diagnoses is mapped to solving a system of inequalities. Some special cases, abstracted from real-world situations, have been discussed. In particular, the fuzzy diagnosis problem, in which fuzzy observations are represented by clause-style fuzzy theories, has been presented and its solving method has also been given. A student fault diagnostic problem abstracted from a simplified real-world eLearning case is described to demonstrate the application of our diagnostic framework.
Resumo:
Beyond the inherent technical challenges, current research into the three dimensional surface correspondence problem is hampered by a lack of uniform terminology, an abundance of application specific algorithms, and the absence of a consistent model for comparing existing approaches and developing new ones. This paper addresses these challenges by presenting a framework for analysing, comparing, developing, and implementing surface correspondence algorithms. The framework uses five distinct stages to establish correspondence between surfaces. It is general, encompassing a wide variety of existing techniques, and flexible, facilitating the synthesis of new correspondence algorithms. This paper presents a review of existing surface correspondence algorithms, and shows how they fit into the correspondence framework. It also shows how the framework can be used to analyse and compare existing algorithms and develop new algorithms using the framework's modular structure. Six algorithms, four existing and two new, are implemented using the framework. Each implemented algorithm is used to match a number of surface pairs. Results demonstrate that the correspondence framework implementations are faithful implementations of existing algorithms, and that powerful new surface correspondence algorithms can be created. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
We present new measurements of the luminosity function (LF) of luminous red galaxies (LRGs) from the Sloan Digital Sky Survey (SDSS) and the 2dF SDSS LRG and Quasar (2SLAQ) survey. We have carefully quantified, and corrected for, uncertainties in the K and evolutionary corrections, differences in the colour selection methods, and the effects of photometric errors, thus ensuring we are studying the same galaxy population in both surveys. Using a limited subset of 6326 SDSS LRGs (with 0.17 < z < 0.24) and 1725 2SLAQ LRGs (with 0.5 < z < 0.6), for which the matching colour selection is most reliable, we find no evidence for any additional evolution in the LRG LF, over this redshift range, beyond that expected from a simple passive evolution model. This lack of additional evolution is quantified using the comoving luminosity density of SDSS and 2SLAQ LRGs, brighter than M-0.2r - 5 log h(0.7) = - 22.5, which are 2.51 +/- 0.03 x 10(-7) L circle dot Mpc(-3) and 2.44 +/- 0.15 x 10(-7) L circle dot Mpc(-3), respectively (< 10 per cent uncertainty). We compare our LFs to the COMBO-17 data and find excellent agreement over the same redshift range. Together, these surveys show no evidence for additional evolution (beyond passive) in the LF of LRGs brighter than M-0.2r - 5 log h(0.7) = - 21 ( or brighter than similar to L-*).. We test our SDSS and 2SLAQ LFs against a simple 'dry merger' model for the evolution of massive red galaxies and find that at least half of the LRGs at z similar or equal to 0.2 must already have been well assembled (with more than half their stellar mass) by z similar or equal to 0.6. This limit is barely consistent with recent results from semi-analytical models of galaxy evolution.
Resumo:
Background. We describe the development, reliability and applications of the Diagnostic Interview for Psychoses (DIP), a comprehensive interview schedule for psychotic disorders. Method. The DIP is intended for use by interviewers with a clinical background and was designed to occupy the middle ground between fully structured, lay-administered schedules, and semi-structured., psychiatrist-administered interviews. It encompasses four main domains: (a) demographic data; (b) social functioning and disability; (c) a diagnostic module comprising symptoms, signs and past history ratings; and (d) patterns of service utilization Lind patient-perceived need for services. It generates diagnoses according to several sets of criteria using the OPCRIT computerized diagnostic algorithm and can be administered either on-screen or in a hard-copy format. Results. The DIP proved easy to use and was well accepted in the field. For the diagnostic module, inter-rater reliability was assessed on 20 cases rated by 24 clinicians: good reliability was demonstrated for both ICD-10 and DSM-III-R diagnoses. Seven cases were interviewed 2-11 weeks apart to determine test-retest reliability, with pairwise agreement of 0.8-1.0 for most items. Diagnostic validity was assessed in 10 cases, interviewed with the DIP and using the SCAN as 'gold standard': in nine cases clinical diagnoses were in agreement. Conclusions. The DIP is suitable for use in large-scale epidemiological studies of psychotic disorders. as well as in smaller Studies where time is at a premium. While the diagnostic module stands on its own, the full DIP schedule, covering demography, social functioning and service utilization makes it a versatile multi-purpose tool.