981 resultados para Component method
Resumo:
Nurse rostering is a complex scheduling problem that affects hospital personnel on a daily basis all over the world. This paper presents a new component-based approach with adaptive perturbations, for a nurse scheduling problem arising at a major UK hospital. The main idea behind this technique is to decompose a schedule into its components (i.e. the allocated shift pattern of each nurse), and then mimic a natural evolutionary process on these components to iteratively deliver better schedules. The worthiness of all components in the schedule has to be continuously demonstrated in order for them to remain there. This demonstration employs a dynamic evaluation function which evaluates how well each component contributes towards the final objective. Two perturbation steps are then applied: the first perturbation eliminates a number of components that are deemed not worthy to stay in the current schedule; the second perturbation may also throw out, with a low level of probability, some worthy components. The eliminated components are replenished with new ones using a set of constructive heuristics using local optimality criteria. Computational results using 52 data instances demonstrate the applicability of the proposed approach in solving real-world problems.
Resumo:
The aim of this study was to develop a methodology using Raman hyperspectral imaging and chemometric methods for identification of pre- and post-blast explosive residues on banknote surfaces. The explosives studied were of military, commercial and propellant uses. After the acquisition of the hyperspectral imaging, independent component analysis (ICA) was applied to extract the pure spectra and the distribution of the corresponding image constituents. The performance of the methodology was evaluated by the explained variance and the lack of fit of the models, by comparing the ICA recovered spectra with the reference spectra using correlation coefficients and by the presence of rotational ambiguity in the ICA solutions. The methodology was applied to forensic samples to solve an automated teller machine explosion case. Independent component analysis proved to be a suitable method of resolving curves, achieving equivalent performance with the multivariate curve resolution with alternating least squares (MCR-ALS) method. At low concentrations, MCR-ALS presents some limitations, as it did not provide the correct solution. The detection limit of the methodology presented in this study was 50μgcm(-2).
Resumo:
This study was conducted in the Private Reserve Mata do Jambreiro (912 ha), localized in the Iron Quadrangle, Minas Gerais, southeastern portion of the Espinhaco Range, which is predominantly covered by semideciduous seasonal montane forest. Three topographically and physiognomic similar areas located within a continuum forest fragment, distant by 1.3 to 1.5 km were sampled by the point-quadrat method. In each area, 30 points were marked. Individuals with a minimum perimeter at the breast height (PBH) of 15 cm were sampled, totaling 111 species belonging to 40 families. The most representative family was Fabaceae, with 14.29% of the total number of species. Low floristic similarity (5.3% to 34.4%) was observed between the areas, pointing out the importance of distribution of sample units in continuous fragments. Shannon diversity index (H') found was 4.22 and Pielou equability (J) 0.894. Soil analysis showed some differences in chemical composition between the three studied areas and was an important component for the interpretation of the floristic variation found. The low floristic similarity observed here for close areas justify the requirement of more detailed inventories by Brazilian Environmental Agencies for the legal authorization procedures prior to the establishment of new enterprising projects. Also, the professionals that conduct rapid inventories, mainly the Environmental Consultants, should give more attention to this kind of floristic variation and to the methods used to inventory complex forests.
Resumo:
Optical monitoring systems are necessary to manufacture multilayer thin-film optical filters with low tolerance on spectrum specification. Furthermore, to have better accuracy on the measurement of film thickness, direct monitoring is a must. Direct monitoring implies acquiring spectrum data from the optical component undergoing the film deposition itself, in real time. In making film depositions on surfaces of optical components, the high vacuum evaporator chamber is the most popular equipment. Inside the evaporator, at the top of the chamber, there is a metallic support with several holes where the optical components are assembled. This metallic support has rotary motion to promote film homogenization. To acquire a measurement of the spectrum of the film in deposition, it is necessary to pass a light beam through a glass witness undergoing the film deposition process, and collect a sample of the light beam using a spectrometer. As both the light beam and the light collector are stationary, a synchronization system is required to identify the moment at which the optical component passes through the light beam.
Resumo:
In this paper a bond graph methodology is used to model incompressible fluid flows with viscous and thermal effects. The distinctive characteristic of these flows is the role of pressure, which does not behave as a state variable but as a function that must act in such a way that the resulting velocity field has divergence zero. Velocity and entropy per unit volume are used as independent variables for a single-phase, single-component flow. Time-dependent nodal values and interpolation functions are introduced to represent the flow field, from which nodal vectors of velocity and entropy are defined as state variables. The system for momentum and continuity equations is coincident with the one obtained by using the Galerkin method for the weak formulation of the problem in finite elements. The integral incompressibility constraint is derived based on the integral conservation of mechanical energy. The weak formulation for thermal energy equation is modeled with true bond graph elements in terms of nodal vectors of temperature and entropy rates, resulting a Petrov-Galerkin method. The resulting bond graph shows the coupling between mechanical and thermal energy domains through the viscous dissipation term. All kind of boundary conditions are handled consistently and can be represented as generalized effort or flow sources. A procedure for causality assignment is derived for the resulting graph, satisfying the Second principle of Thermodynamics. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Nanocomposite membranes containing polysulfone (PSI) and sodium montmorillonite from Wyoming (MMT) were prepared by a combination of solution dispersion and the immersion step of the wet-phase inversion method. The purpose was to study the MMT addition with contents of 0.5 and 3.0 mass% MMT in the preparation of nanocomposite membranes by means of morphology, thermal, mechanical and hydrophilic properties of nanocomposite membranes and to compare these properties to the pure PSf membrane ones. Small-angle X-ray diffraction patterns revealed the formation of intercalated clay mineral layers in the PSf matrix and TEM images also presented an exfoliated structure. A good dispersion of the clay mineral particles was detected by SEM images. Tensile tests showed that both elongation at break and tensile strength of the nanocomposites were improved in comparison to the pristine PSf. The thermal stability of the nanocomposite membranes, evaluated by onset and final temperatures of degradation, was also enhanced. The hydrophilicity of the nanocomposite membranes, determined by water contact angle measurements, was higher; therefore, the MMT addition was useful to produce more hydrophilic membranes. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.
Resumo:
To reconstruct oceanographic variations in the subtropical South Pacific, 271-year long subseasonal time series of Sr/Ca and delta(18)O were generated from a coral growing at Rarotonga (21.5degreesS, 159.5degreesW). In this case, coral Sr/Ca appears to be an excellent proxy for sea surface temperature (SST) and coral delta(18)O is a function of both SST and seawater delta(18)O composition (delta(18)O(sw)). Here, we focus on extracting the delta(18)O(sw) signal from these proxy records. A method is presented assuming that coral Sr/Ca is solely a function of SST and that coral delta(18)O is a function of both SST and delta(18)O(sw). This method separates the effects of delta(18)O(sw) from SST by breaking the instantaneous changes of coral delta(18)O into separate contributions by instantaneous SST and delta(18)O(sw) changes, respectively. The results show that on average delta(18)O(sw) at Rarotonga explains similar to39% of the variance in delta(18)O and that variations in SST explains the remaining similar to61% of delta(18)O variance. Reconstructed delta(18)O(sw) shows systematic increases in summer months (December-February) consistent with the regional pattern of variations in precipitation and evaporation. The delta(18)O(sw) also shows a positive linear correlation with satellite-derived estimated salinity for the period 1980 to 1997 (r = 0.72). This linear correlation between reconstructed delta(18)O(sw) and salinity makes it possible to use the reconstructed delta(18)O(sw) to estimate the past interannual and decadal salinity changes in this region. Comparisons of coral delta(18)O and delta(18)O(sw) at Rarotonga with the Pacific decadal oscillation index suggest that the decadal and interdecadal salinity and SST variability at Rarotonga appears to be related to basin-scale decadal variability in the Pacific. Copyright (C) 2002 Elsevier Science Ltd.
Resumo:
In this paper use consider the problem of providing standard errors of the component means in normal mixture models fitted to univariate or multivariate data by maximum likelihood via the EM algorithm. Two methods of estimation of the standard errors are considered: the standard information-based method and the computationally-intensive bootstrap method. They are compared empirically by their application to three real data sets and by a small-scale Monte Carlo experiment.
Resumo:
Functional MRI (fMRI) data often have low signal-to-noise-ratio (SNR) and are contaminated by strong interference from other physiological sources. A promising tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). BSS is based on the assumption that the detected signals are a mixture of a number of independent source signals that are linearly combined via an unknown mixing matrix. BSS seeks to determine the mixing matrix to recover the source signals based on principles of statistical independence. In most cases, extraction of all sources is unnecessary; instead, a priori information can be applied to extract only the signal of interest. Herein we propose an algorithm based on a variation of ICA, called Dependent Component Analysis (DCA), where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We applied such method to inspect functional Magnetic Resonance Imaging (fMRI) data, aiming to find the hemodynamic response that follows neuronal activation from an auditory stimulation, in human subjects. The method localized a significant signal modulation in cortical regions corresponding to the primary auditory cortex. The results obtained by DCA were also compared to those of the General Linear Model (GLM), which is the most widely used method to analyze fMRI datasets.
Resumo:
It is known the power of ideas is tremendous. But there are employees in many companies who have good ideas but not put them into practice. On the other hand, there are many others who have good ideas and are encouraged to contribute their ideas for innovation in the company. This study attempts to identify factors that contribute to success in managing ideas and consequent business innovation. The method used was the case study applied to two companies. During the investigation, factors considered essential for the success of an idea management program were identified, of which we highlight, among others, evidences the results, involvement of the top management, establishment of goals and objectives; recognition; dissemination of good results. Companies with these implemented systems, capture the best ideas from their collaborators and apply them internally. This study intends to contribute to business innovation in enterprises through creation and idea management, mainly through collecting the best ideas of their own employees. The results of this study can be used to help improving deployed suggestions systems, as well as, all managers who wish to implement suggestions systems/ideas management systems.
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
Given a set of mixed spectral (multispectral or hyperspectral) vectors, linear spectral mixture analysis, or linear unmixing, aims at estimating the number of reference substances, also called endmembers, their spectral signatures, and their abundance fractions. This paper presents a new method for unsupervised endmember extraction from hyperspectral data, termed vertex component analysis (VCA). The algorithm exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. In a series of experiments using simulated and real data, the VCA algorithm competes with state-of-the-art methods, with a computational complexity between one and two orders of magnitude lower than the best available method.