936 resultados para Component interconnection
Resumo:
A combination of modelling and analysis techniques was used to design a six component force balance. The balance was designed specifically for the measurement of impulsive aerodynamic forces and moments characteristic of hypervelocity shock tunnel testing using the stress wave force measurement technique. Aerodynamic modelling was used to estimate the magnitude and distribution of forces and finite element modelling to determine the mechanical response of proposed balance designs. Simulation of balance performance was based on aerodynamic loads and mechanical responses using convolution techniques. Deconvolution was then used to assess balance performance and to guide further design modifications leading to the final balance design. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
In this paper a methodology for integrated multivariate monitoring and control of biological wastewater treatment plants during extreme events is presented. To monitor the process, on-line dynamic principal component analysis (PCA) is performed on the process data to extract the principal components that represent the underlying mechanisms of the process. Fuzzy c-means (FCM) clustering is used to classify the operational state. Performing clustering on scores from PCA solves computational problems as well as increases robustness due to noise attenuation. The class-membership information from FCM is used to derive adequate control set points for the local control loops. The methodology is illustrated by a simulation study of a biological wastewater treatment plant, on which disturbances of various types are imposed. The results show that the methodology can be used to determine and co-ordinate control actions in order to shift the control objective and improve the effluent quality.
Resumo:
This quantitative pilot study (n = 178), conducted in a large Brisbane teaching hospital in Australia, found autonomy to be the most important job component for registered nurses' job satisfaction. The actual level of satisfaction with autonomy was 4.6, on a scale of 1 for very dissatisfied to 7 for very satisfied. The mean for job satisfaction was 4.3, with the job components professional status and interaction adding most substantially to the result. There was discontentment with the other two job components, which were Cask requirements and organisational policies. Demographic comparisons showed that nurses who were preceptors had significantly less job satisfaction than the other nurses at the hospital. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
We consider a two-component Bose-Einstein condensate in two spatially localized modes of a double-well potential, with periodic modulation of the tunnel coupling between the two modes. We treat the driven quantum field using a two-mode expansion and define the quantum dynamics in terms of the Floquet Operator for the time periodic Hamiltonian of the system. It has been shown that the corresponding semiclassical mean-field dynamics can exhibit regions of regular and chaotic motion. We show here that the quantum dynamics can exhibit dynamical tunneling between regions of regular motion, centered on fixed points (resonances) of the semiclassical dynamics.
Resumo:
This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods
Resumo:
Over the last decade component-based software development arose as a promising paradigm to deal with the ever increasing complexity in software design, evolution and reuse. SHACC is a prototyping tool for component-based systems in which components are modelled coinductively as generalized Mealy machines. The prototype is built as a HASKELL library endowed with a graphical user interface developed in Swing
Resumo:
The lack of a commonly accepted de nition of a software component, the proliferation of competing `standards' and component frameworks, is here to stay, raising the fundamental question in component-based development of how to cope in practice with heterogeneity. This paper reports on the design of a Component Repository aimed to give at least a partial answer to the above question. The repository was fully speci ed in Vdm and a working prototype is currently being used in an industrial environment
Resumo:
It is known the power of ideas is tremendous. But there are employees in many companies who have good ideas but not put them into practice. On the other hand, there are many others who have good ideas and are encouraged to contribute their ideas for innovation in the company. This study attempts to identify factors that contribute to success in managing ideas and consequent business innovation. The method used was the case study applied to two companies. During the investigation, factors considered essential for the success of an idea management program were identified, of which we highlight, among others, evidences the results, involvement of the top management, establishment of goals and objectives; recognition; dissemination of good results. Companies with these implemented systems, capture the best ideas from their collaborators and apply them internally. This study intends to contribute to business innovation in enterprises through creation and idea management, mainly through collecting the best ideas of their own employees. The results of this study can be used to help improving deployed suggestions systems, as well as, all managers who wish to implement suggestions systems/ideas management systems.
Resumo:
Independent component analysis (ICA) has recently been proposed as a tool to unmix hyperspectral data. ICA is founded on two assumptions: 1) the observed spectrum vector is a linear mixture of the constituent spectra (endmember spectra) weighted by the correspondent abundance fractions (sources); 2)sources are statistically independent. Independent factor analysis (IFA) extends ICA to linear mixtures of independent sources immersed in noise. Concerning hyperspectral data, the first assumption is valid whenever the multiple scattering among the distinct constituent substances (endmembers) is negligible, and the surface is partitioned according to the fractional abundances. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be statistically independent, this compromising the performance of ICA/IFA algorithms in hyperspectral unmixing. This paper studies the impact of hyperspectral source statistical dependence on ICA and IFA performances. We conclude that the accuracy of these methods tends to improve with the increase of the signature variability, of the number of endmembers, and of the signal-to-noise ratio. In any case, there are always endmembers incorrectly unmixed. We arrive to this conclusion by minimizing the mutual information of simulated and real hyperspectral mixtures. The computation of mutual information is based on fitting mixtures of Gaussians to the observed data. A method to sort ICA and IFA estimates in terms of the likelihood of being correctly unmixed is proposed.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings