889 resultados para spse model (situation, problem, solution, evaluation)
Resumo:
The decision-making process for machine-tool selection and operation allocation in a flexible manufacturing system (FMS) usually involves multiple conflicting objectives. Thus, a fuzzy goal-programming model can be effectively applied to this decision problem. The paper addresses application of a fuzzy goal-programming concept to model the problem of machine-tool selection and operation allocation with explicit considerations given to objectives of minimizing the total cost of machining operation, material handling and set-up. The constraints pertaining to the capacity of machines, tool magazine and tool life are included in the model. A genetic algorithm (GA)-based approach is adopted to optimize this fuzzy goal-programming model. An illustrative example is provided and some results of computational experiments are reported.
Resumo:
Energy consumption has become a major constraint in providing increased functionality for devices with small form factors. Dynamic voltage and frequency scaling has been identified as an effective approach for reducing the energy consumption of embedded systems. Earlier works on dynamic voltage scaling focused mainly on performing voltage scaling when the CPU is waiting for memory subsystem or concentrated chiefly on loop nests and/or subroutine calls having sufficient number of dynamic instructions. This paper concentrates on coarser program regions and for the first time uses program phase behavior for performing dynamic voltage scaling. Program phases are annotated at compile time with mode switch instructions. Further, we relate the Dynamic Voltage Scaling Problem to the Multiple Choice Knapsack Problem, and use well known heuristics to solve it efficiently. Also, we develop a simple integer linear program formulation for this problem. Experimental evaluation on a set of media applications reveal that our heuristic method obtains a 38% reduction in energy consumption on an average, with a performance degradation of 1% and upto 45% reduction in energy with a performance degradation of 5%. Further, the energy consumed by the heuristic solution is within 1% of the optimal solution obtained from the ILP approach.
Resumo:
Search of design spaces to generate solutions affects the design outcomes during conceptual design. This research aims to understand the different types of search that occurs during conceptual design and their effect on the design outcomes. Additionally, we study the effect of other factors, such as creativity, problem-solving style, and experience of designers, on the design outcomes. Two sets of design experiments, with experienced and novice designers, are used in this study. We find that designers employ twelve different types of searches during conceptual design for problem understanding, solution generation, and solution evaluation activities. Results also suggest that creativity is influenced positively by the type and amount of searches, duration of designing, and experience of designers.
Resumo:
Designing for all requires the adaptation and modification of current design best practices to encompass a broader range of user capabilities. This is particularly the case in the design of the human-product interface. Product interfaces exist everywhere and when designing them, there is a very strong temptation to jump to prescribing a solution with only a cursory attempt to understand the nature of the problem. This is particularly the case when attempting to adapt existing designs, optimised for able-bodied users, for use by disabled users. However, such approaches have led to numerous products that are neither usable nor commercially successful. In order to develop a successful design approach it is necessary consider the fundamental structure of the design process being applied. A three stage design process development strategy which includes problem definition, solution development and solution evaluation, should be adopted. This paper describes the development of a new design approach based on the application of usability heuristics to the design of interfaces. This is illustrated by reference to a particular case study of the re-design of a computer interface for controlling an assistive device.
Resumo:
A modelagem orientada a agentes surge como paradigma no desenvolvimento de software, haja vista a quantidade de iniciativas e estudos que remetem à utilização de agentes de software como solução para tratar de problemas mais complexos. Apesar da popularidade de utilização de agentes, especialistas esbarram na falta de universalidade de uma metodologia para construção dos Sistemas Multiagentes (MAS), pois estas acabam pecando pelo excesso ou falta de soluções para modelar o problema. Esta dissertação propõe o uso de uma Ontologia sobre Metodologias Multiagentes, seguindo os princípios da Engenharia de Métodos Situacionais que se propõe a usar fragmentos de métodos para construção de metodologias baseados na especificidade do projeto em desenvolvimento. O objetivo do estudo é sedimentar o conhecimento na área de Metodologias Multiagentes, auxiliando o engenheiro de software a escolher a melhor metodologia ou o melhor fragmento de metodologia capaz de modelar um Sistema Multiagentes.
Resumo:
Modelos de evolução populacional são há muito tempo assunto de grande relevância, principalmente quando a população de estudo é composta por vetores de doenças. Tal importância se deve ao fato de existirem milhares de doenças que são propagadas por espécies específicas e conhecer como tais populações se comportam é vital quando pretende-se criar políticas públicas para controlar a sua proliferação. Este trabalho descreve um problema de evolução populacional difusivo com armadilhas locais e tempo de reprodução atrasado, o problema direto descreve a densidade de uma população uma vez conhecidos os parâmetros do modelo onde sua solução é obtida por meio da técnica de transformada integral generalizada, uma técnica numérico-analítica. Porém a solução do problema direto, por si só, não permite a simulação computacional de uma população em uma aplicação prática, uma vez que os parâmetros do modelo variam de população para população e precisam, portanto, ter seus valores conhecidos. Com o objetivo de possibilitar esta caracterização, o presente trabalho propõe a formulação e solução do problema inverso, estimando os parâmetros do modelo a partir de dados da população utilizando para tal tarefa dois métodos Bayesianos.
Resumo:
We present a multispectral photometric stereo method for capturing geometry of deforming surfaces. A novel photometric calibration technique allows calibration of scenes containing multiple piecewise constant chromaticities. This method estimates per-pixel photometric properties, then uses a RANSAC-based approach to estimate the dominant chromaticities in the scene. A likelihood term is developed linking surface normal, image intensity and photometric properties, which allows estimating the number of chromaticities present in a scene to be framed as a model estimation problem. The Bayesian Information Criterion is applied to automatically estimate the number of chromaticities present during calibration. A two-camera stereo system provides low resolution geometry, allowing the likelihood term to be used in segmenting new images into regions of constant chromaticity. This segmentation is carried out in a Markov Random Field framework and allows the correct photometric properties to be used at each pixel to estimate a dense normal map. Results are shown on several challenging real-world sequences, demonstrating state-of-the-art results using only two cameras and three light sources. Quantitative evaluation is provided against synthetic ground truth data. © 2011 IEEE.
Resumo:
BACKGROUND: The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. METHODS: Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. RESULTS: Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. CONCLUSIONS: This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial.
Resumo:
Macromolecular conjugates of two kinds of natural polysaccharides, that from Panax quinquefolium linn (PQPS) and Ganoderma applanatum pat (GAPS), with gadolinium-diethylenetriaminepenta-acetic acid (Gd-DTPA) have been synthesized and characterized by means of FTIR, elementary analysis and ICP-AES. Their stability was investigated by competition study with Ca2+, EDTA (ethylenediaminetetraacetic acid) and DTPA. Polysaccharide-bound complexes exhibit T-1 relaxivities of 1.5-1.7 times that of Gd-DTPA in D2O at 25degreesC and 9.4T. MR imaging of Sprague-Dawley (SD) rats showed remarkable enhancement in rat liver and kidney after i.v. injection of these two complexes: liver parenchyma 60.9+/-5.6%, 57.8+/-7.4% at 65-85 min; kidney 144.9+/-14.5%, 199.9+/-25.4% at 10-30 min for PQPS-GdDTPA, GAPS-Gd-DTPA at gadolinium dose of 0.083 and 0.082 mmol/kg, respectively. Our preliminary in vivo and in vitro study indicates that the two kinds of polysaccharide-bound complexes are potential tissue-specific contrast agents for MRI.
Resumo:
The work reported here lies in the area of overlap between artificial intelligence software engineering. As research in artificial intelligence, it is a step towards a model of problem solving in the domain of programming. In particular, this work focuses on the routine aspects of programming which involve the application of previous experience with similar programs. I call this programming by inspection. Programming is viewed here as a kind of engineering activity. Analysis and synthesis by inspection area prominent part of expert problem solving in many other engineering disciplines, such as electrical and mechanical engineering. The notion of inspections methods in programming developed in this work is motivated by similar notions in other areas of engineering. This work is also motivated by current practical concerns in the area of software engineering. The inadequacy of current programming technology is universally recognized. Part of the solution to this problem will be to increase the level of automation in programming. I believe that the next major step in the evolution of more automated programming will be interactive systems which provide a mixture of partially automated program analysis, synthesis and verification. One such system being developed at MIT, called the programmer's apprentice, is the immediate intended application of this work. This report concentrates on the knowledge are of the programmer's apprentice, which is the form of a taxonomy of commonly used algorithms and data structures. To the extent that a programmer is able to construct and manipulate programs in terms of the forms in such a taxonomy, he may relieve himself of many details and generally raise the conceptual level of his interaction with the system, as compared with present day programming environments. Also, since it is practical to expand a great deal of effort pre-analyzing the entries in a library, the difficulty of verifying the correctness of programs constructed this way is correspondingly reduced. The feasibility of this approach is demonstrated by the design of an initial library of common techniques for manipulating symbolic data. This document also reports on the further development of a formalism called the plan calculus for specifying computations in a programming language independent manner. This formalism combines both data and control abstraction in a uniform framework that has facilities for representing multiple points of view and side effects.
Resumo:
A review of polymer cure models used in microelectronics packaging applications reveals no clear consensus of the chemical rate constants for the cure reactions, or even of an effective model. The problem lies in the contrast between the actual cure process, which involves a sequence of distinct chemical reactions, and the models, which typically assume only one, (or two with some restrictions on the independence of their characteristic constants.) The standard techniques to determine the model parameters are based on differential scanning calorimetry (DSC), which cannot distinguish between the reactions, and hence yields results useful only under the same conditions, which completely misses the point of modeling. The obvious solution is for manufacturers to provide the modeling parameters, but failing that, an alternative experimental technique is required to determine individual reaction parameters, e.g. Fourier transform infra-red spectroscopy (FTIR).
Resumo:
A review of polymer cure models used in microelectronics packaging applications reveals no clear consensus of the chemical rate constants for the cure reactions, or even of an effective model. The problem lies in the contrast between the actual cure process, which involves a sequence of distinct chemical reactions, and the models, which typically assume only one, (or two with some restrictions on the independence of their characteristic constants.) The standard techniques to determine the model parameters are based on differential scanning calorimetry (DSC), which cannot distinguish between the reactions, and hence yields results useful only under the same conditions, which completely misses the point of modeling. The obvious solution is for manufacturers to provide the modeling parameters, but failing that, an alternative experimental technique is required to determine individual reaction parameters, e.g. Fourier transform infra-red spectroscopy (FTIR).
Resumo:
This paper presents the maximum weighted stream posterior (MWSP) model as a robust and efficient stream integration method for audio-visual speech recognition in environments, where the audio or video streams may be subjected to unknown and time-varying corruption. A significant advantage of MWSP is that it does not require any specific measurements of the signal in either stream to calculate appropriate stream weights during recognition, and as such it is modality-independent. This also means that MWSP complements and can be used alongside many of the other approaches that have been proposed in the literature for this problem. For evaluation we used the large XM2VTS database for speaker-independent audio-visual speech recognition. The extensive tests include both clean and corrupted utterances with corruption added in either/both the video and audio streams using a variety of types (e.g., MPEG-4 video compression) and levels of noise. The experiments show that this approach gives excellent performance in comparison to another well-known dynamic stream weighting approach and also compared to any fixed-weighted integration approach in both clean conditions or when noise is added to either stream. Furthermore, our experiments show that the MWSP approach dynamically selects suitable integration weights on a frame-by-frame basis according to the level of noise in the streams and also according to the naturally fluctuating relative reliability of the modalities even in clean conditions. The MWSP approach is shown to maintain robust recognition performance in all tested conditions, while requiring no prior knowledge about the type or level of noise.
Resumo:
The larval form of the Greater Wax Moth (Galleria mellonella) was evaluated as a model system for the study of the acute in vivo toxicity of 1-alkyl-3-methylimidazolium chloride ionic liquids. 24-h median lethal dose (LD50) values for nine of these ionic liquids bearing alkyl chain substituents ranging from 2 to 18 carbon atoms were determined. The in vivo toxicity of the ionic liquids was found to correlate directly with the length of the alkyl chain substituent, and the pattern of toxicity observed was in accordance with previous studies of ionic liquid toxicity in other living systems, including a characteristic toxicity ‘cut-off’ effect. However, G. mellonella appeared to be more susceptible to the toxic effects of the ionic liquids tested, possibly as a result of their high body fat content. The results obtained in this study indicate that G. mellonella represents a sensitive, reliable and robust in vivo model organism for the evaluation of ionic liquid toxicity.
Resumo:
Recentemente a avaliação imobiliária levou ao colapso de instituições financeiras e à crise no Subprime. A presente investigação pretende contribuir para perceber quais os factores preponderantes na avaliação imobiliária. O trabalho aborda a problemática da assimetria de informação, os diferentes métodos de avaliação imobiliária e a importância das externalidades. Empiricamente há diversos casos analisados através do uso da metodologia da Regressão Linear, Análise de Clusters e Análise de Componentes Principais da Análise Factorial. O primeiro caso analisado é direccionado à avaliação das externalidades, onde os resultados indicam que as externalidades positivas principais são as seguintes: as vistas de marina são mais valorizadas que as vistas de mar, as vistas frontais são mais valorizadas que as vistas laterais e existem diferenças de valorização ao nível do piso de acordo com o tipo de habitação (residência ou férias). O segundo estudo analisa como o método do rendimento ajuda na explicação da realidade portuguesa, no qual foram obtidos três clusters de rendas e três clusters de yields para cada uma das amostras. Os resultados demonstram que: (a) ambos os clusters, das yields e das rendas são formados por diferentes elementos (b) que o valor da oferta é explicado pelo método do rendimento, pelo cluster das yields e pela densidade populacional. No terceiro estudo foram inquiridos 427 indivíduos que procuravam apartamento para residência. A partir da Análise de Componentes Principais da Análise Factorial efectuada obtiveram-se sete factores determinantes na procura de apartamento: as externalidades negativas, as externalidades positivas, a localização de negócios no rés-do-chão do edifício de apartamentos, os interesses racionais de proximidade, as variáveis secundárias na utilização do edifício, as variáveis de rendimento e as variáveis de interesses pessoais. A principal conclusão é que como é uma área transdisciplinar, é difícil chegar a um único modelo que incorpore os métodos de avaliação e as diferentes dinâmicas da procura. O avaliador, deve analisar e fazer o seu scoring, tendo em conta o equilíbrio entre a ciência da avaliação e a arte da apreciação.