931 resultados para optimization of production processes
Resumo:
This dissertation is focused on theoretical and experimental studies of optical properties of materials and multilayer structures composing liquid crystal displays (LCDs) and electrochromic (EC) devices. By applying spectroscopic ellipsometry, we have determined the optical constants of thin films of electrochromic tungsten oxide (WOx) and nickel oxide (NiOy), the films’ thickness and roughness. These films, which were obtained at spattering conditions possess high transmittance that is important for achieving good visibility and high contrast in an EC device. Another application of the general spectroscopic ellipsometry relates to the study of a photo-alignment layer of a mixture of azo-dyes SD-1 and SDA-2. We have found the optical constants of this mixture before and after illuminating it by polarized UV light. The results obtained confirm the diffusion model to explain the formation of the photo-induced order in azo-dye films. We have developed new techniques for fast characterization of twisted nematic LC cells in transmissive and reflective modes. Our techniques are based on the characteristics functions that we have introduced for determination of parameters of non-uniform birefringent media. These characteristic functions are found by simple procedures and can be utilised for simultaneous determination of retardation, its wavelength dispersion, and twist angle, as well as for solving associated optimization problems. Cholesteric LCD that possesses some unique properties, such as bistability and good selective scattering, however, has a disadvantage – relatively high driving voltage (tens of volts). The way we propose to reduce the driving voltage consists of applying a stack of thin (~1µm) LC layers. We have studied the ability of a layer of a surface stabilized ferroelectric liquid crystal coupled with several retardation plates for birefringent color generation. We have demonstrated that in order to accomplish good color characteristics and high brightness of the display, one or two retardation plates are sufficient.
Resumo:
This paper elaborates the routing of cable cycle through available routes in a building in order to link a set of devices, in a most reasonable way. Despite of the similarities to other NP-hard routing problems, the only goal is not only to minimize the cost (length of the cycle) but also to increase the reliability of the path (in case of a cable cut) which is assessed by a risk factor. Since there is often a trade-off between the risk and length factors, a criterion for ranking candidates and deciding the most reasonable solution is defined. A set of techniques is proposed to perform an efficient and exact search among candidates. A novel graph is introduced to reduce the search-space, and navigate the search toward feasible and desirable solutions. Moreover, admissible heuristic length estimation helps to early detection of partial cycles which lead to unreasonable solutions. The results show that the method provides solutions which are both technically and financially reasonable. Furthermore, it is proved that the proposed techniques are very efficient in reducing the computational time of the search to a reasonable amount.
A New Representation And Crossover Operator For Search-based Optimization Of Software Modularization
A New Representation And Crossover Operator For Search-based Optimization Of Software Modularization
Resumo:
Application of optimization algorithm to PDE modeling groundwater remediation can greatly reduce remediation cost. However, groundwater remediation analysis requires a computational expensive simulation, therefore, effective parallel optimization could potentially greatly reduce computational expense. The optimization algorithm used in this research is Parallel Stochastic radial basis function. This is designed for global optimization of computationally expensive functions with multiple local optima and it does not require derivatives. In each iteration of the algorithm, an RBF is updated based on all the evaluated points in order to approximate expensive function. Then the new RBF surface is used to generate the next set of points, which will be distributed to multiple processors for evaluation. The criteria of selection of next function evaluation points are estimated function value and distance from all the points known. Algorithms created for serial computing are not necessarily efficient in parallel so Parallel Stochastic RBF is different algorithm from its serial ancestor. The application for two Groundwater Superfund Remediation sites, Umatilla Chemical Depot, and Former Blaine Naval Ammunition Depot. In the study, the formulation adopted treats pumping rates as decision variables in order to remove plume of contaminated groundwater. Groundwater flow and contamination transport is simulated with MODFLOW-MT3DMS. For both problems, computation takes a large amount of CPU time, especially for Blaine problem, which requires nearly fifty minutes for a simulation for a single set of decision variables. Thus, efficient algorithm and powerful computing resource are essential in both cases. The results are discussed in terms of parallel computing metrics i.e. speedup and efficiency. We find that with use of up to 24 parallel processors, the results of the parallel Stochastic RBF algorithm are excellent with speed up efficiencies close to or exceeding 100%.
Resumo:
Data available on continuos-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the use of Martingale Estimating Functions and the application of Generalized Method of Moments (GMM).
Resumo:
Data available on continuous-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the employment of Martingale Estimating Functions, and the application of Generalized Method of Moments (GMM).
Resumo:
Por muito tempo, os programas de qualidade e gestão vêm dominando o cenário de soluções para a melhoria das organizações. Nesse contexto, têm sido constantes os modismos e, ao mesmo tempo, o consumo de muitas soluções. E o mercado parece se nutrir desses pacotes. De pacote em pacote, a gestão nas companhias vem se desenvolvendo e se especializando e, junto como isso, os métodos e técnicas da boa gestão. A gestão de riscos, especificamente, vem nesse contexto. Apresenta-se como uma solução em gestão, mas parece se estabelecer como ponto de convergência e de influência na otimização e garantia de processos produtivos, de gestão e de suporte, abrangendo uma gama de possibilidades em diversas disciplinas do mundo empresarial, desde finanças até os aspectos de fraudes e distúrbios cotidianos. Na sequência da gestão de riscos, vem a gestão de crises. Esta, o hemisfério dos riscos transformados em impactos verdadeiros e com danos visíveis para a organização. A gestão de crises pressupõe a gestão dos riscos consumados e que, claramente, afetam a organização e seu contexto, tanto interno quanto externo. No ponto final dessa lógica, aparece a confiança como aquilo que sela o pacto da organização e seus stakeholders, como resultado da boa ou má gestão de riscos e crises. Este estudo é, então, sobre riscos, crises e confiança e sobre o entendimento de como a gestão sobrepõe esses elementos e como isso se aplica às organizações, especificamente a uma grande organização do mercado brasileiro. Após revisão bibliográfica, é feita uma pesquisa para se analisar como está sendo a aplicação dos conceitos e práticas de riscos e crises em uma grande empresa, entrevistando-se o principal executivo da companhia responsável pela gestão de riscos na área de segurança da informação e outro responsável pela gestão de crises. É feita uma pesquisa para se entender a percepção de empregados e gerentes da companhia estudada sobre os conceitos e práticas de gestão de riscos e crises e suas aplicações na companhia. Também é feita uma abordagem sobre confiança, extrapolando-se esse conceito para uma idéia de confiabilidade dessas práticas e propondo uma forma de medir essa confiabilidade, identificada como uma lacuna na gestão de crises. Ao final, é feita uma análise sobre o quanto a aplicação desses conceitos e práticas são sistemáticos ou não, de acordo com as hipóteses e suposições definidas, constatando-se o 9 caráter operacional e a aplicação recente das práticas, um tanto deslocada do modelo de referência proposto para o estudo e com pouca visibilidade por parte dos empregados, principalmente no que tange a percepção de efetividade das práticas adotadas.
Resumo:
This paper uses an output oriented Data Envelopment Analysis (DEA) measure of technical efficiency to assess the technical efficiencies of the Brazilian banking system. Four approaches to estimation are compared in order to assess the significance of factors affecting inefficiency. These are nonparametric Analysis of Covariance, maximum likelihood using a family of exponential distributions, maximum likelihood using a family of truncated normal distributions, and the normal Tobit model. The sole focus of the paper is on a combined measure of output and the data analyzed refers to the year 2001. The factors of interest in the analysis and likely to affect efficiency are bank nature (multiple and commercial), bank type (credit, business, bursary and retail), bank size (large, medium, small and micro), bank control (private and public), bank origin (domestic and foreign), and non-performing loans. The latter is a measure of bank risk. All quantitative variables, including non-performing loans, are measured on a per employee basis. The best fits to the data are provided by the exponential family and the nonparametric Analysis of Covariance. The significance of a factor however varies according to the model fit although it can be said that there is some agreements between the best models. A highly significant association in all models fitted is observed only for nonperforming loans. The nonparametric Analysis of Covariance is more consistent with the inefficiency median responses observed for the qualitative factors. The findings of the analysis reinforce the significant association of the level of bank inefficiency, measured by DEA residuals, with the risk of bank failure.
Resumo:
FERNANDES, Fabiano A. N. et al. Optimization of Osmotic Dehydration of Papaya of followed by air-drying. Food Research Internation, v. 39, p. 492-498, 2006.
Resumo:
O objetivo deste trabalho foi avaliar o efeito do choque térmico e mecânico na produtividade de Lentinula edodes em 140 toras de Eucalyptus saligna, completamente colonizadas pelo fungo, em diferentes tempos de imersão em água e no primeiro fluxo de produção. As toras foram imersas em água resfriada (16ºC) ou à temperatura ambiente (22ºC); os períodos de imersão corresponderam a 6, 10, 14, 18, 22, 26 e 38 horas; o choque mecânico foi acompanhado por três quedas consecutivas da tora, em posição vertical, no chão. A temperatura da água e o tempo de imersão afetaram a produção de L. edodes, resultando em aumentos significativos (2 a 4 vezes) nos tratamentos em que as toras foram submetidas à água resfriada e nos tempos de imersão mais curtos (6 e 10 horas). O choque mecânico não resultou em aumento na produção de basidiomas.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)