932 resultados para stochastic optimization, physics simulation, packing, geometry
Resumo:
SRAM-based FPGAs are sensitive to radiation effects. Soft errors can appear and accumulate, potentially defeating mitigation strategies deployed at the Application Layer. Therefore, Configuration Memory scrubbing is required to improve radiation tolerance of such FPGAs in space applications. Virtex FPGAs allow runtime scrubbing by means of dynamic partial reconfiguration. Even with scrubbing, intra-FPGA TMR systems are subjected to common-mode errors affecting more than one design domain. This is solved in inter-FPGA TMR systems at the expense of a higher cost, power and mass. In this context, a self-reference scrubber for device-level TMR system based on Xilinx Virtex FPGAs is presented. This scrubber allows for a fast SEU/MBU detection and correction by peer frame comparison without needing to access a golden configuration memory
Resumo:
Natural regeneration-based silviculture has been increasingly regarded as a reliable option in sustainable forest management. However, successful natural regeneration is not always easy to achieve. Recently, new concerns have arisen because of changing future climate. To date, regeneration models have proved helpful in decision-making concerning natural regeneration. The implementation of such models into optimization routines is a promising approach in providing forest managers with accurate tools for forest planning. In the present study, we present a stochastic multistage regeneration model for Pinus pinea L. managed woodlands in Central Spain, where regeneration has been historically unsuccessful. The model is able to quantify recruitment under different silviculture alternatives and varying climatic scenarios, with further application to optimize management scheduling. The regeneration process in the species showed high between-year variation, with all subprocesses (seed production, dispersal, germination, predation, and seedling survival) having the potential to become bottlenecks. However, model simulations demonstrate that current intensive management is responsible for regeneration failure in the long term. Specifically, stand densities at rotation age are too low to guarantee adequate dispersal, the optimal density of seed-producing trees being around 150 stems·ha−1. In addition, rotation length needs to be extended up to 120 years to benefit from the higher seed production of older trees. Stochastic optimization confirms these results. Regeneration does not appear to worsen under climate change conditions; the species exhibiting resilience worthy of broader consideration in Mediterranean silviculture.
Resumo:
The concept of "intermediate band solar cell" (IBSC) is, apparently, simple to grasp. However, since the idea was proposed, our understanding has improved and we feel now that we can explain better some concepts than we initially introduced. Clarifying these concepts is important, even if they are well-known for the advanced researcher, so that efforts can be driven in the right direction from start. The six pieces of this work are: Does a miniband need to be formed when the IBSC is implemented with quantum dots?; What are the problems of each of the main practical approaches that exist today? What are the simplest experimental techniques to demonstrate whether an IBSC is working as such or not? What is the issue with the absorption coefficient overlap? and Mott's transition? What the best system would be, if any?
Resumo:
Introducing cover crops (CC) interspersed with intensively fertilized crops in rotation has the potential to reduce nitrate leaching. This paper evaluates various strategies involving CC between maize and compares the economic and environmental results with respect to a typical maize?fallow rotation. The comparison is performed through stochastic (Monte-Carlo) simulation models of farms? profits using probability distribution functions (pdfs) of yield and N fertilizer saving fitted with data collected from various field trials and pdfs of crop prices and the cost of fertilizer fitted from statistical sources. Stochastic dominance relationships are obtained to rank the most profitable strategies from a farm financial perspective. A two-criterion comparison scheme is proposed to rank alternative strategies based on farm profit and nitrate leaching levels, taking the baseline scenario as the maize?fallow rotation. The results show that when CC biomass is sold as forage instead of keeping it in the soil, greater profit and less leaching of nitrates are achieved than in the baseline scenario. While the fertilizer saving will be lower if CC is sold than if it is kept in the soil, the revenue obtained from the sale of the CC compensates for the reduced fertilizer savings. The results show that CC would perhaps provide a double dividend of greater profit and reduced nitrate leaching in intensive irrigated cropping systems in Mediterranean regions.
Resumo:
Markov Chain Monte Carlo methods are widely used in signal processing and communications for statistical inference and stochastic optimization. In this work, we introduce an efficient adaptive Metropolis-Hastings algorithm to draw samples from generic multimodal and multidimensional target distributions. The proposal density is a mixture of Gaussian densities with all parameters (weights, mean vectors and covariance matrices) updated using all the previously generated samples applying simple recursive rules. Numerical results for the one and two-dimensional cases are provided.
Resumo:
Monte Carlo (MC) methods are widely used in signal processing, machine learning and communications for statistical inference and stochastic optimization. A well-known class of MC methods is composed of importance sampling and its adaptive extensions (e.g., population Monte Carlo). In this work, we introduce an adaptive importance sampler using a population of proposal densities. The novel algorithm provides a global estimation of the variables of interest iteratively, using all the samples generated. The cloud of proposals is adapted by learning from a subset of previously generated samples, in such a way that local features of the target density can be better taken into account compared to single global adaptation procedures. Numerical results show the advantages of the proposed sampling scheme in terms of mean absolute error and robustness to initialization.
Resumo:
Monte Carlo (MC) methods are widely used in signal processing, machine learning and stochastic optimization. A well-known class of MC methods are Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce a novel parallel interacting MCMC scheme, where the parallel chains share information using another MCMC technique working on the entire population of current states. These parallel ?vertical? chains are led by random-walk proposals, whereas the ?horizontal? MCMC uses a independent proposal, which can be easily adapted by making use of all the generated samples. Numerical results show the advantages of the proposed sampling scheme in terms of mean absolute error, as well as robustness w.r.t. to initial values and parameter choice.
Resumo:
Recently, the target function for crystallographic refinement has been improved through a maximum likelihood analysis, which makes proper allowance for the effects of data quality, model errors, and incompleteness. The maximum likelihood target reduces the significance of false local minima during the refinement process, but it does not completely eliminate them, necessitating the use of stochastic optimization methods such as simulated annealing for poor initial models. It is shown that the combination of maximum likelihood with cross-validation, which reduces overfitting, and simulated annealing by torsion angle molecular dynamics, which simplifies the conformational search problem, results in a major improvement of the radius of convergence of refinement and the accuracy of the refined structure. Torsion angle molecular dynamics and the maximum likelihood target function interact synergistically, the combination of both methods being significantly more powerful than each method individually. This is demonstrated in realistic test cases at two typical minimum Bragg spacings (dmin = 2.0 and 2.8 Å, respectively), illustrating the broad applicability of the combined method. In an application to the refinement of a new crystal structure, the combined method automatically corrected a mistraced loop in a poor initial model, moving the backbone by 4 Å.
Resumo:
A condutividade hidráulica (K) é um dos parâmetros controladores da magnitude da velocidade da água subterrânea, e consequentemente, é um dos mais importantes parâmetros que afetam o fluxo subterrâneo e o transporte de solutos, sendo de suma importância o conhecimento da distribuição de K. Esse trabalho visa estimar valores de condutividade hidráulica em duas áreas distintas, uma no Sistema Aquífero Guarani (SAG) e outra no Sistema Aquífero Bauru (SAB) por meio de três técnicas geoestatísticas: krigagem ordinária, cokrigagem e simulação condicional por bandas rotativas. Para aumentar a base de dados de valores de K, há um tratamento estatístico dos dados conhecidos. O método de interpolação matemática (krigagem ordinária) e o estocástico (simulação condicional por bandas rotativas) são aplicados para estimar os valores de K diretamente, enquanto que os métodos de krigagem ordinária combinada com regressão linear e cokrigagem permitem incorporar valores de capacidade específica (Q/s) como variável secundária. Adicionalmente, a cada método geoestatístico foi aplicada a técnica de desagrupamento por célula para comparar a sua capacidade de melhorar a performance dos métodos, o que pode ser avaliado por meio da validação cruzada. Os resultados dessas abordagens geoestatísticas indicam que os métodos de simulação condicional por bandas rotativas com a técnica de desagrupamento e de krigagem ordinária combinada com regressão linear sem a técnica de desagrupamento são os mais adequados para as áreas do SAG (rho=0.55) e do SAB (rho=0.44), respectivamente. O tratamento estatístico e a técnica de desagrupamento usados nesse trabalho revelaram-se úteis ferramentas auxiliares para os métodos geoestatísticos.
Resumo:
A comercialização de energia elétrica de fontes renováveis, ordinariamente, constitui-se uma atividade em que as operações são estruturadas sob condições de incerteza, por exemplo, em relação ao preço \"spot\" no mercado de curto prazo e a geração de energia dos empreendimentos. Deriva desse fato a busca dos agentes pela formulação de estratégias e utilização de ferramentais para auxiliá-los em suas tomadas de decisão, visando não somente o retorno financeiro, mas também à mitigação dos riscos envolvidos. Análises de investimentos em fontes renováveis compartilham de desafios similares. Na literatura, o estudo da tomada de decisão considerada ótima sob condições de incerteza se dá por meio da aplicação de técnicas de programação estocástica, que viabiliza a modelagem de problemas com variáveis randômicas e a obtenção de soluções racionais, de interesse para o investidor. Esses modelos permitem a incorporação de métricas de risco, como por exemplo, o Conditional Value-at-Risk, a fim de se obter soluções ótimas que ponderem a expectativa de resultado financeiro e o risco associado da operação, onde a aversão ao risco do agente torna-se um condicionante fundamental. O objetivo principal da Tese - sob a ótica dos agentes geradores, consumidores e comercializadores - é: (i) desenvolver e implementar modelos de otimização em programação linear estocástica com métrica CVaR associada, customizados para cada um desses agentes; e (ii) aplicá-los na análise estratégica de operações como forma de apresentar alternativas factíveis à gestão das atividades desses agentes e contribuir com a proposição de um instrumento conceitualmente robusto e amigável ao usuário, para utilização por parte das empresas. Nesse contexto, como antes frisado, dá-se ênfase na análise do risco financeiro dessas operações por meio da aplicação do CVaR e com base na aversão ao risco do agente. Considera-se as fontes renováveis hídrica e eólica como opções de ativos de geração, de forma a estudar o efeito de complementaridade entre fontes distintas e entre sites distintos da mesma fonte, avaliando-se os rebatimentos nas operações.
Resumo:
The cross-entropy (CE) method is a new generic approach to combinatorial and multi-extremal optimization and rare event simulation. The purpose of this tutorial is to give a gentle introduction to the CE method. We present the CE methodology, the basic algorithm and its modifications, and discuss applications in combinatorial optimization and machine learning. combinatorial optimization
Resumo:
In series I and II of this study ([Chua et al., 2010a] and [Chua et al., 2010b]), we discussed the time scale of granule–granule collision, droplet–granule collision and droplet spreading in Fluidized Bed Melt Granulation (FBMG). In this third one, we consider the rate at which binder solidifies. Simple analytical solution, based on classical formulation for conduction across a semi-infinite slab, was used to obtain a generalized equation for binder solidification time. A multi-physics simulation package (Comsol) was used to predict the binder solidification time for various operating conditions usually considered in FBMG. The simulation results were validated with experimental temperature data obtained with a high speed infrared camera during solidification of ‘macroscopic’ (mm scale) droplets. For the range of microscopic droplet size and operating conditions considered for a FBMG process, the binder solidification time was found to fall approximately between 10-3 and 10-1 s. This is the slowest compared to the other three major FBMG microscopic events discussed in this series (granule–granule collision, granule–droplet collision and droplet spreading).
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.
Resumo:
Le béton conventionnel (BC) a de nombreux problèmes tels que la corrosion de l’acier d'armature et les faibles résistances des constructions en béton. Par conséquent, la plupart des structures fabriquées avec du BC exigent une maintenance fréquent. Le béton fibré à ultra-hautes performances (BFUP) peut être conçu pour éliminer certaines des faiblesses caractéristiques du BC. Le BFUP est défini à travers le monde comme un béton ayant des propriétés mécaniques, de ductilité et de durabilité supérieures. Le BFUP classique comprend entre 800 kg/m³ et 1000 kg/m³ de ciment, de 25 à 35% massique (%m) de fumée de silice (FS), de 0 à 40%m de poudre de quartz (PQ) et 110-140%m de sable de quartz (SQ) (les pourcentages massiques sont basés sur la masse totale en ciment des mélanges). Le BFUP contient des fibres d'acier pour améliorer sa ductilité et sa résistance aux efforts de traction. Les quantités importantes de ciment utilisées pour produire un BFUP affectent non seulement les coûts de production et la consommation de ressources naturelles comme le calcaire, l'argile, le charbon et l'énergie électrique, mais affectent également négativement les dommages sur l'environnement en raison de la production substantielle de gaz à effet de serre dont le gas carbonique (CO[indice inférieur 2]). Par ailleurs, la distribution granulométrique du ciment présente des vides microscopiques qui peuvent être remplis avec des matières plus fines telles que la FS. Par contre, une grande quantité de FS est nécessaire pour combler ces vides uniquement avec de la FS (25 à 30%m du ciment) ce qui engendre des coûts élevés puisqu’il s’agit d’une ressource limitée. Aussi, la FS diminue de manière significative l’ouvrabilité des BFUP en raison de sa surface spécifique Blaine élevée. L’utilisation du PQ et du SQ est également coûteuse et consomme des ressources naturelles importantes. D’ailleurs, les PQ et SQ sont considérés comme des obstacles pour l’utilisation des BFUP à grande échelle dans le marché du béton, car ils ne parviennent pas à satisfaire les exigences environnementales. D’ailleurs, un rapport d'Environnement Canada stipule que le quartz provoque des dommages environnementaux immédiats et à long terme en raison de son effet biologique. Le BFUP est généralement vendu sur le marché comme un produit préemballé, ce qui limite les modifications de conception par l'utilisateur. Il est normalement transporté sur de longues distances, contrairement aux composantes des BC. Ceci contribue également à la génération de gaz à effet de serre et conduit à un coût plus élevé du produit final. Par conséquent, il existe le besoin de développer d’autres matériaux disponibles localement ayant des fonctions similaires pour remplacer partiellement ou totalement la fumée de silice, le sable de quartz ou la poudre de quartz, et donc de réduire la teneur en ciment dans BFUP, tout en ayant des propriétés comparables ou meilleures. De grandes quantités de déchets verre ne peuvent pas être recyclées en raison de leur fragilité, de leur couleur, ou des coûts élevés de recyclage. La plupart des déchets de verre vont dans les sites d'enfouissement, ce qui est indésirable puisqu’il s’agit d’un matériau non biodégradable et donc moins respectueux de l'environnement. Au cours des dernières années, des études ont été réalisées afin d’utiliser des déchets de verre comme ajout cimentaire alternatif (ACA) ou comme granulats ultrafins dans le béton, en fonction de la distribution granulométrique et de la composition chimique de ceux-ci. Cette thèse présente un nouveau type de béton écologique à base de déchets de verre à ultra-hautes performances (BEVUP) développé à l'Université de Sherbrooke. Les bétons ont été conçus à l’aide de déchets verre de particules de tailles variées et de l’optimisation granulaire de la des matrices granulaires et cimentaires. Les BEVUP peuvent être conçus avec une quantité réduite de ciment (400 à 800 kg/m³), de FS (50 à 220 kg/m³), de PQ (0 à 400 kg/m³), et de SQ (0-1200 kg/m³), tout en intégrant divers produits de déchets de verre: du sable de verre (SV) (0-1200 kg/m³) ayant un diamètre moyen (d[indice inférieur 50]) de 275 µm, une grande quantité de poudre de verre (PV) (200-700 kg/m³) ayant un d50 de 11 µm, une teneur modérée de poudre de verre fine (PVF) (50-200 kg/m³) avec d[indice inférieur] 50 de 3,8 µm. Le BEVUP contient également des fibres d'acier (pour augmenter la résistance à la traction et améliorer la ductilité), du superplastifiants (10-60 kg/m³) ainsi qu’un rapport eau-liant (E/L) aussi bas que celui de BFUP. Le remplacement du ciment et des particules de FS avec des particules de verre non-absorbantes et lisse améliore la rhéologie des BEVUP. De plus, l’utilisation de la PVF en remplacement de la FS réduit la surface spécifique totale nette d’un mélange de FS et de PVF. Puisque la surface spécifique nette des particules diminue, la quantité d’eau nécessaire pour lubrifier les surfaces des particules est moindre, ce qui permet d’obtenir un affaissement supérieur pour un même E/L. Aussi, l'utilisation de déchets de verre dans le béton abaisse la chaleur cumulative d'hydratation, ce qui contribue à minimiser le retrait de fissuration potentiel. En fonction de la composition des BEVUP et de la température de cure, ce type de béton peut atteindre des résistances à la compression allant de 130 à 230 MPa, des résistances à la flexion supérieures à 20 MPa, des résistances à la traction supérieure à 10 MPa et un module d'élasticité supérieur à 40 GPa. Les performances mécaniques de BEVUP sont améliorées grâce à la réactivité du verre amorphe, à l'optimisation granulométrique et la densification des mélanges. Les produits de déchets de verre dans les BEVUP ont un comportement pouzzolanique et réagissent avec la portlandite générée par l'hydratation du ciment. Cependant, ceci n’est pas le cas avec le sable de quartz ni la poudre de quartz dans le BFUP classique, qui réagissent à la température élevée de 400 °C. L'addition des déchets de verre améliore la densification de l'interface entre les particules. Les particules de déchets de verre ont une grande rigidité, ce qui augmente le module d'élasticité du béton. Le BEVUP a également une très bonne durabilité. Sa porosité capillaire est très faible, et le matériau est extrêmement résistant à la pénétration d’ions chlorure (≈ 8 coulombs). Sa résistance à l'abrasion (indice de pertes volumiques) est inférieure à 1,3. Le BEVUP ne subit pratiquement aucune détérioration aux cycles de gel-dégel, même après 1000 cycles. Après une évaluation des BEVUP en laboratoire, une mise à l'échelle a été réalisée avec un malaxeur de béton industriel et une validation en chantier avec de la construction de deux passerelles. Les propriétés mécaniques supérieures des BEVUP a permis de concevoir les passerelles avec des sections réduites d’environ de 60% par rapport aux sections faites de BC. Le BEVUP offre plusieurs avantages économiques et environnementaux. Il réduit le coût de production et l’empreinte carbone des structures construites de béton fibré à ultra-hautes performances (BFUP) classique, en utilisant des matériaux disponibles localement. Il réduit les émissions de CO[indice inférieur 2] associées à la production de clinkers de ciment (50% de remplacement du ciment) et utilise efficacement les ressources naturelles. De plus, la production de BEVUP permet de réduire les quantités de déchets de verre stockés ou mis en décharge qui causent des problèmes environnementaux et pourrait permettre de sauver des millions de dollars qui pourraient être dépensés dans le traitement de ces déchets. Enfin, il offre une solution alternative aux entreprises de construction dans la production de BFUP à moindre coût.