971 resultados para Monte-carlo Simulations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to the axiomatic literature on consensus methods, the best collective choice by one method of preference aggregation can easily be the worst by another. Are award committees, electorates, managers, online retailers, and web-based recommender systems stuck with an impossibility of rational preference aggregation? We investigate this social choice conundrum for seven social choice methods: Condorcet, Borda, Plurality, Antiplurality, the Single Transferable Vote, Coombs, and Plurality Runoff. We rely on Monte Carlo simulations for theoretical results and on twelve ballot datasets from American Psychological Association (APA) presidential elections for empirical results. Each of these elections provides partial rankings of five candidates from about 13,000 to about 20,000 voters. APA preferences are neither domain-restricted nor generated by an Impartial Culture. We find virtually no trace of a Condorcet paradox. In direct contrast with the classical social choice conundrum, competing consensus methods agree remarkably well, especially on the overall best and worst options. The agreement is also robust under perturbations of the preference prole via resampling, even in relatively small pseudosamples. We also explore prescriptive implications of our findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the impact of hardware impairments on secrecy performance of cognitive MIMO schemes is investigated. In addition, the relay which helps the source forward the source signal to the destination can operate either half-duplex mode or full-duplex mode. For performance evaluation, we give the expressions of average secrecy rate over Rayleigh fading channel. Monte-Carlo simulations are presented to compare and optimize the performance of the proposed schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel design for a compact gamma-ray spectrometer is presented. The proposed system allows for spectroscopy of high-flux multi-MeV gamma-ray beams with MeV energy resolution in a compact design. In its basic configuration, the spectrometer exploits conversion of gamma-rays into electrons via Compton scattering in a low-Z material. The scattered electron population is then spectrally resolved using a magnetic spectrometer. The detector is shown to be effective for gamma-ray energies between 3 and 20 MeV. The main properties of the spectrometer are confirmed by Monte Carlo simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the effects of ground heterogeneity, considering permeability as a random variable, on an intruding SW wedge using Monte Carlo simulations. Random permeability fields were generated, using the method of Local Average Subdivision (LAS), based on a lognormal probability density function. The LAS method allows the creation of spatially correlated random fields, generated using coefficients of variation (COV) and horizontal and vertical scales of fluctuation (SOF). The numerical modelling code SUTRA was employed to solve the coupled flow and transport problem. The well-defined 2D dispersive Henry problem was used as the test case for the method. The intruding SW wedge is defined by two key parameters, the toe penetration length (TL) and the width of mixing zone (WMZ). These parameters were compared to the results of a homogeneous case simulated using effective permeability values. The simulation results revealed: (1) an increase in COV resulted in a seaward movement of TL; (2) the WMZ extended with increasing COV; (3) a general increase in horizontal and vertical SOF produced a seaward movement of TL, with the WMZ increasing slightly; (4) as the anisotropic ratio increased the TL intruded further inland and the WMZ reduced in size. The results show that for large values of COV, effective permeability parameters are inadequate at reproducing the effects of heterogeneity on SW intrusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use ground-based images of high spatial and temporal resolution to search for evidence of nanoflare activity in the solar chromosphere. Through close examination of more than 1 x 10(9) pixels in the immediate vicinity of an active region, we show that the distributions of observed intensity fluctuations have subtle asymmetries. A negative excess in the intensity fluctuations indicates that more pixels have fainter-than-average intensities compared with those that appear brighter than average. By employing Monte Carlo simulations, we reveal how the negative excess can be explained by a series of impulsive events, coupled with exponential decays, that are fractionally below the current resolving limits of low-noise equipment on high-resolution ground-based observatories. Importantly, our Monte Carlo simulations provide clear evidence that the intensity asymmetries cannot be explained by photon-counting statistics alone. A comparison to the coronal work of Terzo et al. suggests that nanoflare activity in the chromosphere is more readily occurring, with an impulsive event occurring every similar to 360 s in a 10,000 km(2) area of the chromosphere, some 50 times more events than a comparably sized region of the corona. As a result, nanoflare activity in the chromosphere is likely to play an important role in providing heat energy to this layer of the solar atmosphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper tests a simple market fraction asset pricing model with heterogeneous
agents. By selecting a set of structural parameters of the model through a systematic procedure, we show that the autocorrelations (of returns, absolute returns and squared returns) of the market fraction model share the same pattern as those of the DAX 30. By conducting econometric analysis via Monte Carlo simulations, we characterize these power-law behaviours and find that estimates of the power-law decay indices, the (FI)GARCH parameters, and the tail index of the selected market fraction model closely match those of the DAX 30. The results strongly support the explanatory power of the heterogeneous agent models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we investigate the end-to-end performance of dual-hop proactive decode-and-forward relaying networks with Nth best relay selection in the presence of two practical deleterious effects: i) hardware impairment and ii) cochannel interference. In particular, we derive new exact and asymptotic closed-form expressions for the outage probability and average channel capacity of Nth best partial and opportunistic relay selection schemes over Rayleigh fading channels. Insightful discussions are provided. It is shown that, when the system cannot select the best relay for cooperation, the partial relay selection scheme outperforms the opportunistic method under the impact of the same co-channel interference (CCI). In addition, without CCI but under the effect of hardware impairment, it is shown that both selection strategies have the same asymptotic channel capacity. Monte Carlo simulations are presented to corroborate our analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the non-equilibrium dynamics of a simple system consisting of interacting spin-1/2 particles subjected to a collective damping. The model is close to situations that can be engineered in hybrid electro/opto-mechanical settings. Making use of large-deviation theory, we find a Gallavotti-Cohen symmetry in the dynamics of the system as well as evidence for the coexistence of two dynamical phases with different activity levels. We show that additional damping processes smooth out this behavior. Our analytical results are backed up by Monte Carlo simulations that reveal the nature of the trajectories contributing to the different dynamical phases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To investigate the clinical implications of a variable relative biological effectiveness (RBE) on proton dose fractionation. Using acute exposures, the current clinical adoption of a generic, constant cell killing RBE has been shown to underestimate the effect of the sharp increase in linear energy transfer (LET) in the distal regions of the spread-out Bragg peak (SOBP). However, experimental data for the impact of dose fractionation in such scenarios are still limited.

Methods and Materials: Human fibroblasts (AG01522) at 4 key depth positions on a clinical SOBP of maximum energy 219.65 MeV were subjected to various fractionation regimens with an interfraction period of 24 hours at Proton Therapy Center in Prague, Czech Republic. Cell killing RBE variations were measured using standard clonogenic assays and were further validated using Monte Carlo simulations and parameterized using a linear quadratic formalism.

Results: Significant variations in the cell killing RBE for fractionated exposures along the proton dose profile were observed. RBE increased sharply toward the distal position, corresponding to a reduction in cell sparing effectiveness of fractionated proton exposures at higher LET. The effect was more pronounced at smaller doses per fraction. Experimental survival fractions were adequately predicted using a linear quadratic formalism assuming full repair between fractions. Data were also used to validate a parameterized variable RBE model based on linear α parameter response with LET that showed considerable deviations from clinically predicted isoeffective fractionation regimens.

Conclusions: The RBE-weighted absorbed dose calculated using the clinically adopted generic RBE of 1.1 significantly underestimates the biological effective dose from variable RBE, particularly in fractionation regimens with low doses per fraction. Coupled with an increase in effective range in fractionated exposures, our study provides an RBE dataset that can be used by the modeling community for the optimization of fractionated proton therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By testing a simple asset pricing model of heterogeneous agents to characterize the power-law behavior of the DAX 30 from 1975 to 2007, we provide supporting evidence on empirical findings that investors and fund managers use combinations of fixed and switching strategies based on fundamental and technical analysis when making investment decisions. By conducting econometric analysis via Monte Carlo simulations, we show that the autocorrelation patterns, the estimates of the power-law decay indices, (FI)GARCH parameters, and tail index of the model match closely the corresponding estimates for the DAX 30. A mechanism analysis based on the calibrated model provides further insights into the explanatory power of heterogeneous agent models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de Doutoramento, Gestão, na especialidade de Marketing, Faculdade de Economia, Universidade do Algarve, 2007

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A PhD Dissertation, presented as part of the requirements for the Degree of Doctor of Philosophy from the NOVA - School of Business and Economics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is divided into two distinct parts. The first part consists of the study of the metal organic framework UiO-66Zr, where the aim was to determine the force field that best describes the adsorption equilibrium properties of two different gases, methane and carbon dioxide. The other part of the work focuses on the study of the single wall carbon nanotube topology for ethane adsorption; the aim was to simplify as much as possible the solid-fluid force field model to increase the computational efficiency of the Monte Carlo simulations. The choice of both adsorbents relies on their potential use in adsorption processes, such as the capture and storage of carbon dioxide, natural gas storage, separation of components of biogas, and olefin/paraffin separations. The adsorption studies on the two porous materials were performed by molecular simulation using the grand canonical Monte Carlo (μ,V,T) method, over the temperature range of 298-343 K and pressure range 0.06-70 bar. The calibration curves of pressure and density as a function of chemical potential and temperature for the three adsorbates under study, were obtained Monte Carlo simulation in the canonical ensemble (N,V,T); polynomial fit and interpolation of the obtained data allowed to determine the pressure and gas density at any chemical potential. The adsorption equilibria of methane and carbon dioxide in UiO-66Zr were simulated and compared with the experimental data obtained by Jasmina H. Cavka et al. The results show that the best force field for both gases is a chargeless united-atom force field based on the TraPPE model. Using this validated force field it was possible to estimate the isosteric heats of adsorption and the Henry constants. In the Grand-Canonical Monte Carlo simulations of carbon nanotubes, we conclude that the fastest type of run is obtained with a force field that approximates the nanotube as a smooth cylinder; this approximation gives execution times that are 1.6 times faster than the typical atomistic runs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.