864 resultados para Euler discretization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

O Acordo do Clima é um marco nas negociações internacionais deste século. É fato que o aquecimento global ameaça o bem-estar humano e a economia mundial, e o desafio de estabilizar a concentração de gases do efeito estufa (GEE) na atmosfera, limitando o aumento de temperatura a menos de 2 graus Celsius até 2100, é uma responsabilidade comum, mas as ações devem ter caráter diferenciado dependendo da contribuição histórica e capacidade de cada nação. Para isso será necessária uma mudança de paradigma em relação ao modelo de desenvolvimento vigente, sobretudo a transição da matriz energética baseada nos combustíveis fósseis, em direção a uma economia com predominância de fontes renováveis e de baixa emissão de carbono. O processo de negociação do acordo climático foi longo. Em 1992, o Brasil sediou a primeira Conferência das Nações Unidas sobre o Meio Ambiente e o Desenvolvimento, vinte anos depois da Primeira Conferência Mundial sobre o Homem e o Meio Ambiente (Estocolmo, 1972) que pela primeira vez chamou atenção da comunidade internacional sobre a necessidade de um pacto global para reverter as ameaças à saúde do planeta e das futuras gerações. A Eco-92 celebrou uma série de tratados relacionados à temática ambiental, dentre eles a Convenção-Quadro sobre Mudança do Clima, que abriu caminho para o Protocolo de Quioto. Pela primeira vez, se propõe um calendário pelo qual países-membros tem a obrigação3 de reduzir a emissão de GEE em, pelo menos, 5% em relação aos níveis de 1990 no período entre 2008 e 2012. O Protocolo traz a opção dos países do Anexo I compensarem suas emissões através do Mecanismo de Desenvolvimento Limpo (MDL), considerando como medida de redução projetos implementados nos países em desenvolvimento (PED). Sua ratificação só ocorreu em 2005 com a entrada da Rússia, mas ainda sem a participação dos Estados Unidos e China, responsáveis pelas maiores fontes de emissões planetárias. O Brasil teve um papel de liderança nas negociações da Convenção do Clima, principalmente a partir de 2009, quando apresenta a UNFCCC a sua Política Nacional sobre Mudança do Clima (PNMC, Lei no 12.187/2009) e posteriormente o Plano Nacional sobre Mudança do Clima (Decreto 7390/2010). Estes marcos regulatórios definiram a estratégia brasileira de redução voluntária de emissões de GEE (36,1 à 38,9% em relação às emissões projetadas até 2020) e os planos de ação setoriais para atingir tais metas. Apesar de todos os desafios sociais e econômicos, os resultados alcançados pelo Brasil no período de vigência do Protocolo de Quioto representam um dos maiores esforços de um único país até hoje, tendo reduzido suas emissões em mais de 41%, em 2012, com relação aos níveis de 2005. A região amazônica teve papel decisivo, com redução de 85% do desmatamento, enquanto todos os demais setores da economia tiveram aumento de emissões. No Acordo de Paris, o Brasil sinaliza um compromisso ainda mais audacioso de redução de emissões absolutas, e de zerar o desmatamento ilegal em 2030 (iNDC, 2015). Este artigo pretende fazer um retrospecto da construção da proposta do mecanismo de redução de emissões por desmatamento e degradação (REDD) no Brasil e na Conferência das Nações Unidas sobre Mudanças Climáticas, e discutir o papel das florestas tropicais no combate ao aquecimento global do ponto de vista da relevância da região amazônica para o alcance das metas brasileiras, e o contexto de discussão e implementação de REDD+ nos estados. Finalmente refletir sobre os desafios futuros da recém lançada Estratégia Nacional de REDD+ (ENREDD+) frente ao baixo retorno histórico recebido pelas populações amazônidas quando analisamos o seu legado na conservação deste imenso patrimônio da humanidade. E a visão traçada pelo Brasil na sua Pretendida Contribuição Nacionalmente Determinada (iNDCs), como parte do novo Acordo do Clima, onde o papel das florestas torna-se secundário em relação ao agronegócio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudo trata da comunicação face a face nas organizações sob diferentes abordagens teóricas. Considera a perspectiva da simultaneidade dos meios, já que as empresas utilizam diversos canais para dialogar com seus públicos de interesse. Leva em conta o fenômeno da midiatização, que reestrutura o modo como as pessoas se relacionam na sociedade contemporânea. O objetivo geral da pesquisa é sistematizar papeis potencialmente exercidos pela interação face a face e conhecer algumas circunstâncias que envolvem sua prática nas organizações. Por se tratar de uma tese teórica, a pesquisa bibliográfica se apresenta como um dos principais procedimentos metodológicos; análises de casos empíricos e um estudo de caso desenvolvido na Embrapa Pantanal constituem situações ilustrativas. Conclui-se que a comunicação face a face nas empresas ocorre de forma simultânea e combinada a outros canais de comunicação, porém, ela proporciona resultados práticos e filosóficos ainda pouco explorados. É rara a utilização estratégica de contatos presenciais como mecanismo para estabelecer relacionamentos, conhecer as reações alheias e ajustar a comunicação, aliar o discurso corporativo às práticas empresariais e avaliar o contexto onde se desenvolvem as interações, o que pode ser decisivo para a comunicação organizacional.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Additive Manufacturing (AM) is nowadays considered an important alternative to traditional manufacturing processes. AM technology shows several advantages in literature as design flexibility, and its use increases in automotive, aerospace and biomedical applications. As a systematic literature review suggests, AM is sometimes coupled with voxelization, mainly for representation and simulation purposes. Voxelization can be defined as a volumetric representation technique based on the model’s discretization with hexahedral elements, as occurs with pixels in the 2D image. Voxels are used to simplify geometric representation, store intricated details of the interior and speed-up geometric and algebraic manipulation. Compared to boundary representation used in common CAD software, voxel’s inherent advantages are magnified in specific applications such as lattice or topologically structures for visualization or simulation purposes. Those structures can only be manufactured with AM employment due to their complex topology. After an accurate review of the existent literature, this project aims to exploit the potential of the voxelization algorithm to develop optimized Design for Additive Manufacturing (DfAM) tools. The final aim is to manipulate and support mechanical simulations of lightweight and optimized structures that should be ready to be manufactured with AM with particular attention to automotive applications. A voxel-based methodology is developed for efficient structural simulation of lattice structures. Moreover, thanks to an optimized smoothing algorithm specific for voxel-based geometries, a topological optimized and voxelized structure can be transformed into a surface triangulated mesh file ready for the AM process. Moreover, a modified panel code is developed for simple CFD simulations using the voxels as a discretization unit to understand the fluid-dynamics performances of industrial components for preliminary aerodynamic performance evaluation. The developed design tools and methodologies perfectly fit the automotive industry’s needs to accelerate and increase the efficiency of the design workflow from the conceptual idea to the final product.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective field theories (EFTs) are ubiquitous in theoretical physics and in particular in field theory descriptions of quantum systems probed at energies much lower than one or few characterizing scales. More recently, EFTs have gained a prominent role in the study of fundamental interactions and in particular in the parametriasation of new physics beyond the Standard Model, which would occur at scales Λ, much larger than the electroweak scale. In this thesis, EFTs are employed to study three different physics cases. First, we consider light-by-light scattering as a possible probe of new physics. At low energies it can be described by dimension-8 operators, leading to the well-known Euler-Heisenberg Lagrangian. We consider the explicit dependence of matching coefficients on type of particle running in the loop, confirming the sensitiveness to the spin, mass, and interactions of possibly new particles. Second, we consider EFTs to describe Dark Matter (DM) interactions with SM particles. We consider a phenomenologically motivated case, i.e., a new fermion state that couples to the Hypercharge through a form factor and has no interactions with photons and the Z boson. Results from direct, indirect and collider searches for DM are used to constrain the parameter space of the model. Third, we consider EFTs that describe axion-like particles (ALPs), whose phenomenology is inspired by the Peccei-Quinn solution to strong CP problem. ALPs generically couple to ordinary matter through dimension-5 operators. In our case study, we investigate the rather unique phenomenological implications of ALPs with enhanced couplings to the top quark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We start in Chapter 2 to investigate linear matrix-valued SDEs and the Itô-stochastic Magnus expansion. The Itô-stochastic Magnus expansion provides an efficient numerical scheme to solve matrix-valued SDEs. We show convergence of the expansion up to a stopping time τ and provide an asymptotic estimate of the cumulative distribution function of τ. Moreover, we show how to apply it to solve SPDEs with one and two spatial dimensions by combining it with the method of lines with high accuracy. We will see that the Magnus expansion allows us to use GPU techniques leading to major performance improvements compared to a standard Euler-Maruyama scheme. In Chapter 3, we study a short-rate model in a Cox-Ingersoll-Ross (CIR) framework for negative interest rates. We define the short rate as the difference of two independent CIR processes and add a deterministic shift to guarantee a perfect fit to the market term structure. We show how to use the Gram-Charlier expansion to efficiently calibrate the model to the market swaption surface and price Bermudan swaptions with good accuracy. We are taking two different perspectives for rating transition modelling. In Section 4.4, we study inhomogeneous continuous-time Markov chains (ICTMC) as a candidate for a rating model with deterministic rating transitions. We extend this model by taking a Lie group perspective in Section 4.5, to allow for stochastic rating transitions. In both cases, we will compare the most popular choices for a change of measure technique and show how to efficiently calibrate both models to the available historical rating data and market default probabilities. At the very end, we apply the techniques shown in this thesis to minimize the collateral-inclusive Credit/ Debit Valuation Adjustments under the constraint of small collateral postings by using a collateral account dependent on rating trigger.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present manuscript focuses on Lattice Gauge Theories based on finite groups. For the purpose of Quantum Simulation, the Hamiltonian approach is considered, while the finite group serves as a discretization scheme for the degrees of freedom of the gauge fields. Several aspects of these models are studied. First, we investigate dualities in Abelian models with a restricted geometry, using a systematic approach. This leads to a rich phase diagram dependent on the super-selection sectors. Second, we construct a family of lattice Hamiltonians for gauge theories with a finite group, either Abelian or non-Abelian. We show that is possible to express the electric term as a natural graph Laplacian, and that the physical Hilbert space can be explicitly built using spin network states. In both cases we perform numerical simulations in order to establish the correctness of the theoretical results and further investigate the models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background There is a wide variation of recurrence risk of Non-small-cell lung cancer (NSCLC) within the same Tumor Node Metastasis (TNM) stage, suggesting that other parameters are involved in determining this probability. Radiomics allows extraction of quantitative information from images that can be used for clinical purposes. The primary objective of this study is to develop a radiomic prognostic model that predicts a 3 year disease free-survival (DFS) of resected Early Stage (ES) NSCLC patients. Material and Methods 56 pre-surgery non contrast Computed Tomography (CT) scans were retrieved from the PACS of our institution and anonymized. Then they were automatically segmented with an open access deep learning pipeline and reviewed by an experienced radiologist to obtain 3D masks of the NSCLC. Images and masks underwent to resampling normalization and discretization. From the masks hundreds Radiomic Features (RF) were extracted using Py-Radiomics. Hence, RF were reduced to select the most representative features. The remaining RF were used in combination with Clinical parameters to build a DFS prediction model using Leave-one-out cross-validation (LOOCV) with Random Forest. Results and Conclusion A poor agreement between the radiologist and the automatic segmentation algorithm (DICE score of 0.37) was found. Therefore, another experienced radiologist manually segmented the lesions and only stable and reproducible RF were kept. 50 RF demonstrated a high correlation with the DFS but only one was confirmed when clinicopathological covariates were added: Busyness a Neighbouring Gray Tone Difference Matrix (HR 9.610). 16 clinical variables (which comprised TNM) were used to build the LOOCV model demonstrating a higher Area Under the Curve (AUC) when RF were included in the analysis (0.67 vs 0.60) but the difference was not statistically significant (p=0,5147).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis project, I present stationary models of rotating fluids with toroidal distributions that can be used to represent the active galactic nuclei (AGN) central obscurers, i.e. molecular tori (Combes et al., 2019), as well as geometrically thick accretion discs, like ADAF discs (Narayan and Yi, 1995) or Polish doughnuts (Abramowicz, 2005). In particular, I study stationary rotating systems with a more general baroclinic distribution (with a vertical gradient of the angular velocity), which are often more realistic and less studied, due to their complexity, than the barotropic ones (with cylindrical rotation), which are easier to construct. In the thesis, I compute analytically the main intrinsic and projected properties of the power-law tori based on the potential-density pairs of Ciotti and Bertin (2005). I study the density distribution and the resulting gravitational potential for different values of α, in the range 2 < α < 5. For the same models, I compute the surface density of the systems when seen face-on and edge-on. I then apply the stationary Euler equations to obtain rotational velocity and temperature distributions of the self-gravitating models in the absence of an external gravitational potential. In the thesis I also consider the power-law tori with the presence of a central black hole in addition to the gas self-gravity, and solving analytically the stationary Euler equations, I compute how the properties of the system are modified by the black hole and how they vary as a function of the black hole mass. Finally, applying the Solberg-Høiland criterion, I show that these baroclinic stationary models are linearly stable in the absence of the black hole. In the presence of the black hole I derive the analytical condition for stability, which depends on α and on the black hole mass. I also study the stability of the tori in the hypothesis that they are weakly magnetized, finding that they are always unstable to this instability.