968 resultados para Non linear processes
Resumo:
The purpose of this thesis is to clarify the role of non-equilibrium stationary currents of Markov processes in the context of the predictability of future states of the system. Once the connection between the predictability and the conditional entropy is established, we provide a comprehensive approach to the definition of a multi-particle Markov system. In particular, starting from the well-known theory of random walk on network, we derive the non-linear master equation for an interacting multi-particle system under the one-step process hypothesis, highlighting the limits of its tractability and the prop- erties of its stationary solution. Lastly, in order to study the impact of the NESS on the predictability at short times, we analyze the conditional entropy by modulating the intensity of the stationary currents, both for a single-particle and a multi-particle Markov system. The results obtained analytically are numerically tested on a 5-node cycle network and put in correspondence with the stationary entropy production. Furthermore, because of the low dimensionality of the single-particle system, an analysis of its spectral properties as a function of the modulated stationary currents is performed.
Resumo:
Fenômenos oscilatórios e ressonantes são explorados em vários cursos experimentais de física. Em geral os experimentos são interpretados no limite de pequenas oscilações e campos uniformes. Neste artigo descrevemos um experimento de baixo custo para o estudo da ressonância em campo magnético da agulha de uma bússola fora dos limites acima. Nesse caso, termos não lineares na equação diferencial são responsáveis por fenômenos interessantes de serem explorados em laboratórios didáticos.
Resumo:
There is an increasing need to treat effluents contaminated with phenol with advanced oxidation processes (AOPs) to minimize their impact on the environment as well as on bacteriological populations of other wastewater treatment systems. One of the most promising AOPs is the Fenton process that relies on the Fenton reaction. Nevertheless, there are no systematic studies on Fenton reactor networks. The objective of this paper is to develop a strategy for the optimal synthesis of Fenton reactor networks. The strategy is based on a superstructure optimization approach that is represented as a mixed integer non-linear programming (MINLP) model. Network superstructures with multiple Fenton reactors are optimized with the objective of minimizing the sum of capital, operation and depreciation costs of the effluent treatment system. The optimal solutions obtained provide the reactor volumes and network configuration, as well as the quantities of the reactants used in the Fenton process. Examples based on a case study show that multi-reactor networks yield decrease of up to 45% in overall costs for the treatment plant. (C) 2010 The Institution of Chemical Engineers. Published by Elsevier B.V. All rights reserved.
Resumo:
Quantum computers promise to increase greatly the efficiency of solving problems such as factoring large integers, combinatorial optimization and quantum physics simulation. One of the greatest challenges now is to implement the basic quantum-computational elements in a physical system and to demonstrate that they can be reliably and scalably controlled. One of the earliest proposals for quantum computation is based on implementing a quantum bit with two optical modes containing one photon. The proposal is appealing because of the ease with which photon interference can be observed. Until now, it suffered from the requirement for non-linear couplings between optical modes containing few photons. Here we show that efficient quantum computation is possible using only beam splitters, phase shifters, single photon sources and photo-detectors. Our methods exploit feedback from photo-detectors and are robust against errors from photon loss and detector inefficiency. The basic elements are accessible to experimental investigation with current technology.
Resumo:
Previous magnetic resonance imaging (MRI) studies described consistent age-related gray matter (GM) reductions in the fronto-parietal neocortex, insula and cerebellum in elderly subjects, but not as frequently in limbic/paralimbic structures. However, it is unclear whether such features are already present during earlier stages of adulthood, and if age-related GM changes may follow non-linear patterns at such age range. This voxel-based morphometry study investigated the relationship between GM volumes and age specifically during non-elderly life (18-50 years) in 89 healthy individuals (48 males and 41 females). Voxelwise analyses showed significant (p < 0.05, corrected) negative correlations in the right prefrontal cortex and left cerebellum, and positive correlations (indicating lack of GM loss) in the medial temporal region, cingulate gyrus, insula and temporal neocortex. Analyses using ROI masks showed that age-related dorsolateral prefrontal volume decrements followed non-linear patterns, and were less prominent in females compared to males at this age range. These findings further support for the notion of a heterogeneous and asynchronous pattern of age-related brain morphometric changes, with region-specific non-linear features. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Quantum feedback can stabilize a two-level atom against decoherence (spontaneous emission), putting it into an arbitrary (specified) pure state. This requires perfect homodyne detection of the atomic emission, and instantaneous feedback. Inefficient detection was considered previously by two of us. Here we allow for a non-zero delay time tau in the feedback circuit. Because a two-level atom is a non-linear optical system, an analytical solution is not possible. However, quantum trajectories allow a simple numerical simulation of the resulting non-Markovian process. We find the effect of the time delay to be qualitatively similar to chat of inefficient detection. The solution of the non-Markovian quantum trajectory will not remain fixed, so that the time-averaged state will be mixed, not pure. In the case where one tries to stabilize the atom in the excited state, an approximate analytical solution to the quantum trajectory is possible. The result, that the purity (P = 2Tr[rho (2)] - 1) of the average state is given by P = 1 - 4y tau (where gamma is the spontaneous emission rate) is found to agree very well with the numerical results. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Modelos de escoamento multifásico são amplamente usados em diversas áreas de pesquisa ambiental, como leitos fluidizados, dispersão de gás em líquidos e vários outros processos que englobam mais de uma propriedade físico-química do meio. Dessa forma, um modelo multifásico foi desenvolvido e adaptado para o estudo do transporte de sedimentos de fundo devido à ação de ondas de gravidade. Neste trabalho, foi elaborado o acoplamento multifásico de um modelo euleriano não-linear de ondas do tipo Boussinesq, baseado na formulação numérica encontrada em Wei et al. (1995), com um modelo lagrangiano de partículas, fundamentado pelo princípio Newtoniano do movimento com o esquema de colisões do tipo esferas rígidas. O modelo de ondas foi testado quanto à sua fonte geradora, representada por uma função gaussiana, pá-pistão e pá-batedor, e quanto à sua interação com a profundidade, através da não-linearidade e de propriedades dispersivas. Nos testes realizados da fonte geradora, foi observado que a fonte gaussiana, conforme Wei et al. (1999), apresentou melhor consistência e estabilidade na geração das ondas, quando comparada à teoria linear para um kh . A não-linearidade do modelo de ondas de 2ª ordem para a dispersão apresentou resultados satisfatórios quando confrontados com o experimento de ondas sobre um obstáculo trapezoidal, onde a deformação da onda sobre a estrutura submersa está em concordância com os dados experimentais encontrados na literatura. A partir daí, o modelo granular também foi testado em dois experimentos. O primeiro simula uma quebra de barragem em um tanque contendo água e o segundo, a quebra de barragem é simulada com um obstáculo rígido adicionado ao centro do tanque. Nesses experimentos, o algoritmo de colisão foi eficaz no tratamento da interação entre partícula-partícula e partícula-parede, permitindo a evidência de processos físicos que são complicados de serem simulados por modelos de malhas regulares. Para o acoplamento do modelo de ondas e de sedimentos, o algoritmo foi testado com base de dados da literatura quanto à morfologia do leito. Os resultados foram confrontados com dados analíticos e de modelos numéricos, e se mostraram satisfatórios com relação aos pontos de erosão, de sedimentação e na alteração da forma da barra arenosa
Resumo:
"Bruno Aleixo" is a viral animation character, created by the Portuguese collective GANA, that surfaced online in 2008. Their animation works have meanwhile crossed onto the most diverse media, and have been branching out in multiple webs of narratives, constantly referring to each other, as well as constantly quoting disparate references such as film classics, chatrooms and TV ads for detergents. This paper attempts a triple analysis of this object of study: the ways in which technology has been fostering non-linear narratives while widening the available aesthetic spectrum, the ways in which processes of cultural consumerism are being reinvented in light of the web 2.0, and the use of "pseudo-nonsense" as a process of oblique cultural psychoanalysis. We will further attempt to demonstrate how new media and web networks have been contributing to a fragmentation of audiences, as well as a blurring between dominant cultures and sub-cultural phenomena; and we will end by positing that the structural principles behind the "Bruno Aleixo" series can be applied in social and cultural contexts situated at the opposite end of the spectrum of traditional expectations regarding Animation.
Resumo:
The optimal power flow problem has been widely studied in order to improve power systems operation and planning. For real power systems, the problem is formulated as a non-linear and as a large combinatorial problem. The first approaches used to solve this problem were based on mathematical methods which required huge computational efforts. Lately, artificial intelligence techniques, such as metaheuristics based on biological processes, were adopted. Metaheuristics require lower computational resources, which is a clear advantage for addressing the problem in large power systems. This paper proposes a methodology to solve optimal power flow on economic dispatch context using a Simulated Annealing algorithm inspired on the cooling temperature process seen in metallurgy. The main contribution of the proposed method is the specific neighborhood generation according to the optimal power flow problem characteristics. The proposed methodology has been tested with IEEE 6 bus and 30 bus networks. The obtained results are compared with other wellknown methodologies presented in the literature, showing the effectiveness of the proposed method.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Civil
Dimensão do sector público e crescimento económico: uma relação não linear na União Europeia dos 15?
Resumo:
Os Estados-Membros da União Europeia têm tido a preocupação de reduzirem a dimensão da Administração Pública na economia, a par de a tornar muito mais eficiente de forma a promover o crescimento económico. Neste artigo analisam-se as relações entre a despesa pública e o crescimento económico em 14 Estados-Membros da União Europeia dos 15, com o objectivo de determinar a dimensão óptima das Administrações Públicas, tendo por base teórica a Curva de Armey. Os resultados, para o período 1965-2007, sugerem uma dimensão do sector público maximizadora do crescimento económico de 47,37% e 22,17% do PIB, quando avaliada pelas despesas públicas totais e o consumo público, respectivamente.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
For any vacuum initial data set, we define a local, non-negative scalar quantity which vanishes at every point of the data hypersurface if and only if the data are Kerr initial data. Our scalar quantity only depends on the quantities used to construct the vacuum initial data set which are the Riemannian metric defined on the initial data hypersurface and a symmetric tensor which plays the role of the second fundamental form of the embedded initial data hypersurface. The dependency is algorithmic in the sense that given the initial data one can compute the scalar quantity by algebraic and differential manipulations, being thus suitable for an implementation in a numerical code. The scalar could also be useful in studies of the non-linear stability of the Kerr solution because it serves to measure the deviation of a vacuum initial data set from the Kerr initial data in a local and algorithmic way.
Resumo:
Dissertação de mestrado em Optometria Avançada
Resumo:
The relationship between body size and geographic range was analyzed for 70 species of terrestrial Carnivora ("fissipeds") of the New World, after the control of phylogenetic patterns in the data using phylogenetic eigenvector regression. The analysis from EcoSim software showed that the variables are related as a triangular envelope. Phylogenetic patterns in data were detected by means of phylogenetic correlograms, and 200 simulations of the phenotypic evolution were also performed over the phylogeny. For body size, the simulations suggested a non-linear relationship for the evolution of this character along the phylogeny. For geographic range size, the correlogram showed no phylogenetic patterns. A phylogenetic eigenvector regression was performed on original data and on data simulated under Ornstein-Uhlenbeck process. Since both characters did not evolve under a simple Brownian motion process, the Type I errors should be around 10%, compatible with other methods to analyze correlated evolution. The significant correlation of the original data (r = 0.38; P < 0.05), as well as the triangular envelope, then indicate ecological and adaptive processes connecting the two variables, such as those proposed in minimum viable population models.