983 resultados para Complex-order differintegrals
Resumo:
Finding the structure of a confined liquid crystal is a difficult task since both the density and order parameter profiles are nonuniform. Starting from a microscopic model and density-functional theory, one has to either (i) solve a nonlinear, integral Euler-Lagrange equation, or (ii) perform a direct multidimensional free energy minimization. The traditional implementations of both approaches are computationally expensive and plagued with convergence problems. Here, as an alternative, we introduce an unsupervised variant of the multilayer perceptron (MLP) artificial neural network for minimizing the free energy of a fluid of hard nonspherical particles confined between planar substrates of variable penetrability. We then test our algorithm by comparing its results for the structure (density-orientation profiles) and equilibrium free energy with those obtained by standard iterative solution of the Euler-Lagrange equations and with Monte Carlo simulation results. Very good agreement is found and the MLP method proves competitively fast, flexible, and refinable. Furthermore, it can be readily generalized to the richer experimental patterned-substrate geometries that are now experimentally realizable but very problematic to conventional theoretical treatments.
Resumo:
A Teia Mundial (Web) foi prevista como uma rede de documentos de hipertexto interligados de forma a criar uma espaço de informação onde humanos e máquinas poderiam comunicar. No entanto, a informação contida na Web tradicional foi/é armazenada de forma não estruturada o que leva a que apenas os humanos a possam consumir convenientemente. Consequentemente, a procura de informações na Web sintáctica é uma tarefa principalmente executada pelos humanos e nesse sentido nem sempre é fácil de concretizar. Neste contexto, tornou-se essencial a evolução para uma Web mais estruturada e mais significativa onde é dado significado bem definido à informação de forma a permitir a cooperação entre humanos e máquinas. Esta Web é usualmente referida como Web Semântica. Além disso, a Web Semântica é totalmente alcançável apenas se os dados de diferentes fontes forem ligados criando assim um repositório de Dados Abertos Ligados (LOD). Com o aparecimento de uma nova Web de Dados (Abertos) Ligados (i.e. a Web Semântica), novas oportunidades e desafios surgiram. Pergunta Resposta (QA) sobre informação semântica é actualmente uma área de investigação activa que tenta tirar vantagens do uso das tecnologias ligadas à Web Semântica para melhorar a tarefa de responder a questões. O principal objectivo do projecto World Search passa por explorar a Web Semântica para criar mecanismos que suportem os utilizadores de domínios de aplicação específicos a responder a questões complexas com base em dados oriundos de diferentes repositórios. No entanto, a avaliação feita ao estado da arte permite concluir que as aplicações existentes não suportam os utilizadores na resposta a questões complexas. Nesse sentido, o trabalho desenvolvido neste documento foca-se em estudar/desenvolver metodologias/processos que permitam ajudar os utilizadores a encontrar respostas exactas/corretas para questões complexas que não podem ser respondidas fazendo uso dos sistemas tradicionais. Tal inclui: (i) Ultrapassar a dificuldade dos utilizadores visionarem o esquema subjacente aos repositórios de conhecimento; (ii) Fazer a ponte entre a linguagem natural expressa pelos utilizadores e a linguagem (formal) entendível pelos repositórios; (iii) Processar e retornar informações relevantes que respondem apropriadamente às questões dos utilizadores. Para esse efeito, são identificadas um conjunto de funcionalidades que são consideradas necessárias para suportar o utilizador na resposta a questões complexas. É também fornecida uma descrição formal dessas funcionalidades. A proposta é materializada num protótipo que implementa as funcionalidades previamente descritas. As experiências realizadas com o protótipo desenvolvido demonstram que os utilizadores efectivamente beneficiam das funcionalidades apresentadas: ▪ Pois estas permitem que os utilizadores naveguem eficientemente sobre os repositórios de informação; ▪ O fosso entre as conceptualizações dos diferentes intervenientes é minimizado; ▪ Os utilizadores conseguem responder a questões complexas que não conseguiam responder com os sistemas tradicionais. Em suma, este documento apresenta uma proposta que comprovadamente permite, de forma orientada pelo utilizador, responder a questões complexas em repositórios semiestruturados.
Resumo:
The National Cancer Institute (NCI) method allows the distributions of usual intake of nutrients and foods to be estimated. This method can be used in complex surveys. However, the user must perform additional calculations, such as balanced repeated replication (BRR), in order to obtain standard errors and confidence intervals for the percentiles and mean from the distribution of usual intake. The objective is to highlight adaptations of the NCI method using data from the National Dietary Survey. The application of the NCI method was exemplified analyzing the total energy (kcal) and fruit (g) intake, comparing estimations of mean and standard deviation that were based on the complex design of the Brazilian survey with those assuming simple random sample. Although means point estimates were similar, estimates of standard error using the complex design increased by up to 60% compared to simple random sample. Thus, for valid estimates of food and energy intake for the population, all of the sampling characteristics of the surveys should be taken into account because when these characteristics are neglected, statistical analysis may produce underestimated standard errors that would compromise the results and the conclusions of the survey.
Resumo:
The self similar branching arrangement of the airways makes the respiratory system an ideal candidate for the application of fractional calculus theory. The fractal geometry is typically characterized by a recurrent structure. This study investigates the identification of a model for the respiratory tree by means of its electrical equivalent based on intrinsic morphology. Measurements were obtained from seven volunteers, in terms of their respiratory impedance by means of its complex representation for frequencies below 5 Hz. A parametric modeling is then applied to the complex valued data points. Since at low-frequency range the inertance is negligible, each airway branch is modeled by using gamma cell resistance and capacitance, the latter having a fractional-order constant phase element (CPE), which is identified from measurements. In addition, the complex impedance is also approximated by means of a model consisting of a lumped series resistance and a lumped fractional-order capacitance. The results reveal that both models characterize the data well, whereas the averaged CPE values are supraunitary and subunitary for the ladder network and the lumped model, respectively.
Resumo:
Finding the structure of a confined liquid crystal is a difficult task since both the density and order parameter profiles are nonuniform. Starting from a microscopic model and density-functional theory, one has to either (i) solve a nonlinear, integral Euler-Lagrange equation, or (ii) perform a direct multidimensional free energy minimization. The traditional implementations of both approaches are computationally expensive and plagued with convergence problems. Here, as an alternative, we introduce an unsupervised variant of the multilayer perceptron (MLP) artificial neural network for minimizing the free energy of a fluid of hard nonspherical particles confined between planar substrates of variable penetrability. We then test our algorithm by comparing its results for the structure (density-orientation profiles) and equilibrium free energy with those obtained by standard iterative solution of the Euler-Lagrange equations and with Monte Carlo simulation results. Very good agreement is found and the MLP method proves competitively fast, flexible, and refinable. Furthermore, it can be readily generalized to the richer experimental patterned-substrate geometries that are now experimentally realizable but very problematic to conventional theoretical treatments.
Resumo:
Defective interfering (DI) viruses are thought to cause oscillations in virus levels, known as the ‘Von Magnus effect’. Interference by DI viruses has been proposed to underlie these dynamics, although experimental tests of this idea have not been forthcoming. For the baculoviruses, insect viruses commonly used for the expression of heterologous proteins in insect cells, the molecular mechanisms underlying DI generation have been investigated. However, the dynamics of baculovirus populations harboring DIs have not been studied in detail. In order to address this issue, we used quantitative real-time PCR to determine the levels of helper and DI viruses during 50 serial passages of Autographa californica multiple nucleopolyhedrovirus (AcMNPV) in Sf21 cells. Unexpectedly, the helper and DI viruses changed levels largely in phase, and oscillations were highly irregular, suggesting the presence of chaos. We therefore developed a simple mathematical model of baculovirus-DI dynamics. This theoretical model reproduced patterns qualitatively similar to the experimental data. Although we cannot exclude that experimental variation (noise) plays an important role in generating the observed patterns, the presence of chaos in the model dynamics was confirmed with the computation of the maximal Lyapunov exponent, and a Ruelle-Takens-Newhouse route to chaos was identified at decreasing production of DI viruses, using mutation as a control parameter. Our results contribute to a better understanding of the dynamics of DI baculoviruses, and suggest that changes in virus levels over passages may exhibit chaos.
Resumo:
ABSTRACT - Starting with the explanation of metanarrative as a sort of self-reflexive storytelling (as defended by Kenneth Weaver Hope in his unpublished PhD. thesis), I propose to talk about enunciative practices that stress the telling more than the told. In line with some metaficcional practices applied to cinema, such as the ‘mindfuck’ film (Jonathan Eig, 2003), the ‘psychological puzzle film’ (Elliot Panek, 2003) and the ‘mind-game film’ (Thomas Elsaesser, 2009), I will address the manipulations that a narrative film endures in order to produce a more fruitful and complex experience for the viewer. I will particularly concentrate on the misrepresentation of time as a way to produce a labyrinthine work of fiction where the linear description of events is replaced by a game of time disclosure. The viewer is thus called upon to reconstruct the order of the various situations portrayed in a process that I call ‘temporal mapping’. However, as the viewer attempts to do this, the film, ironically, because of the intricate nature of the plot and the uncertain status of the characters, resists the attempt. There is a sort of teasing taking place between the film and its spectator: an invitation of decoding that is half-denied until the end, where the puzzle is finally solved. I will use three of Alejandro Iñárritu’s films to better convey my point: Amores perros (2000), 21 Grams (2003) and Babel (2006). I will consider Iñárritu’s methods to produce a non-linear storytelling as a way to stress the importance of time and its validity as one of the elements that make up for a metanarrative experience in films. I will focus especially on 21 Grams, which I consider to be a paragon of the labyrinth.
Resumo:
This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.
Resumo:
A pair matched case/control study was conducted from January 1991 to 30 June 1992 in order to define clinical and laboratory findings associated with DMAC infection in AIDS patients. Since DMAC infection is usually associated with advanced immunodeficiency, and therefore also with other opportunistic illnesses, in addition to the number of CD4+ lymphocytes, cases and controls were matched using the following criteria: date of AIDS diagnosis and antiretroviral therapy, number and severity of associated opportunistic infections and, whenever possible, type of Pneumocystis carinii prophylaxis, age and gender, in this order of relevance. Cases (defined as patients presenting at least one positive culture for MAC at a normally sterile site) and controls presented CD4+ lymphocyte counts below 50 cel/mm3. A significantly higher prevalence of general, digestive and respiratory signs, increased LDH levels, low hemoglobin levels and CD4+ cell counts were recorded for cases when compared to controls. Increases in gGT and alkaline phosphatase levels seen in cases were also recorded for controls. In conclusion, the strategy we used for selecting controls allowed us to detect laboratory findings associated to DMAC infection not found in other advanced immunossupressed AIDS patients without DMAC.
Resumo:
All over the world, the liberalization of electricity markets, which follows different paradigms, has created new challenges for those involved in this sector. In order to respond to these challenges, electric power systems suffered a significant restructuring in its mode of operation and planning. This restructuring resulted in a considerable increase of the electric sector competitiveness. Particularly, the Ancillary Services (AS) market has been target of constant renovations in its operation mode as it is a targeted market for the trading of services, which have as main objective to ensure the operation of electric power systems with appropriate levels of stability, safety, quality, equity and competitiveness. In this way, with the increasing penetration of distributed energy resources including distributed generation, demand response, storage units and electric vehicles, it is essential to develop new smarter and hierarchical methods of operation of electric power systems. As these resources are mostly connected to the distribution network, it is important to consider the introduction of this kind of resources in AS delivery in order to achieve greater reliability and cost efficiency of electrical power systems operation. The main contribution of this work is the design and development of mechanisms and methodologies of AS market and for energy and AS joint market, considering different management entities of transmission and distribution networks. Several models developed in this work consider the most common AS in the liberalized market environment: Regulation Down; Regulation Up; Spinning Reserve and Non-Spinning Reserve. The presented models consider different rules and ways of operation, such as the division of market by network areas, which allows the congestion management of interconnections between areas; or the ancillary service cascading process, which allows the replacement of AS of superior quality by lower quality of AS, ensuring a better economic performance of the market. A major contribution of this work is the development an innovative methodology of market clearing process to be used in the energy and AS joint market, able to ensure viable and feasible solutions in markets, where there are technical constraints in the transmission network involving its division into areas or regions. The proposed method is based on the determination of Bialek topological factors and considers the contribution of the dispatch for all services of increase of generation (energy, Regulation Up, Spinning and Non-Spinning reserves) in network congestion. The use of Bialek factors in each iteration of the proposed methodology allows limiting the bids in the market while ensuring that the solution is feasible in any context of system operation. Another important contribution of this work is the model of the contribution of distributed energy resources in the ancillary services. In this way, a Virtual Power Player (VPP) is considered in order to aggregate, manage and interact with distributed energy resources. The VPP manages all the agents aggregated, being able to supply AS to the system operator, with the main purpose of participation in electricity market. In order to ensure their participation in the AS, the VPP should have a set of contracts with the agents that include a set of diversified and adapted rules to each kind of distributed resource. All methodologies developed and implemented in this work have been integrated into the MASCEM simulator, which is a simulator based on a multi-agent system that allows to study complex operation of electricity markets. In this way, the developed methodologies allow the simulator to cover more operation contexts of the present and future of the electricity market. In this way, this dissertation offers a huge contribution to the AS market simulation, based on models and mechanisms currently used in several real markets, as well as the introduction of innovative methodologies of market clearing process on the energy and AS joint market. This dissertation presents five case studies; each one consists of multiple scenarios. The first case study illustrates the application of AS market simulation considering several bids of market players. The energy and ancillary services joint market simulation is exposed in the second case study. In the third case study it is developed a comparison between the simulation of the joint market methodology, in which the player bids to the ancillary services is considered by network areas and a reference methodology. The fourth case study presents the simulation of joint market methodology based on Bialek topological distribution factors applied to transmission network with 7 buses managed by a TSO. The last case study presents a joint market model simulation which considers the aggregation of small players to a VPP, as well as complex contracts related to these entities. The case study comprises a distribution network with 33 buses managed by VPP, which comprises several kinds of distributed resources, such as photovoltaic, CHP, fuel cells, wind turbines, biomass, small hydro, municipal solid waste, demand response, and storage units.
Resumo:
This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization.
Resumo:
Gottfried Leibniz generalized the derivation and integration, extending the operators from integer up to real, or even complex, orders. It is presently recognized that the resulting models capture long term memory effects difficult to describe by classical tools. Leon Chua generalized the set of lumped electrical elements that provide the building blocks in mathematical models. His proposal of the memristor and of higher order elements broadened the scope of variables and relationships embedded in the development of models. This paper follows the two directions and proposes a new logical step, by generalizing the concept of junction. Classical junctions interconnect system elements using simple algebraic restrictions. Nevertheless, this simplistic approach may be misleading in the presence of unexpected dynamical phenomena and requires including additional “parasitic” elements. The novel γ-junction includes, as special cases, the standard series and parallel connections and allows a new degree of freedom when building models. The proposal motivates the search for experimental and real world manifestations of the abstract conjectures.
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
Proceedings of the 13th International UFZ-Deltares Conference on Sustainable Use and Management of Soil, Sediment and Water Resources - 9–12 June 2015 • Copenhagen, Denmark
Resumo:
Dissertação para obtenção do Grau de Mestre em Conservação e Restauro, Perfil Ciências da Conservação Especialização em Arte Contemporânea