804 resultados para Pareto frontier
Resumo:
Cortical bones, essential for mechanical support and structure in many animals, involve a large number of canals organized in intricate fashion. By using state-of-the art image analysis and computer graphics, the 3D reconstruction of a whole bone (phalange) of a young chicken was obtained and represented in terms of a complex network where each canal was associated to an edge and every confluence of three or more canals yielded a respective node. The representation of the bone canal structure as a complex network has allowed several methods to be applied in order to characterize and analyze the canal system organization and the robustness. First, the distribution of the node degrees (i.e. the number of canals connected to each node) confirmed previous indications that bone canal networks follow a power law, and therefore present some highly connected nodes (hubs). The bone network was also found to be partitioned into communities or modules, i.e. groups of nodes which are more intensely connected to one another than with the rest of the network. We verified that each community exhibited distinct topological properties that are possibly linked with their specific function. In order to better understand the organization of the bone network, its resilience to two types of failures (random attack and cascaded failures) was also quantified comparatively to randomized and regular counterparts. The results indicate that the modular structure improves the robustness of the bone network when compared to a regular network with the same average degree and number of nodes. The effects of disease processes (e. g., osteoporosis) and mutations in genes (e.g., BMP4) that occur at the molecular level can now be investigated at the mesoscopic level by using network based approaches.
Resumo:
Based on a divide and conquer approach, knowledge about nature has been organized into a set of interrelated facts, allowing a natural representation in terms of graphs: each `chunk` of knowledge corresponds to a node, while relationships between such chunks are expressed as edges. This organization becomes particularly clear in the case of mathematical theorems, with their intense cross-implications and relationships. We have derived a web of mathematical theorems from Wikipedia and, thanks to the powerful concept of entropy, identified its more central and frontier elements. Our results also suggest that the central nodes are the oldest theorems, while the frontier nodes are those recently added to the network. The network communities have also been identified, allowing further insights about the organization of this network, such as its highly modular structure.
Resumo:
In this paper, a simple relation between the Leimkuhler curve and the mean residual life is established. The result is illustrated with several models commonly used in informetrics, such as exponential, Pareto and lognormal. Finally, relationships with some other reliability concepts are also presented. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Absorption and fluorescence spectroscopy, electrochemical techniques, and semiempirical calculations were employed to characterize the multiple complexation equilibria between two polymethine cyanine dyes (IR-786 and Indocyanine green-ICG, 5) and beta-cyclodextrin (beta-CD, L), as well as the chemical reactivity of the complexed and uncomplexed species against the oxidizing agents hypochlorite (HC) and hydrogen peroxide (HP). IR-786 dimerization is favored with the increase in beta-CD concentration in the form of (SL)(2) complexes. In the case of ICG, free dimers (D) and SL complexes are favored. Both IR-786 and ICG react and discolor in the presence of HC and HP. For IR-786, the reaction with HP and HC proceeds with observed rate constants of 10(-3) and 0.28 s(-1) and second-order rate constants (k(2)) of similar to 10(-3) and 10(4) M(-1) s(-1), respectively. The intermediate species observed in the bleaching reactions of IR-786 and ICG were shown, by cyclic voltammetry and VIS absorption, to result from one electron oxidation. IR-786 complexed with beta-CD is protected against bleaching in the presence of HP and HC by factors of 20 and 4, respectively. This protection was not observed in ICG complexes. Superdelocalizability profile of both dyes and frontier orbital analysis indicates that beta-CD does not protect ICG from oxidation by HP or HC, whereas the 2:2 IR-786/beta-Cd complex is able to avoid the oxidation of IR-786. We concluded that the decrease in the chemical reactivity of the dyes against oxidant agents in the presence of beta-CD is due to the formation of (SL)(2) complexes. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
Molecular orbital calculations were carried out on a set of 28 non-imidazole H(3) antihistamine compounds using the Hartree-Fock method in order to investigate the possible relationships between electronic structural properties and binding affinity for H3 receptors (pK(i)). It was observed that the frontier effective-for-reaction molecular orbital (FERMO) energies were better correlated with pK(i) values than highest occupied molecular orbital (HOMO) and lowest unoccupied molecular orbital (LUMO) energy values. Exploratory data analysis through hierarchical cluster (HCA) and principal component analysis (PCA) showed a separation of the compounds in two sets, one grouping the molecules with high pK(i) values, the other gathering low pK(i) value compounds. This separation was obtained with the use of the following descriptors: FERMO energies (epsilon(FERMO)), charges derived from the electrostatic potential on the nitrogen atom (N(1)), electronic density indexes for FERMO on the N(1) atom (Sigma((FERMO))c(i)(2)). and electrophilicity (omega`). These electronic descriptors were used to construct a quantitative structure-activity relationship (QSAR) model through the partial least-squares (PLS) method with three principal components. This model generated Q(2) = 0.88 and R(2) = 0.927 values obtained from a training set and external validation of 23 and 5 molecules, respectively. After the analysis of the PLS regression equation and the values for the selected electronic descriptors, it is suggested that high values of FERMO energies and of Sigma((FERMO))c(i)(2), together with low values of electrophilicity and pronounced negative charges on N(1) appear as desirable properties for the conception of new molecules which might have high binding affinity. 2010 Elsevier Inc. All rights reserved.
Resumo:
This thesis work concerns about the Performance evolution of peer to peer networks, where we used different distribution technique’s of peer distribution like Weibull, Lognormal and Pareto distribution process. Then we used a network simulator to evaluate the performance of these three distribution techniques.During the last decade the Internet has expanded into a world-wide network connecting millions of hosts and users and providing services for everyone. Many emerging applications are bandwidth-intensive in their nature; the size of downloaded files including music and videos can be huge, from ten megabits to many gigabits. The efficient use of network resources is thus crucial for the survivability of the Internet. Traffic engineering (TE) covers a range of mechanisms for optimizing operational networks from the traffic perspective. The time scale in traffic engineering varies from the short-term network control to network planning over a longer time period.Here in this thesis work we considered the peer distribution technique in-order to minimise the peer arrival and service process with three different techniques, where we calculated the congestion parameters like blocking time for each peer before entering into the service process, waiting time for a peers while the other peer has been served in the service block and the delay time for each peer. Then calculated the average of each process and graphs have been plotted using Matlab to analyse the results
Resumo:
Each year search engines like Google, Bing and Yahoo, complete trillions of search queries online. Students are especially dependent on these search tools because of their popularity, convenience and accessibility. However, what students are unaware of, by choice or naiveté is the amount of personal information that is collected during each search session, how that data is used and who is interested in their online behavior profile. Privacy policies are frequently updated in favor of the search companies but are lengthy and often are perused briefly or ignored entirely with little thought about how personal web habits are being exploited for analytics and marketing. As an Information Literacy instructor, and a member of the Electronic Frontier Foundation, I believe in the importance of educating college students and web users in general that they have a right to privacy online. Class discussions on the topic of web privacy have yielded an interesting perspective on internet search usage. Students are unaware of how their online behavior is recorded and have consistently expressed their hesitancy to use tools that disguise or delete their IP address because of the stigma that it may imply they have something to hide or are engaging in illegal activity. Additionally, students fear they will have to surrender the convenience of uber connectivity in their applications to maintain their privacy. The purpose of this lightning presentation is to provide educators with a lesson plan highlighting and simplifying the privacy terms for the three major search engines, Google, Bing and Yahoo. This presentation focuses on what data these search engines collect about users, how that data is used and alternative search solutions, like DuckDuckGo, for increased privacy. Students will directly benefit from this lesson because informed internet users can protect their data, feel safer online and become more effective web searchers.
Resumo:
Most of water distribution systems (WDS) need rehabilitation due to aging infrastructure leading to decreasing capacity, increasing leakage and consequently low performance of the WDS. However an appropriate strategy including location and time of pipeline rehabilitation in a WDS with respect to a limited budget is the main challenge which has been addressed frequently by researchers and practitioners. On the other hand, selection of appropriate rehabilitation technique and material types is another main issue which has yet to address properly. The latter can affect the environmental impacts of a rehabilitation strategy meeting the challenges of global warming mitigation and consequent climate change. This paper presents a multi-objective optimization model for rehabilitation strategy in WDS addressing the abovementioned criteria mainly focused on greenhouse gas (GHG) emissions either directly from fossil fuel and electricity or indirectly from embodied energy of materials. Thus, the objective functions are to minimise: (1) the total cost of rehabilitation including capital and operational costs; (2) the leakage amount; (3) GHG emissions. The Pareto optimal front containing optimal solutions is determined using Non-dominated Sorting Genetic Algorithm NSGA-II. Decision variables in this optimisation problem are classified into a number of groups as: (1) percentage proportion of each rehabilitation technique each year; (2) material types of new pipeline for rehabilitation each year. Rehabilitation techniques used here includes replacement, rehabilitation and lining, cleaning, pipe duplication. The developed model is demonstrated through its application to a Mahalat WDS located in central part of Iran. The rehabilitation strategy is analysed for a 40 year planning horizon. A number of conventional techniques for selecting pipes for rehabilitation are analysed in this study. The results show that the optimal rehabilitation strategy considering GHG emissions is able to successfully save the total expenses, efficiently decrease the leakage amount from the WDS whilst meeting environmental criteria.
Resumo:
Nesta dissertação conceitualiza-se Economia do Meio Ambiente e destaca-se a importância de seu estudo e de suas aplicações. São apresentados os conceitos de externalidade, bens públicos, ótimo de Pareto, taxas Pigouvianas, o teorema de Coase, a tragédia dos comuns e comportamento free rider. Com base nesses conceitos, são focalizados as políticas públicas tradicionais e os mecanismos de mercado, buscando encontrar alternativas que conciliem a preservação do meio ambiente com os objetivos de eficiência econômica da sociedade. Finalmente, é analisado, através de um estudo de caso, o interesse e as condições da iniciativa privada em se adaptar tecnologicamente às regras ambientais definidas pelas políticas públicas.
Resumo:
Esta dissertação tem por finalidade principal verificar a viabilidade de implantação do método de controle integrado de atributos da qualidade. O objetivo secundário baseia-se em uma revisão na bibliografia do controle estatístico do processo, em especial as cartas de controle tradicionais e a carta de controle integrado do processo. A carta de controle integrado do processo é aplicável em processos contínuos e intermitentes de manufatura onde existem vários atributos independentes a serem monitorados. A partir de um posto de trabalho definido em uma linha de produção, o monitoramento, visando o controle e a garantia da qualidade, é realizado através de uma única carta de controle que engloba todas as características de qualidade pertinentes ao processo. Utilizando-se de um gráfico de Pareto, obtém-se a hierarquização das características de qualidade, podendo-se atuar sobre aquelas que mais contribuem para o percentual de defeituosos. Desta forma, é possível diagnosticar e solucionar problemas de qualidade tendo como alvo a melhoria contínua do processo. A principal vantagem desta carta está na simplicidade do controle integrado por apresentar em uma única carta uma visão geral da condição da qualidade. A demonstração da aplicabilidade da carta de controle integrado do processo é feita através do estudo de caso em uma empresa de bebidas alcoólicas O processo estudado é a rotulagem de duas bebidas alcoólicas que apresentam 10 características de qualidades, e destas se desdobram 29 tipos de defeitos. Todos estes defeitos são monitorados por uma carta de controle integrado do processo. Em um dos processos, se verifica através do gráfico de Pareto que dos 29 tipos de defeitos, 2 são responsáveis pela instabilidade do mesmo. Pode-se, então, atuar sobre estes defeitos e, portanto, melhorar este processo. Sempre que houver uma causa especial anômala ao processo, a carta indicará sua presença, e dará uma base sólida para a tomada de decisão de uma ação corretiva.
Resumo:
Esta dissertação apresenta a implementação das etapas do método DMAIC (Definir, Medir, Analisar, Melhorar e Controlar) no desenvolvimento de um projeto Seis Sigma em uma indústria petroquímica. O objetivo do projeto Seis Sigma foi reduzir a variabilidade de uma característica de qualidade crítica para o cliente e diminuir a dispersão dos tempos de reação entre as bateladas na produção de elastômeros. Neste trabalho são apresentadas as principais técnicas e ferramentas estatísticas utilizadas nas cinco etapas do método DMAIC, tais como brainstorming, mapeamento de processo, diagrama de causa e efeito, matriz da causa e efeito, gráfico de Pareto, FMEA e análise de regressão linear múltipla. A pesquisa desenvolvida de forma participativa, através da interação entre o pesquisador e os especialistas do processo, evidenciou a importância do conhecimento técnico do processo e um bom planejamento para a aquisição dos dados, como pontos importantes para a realização de um projeto de melhoria bem sucedido. O estudo apontou ainda, deficiências no sistema de controle de temperatura do reator, no sistema de medição para a característica de qualidade viscosidade Mooney e no sistema de dosagem dos insumos.
Resumo:
We consider exchange economies with a continuum of agents and differential information about finitely many states of nature. It was proved in Einy, Moreno and Shitovitz (2001) that if we allow for free disposal in the market clearing (feasibility) constraints then an irreducible economy has a competitive (or Walrasian expectations) equilibrium, and moreover, the set of competitive equilibrium allocations coincides with the private core. However when feasibility is defined with free disposal, competitive equilibrium allocations may not be incentive compatible and contracts may not be enforceable (see e.g. Glycopantis, Muir and Yannelis (2002)). This is the main motivation for considering equilibrium solutions with exact feasibility. We first prove that the results in Einy et al. (2001) are still valid without freedisposal. Then we define an incentive compatibility property motivated by the issue of contracts’ execution and we prove that every Pareto optimal exact feasible allocation is incentive compatible, implying that contracts of competitive or core allocations are enforceable.
Resumo:
O que faz com que, nos modelos de negociação com informações assimétricas no mercado de capitais, haja compra e venda de ativos? Seria o fato de as informações não serem as mesmas para todos os agentes que atuam na bolsa de valores o motivo da especulação? Faz sentido falar em compra e venda de ativos, quando os diversos agentes que compõem o mercado agem de forma racional e sabem que todos os demais agem assim? A grande motivação para a formulação deste trabalho foi a de que todos os artigos desenvolvidos até os dias de hoje - sobre equilíbrio com negociação em mercados de capitais - consideram de alguma maneira a presença de comportamento irracional de algum agente ou do mercado como um todo. Ver, por exemplo, os modelos apresentados em Kyle (1985) e Glosten e Milgrom (1985), onde a irracional idade existe no comportamento dos investidores denominados aleat6rios. Tais aplicadores demandam ativos de maneira aleat6ria, ou seja, não possuem uma estratégia que determine os seus desejos de compra E venda de ações. O que nos causou muita estranheza foi o fato de serem modelos de expectativas racionais, isto é, existe urna hip6tese de racionalidade entre OS indivíduos que negociam no setor financeiro. Portanto, a presença dos investidores aleat6rios torna esses trabalhos inconsistentes. O objetivo deste capítulo: retirar esses investidores aleatórios do mercado e, com isso, descobrir se sem a presença deles existir um ponto de alocação Pareto superior com a negociação.