20 resultados para Entropy diagrams

em Universidade do Minho


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thermodynamic stability of black holes, described by the Rényi formula as equilibrium compatible entropy function, is investigated. It is shown that within this approach, asymptotically flat, Schwarzschild black holes can be in stable equilibrium with thermal radiation at a fixed temperature. This implies that the canonical ensemble exists just like in anti-de Sitter space, and nonextensive effects can stabilize the black holes in a very similar way as it is done by the gravitational potential of an anti-de Sitter space. Furthermore, it is also shown that a Hawking–Page-like black hole phase transition occurs at a critical temperature which depends on the q-parameter of the Rényi formula.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The necessary information to distinguish a local inhomogeneous mass density field from its spatial average on a compact domain of the universe can be measured by relative information entropy. The Kullback-Leibler (KL) formula arises very naturally in this context, however, it provides a very complicated way to compute the mutual information between spatially separated but causally connected regions of the universe in a realistic, inhomogeneous model. To circumvent this issue, by considering a parametric extension of the KL measure, we develop a simple model to describe the mutual information which is entangled via the gravitational field equations. We show that the Tsallis relative entropy can be a good approximation in the case of small inhomogeneities, and for measuring the independent relative information inside the domain, we propose the R\'enyi relative entropy formula.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vulnerability of the masonry envelop under blast loading is considered critical due to the risk of loss of lives. The behaviour of masonry infill walls subjected to dynamic out-of-plane loading was experimentally investigated in this work. Using confined underwater blast wave generators (WBWG), applying the extremely high rate conversion of the explosive detonation energy into the kinetic energy of a thick water confinement, allowed a surface area distribution avoiding also the generation of high velocity fragments and reducing atmospheric sound wave. In the present study, water plastic containers, having in its centre a detonator inside a cylindrical explosive charge, were used in unreinforced masonry infills panels with 1.7m by 3.5m. Besides the usage of pressure and displacement transducers, pictures with high-speed video cameras were recorded to enable processing of the deflections and identification of failure modes. Additional numerical studies were performed in both unreinforced and reinforced walls. Bed joint reinforcement and grid reinforcement were used to strengthen the infill walls, and the results are presented and compared, allowing to obtain pressure-impulse diagrams for design of masonry infill walls.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Companies from the motorcycles components branch are dealing with a dynamic environment, resulting from the introduction of new products and the increase of market demand. This dynamic environment requires frequent changes in production lines and requires flexibility in the processes, which can cause reductions in the level of quality and productivity. This paper presents a Lean Six Sigma improvement project performed in a production line of the company's machining sector, in order to eliminate losses that cause low productivity, affecting the fulfillment of the production plan and customer satisfaction. The use of Lean methodology following the DMAIC stages allowed analyzing the factors that influence the line productivity loss. The major problems and causes that contribute to a reduction on productivity and that were identified in this study are the lack of standardization in the setup activities and the excessive stoppages for adjustment of the processes that caused an increase of defects. Control charts, Pareto analysis and cause-and-effect diagrams were used to analyze the problem. On the improvement stage, the changes were based on the reconfiguration of the line layout as well as the modernization of the process. Overall, the project justified an investment in new equipment, the defective product units were reduced by 84% and an increase of 29% of line capacity was noticed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de Doutoramento em Tecnologias e Sistemas de Informação

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Doctoral Thesis Civil Engineering

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia Industrial

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão Industrial

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão Industrial

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Biomédica

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências da Educação (Especialidade em Desenvolvimento Curricular)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de Doutoramento (Programa Doutoral em Engenharia Biomédica)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.