912 resultados para non-process elements
Resumo:
A collection of miscellaneous pamphlets on religion.
Resumo:
A variation of low-density parity check (LDPC) error-correcting codes defined over Galois fields (GF(q)) is investigated using statistical physics. A code of this type is characterised by a sparse random parity check matrix composed of C non-zero elements per column. We examine the dependence of the code performance on the value of q, for finite and infinite C values, both in terms of the thermodynamical transition point and the practical decoding phase characterised by the existence of a unique (ferromagnetic) solution. We find different q-dependence in the cases of C = 2 and C ≥ 3; the analytical solutions are in agreement with simulation results, providing a quantitative measure to the improvement in performance obtained using non-binary alphabets.
Resumo:
Purpose – To investigate the impact of performance measurement in strategic planning process. Design/methodology/approach – A large scale survey was conducted online with Warwick Business School alumni. The questionnaire was based on the Strategic Development Process model by Dyson. The questionnaire was designed to map the current practice of strategic planning and to determine its most influential factors on the effectiveness of the process. All questions were close ended and a seven-point Likert scale used. The independent variables were grouped into four meaningful factors by factor analysis (Varimax, coefficient of rotation 0.4). The factors produced were used to build regression models (stepwise) for the five assessments of strategic planning process. Regression models were developed for the totality of the responses, comparing SMEs and large organizations and comparing organizations operating in slowly and rapidly changing environments. Findings – The results indicate that performance measurement stands as one of the four main factors characterising the current practice of strategic planning. This research has determined that complexity coming from organizational size and rate of change in the sector creates variation in the impact of performance measurement in strategic planning. Large organizations and organizations operating in rapidly changing environments make greater use of performance measurement. Research limitations/implications – This research is based on subjective data, therefore the conclusions do not concern the impact of strategic planning process' elements on the organizational performance achievements, but on the success/effectiveness of the strategic planning process itself. Practical implications – This research raises a series of questions about the use and potential impact of performance measurement, especially in the categories of organizations that are not significantly influenced by its utilisation. It contributes to the field of performance measurement impact. Originality/value – This research fills in the gap literature concerning the lack of large scale surveys on strategic development processes and performance measurement. It also contributes in the literature of this field by providing empirical evidences on the impact of performance measurement upon the strategic planning process.
An improved conflicting evidence combination approach based on a new supporting probability distance
Resumo:
To avoid counter-intuitive result of classical Dempster's combination rule when dealing with highly conflict information, many improved combination methods have been developed through modifying the basic probability assignments (BPAs) of bodies of evidence (BOEs) by using a certain measure of the degree of conflict or uncertain information, such as Jousselme's distance, the pignistic probability distance and the ambiguity measure. However, if BOEs contain some non-singleton elements and the differences among their BPAs are larger than 0.5, the current conflict measure methods have limitations in describing the interrelationship among the conflict BOEs and may even lead to wrong combination results. In order to solve this problem, a new distance function, which is called supporting probability distance, is proposed to characterize the differences among BOEs. With the new distance, the information of how much a focal element is supported by the other focal elements in BOEs can be given. Also, a new combination rule based on the supporting probability distance is proposed for the combination of the conflicting evidences. The credibility and the discounting factor of each BOE are generated by the supporting probability distance and the weighted BOEs are combined directly using Dempster's rules. Analytical results of numerical examples show that the new distance has a better capability of describing the interrelationships among BOEs, especially for the highly conflicting BOEs containing non-singleton elements and the proposed new combination method has better applicability and effectiveness compared with the existing methods.
Resumo:
2010 Mathematics Subject Classification: 62H10.
Resumo:
Durch den großen Erfolg des Cloud Computing und der hohen Geschwindigkeit, mit der Cloud-Innovationen seither Einzug in die Praxis finden, eröffnen sich für die Industrie neue Chancen im Wettbewerb. Von besonderer Bedeutung sind die Möglichkeiten, Cloud-gestützte Geschäftsprozesse dynamisch, als direkte Reaktion auf einen Kundenauftrag, anzupassen und auszuführen. Dies gilt insbesondere auch für kooperative und unternehmensübergreifende Anwendungen, welche aus mehreren IT-Diensten verschiedener Partner bestehen. Gegenstand dieses Artikels ist die Vorstellung eines Konzeptes und einer Architektur für eine zentrale Cloud-Plattform zur Konfiguration, Ausführung und Überwachung von kollaborativen Logistik-Prozessen. Auf dieser Plattform können Geschäftsprozesse modelliert und in ihren Privacy-Eigenschaften parametrisiert werden. Die einzelnen Prozesselemente werden dabei mit IT-Diensten verknüpft, die beispielsweise auf externen Cloud-Plattformen ausgeführt werden. Ein Schwerpunkt der Veröffentlichung liegt in der Betrachtung der Erstellung, Umsetzung und Überwachung von Privacy-Anforderungen.
Resumo:
This thesis is a research about the recent complex spatial changes in Namibia and Tanzania and local communities’ capacity to cope with, adapt to and transform the unpredictability engaged to these processes. I scrutinise the concept of resilience and its potential application to explaining the development of local communities in Southern Africa when facing various social, economic and environmental changes. My research is based on three distinct but overlapping research questions: what are the main spatial changes and their impact on the study areas in Namibia and Tanzania? What are the adaptation, transformation and resilience processes of the studied local communities in Namibia and Tanzania? How are innovation systems developed, and what is their impact on the resilience of the studied local communities in Namibia and Tanzania? I use four ethnographic case studies concerning environmental change, global tourism and innovation system development in Namibia and Tanzania, as well as mixed-methodological approaches, to study these issues. The results of my empirical investigation demonstrate that the spatial changes in the localities within Namibia and Tanzania are unique, loose assemblages, a result of the complex, multisided, relational and evolutional development of human and non-human elements that do not necessarily have linear causalities. Several changes co-exist and are interconnected though uncertain and unstructured and, together with the multiple stressors related to poverty, have made communities more vulnerable to different changes. The communities’ adaptation and transformation measures have been mostly reactive, based on contingency and post hoc learning. Despite various anticipation techniques, coping measures, adaptive learning and self-organisation processes occurring in the localities, the local communities are constrained by their uneven power relationships within the larger assemblages. Thus, communities’ own opportunities to increase their resilience are limited without changing the relations in these multiform entities. Therefore, larger cooperation models are needed, like an innovation system, based on the interactions of different actors to foster cooperation, which require collaboration among and input from a diverse set of stakeholders to combine different sources of knowledge, innovation and learning. Accordingly, both Namibia and Tanzania are developing an innovation system as their key policy to foster transformation towards knowledge-based societies. Finally, the development of an innovation system needs novel bottom-up approaches to increase the resilience of local communities and embed it into local communities. Therefore, innovation policies in Namibia have emphasised the role of indigenous knowledge, and Tanzania has established the Living Lab network.
Resumo:
El presente trabajo interpreta textos de la literatura latinoamericana (1950-1970) desde la perspectiva ecocrítica, apoyándose en una mirada sistémica de pensadores del paradigma emergente que proporciona una lectura renovadora en torno al discurso literario desde un enfoque de ecología profunda. Esta literatura indaga sobre la génesis propia haciendo dialogar elementos sagrados de la cultura con todo lo demás, de esta forma, lo indígena, lo africano, lo europeose expresa desde las voces de los persona- jes con una mirada neoparadigmática, que transgrede la visión tradicional del paradigma de la modernidad en cuanto lectura de la identidad latinoamericana.AbstractThe following article analyzes on Latin American literature texts (1950-1970) from an ecocritical perspective. It is based on a systemic look from the emergent paradigm thinkers. This paradigm gives a new kind of reading regarding literary texts focusing on deep ecology perspective. Latin American literature explores its own genesis bringing together sacred and non sacred elements from culture. In this sense, Indigenous, African and European cultures sets up a dialogue. Cultures express themselves throughout the characters’ voices in texts embracing a neo-paradigmatic look that trespasses the traditional vision of modernity in regard to Latin American identity.
Resumo:
Barium stars are optimal sites for studying the correlations between the neutron-capture elements and other species that may be depleted or enhanced, because they act as neutron seeds or poisons during the operation of the s-process. These data are necessary to help constrain the modeling of the neutron-capture paths and explain the s-process abundance curve of the solar system. Chemical abundances for a large number of barium stars with different degrees of s-process excesses, masses, metallicities, and evolutionary states are a crucial step towards this goal. We present abundances of Mn, Cu, Zn, and various light and heavy elements for a sample of barium and normal giant stars, and present correlations between abundances contributed to different degrees by the weak-s, mains, and r-processes of neutron capture, between Fe-peak elements and heavy elements. Data from the literature are also considered in order to better study the abundance pattern of peculiar stars. The stellar spectra were observed with FEROS/ESO. The stellar atmospheric parameters of the eight barium giant stars and six normal giants that we analyzed lie in the range 4300 < T(eff)/K < 5300, -0.7 < [Fe/H] <= 0.12 and 1.5 <= log g < 2.9. Carbon and nitrogen abundances were derived by spectral synthesis of the molecular bands of C(2), CH, and CN. For all other elements we used the atomic lines to perform the spectral synthesis. A very large scatter was found mainly for the Mn abundances when data from the literature were considered. We found that [Zn/Fe] correlates well with the heavy element excesses, its abundance clearly increasing as the heavy element excesses increase, a trend not shown by the [Cu/Fe] and [Mn/Fe] ratios. Also, the ratios involving Mn, Cu, and Zn and heavy elements usually show an increasing trend toward higher metallicities. Our results suggest that a larger fraction of the Zn synthesis than of Cu is owed to massive stars, and that the contribution of the main-s process to the synthesis of both elements is small. We also conclude that Mn is mostly synthesized by SN Ia, and that a non-negligible fraction of the synthesis of Mn, Cu, and Zn is owed to the weak s-process.
Resumo:
This work presents a non-linear boundary element formulation applied to analysis of contact problems. The boundary element method (BEM) is known as a robust and accurate numerical technique to handle this type of problem, because the contact among the solids occurs along their boundaries. The proposed non-linear formulation is based on the use of singular or hyper-singular integral equations by BEM, for multi-region contact. When the contact occurs between crack surfaces, the formulation adopted is the dual version of BEM, in which singular and hyper-singular integral equations are defined along the opposite sides of the contact boundaries. The structural non-linear behaviour on the contact is considered using Coulomb`s friction law. The non-linear formulation is based on the tangent operator in which one uses the derivate of the set of algebraic equations to construct the corrections for the non-linear process. This implicit formulation has shown accurate as the classical approach, however, it is faster to compute the solution. Examples of simple and multi-region contact problems are shown to illustrate the applicability of the proposed scheme. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This work deals with analysis of cracked structures using BEM. Two formulations to analyse the crack growth process in quasi-brittle materials are discussed. They are based on the dual formulation of BEM where two different integral equations are employed along the opposite sides of the crack surface. The first presented formulation uses the concept of constant operator, in which the corrections of the nonlinear process are made only by applying appropriate tractions along the crack surfaces. The second presented BEM formulation to analyse crack growth problems is an implicit technique based on the use of a consistent tangent operator. This formulation is accurate, stable and always requires much less iterations to reach the equilibrium within a given load increment in comparison with the classical approach. Comparison examples of classical problem of crack growth are shown to illustrate the performance of the two formulations. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Screening of topologies developed by hierarchical heuristic procedures can be carried out by comparing their optimal performance. In this work we will be exploiting mono-objective process optimization using two algorithms, simulated annealing and tabu search, and four different objective functions: two of the net present value type, one of them including environmental costs and two of the global potential impact type. The hydrodealkylation of toluene to produce benzene was used as case study, considering five topologies with different complexities mainly obtained by including or not liquid recycling and heat integration. The performance of the algorithms together with the objective functions was observed, analyzed and discussed from various perspectives: average deviation of results for each algorithm, capacity for producing high purity product, screening of topologies, objective functions robustness in screening of topologies, trade-offs between economic and environmental type objective functions and variability of optimum solutions.
Resumo:
In this paper we review the different relativistic and QED contributions to energies, ionic radii, transition probabilities and Landé g-factors in super-heavy elements, with the help of the MultiConfiguration Dirac-Fock method (MCDF). The effects of taking into account the Breit interaction to all orders by including it in the self-consistent field process are demonstrated. State of the art radiative corrections are included in the calculation and discussed. We also study the non-relativistic limit of MCDF calculation and find that the non-relativistic offset can be unexpectedly large.
Resumo:
This paper refers to the assessment on site by semi-destructive testing (SDT) methods of the consolidation efficiency of a conservation process developed by Henriques (2011) for structural and non-structural pine wood elements in service. This study was applied on scots pine wood (Pinus sylvestris L.) degraded by fungi after treatment with a biocidal product followed by consolidation with a polymeric product. This solution avoids substitutions of wood moderately degraded by fungi, improving its physical and mechanical characteristics. The consolidation efficiency was assessed on site by methods of drill resistance and penetration resistance. The SDT methods used showed good sensitivity to the conservation process and could evaluate their effectiveness. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Mecânica