911 resultados para New Strategic Theory
Resumo:
Natural hazards trigger disasters, the scale of which is largely determined by vulnerability. Developing countries suffer the most from disasters due to various conditions of vulnerability which exist and there is an opportunity after disasters to take mitigative action. NGOs implementing post-disaster rehabilitation projects must be able to address the issues causing communities to live at risk of disaster and therefore must build dynamic capacity, capabilities and competencies, enabling them to operate in unstable environments. This research is built upon a theoretical framework of dynamic competency established by combining elements of disaster management, strategic management and project management theory. A number of NGOs which have implemented reconstruction and rehabilitation projects both in Sri Lanka following the Asian Tsunami and Bangladesh following Cyclone Sidr are being investigated in great depth using a causal mapping procedure. ‘Event’ specific maps have been developed for each organization in each disaster. This data will be analysed with a view to discovering the strategies which lead to vulnerability reduction in post-disaster communities and the competencies that NGOs must possess in order to achieve favourable outcomes. It is hypothesized that by building organizational capacity, capabilities and competencies to be dynamic in nature, while focusing on a more emergent strategic approach, with emphasis on adaptive capability and innovation, NGOs will be better equipped to contribute to sustainable community development through reconstruction. We believe that through this study it will be possible to glean a new understanding of social processes that emerge within community rehabilitation projects.
Resumo:
The paper considers how planning as a political activity is underpinned by concepts of justice and how professional practitioners are consistently faced with making ethical choices in the public interest. The key objective is therefore to identify the centrality of ethics in praxis. In this context, political liberal theory is empirically useful in exploring both the role of participants and the processes employed in strategic planning. A case study analysis generates key issues which are relevant to planning in the wider arena and an extensive series of interviews provides interesting insights into the dynamic between those involved and the effectiveness of procedures followed.
Resumo:
An investigation of the long controversy around the definition of an Italian New Wave cinema of the 1960s, this essay engages (and takes issue) with the reasons behind the critics’ reluctance to recognise its existence. After establishing a theoretical and historical framework for a transnational under- standing of the phenomenon of the European and World New Waves, it offers a reasoned analysis of the multiple industrial and artistic attempts at a generational renewal of Italian cinema that were made in Italy during the 1960s. Ultimately, the essay suggests that it would not only be appropriate, but also highly productive to reconsider the vibrant and heterogeneous young Italian cinema of the 1960s under the generational and transnational New Wave label, instead of continuing to approach the decade exclusively in the light of Neorealism.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
The integration of detailed information on feeding interactions with measures of abundance and body mass of individuals provides a powerful platform for understanding ecosystem organisation. Metabolism and, by proxy, body mass constrain the flux, turnover and storage of energy and biomass in food webs. Here, we present the first food web data for Lough Hyne, a species rich Irish Sea Lough. Through the application of individual-and size-based analysis of the abundance-body mass relationship, we tested predictions derived from the metabolic theory of ecology. We found that individual body mass constrained the flux of biomass and determined its distribution within the food web. Body mass was also an important determinant of diet width and niche overlap, and predator diets were nested hierarchically, such that diet width increased with body mass. We applied a novel measure of predator-prey biomass flux which revealed that most interactions in Lough Hyne were weak, whereas only a few were strong. Further, the patterning of interaction strength between prey sharing a common predator revealed that strong interactions were nearly always coupled with weak interactions. Our findings illustrate that important insights into the organisation, structure and stability of ecosystems can be achieved through the theoretical exploration of detailed empirical data.
Resumo:
The microkinetics based on density function theory (DFT) calculations is utilized to investigate the reaction mechanism of crotonaldehyde hydrogenation on Pt(111) in the free energy landscape. The dominant reaction channel of each hydrogenation product is identified. Each of them begins with the first surface hydrogenation of the carbonyl oxygen of crotonaldehyde on the surface. A new mechanism, 1,4-addition mechanism generating enols (butenol), which readily tautomerize to saturated aldehydes (butanal), is identified as a primary mechanism to yield saturated aldehydes instead of the 3,4-addition via direct hydrogenation of the ethylenic bond. The calculation results also show that the full hydrogenation product, butylalcohol, mainly stems from the deep hydrogenation of surface open-shell dihydrogenation intermediates. It is found that the apparent barriers of the dominant pathways to yield three final products are similar on P(111), which makes it difficult to achieve a high selectivity to the desired crotyl alcohol (COL).
Resumo:
Automated examination timetabling has been addressed by a wide variety of methodologies and techniques over the last ten years or so. Many of the methods in this broad range of approaches have been evaluated on a collection of benchmark instances provided at the University of Toronto in 1996. Whilst the existence of these datasets has provided an invaluable resource for research into examination timetabling, the instances have significant limitations in terms of their relevance to real-world examination timetabling in modern universities. This paper presents a detailed model which draws upon experiences of implementing examination timetabling systems in universities in Europe, Australasia and America. This model represents the problem that was presented in the 2nd International Timetabling Competition (ITC2007). In presenting this detailed new model, this paper describes the examination timetabling track introduced as part of the competition. In addition to the model, the datasets used in the competition are also based on current real-world instances introduced by EventMAP Limited. It is hoped that the interest generated as part of the competition will lead to the development, investigation and application of a host of novel and exciting techniques to address this important real-world search domain. Moreover, the motivating goal of this paper is to close the currently existing gap between theory and practice in examination timetabling by presenting the research community with a rigorous model which represents the complexity of the real-world situation. In this paper we describe the model and its motivations, followed by a full formal definition.
Absorbing new knowledge in small and medium-sized enterprises: A multiple case analysis of Six Sigma
Resumo:
The primary aim of this article is to critically analyse the development of Six Sigma theory and practice within small and medium-sized enterprises (SMEs) using a multiple case study approach. The article also explores the subsequent development of Lean Six Sigma as a means of addressing the perceived limitations of the efficacy of Six Sigma in this context. The overarching theoretical framework is that of absorptive capacity, where Six Sigma is conceptualized as new knowledge to be absorbed by smaller firms. The findings from a multiple case study involving repeat interviews and focus groups informed the development of an analytical model demonstrating the dynamic underlying routines for the absorptive capacity process and the development of a number of summative propositions relating the characteristics of SMEs to Six Sigma and Lean Six Sigma implementation.
Resumo:
The majority of reported learning methods for Takagi-Sugeno-Kang fuzzy neural models to date mainly focus on the improvement of their accuracy. However, one of the key design requirements in building an interpretable fuzzy model is that each obtained rule consequent must match well with the system local behaviour when all the rules are aggregated to produce the overall system output. This is one of the distinctive characteristics from black-box models such as neural networks. Therefore, how to find a desirable set of fuzzy partitions and, hence, to identify the corresponding consequent models which can be directly explained in terms of system behaviour presents a critical step in fuzzy neural modelling. In this paper, a new learning approach considering both nonlinear parameters in the rule premises and linear parameters in the rule consequents is proposed. Unlike the conventional two-stage optimization procedure widely practised in the field where the two sets of parameters are optimized separately, the consequent parameters are transformed into a dependent set on the premise parameters, thereby enabling the introduction of a new integrated gradient descent learning approach. A new Jacobian matrix is thus proposed and efficiently computed to achieve a more accurate approximation of the cost function by using the second-order Levenberg-Marquardt optimization method. Several other interpretability issues about the fuzzy neural model are also discussed and integrated into this new learning approach. Numerical examples are presented to illustrate the resultant structure of the fuzzy neural models and the effectiveness of the proposed new algorithm, and compared with the results from some well-known methods.
Resumo:
This paper introduces the discrete choice model-paradigm of Random Regret Minimization (RRM) to the field of environmental and resource economics. The RRM-approach has been very recently developed in the context of travel demand modelling and presents a tractable, regret-based alternative to the dominant choice-modelling paradigm based on Random Utility Maximization-theory (RUM-theory). We highlight how RRM-based models provide closed form, logit-type formulations for choice probabilities that allow for capturing semi-compensatory behaviour and choice set-composition effects while being equally parsimonious as their utilitarian counterparts. Using data from a Stated Choice-experiment aimed at identifying valuations of characteristics of nature parks, we compare RRM-based models and RUM-based models in terms of parameter estimates, goodness of fit, elasticities and consequential policy implications.
Resumo:
A new linear equations method for calculating the R-matrix, which arises in the R-matrix-Floquet theory of multiphoton processes, is introduced. This method replaces the diagonalization of the Floquet Hamiltonian matrix by the solution of a set of linear simultaneous equations which are solved, in the present work, by the conjugate gradient method. This approach uses considerably less computer memory and can be readily ported onto parallel computers. It will thus enable much larger problems of current interest to be treated. This new method is tested by applying it to three-photon ionization of helium at frequencies where double resonances with a bound state and autoionizing states are important. Finally, an alternative linear equations method, which avoids the explicit calculation of the R-matrix by incorporating the boundary conditions directly, is described in an appendix.
Resumo:
A generalized linear theory for electromagnetic waves in a homogeneous dusty magnetoplasma is presented. The waves described are characterized by a frequency which is much smaller (larger) than the electron gyrofrequency (dust plasma and dust gyrofrequencies), and a long wavelength (in comparison with the ion gyroradius and the electron skin depth). The generalized Hall- magnetohydrodynamic (GH-MHD) equations are derived by assuming massive charged dust macroparticles to be immobile, and Fourier transformed to obtain a general dispersion relation. The latter is analyzed to understand the influence of immobile charged dust grains on various electromagnetic wave modes in a magnetized dusty plasma.