926 resultados para Elementary Methods In Number Theory
Resumo:
From its roots in strategic management theory, stakeholder management has been adopted by the construction management academic community and applied as a valid paradigm around which research work has been generated aiming to improve project effi ciencies and effectiveness. However, academics have argued that stakeholder management should move away from purely theoretical discussions and engage more with the realities of construction project work. This paper re-appraises the stakeholder management concept for the construction domain by re-thinking some of the fundamental principles and ideals present within the more general stakeholder theory literature. It engages with issues which researchers have arguably failed to acknowledge and calls for a re-evaluation of construction stakeholder management research by presenting a review around four distinctive themes: the moral obligations of engaging with stakeholders against the business and efficiency driven imperatives of construction organisations; the contrast between theoretical abstractions and empirically grounded research; the tensions between theoretical convergence versus calls for multiple and divergent perspectives on stakeholder management and the practicalities of conducting stakeholder management in the construction domain. Such a critical re-appraisal of stakeholder management thinking both generates new lines of enquiry and promises to help inform and shape current and future industry practice.
Resumo:
This article compares the results obtained from using two different methodological approaches to elicit teachers’ views on their professional role, the key challenges and their aspirations for the future. One approach used a postal/online questionnaire, while the other used telephone interviews, posing a selection of the same questions. The research was carried out on two statistically comparable samples of teachers in England in spring 2004. Significant differences in responses were observed which seem to be attributable to the methods employed. In particular, more ‘definite’ responses were obtained in the interviews than in response to the questionnaire. This article reviews the comparative outcomes in the context of existing research and explores why the separate methods may have produced significantly different responses to the same questions.
Resumo:
Data assimilation aims to incorporate measured observations into a dynamical system model in order to produce accurate estimates of all the current (and future) state variables of the system. The optimal estimates minimize a variational principle and can be found using adjoint methods. The model equations are treated as strong constraints on the problem. In reality, the model does not represent the system behaviour exactly and errors arise due to lack of resolution and inaccuracies in physical parameters, boundary conditions and forcing terms. A technique for estimating systematic and time-correlated errors as part of the variational assimilation procedure is described here. The modified method determines a correction term that compensates for model error and leads to improved predictions of the system states. The technique is illustrated in two test cases. Applications to the 1-D nonlinear shallow water equations demonstrate the effectiveness of the new procedure.
Resumo:
The Fourier series can be used to describe periodic phenomena such as the one-dimensional crystal wave function. By the trigonometric treatements in Hückel theory it is shown that Hückel theory is a special case of Fourier series theory. Thus, the conjugated π system is in fact a periodic system. Therefore, it can be explained why such a simple theorem as Hückel theory can be so powerful in organic chemistry. Although it only considers the immediate neighboring interactions, it implicitly takes account of the periodicity in the complete picture where all the interactions are considered. Furthermore, the success of the trigonometric methods in Hückel theory is not accidental, as it based on the fact that Hückel theory is a specific example of the more general method of Fourier series expansion. It is also important for education purposes to expand a specific approach such as Hückel theory into a more general method such as Fourier series expansion.
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
In the early 1920s, before Virginia Woolf wrote her now well-known essays “The New Biography” and “The Art of Biography,” the Hogarth Press published four biographies of Tolstoy. Each of these English translations of Russian works takes a different approach to biographical composition, and as a group they offer multiple and contradictory perspectives on Tolstoy’s character and on the genre of biography in the early twentieth century. These works show that Leonard and Virginia Woolf’s Hogarth Press took a multi-perspectival, modernist approach to publishing literary lives.
Resumo:
This book advances a fresh philosophical account of the relationship between the legislature and courts, opposing the common conception of law, in which it is legislatures that primarily create the law, and courts that primarily apply it. This conception has eclectic affinities with legal positivism, and although it may have been a helpful intellectual tool in the past, it now increasingly generates more problems than it solves. For this reason, the author argues, legal philosophers are better off abandoning it. At the same time they are asked to dismantle the philosophical and doctrinal infrastructure that has been based on it and which has been hitherto largely unquestioned. In its place the book offers an alternative framework for understanding the role of courts and the legislature; a framework which is distinctly anti-positivist and which builds on Ronald Dworkin’s interpretive theory of law. But, contrary to Dworkin, it insists that legal duty is sensitive to the position one occupies in the project of governing; legal interpretation is not the solitary task of one super-judge, but a collaborative task structured by principles of institutional morality such as separation of powers which impose a moral duty on participants to respect each other's contributions. Moreover this collaborative task will often involve citizens taking an active role in their interaction with the law.
Resumo:
The weak-constraint inverse for nonlinear dynamical models is discussed and derived in terms of a probabilistic formulation. The well-known result that for Gaussian error statistics the minimum of the weak-constraint inverse is equal to the maximum-likelihood estimate is rederived. Then several methods based on ensemble statistics that can be used to find the smoother (as opposed to the filter) solution are introduced and compared to traditional methods. A strong point of the new methods is that they avoid the integration of adjoint equations, which is a complex task for real oceanographic or atmospheric applications. they also avoid iterative searches in a Hilbert space, and error estimates can be obtained without much additional computational effort. the feasibility of the new methods is illustrated in a two-layer quasigeostrophic model.