43 resultados para GNSS, Ambiguity resolution, Regularization, Ill-posed problem, Success probability
Resumo:
The literature on ambiguity reflects contradictory views on its value as a resource or a problem for organizational action. In this longitudinal empirical study of ambiguity about a strategic goal, we examined how strategic ambiguity is used as a discursive resource by different organizational constituents and how that is associated with collective action around the strategic goal. We found four rhetorical positions, each of which drew upon strategic ambiguity to construct the strategic goal differently according to whether the various constituents were asserting their own interests or accommodating wider organizational interests. However, we also found that the different constituents maintained these four rhetorical positions simultaneously over time, enabling them to shift between their own and other’s interests rather than converging upon a common interest. These findings are used to develop a conceptual framework that explains how strategic ambiguity might serve as a resource for different organizational constituents to assert their own interests whilst also enabling collective organizational action, at least of a temporary nature.
Resumo:
Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.
Resumo:
In the quest to secure the much vaunted benefits of North Sea oil, highly non-incremental technologies have been adopted. Nowhere is this more the case than with the early fields of the central and northern North Sea. By focusing on the inflexible nature of North Sea hardware, in such fields, this thesis examines the problems that this sort of technology might pose for policy making. More particularly, the following issues are raised. First, the implications of non-incremental technical change for the successful conduct of oil policy is raised. Here, the focus is on the micro-economic performance of the first generation of North Sea oil fields and the manner in which this relates to government policy. Secondly, the question is posed as to whether there were more flexible, perhaps more incremental policy alternatives open to the decision makers. Conclusions drawn relate to the degree to which non-incremental shifts in policy permit decision makers to achieve their objectives at relatively low cost. To discover cases where non-incremental policy making has led to success in this way, would be to falsify the thesis that decision makers are best served by employing incremental politics as an approach to complex problem solving.
Resumo:
We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.
Resumo:
Since the transfer of a message between two cultures very frequently takes place through the medium of a written text qua communicative event, it would seem useful to attempt to ascertain whether there is any kind of pattern in the use of strategies for the effective interlingual transfer of this message. Awareness of potentially successful strategies, within the constraints of context, text type, intended TL function and TL reader profile will enhance quality and cost-effectiveness (time, effort, financial costs) in the production of the target text. Through contrastive analysis of pairs of advertising texts, SL and TL, French and English, this study will attempt to identify the nature of some recurring choices made by different translators in the attempt to recreate ST information in the TL in such a manner as to reproduce as closely as possible the informative, persuasive and affective functions of the text as advertising material. Whilst recurrence may be seen to be significant in terms of illustrating tendencies with regard to the solution of problems of translation, this would not necessarily be taken as confirmation of the existence of pre-determined or prescriptive rules. These tendencies could, however, be taken as a guide to potential solutions to certain kinds of context-bound and text-type specific problem. Analysis of translated text-pairs taken from the field of advertising should produce examples of constraints posed by the need to select the content, tone and form of the Target Text, in order to ensure maximum efficacy of persuasive effect and to ensure the desired outcome, as determined by the Source Text function. When evaluating the success of a translated advertising text, constraints could be defined in terms of the culture-specific references or assumptions on which a Source Text may build in order to achieve its intended communicative function within the target community.
Resumo:
This thesis is concerned with Organisational Problem Solving. The work reflects the complexities of organisational problem situations and the eclectic approach that has been necessary to gain an understanding of the processes involved. The thesis is structured into three main parts. Part I describes the author's understanding of problems and suitable approaches. Chapter 2 identifies the Transcendental Realist (TR) view of science (Harre 1970, Bhaskar 1975) as the best general framework for identifying suitable approaches to complex organisational problems. Chapter 3 discusses the relationship between Checkland's methodology (1972) and TR. The need to generate iconic (explanatory) models of the problem situation is identified and the ability of viable system modelling to supplement the modelling stage of the methodology is explored in Chapter 4. Chapter 5 builds further on the methodology to produce an original iconic model of the methodological process. The model characterises the mechanisms of organisational problem situations as well as desirable procedural steps. The Weltanschauungen (W's) or "world views" of key actors is recognised as central to the mechanisms involved. Part II describes the experience which prompted the theoretical investigation. Chapter 6 describes the first year of the project. The success of this stage is attributed to the predominance of a single W. Chapter 7 describes the changes in the organisation which made the remaining phase of the project difficult. These difficulties are attributed to a failure to recognise the importance of differing W's. Part III revisits the theoretical and organisational issues. Chapter 8 identifies a range of techniques embodying W's which are compatible with .the framework of Part I and which might usefully supplement it. Chapter 9 characterises possible W's in the sponsoring organisation. Throughout the work, an attempt 1s made to reflect the process as well as the product of the author's leaving.
Resumo:
The prominent position given to academic writing across contemporary academia is reflected in the substantive literature and debate devoted to the subject over the past 30 years. However, the massification of higher education, manifested by a shift from elite to mass education, has brought the issue into the public arena, with much debate focusing on the need for ‘modern-day' students to be taught how to write academically (Bjork et al., 2003; Ganobcsik-Williams, 2006). Indeed, Russell (2003) argued that academic writing has become a global ‘problem' in Higher Education because it sits between two contradictory pressures (p.V). On one end of the university ‘experience' increasing numbers of students, many from non-traditional backgrounds, enter higher education bringing with them a range of communication abilities. At the other end, many graduates leave university to work in specialised industries where employers expect them to have high level writing skills (Ashton, 2007; Russell, 2003; Torrence et al., 1999). By drawing attention to the issues around peer mentoring within an academic writing setting in three different higher education Institutions, this paper makes an important contribution to current debates. Based upon a critical analysis of the emergent findings of an empirical study into the role of peer writing mentors in promoting student transition to higher education, the paper adopts an academic literacies approach to discuss the role of writing mentoring in promoting transition and retention by developing students' academic writing. Attention is drawn to the manner in which student expectations of writing mentoring actually align with mentoring practices - particularly in terms of the writing process and critical thinking. Other issues such as the approachability of writing mentors, the practicalities of accessing writing mentoring and the wider learning environment are also discussed.
Resumo:
We investigate two numerical procedures for the Cauchy problem in linear elasticity, involving the relaxation of either the given boundary displacements (Dirichlet data) or the prescribed boundary tractions (Neumann data) on the over-specified boundary, in the alternating iterative algorithm of Kozlov et al. (1991). The two mixed direct (well-posed) problems associated with each iteration are solved using the method of fundamental solutions (MFS), in conjunction with the Tikhonov regularization method, while the optimal value of the regularization parameter is chosen via the generalized cross-validation (GCV) criterion. An efficient regularizing stopping criterion which ceases the iterative procedure at the point where the accumulation of noise becomes dominant and the errors in predicting the exact solutions increase, is also presented. The MFS-based iterative algorithms with relaxation are tested for Cauchy problems for isotropic linear elastic materials in various geometries to confirm the numerical convergence, stability, accuracy and computational efficiency of the proposed method.
Resumo:
We investigate the problem of determining the stationary temperature field on an inclusion from given Cauchy data on an accessible exterior boundary. On this accessible part the temperature (or the heat flux) is known, and, additionally, on a portion of this exterior boundary the heat flux (or temperature) is also given. We propose a direct boundary integral approach in combination with Tikhonov regularization for the stable determination of the temperature and flux on the inclusion. To determine these quantities on the inclusion, boundary integral equations are derived using Green’s functions, and properties of these equations are shown in an L2-setting. An effective way of discretizing these boundary integral equations based on the Nystr¨om method and trigonometric approximations, is outlined. Numerical examples are included, both with exact and noisy data, showing that accurate approximations can be obtained with small computational effort, and the accuracy is increasing with the length of the portion of the boundary where the additionally data is given.
Resumo:
An iterative method for reconstruction of solutions to second order elliptic equations by Cauchy data given on a part of the boundary, is presented. At each iteration step, a series of mixed well-posed boundary value problems are solved for the elliptic operator and its adjoint. The convergence proof of this method in a weighted L2 space is included. (© 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
An iterative method for reconstruction of the solution to a parabolic initial boundary value problem of second order from Cauchy data is presented. The data are given on a part of the boundary. At each iteration step, a series of well-posed mixed boundary value problems are solved for the parabolic operator and its adjoint. The convergence proof of this method in a weighted L2-space is included.
Resumo:
We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.
Resumo:
A numerical method based on integral equations is proposed and investigated for the Cauchy problem for the Laplace equation in 3-dimensional smooth bounded doubly connected domains. To numerically reconstruct a harmonic function from knowledge of the function and its normal derivative on the outer of two closed boundary surfaces, the harmonic function is represented as a single-layer potential. Matching this representation against the given data, a system of boundary integral equations is obtained to be solved for two unknown densities. This system is rewritten over the unit sphere under the assumption that each of the two boundary surfaces can be mapped smoothly and one-to-one to the unit sphere. For the discretization of this system, Weinert’s method (PhD, Göttingen, 1990) is employed, which generates a Galerkin type procedure for the numerical solution, and the densities in the system of integral equations are expressed in terms of spherical harmonics. Tikhonov regularization is incorporated, and numerical results are included showing the efficiency of the proposed procedure.