911 resultados para GNSS, Ambiguity resolution, Regularization, Ill-posed problem, Success probability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-impact, localized intense rainfall episodes represent a major socio-economic problem for societies worldwide, and at the same time these events are notoriously difficult to simulate properly in climate models. Here, the authors investigate how horizontal resolution and model formulation influence this issue by applying the HARMONIE regional climate model (HCLIM) with three different setups; two using convection parameterization at 15 and 6.25 km horizontal resolution (the latter within the “grey-zone” scale), with lateral boundary conditions provided by ERA-Interim reanalysis and integrated over a pan-European domain, and one with explicit convection at 2 km resolution (HCLIM2) over the Alpine region driven by the 15 km model. Seven summer seasons were sampled and validated against two high-resolution observational data sets. All HCLIM versions underestimate the number of dry days and hours by 20-40%, and overestimate precipitation over the Alpine ridge. Also, only modest added value were found of “grey-zone” resolution. However, the single most important outcome is the substantial added value in HCLIM2 compared to the coarser model versions at sub-daily time scales. It better captures the local-to-regional spatial patterns of precipitation reflecting a more realistic representation of the local and meso-scale dynamics. Further, the duration and spatial frequency of precipitation events, as well as extremes, are closer to observations. These characteristics are key ingredients in heavy rainfall events and associated flash floods, and the outstanding results using HCLIM in convection-permitting setting are convincing and encourage further use of the model to study changes in such events in changing climates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Use of nonlinear parameter estimation techniques is now commonplace in ground water model calibration. However, there is still ample room for further development of these techniques in order to enable them to extract more information from calibration datasets, to more thoroughly explore the uncertainty associated with model predictions, and to make them easier to implement in various modeling contexts. This paper describes the use of pilot points as a methodology for spatial hydraulic property characterization. When used in conjunction with nonlinear parameter estimation software that incorporates advanced regularization functionality (such as PEST), use of pilot points can add a great deal of flexibility to the calibration process at the same time as it makes this process easier to implement. Pilot points can be used either as a substitute for zones of piecewise parameter uniformity, or in conjunction with such zones. In either case, they allow the disposition of areas of high and low hydraulic property value to be inferred through the calibration process, without the need for the modeler to guess the geometry of such areas prior to estimating the parameters that pertain to them. Pilot points and regularization can also be used as an adjunct to geostatistically based stochastic parameterization methods. Using the techniques described herein, a series of hydraulic property fields can be generated, all of which recognize the stochastic characterization of an area at the same time that they satisfy the constraints imposed on hydraulic property values by the need to ensure that model outputs match field measurements. Model predictions can then be made using all of these fields as a mechanism for exploring predictive uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bush administration's continuing emphasis on US military deterrence of the PRC on behalf of Taiwan threatens to undermine the posture of 'strategic ambiguity' that the United States has proclaimed since 1979. This article argues for the retention of 'strategic ambiguity' and traces the origins of revisionist sentiment towards this effective conflict avoidance mechanism to reactions within the US foreign policy community to the 1995-96 Taiwan Strait crisis. Case studies of this crisis and its predecessors in 1954-55 and 1958 demonstrate that US military deterrence was not a decisive factor in their resolution. US and PRC initiatives and responses in the 1950s crises introduced the essential elements of 'strategic ambiguity' into the triangular relationship between themselves and Taiwan. In particular, they established a precedent for the United States and the PRC in circumscribing the issue of Taiwan so as to achieve a political accommodation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To describe a series of patients with clinically significant lead poisoning. Methodology: A case series of nine patients with lead poisoning who required inpatient management, identified through a Clinical Toxicology Service. Results: Nine children presented with clinically significant lead poisoning. The median serum lead was 2.5 mumol/L (range 1.38-4.83). Eight of the children were exposed to lead-based paint, with seven due to dust from sanded lead paint during house renovations. Serial blood determinations suggested re-exposure in four of the patients, and in one of these patients the re-exposure was from a different source of lead. Eight of the patients required chelation therapy. Conclusions: Serious lead poisoning continues to occur and there appears to be complacency regarding the hazard posed by lead paint in old houses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A calibration methodology based on an efficient and stable mathematical regularization scheme is described. This scheme is a variant of so-called Tikhonov regularization in which the parameter estimation process is formulated as a constrained minimization problem. Use of the methodology eliminates the need for a modeler to formulate a parsimonious inverse problem in which a handful of parameters are designated for estimation prior to initiating the calibration process. Instead, the level of parameter parsimony required to achieve a stable solution to the inverse problem is determined by the inversion algorithm itself. Where parameters, or combinations of parameters, cannot be uniquely estimated, they are provided with values, or assigned relationships with other parameters, that are decreed to be realistic by the modeler. Conversely, where the information content of a calibration dataset is sufficient to allow estimates to be made of the values of many parameters, the making of such estimates is not precluded by preemptive parsimonizing ahead of the calibration process. White Tikhonov schemes are very attractive and hence widely used, problems with numerical stability can sometimes arise because the strength with which regularization constraints are applied throughout the regularized inversion process cannot be guaranteed to exactly complement inadequacies in the information content of a given calibration dataset. A new technique overcomes this problem by allowing relative regularization weights to be estimated as parameters through the calibration process itself. The technique is applied to the simultaneous calibration of five subwatershed models, and it is demonstrated that the new scheme results in a more efficient inversion, and better enforcement of regularization constraints than traditional Tikhonov regularization methodologies. Moreover, it is argued that a joint calibration exercise of this type results in a more meaningful set of parameters than can be achieved by individual subwatershed model calibration. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radio-frequency ( RF) coils are designed such that they induce homogeneous magnetic fields within some region of interest within a magnetic resonance imaging ( MRI) scanner. Loading the scanner with a patient disrupts the homogeneity of these fields and can lead to a considerable degradation of the quality of the acquired image. In this paper, an inverse method is presented for designing RF coils, in which the presence of a load ( patient) within the MRI scanner is accounted for in the model. To approximate the finite length of the coil, a Fourier series expansion is considered for the coil current density and for the induced fields. Regularization is used to solve this ill-conditioned inverse problem for the unknown Fourier coefficients. That is, the error between the induced and homogeneous target fields is minimized along with an additional constraint, chosen in this paper to represent the curvature of the coil windings. Smooth winding patterns are obtained for both unloaded and loaded coils. RF fields with a high level of homogeneity are obtained in the unloaded case and a limit to the level of homogeneity attainable is observed in the loaded case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualisation of multiple isoforms of kappa-casein on 2-D gels is restricted by the abundant alpha- and beta-caseins that not only limit gel loading but also migrate to similar regions as the more acidic kappa-casein isoforms. To overcome this problem, we took advantage of the absence of cysteine residues in alpha(S1)- and beta-casein by devising an affinity enrichment procedure based on reversible biotinylation of cysteine residues. Affinity capture of cysteine-containing proteins on avidin allowed the removal of the vast majority of alpha(S1)- and beta-casein, and on subsequent 2-D gel analysis 16 gel spots were identified as kappa-casein by PMF. Further analysis of the C-terminal tryptic peptide along with structural predictions based on mobility on the 2-D gel allowed us to assign identities to each spot in terms of genetic variant (A or B), phosphorylation status (1, 2 or 3) and glycosylation status (from 0 to 6). Eight isoforms of the A and B variants with the same PTMs were observed. When the casein fraction of milk from a single cow, homozygous for the B variant of kappa-casein, was used as the starting material, 17 isoforms from 13 gel spots were characterised. Analysis of isoforms of low abundance proved challenging due to the low amount of material that could be extracted from the gels as well as the lability of the PTMs during MS analysis. However, we were able to identify a previously unrecognised site, T-166, that could be phosphorylated or glycosylated. Despite many decades of analysis of milk proteins, the reasons for this high level of heterogeneity are still not clear.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Government agencies responsible for riparian environments are assessing the combined utility of field survey and remote sensing for mapping and monitoring indicators of riparian zone health. The objective of this work was to determine if the structural attributes of savanna riparian zones in northern Australia can be detected from commercially available remotely sensed image data. Two QuickBird images and coincident field data covering sections of the Daly River and the South Alligator River - Barramundie Creek in the Northern Territory were used. Semi-variograms were calculated to determine the characteristic spatial scales of riparian zone features, both vegetative and landform. Interpretation of semi-variograms showed that structural dimensions of riparian environments could be detected and estimated from the QuickBird image data. The results also show that selecting the correct spatial resolution and spectral bands is essential to maximize the accuracy of mapping spatial characteristics of savanna riparian features. The distribution of foliage projective cover of riparian vegetation affected spectral reflectance variations in individual spectral bands differently. Pan-sharpened image data enabled small-scale information extraction (< 6 m) on riparian zone structural parameters. The semi-variogram analysis results provide the basis for an inversion approach using high spatial resolution satellite image data to map indicators of savanna riparian zone health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transnational Environmental Policy analyses a surprising success story in the field of international environmental policy making: the threat to the ozone layer posed by industrial chemicals, and how it has been averted. The book also raises the more general question about the problem-solving capacities of industrialised countries and the world society as a whole. Reiner Grundmann investigates the regulations which have been put in place at an international level, and how the process evolved over twenty years in the US and Germany.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature on ambiguity reflects contradictory views on its value as a resource or a problem for organizational action. In this longitudinal empirical study of ambiguity about a strategic goal, we examined how strategic ambiguity is used as a discursive resource by different organizational constituents and how that is associated with collective action around the strategic goal. We found four rhetorical positions, each of which drew upon strategic ambiguity to construct the strategic goal differently according to whether the various constituents were asserting their own interests or accommodating wider organizational interests. However, we also found that the different constituents maintained these four rhetorical positions simultaneously over time, enabling them to shift between their own and other’s interests rather than converging upon a common interest. These findings are used to develop a conceptual framework that explains how strategic ambiguity might serve as a resource for different organizational constituents to assert their own interests whilst also enabling collective organizational action, at least of a temporary nature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest to secure the much vaunted benefits of North Sea oil, highly non-incremental technologies have been adopted. Nowhere is this more the case than with the early fields of the central and northern North Sea. By focusing on the inflexible nature of North Sea hardware, in such fields, this thesis examines the problems that this sort of technology might pose for policy making. More particularly, the following issues are raised. First, the implications of non-incremental technical change for the successful conduct of oil policy is raised. Here, the focus is on the micro-economic performance of the first generation of North Sea oil fields and the manner in which this relates to government policy. Secondly, the question is posed as to whether there were more flexible, perhaps more incremental policy alternatives open to the decision makers. Conclusions drawn relate to the degree to which non-incremental shifts in policy permit decision makers to achieve their objectives at relatively low cost. To discover cases where non-incremental policy making has led to success in this way, would be to falsify the thesis that decision makers are best served by employing incremental politics as an approach to complex problem solving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the transfer of a message between two cultures very frequently takes place through the medium of a written text qua communicative event, it would seem useful to attempt to ascertain whether there is any kind of pattern in the use of strategies for the effective interlingual transfer of this message. Awareness of potentially successful strategies, within the constraints of context, text type, intended TL function and TL reader profile will enhance quality and cost-effectiveness (time, effort, financial costs) in the production of the target text. Through contrastive analysis of pairs of advertising texts, SL and TL, French and English, this study will attempt to identify the nature of some recurring choices made by different translators in the attempt to recreate ST information in the TL in such a manner as to reproduce as closely as possible the informative, persuasive and affective functions of the text as advertising material. Whilst recurrence may be seen to be significant in terms of illustrating tendencies with regard to the solution of problems of translation, this would not necessarily be taken as confirmation of the existence of pre-determined or prescriptive rules. These tendencies could, however, be taken as a guide to potential solutions to certain kinds of context-bound and text-type specific problem. Analysis of translated text-pairs taken from the field of advertising should produce examples of constraints posed by the need to select the content, tone and form of the Target Text, in order to ensure maximum efficacy of persuasive effect and to ensure the desired outcome, as determined by the Source Text function. When evaluating the success of a translated advertising text, constraints could be defined in terms of the culture-specific references or assumptions on which a Source Text may build in order to achieve its intended communicative function within the target community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with Organisational Problem Solving. The work reflects the complexities of organisational problem situations and the eclectic approach that has been necessary to gain an understanding of the processes involved. The thesis is structured into three main parts. Part I describes the author's understanding of problems and suitable approaches. Chapter 2 identifies the Transcendental Realist (TR) view of science (Harre 1970, Bhaskar 1975) as the best general framework for identifying suitable approaches to complex organisational problems. Chapter 3 discusses the relationship between Checkland's methodology (1972) and TR. The need to generate iconic (explanatory) models of the problem situation is identified and the ability of viable system modelling to supplement the modelling stage of the methodology is explored in Chapter 4. Chapter 5 builds further on the methodology to produce an original iconic model of the methodological process. The model characterises the mechanisms of organisational problem situations as well as desirable procedural steps. The Weltanschauungen (W's) or "world views" of key actors is recognised as central to the mechanisms involved. Part II describes the experience which prompted the theoretical investigation. Chapter 6 describes the first year of the project. The success of this stage is attributed to the predominance of a single W. Chapter 7 describes the changes in the organisation which made the remaining phase of the project difficult. These difficulties are attributed to a failure to recognise the importance of differing W's. Part III revisits the theoretical and organisational issues. Chapter 8 identifies a range of techniques embodying W's which are compatible with .the framework of Part I and which might usefully supplement it. Chapter 9 characterises possible W's in the sponsoring organisation. Throughout the work, an attempt 1s made to reflect the process as well as the product of the author's leaving.