83 resultados para data movement problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the 2D Dirichlet boundary value problem for Laplace’s equation in a non-locally perturbed half-plane, with data in the space of bounded and continuous functions. We show uniqueness of solution, using standard Phragmen-Lindelof arguments. The main result is to propose a boundary integral equation formulation, to prove equivalence with the boundary value problem, and to show that the integral equation is well posed by applying a recent partial generalisation of the Fredholm alternative in Arens et al [J. Int. Equ. Appl. 15 (2003) pp. 1-35]. This then leads to an existence proof for the boundary value problem. Keywords. Boundary integral equation method, Water waves, Laplace’s

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The s–x model of microwave emission from soil and vegetation layers is widely used to estimate soil moisture content from passive microwave observations. Its application to prospective satellite-based observations aggregating several thousand square kilometres requires understanding of the effects of scene heterogeneity. The effects of heterogeneity in soil surface roughness, soil moisture, water area and vegetation density on the retrieval of soil moisture from simulated single- and multi-angle observing systems were tested. Uncertainty in water area proved the most serious problem for both systems, causing errors of a few percent in soil moisture retrieval. Single-angle retrieval was largely unaffected by the other factors studied here. Multiple-angle retrievals errors around one percent arose from heterogeneity in either soil roughness or soil moisture. Errors of a few percent were caused by vegetation heterogeneity. A simple extension of the model vegetation representation was shown to reduce this error substantially for scenes containing a range of vegetation types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the efficacy of published δ18O data from the calcite of Late Miocene surface dwelling planktonic foraminifer shells, for sea surface temperature estimates for the pre-Quaternary. The data are from 33 Late Miocene (Messinian) marine sites from a modern latitudinal gradient of 64°N to 48°S. They give estimates of SSTs in the tropics/subtropics (to 30°N and S) that are mostly cooler than present. Possible causes of this temperature discrepancy are ecological factors (e.g. calcification of shells at levels below the ocean mixed layer), taphonomic effects (e.g. diagenesis or dissolution), inaccurate estimation of Late Miocene seawater oxygen isotope composition, or a real Late Miocene cool climate. The scale of apparent cooling in the tropics suggests that the SST signal of the foraminifer calcite has been reset, at least in part, by early diagenetic calcite with higher δ18O, formed in the foraminifer shells in cool sea bottom pore waters, probably coupled with the effects of calcite formed below the mixed layer during the life of the foraminifera. This hypothesis is supported by the markedly cooler SST estimates from low latitudes—in some cases more than 9 °C cooler than present—where the gradients of temperature and the δ18O composition of seawater between sea surface and sea bottom are most marked, and where ocean surface stratification is high. At higher latitudes, particularly N and S of 30°, the temperature signal is still cooler, though maximum temperature estimates overlap with modern SSTs N and S of 40°. Comparison of SST estimates for the Late Miocene from alkenone unsaturation analysis from the eastern tropical Atlantic at Ocean Drilling Program (ODP) Site 958—which suggest a warmer sea surface by 2–4 °C, with estimates from oxygen isotopes at Deep Sea Drilling Project (DSDP) Site 366 and ODP Site 959, indicating cooler than present SSTs, also suggest a significant impact on the δ18O signal. Nevertheless, much of the original SST variation is clearly preserved in the primary calcite formed in the mixed layer, and records secular and temporal oceanographic changes at the sea surface, such as movement of the Antarctic Polar Front in the Southern Ocean. Cooler SSTs in the tropics and sub-tropics are also consistent with the Late Miocene latitude reduction in the coral reef belt and with interrupted reef growth on the Queensland Plateau of eastern Australia, though it is not possible to quantify absolute SSTs with the existing oxygen isotope data. Reconstruction of an accurate global SST dataset for Neogene time-slices from the existing published DSDP/ODP isotope data, for use in general circulation models, may require a detailed re-assessment of taphonomy at many sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data such as digitized aerial photographs, electrical conductivity and yield are intensive and relatively inexpensive to obtain compared with collecting soil data by sampling. If such ancillary data are co-regionalized with the soil data they should be suitable for co-kriging. The latter requires that information for both variables is co-located at several locations; this is rarely so for soil and ancillary data. To solve this problem, we have derived values for the ancillary variable at the soil sampling locations by averaging the values within a radius of 15 m, taking the nearest-neighbour value, kriging over 5 m blocks, and punctual kriging. The cross-variograms from these data with clay content and also the pseudo cross-variogram were used to co-krige to validation points and the root mean squared errors (RMSEs) were calculated. In general, the data averaged within 15m and the punctually kriged values resulted in more accurate predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maps of kriged soil properties for precision agriculture are often based on a variogram estimated from too few data because the costs of sampling and analysis are often prohibitive. If the variogram has been computed by the usual method of moments, it is likely to be unstable when there are fewer than 100 data. The scale of variation in soil properties should be investigated prior to sampling by computing a variogram from ancillary data, such as an aerial photograph of the bare soil. If the sampling interval suggested by this is large in relation to the size of the field there will be too few data to estimate a reliable variogram for kriging. Standardized variograms from aerial photographs can be used with standardized soil data that are sparse, provided the data are spatially structured and the nugget:sill ratio is similar to that of a reliable variogram of the property. The problem remains of how to set this ratio in the absence of an accurate variogram. Several methods of estimating the nugget:sill ratio for selected soil properties are proposed and evaluated. Standardized variograms with nugget:sill ratios set by these methods are more similar to those computed from intensive soil data than are variograms computed from sparse soil data. The results of cross-validation and mapping show that the standardized variograms provide more accurate estimates, and preserve the main patterns of variation better than those computed from sparse data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are commonly recommended as tools for assessing the attainment of development, and the current vogue is for aggregating a number of indicators together into a single index. It is claimed that such indices of development help facilitate maximum impact in policy terms by appealing to those who may not necessarily have technical expertise in data collection, analysis and interpretation. In order to help counter criticisms of over-simplification, those advocating such indices also suggest that the raw data be provided so as to allow disaggregation into component parts and hence facilitate a more subtle interpretation if a reader so desires. This paper examines the problems involved with interpreting indices of development by focusing on the United Nations Development Programmes (UNDP) Human Development Index (HDI) published each year in the Human Development Reports (HDRs). The HDI was intended to provide an alternative to the more economic based indices, such as GDP, commonly used within neo-liberal development agendas. The paper explores the use of the HDI as a gauge of human development by making comparisons between two major political and economic communities in Africa (ECOWAS and SADC). While the HDI did help highlight important changes in human development as expressed by the HDI over 10 years, it is concluded that the HDI and its components are difficult to interpret as methodologies have changed significantly and the 'averaging' nature of the HDI could hide information unless care is taken. The paper discusses the applicability of alternative models to the HDI such as the more neo-populist centred methods commonly advocated for indicators of sustainable development. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pressing global environmental problems highlight the need to develop tools to measure progress towards "sustainability." However, some argue that any such attempt inevitably reflects the views of those creating such tools and only produce highly contested notions of "reality." To explore this tension, we critically assesses the Environmental Sustainability Index (ESI), a well-publicized product of the World Economic Forum that is designed to measure 'sustainability' by ranking nations on league tables based on extensive databases of environmental indicators. By recreating this index, and then using statistical tools (principal components analysis) to test relations between various components of the index, we challenge ways in which countries are ranked in the ESI. Based on this analysis, we suggest (1) that the approach taken to aggregate, interpret and present the ESI creates a misleading impression that Western countries are more sustainable than the developing world; (2) that unaccounted methodological biases allowed the authors of the ESI to over-generalize the relative 'sustainability' of different countries; and, (3) that this has resulted in simplistic conclusions on the relation between economic growth and environmental sustainability. This criticism should not be interpreted as a call for the abandonment of efforts to create standardized comparable data. Instead, this paper proposes that indicator selection and data collection should draw on a range of voices, including local stakeholders as well as international experts. We also propose that aggregating data into final league ranking tables is too prone to error and creates the illusion of absolute and categorical interpretations. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the efficacy of published δ18O data from the calcite of Late Miocene surface dwelling planktonic foraminifer shells, for sea surface temperature estimates for the pre-Quaternary. The data are from 33 Late Miocene (Messinian) marine sites from a modern latitudinal gradient of 64°N to 48°S. They give estimates of SSTs in the tropics/subtropics (to 30°N and S) that are mostly cooler than present. Possible causes of this temperature discrepancy are ecological factors (e.g. calcification of shells at levels below the ocean mixed layer), taphonomic effects (e.g. diagenesis or dissolution), inaccurate estimation of Late Miocene seawater oxygen isotope composition, or a real Late Miocene cool climate. The scale of apparent cooling in the tropics suggests that the SST signal of the foraminifer calcite has been reset, at least in part, by early diagenetic calcite with higher δ18O, formed in the foraminifer shells in cool sea bottom pore waters, probably coupled with the effects of calcite formed below the mixed layer during the life of the foraminifera. This hypothesis is supported by the markedly cooler SST estimates from low latitudes—in some cases more than 9 °C cooler than present—where the gradients of temperature and the δ18O composition of seawater between sea surface and sea bottom are most marked, and where ocean surface stratification is high. At higher latitudes, particularly N and S of 30°, the temperature signal is still cooler, though maximum temperature estimates overlap with modern SSTs N and S of 40°. Comparison of SST estimates for the Late Miocene from alkenone unsaturation analysis from the eastern tropical Atlantic at Ocean Drilling Program (ODP) Site 958—which suggest a warmer sea surface by 2–4 °C, with estimates from oxygen isotopes at Deep Sea Drilling Project (DSDP) Site 366 and ODP Site 959, indicating cooler than present SSTs, also suggest a significant impact on the δ18O signal. Nevertheless, much of the original SST variation is clearly preserved in the primary calcite formed in the mixed layer, and records secular and temporal oceanographic changes at the sea surface, such as movement of the Antarctic Polar Front in the Southern Ocean. Cooler SSTs in the tropics and sub-tropics are also consistent with the Late Miocene latitude reduction in the coral reef belt and with interrupted reef growth on the Queensland Plateau of eastern Australia, though it is not possible to quantify absolute SSTs with the existing oxygen isotope data. Reconstruction of an accurate global SST dataset for Neogene time-slices from the existing published DSDP/ODP isotope data, for use in general circulation models, may require a detailed re-assessment of taphonomy at many sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we use the no-response test idea, introduced in Luke and Potthast (2003) and Potthast (Preprint) and the inverse obstacle problem, to identify the interface of the discontinuity of the coefficient gamma of the equation del (.) gamma(x)del + c(x) with piecewise regular gamma and bounded function c(x). We use infinitely many Cauchy data as measurement and give a reconstructive method to localize the interface. We will base this multiwave version of the no-response test on two different proofs. The first one contains a pointwise estimate as used by the singular sources method. The second one is built on an energy (or an integral) estimate which is the basis of the probe method. As a conclusion of this, the probe and the singular sources methods are equivalent regarding their convergence and the no-response test can be seen as a unified framework for these methods. As a further contribution, we provide a formula to reconstruct the values of the jump of gamma(x), x is an element of partial derivative D at the boundary. A second consequence of this formula is that the blow-up rate of the indicator functions of the probe and singular sources methods at the interface is given by the order of the singularity of the fundamental solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This note presents a robust method for estimating response surfaces that consist of linear response regimes and a linear plateau. The linear response-and-plateau model has fascinated production scientists since von Liebig (1855) and, as Upton and Dalton indicated, some years ago in this Journal, the response-and-plateau model seems to fit the data in many empirical studies. The estimation algorithm evolves from Bayesian implementation of a switching-regression (finite mixtures) model and demonstrates routine application of Gibbs sampling and data augmentation-techniques that are now in widespread application in other disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contribution investigates the problem of estimating the size of a population, also known as the missing cases problem. Suppose a registration system is targeting to identify all cases having a certain characteristic such as a specific disease (cancer, heart disease, ...), disease related condition (HIV, heroin use, ...) or a specific behavior (driving a car without license). Every case in such a registration system has a certain notification history in that it might have been identified several times (at least once) which can be understood as a particular capture-recapture situation. Typically, cases are left out which have never been listed at any occasion, and it is this frequency one wants to estimate. In this paper modelling is concentrating on the counting distribution, e.g. the distribution of the variable that counts how often a given case has been identified by the registration system. Besides very simple models like the binomial or Poisson distribution, finite (nonparametric) mixtures of these are considered providing rather flexible modelling tools. Estimation is done using maximum likelihood by means of the EM algorithm. A case study on heroin users in Bangkok in the year 2001 is completing the contribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The well-studied link between psychotic traits and creativity is a subject of much debate. The present study investigated the extent to which schizotypic personality traits - as measured by O-LIFE (Oxford-Liverpool Inventory of Feelings and Experiences) - equip healthy individuals to engage as groups in everyday tasks. From a sample of 69 students, eight groups of four participants - comprised of high, medium, or low-schizotypy individuals - were assembled to work as a team to complete a creative problem-solving task. Predictably, high scorers on the O-LIFE formulated a greater number of strategies to solve the task, indicative of creative divergent thinking. However, for task success (as measured by time taken to complete the problem) an inverted U shaped pattern emerged, whereby high and low-schizotypy groups were consistently faster than medium schizotypy groups. Intriguing data emerged concerning leadership within the groups, and other tangential findings relating to anxiety, competition and motivation were explored. These findings challenge the traditional cliche that psychotic personality traits are linearly related to creative performance, and suggest that the nature of the problem determines which thinking styles are optimally equipped to solve it. (C) 2009 Elsevier Ltd. All rights reserved.