878 resultados para Mean deviation
Resumo:
This paper discusses how numerical gradient estimation methods may be used in order to reduce the computational demands on a class of multidimensional clustering algorithms. The study is motivated by the recognition that several current point-density based cluster identification algorithms could benefit from a reduction of computational demand if approximate a-priori estimates of the cluster centres present in a given data set could be supplied as starting conditions for these algorithms. In this particular presentation, the algorithm shown to benefit from the technique is the Mean-Tracking (M-T) cluster algorithm, but the results obtained from the gradient estimation approach may also be applied to other clustering algorithms and their related disciplines.
Resumo:
Objective. Disparity cues can be a major drive to accommodation via the CA/C (convergence accommodation to convergence) linkage but, on decompensation of exotropia, disparity cues are extinguished by suppression, so this drive is lost. This study investigated accommodation and vergence responses to disparity, blur and proximal cues in a group of distance exotropes aged between 4-11 years both during decompensation and when exotropic. Methods. 19 participants with distance exotropia were tested using a PlusoptiXSO4 photorefractor set in a remote haploscopic device which assessed simultaneous vergence and accommodation to a range of targets incorporating different combinations of blur, disparity and proximal cues at four fixation distances between 2m and 33cm. Responses on decompensation were compared to those from the same children when their deviation was controlled. Results. Manifest exotropia was more common in the more impoverished cue conditions. When decompensated for near, mean accommodation gain for the all-cue (naturalistic) target reduced significantly (p<0.0001), with resultant mean under-accommodation of 2.33D at 33cm. The profile of near cues usage changed after decompensation, with blur and proximity driving residual responses, but these remaining cues did not compensate for loss of accommodation caused by the removal of disparity. Conclusions. Accommodation often reduces on decompensation of distance exotropia as the drive from convergence is extinguished, providing a further reason to try to prevent decompensation for near.
Resumo:
We study the global atmospheric budgets of mass, moisture, energy and angular momentum in the latest reanalysis from the European Centre for Medium-Range Weather Forecasts (ECMWF), ERA-Interim, for the period 1989–2008 and compare with ERA-40. Most of the measures we use indicate that the ERA-Interim reanalysis is superior in quality to ERA-40. In ERA-Interim the standard deviation of the monthly mean global dry mass of 0.7 kg m−2 (0.007%) is slightly worse than in ERA-40, and long time-scale variations in dry mass originate predominately in the surface pressure field. The divergent winds are improved in ERA-Interim: the global standard deviation of the time-averaged dry mass budget residual is 10 kg m−2 day−1 and the quality of the cross-equatorial mass fluxes is improved. The temporal variations in the global evaporation minus precipitation (E − P) are too large but the global moisture budget residual is 0.003 kg m−2 day−1 with a spatial standard deviation of 0.3 kg m−2 day−1. Both the E − P over ocean and P − E over land are about 15% larger than the 1.1 Tg s−1 transport of water from ocean to land. The top of atmosphere (TOA) net energy losses are improved, with a value of 1 W m−2, but the meridional gradient of the TOA net energy flux is smaller than that from the Clouds and the Earth's Radiant Energy System (CERES) data. At the surface the global energy losses are worse, with a value of 7 W m−2. Over land however, the energy loss is only 0.5 W m−2. The downwelling thermal radiation at the surface in ERA-Interim of 341 W m−2 is towards the higher end of previous estimates. The global mass-adjusted energy budget residual is 8 W m−2 with a spatial standard deviation of 11 W m−2, and the mass-adjusted atmospheric energy transport from low to high latitudes (the sum for the two hemispheres) is 9.5 PW
Resumo:
Estimating snow mass at continental scales is difficult, but important for understanding land-atmosphere interactions, biogeochemical cycles and the hydrology of the Northern latitudes. Remote sensing provides the only consistent global observations, butwith unknown errors. Wetest the theoretical performance of the Chang algorithm for estimating snow mass from passive microwave measurements using the Helsinki University of Technology (HUT) snow microwave emission model. The algorithm's dependence upon assumptions of fixed and uniform snow density and grainsize is determined, and measurements of these properties made at the Cold Land Processes Experiment (CLPX) Colorado field site in 2002–2003 used to quantify the retrieval errors caused by differences between the algorithm assumptions and measurements. Deviation from the Chang algorithm snow density and grainsize assumptions gives rise to an error of a factor of between two and three in calculating snow mass. The possibility that the algorithm performsmore accurately over large areas than at points is tested by simulating emission from a 25 km diameter area of snow with a distribution of properties derived from the snow pitmeasurements, using the Chang algorithm to calculate mean snow-mass from the simulated emission. The snowmass estimation froma site exhibiting the heterogeneity of the CLPX Colorado site proves onlymarginally different than that from a similarly-simulated homogeneous site. The estimation accuracy predictions are tested using the CLPX field measurements of snow mass, and simultaneous SSM/I and AMSR-E measurements.
Resumo:
Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.
Resumo:
We bridge the properties of the regular triangular, square, and hexagonal honeycomb Voronoi tessellations of the plane to the Poisson-Voronoi case, thus analyzing in a common framework symmetry breaking processes and the approach to uniform random distributions of tessellation-generating points. We resort to ensemble simulations of tessellations generated by points whose regular positions are perturbed through a Gaussian noise, whose variance is given by the parameter α2 times the square of the inverse of the average density of points. We analyze the number of sides, the area, and the perimeter of the Voronoi cells. For all valuesα >0, hexagons constitute the most common class of cells, and 2-parameter gamma distributions provide an efficient description of the statistical properties of the analyzed geometrical characteristics. The introduction of noise destroys the triangular and square tessellations, which are structurally unstable, as their topological properties are discontinuous in α = 0. On the contrary, the honeycomb hexagonal tessellation is topologically stable and, experimentally, all Voronoi cells are hexagonal for small but finite noise withα <0.12. For all tessellations and for small values of α, we observe a linear dependence on α of the ensemble mean of the standard deviation of the area and perimeter of the cells. Already for a moderate amount of Gaussian noise (α >0.5), memory of the specific initial unperturbed state is lost, because the statistical properties of the three perturbed regular tessellations are indistinguishable. When α >2, results converge to those of Poisson-Voronoi tessellations. The geometrical properties of n-sided cells change with α until the Poisson- Voronoi limit is reached for α > 2; in this limit the Desch law for perimeters is shown to be not valid and a square root dependence on n is established. This law allows for an easy link to the Lewis law for areas and agrees with exact asymptotic results. Finally, for α >1, the ensemble mean of the cells area and perimeter restricted to the hexagonal cells agree remarkably well with the full ensemble mean; this reinforces the idea that hexagons, beyond their ubiquitous numerical prominence, can be interpreted as typical polygons in 2D Voronoi tessellations.
Resumo:
Metrics are often used to compare the climate impacts of emissions from various sources, sectors or nations. These are usually based on global-mean input, and so there is the potential that important information on smaller scales is lost. Assuming a non-linear dependence of the climate impact on local surface temperature change, we explore the loss of information about regional variability that results from using global-mean input in the specific case of heterogeneous changes in ozone, methane and aerosol concentrations resulting from emissions from road traffic, aviation and shipping. Results from equilibrium simulations with two general circulation models are used. An alternative metric for capturing the regional climate impacts is investigated. We find that the application of a metric that is first calculated locally and then averaged globally captures a more complete and informative signal of climate impact than one that uses global-mean input. The loss of information when heterogeneity is ignored is largest in the case of aviation. Further investigation of the spatial distribution of temperature change indicates that although the pattern of temperature response does not closely match the pattern of the forcing, the forcing pattern still influences the response pattern on a hemispheric scale. When the short-lived transport forcing is superimposed on present-day anthropogenic CO2 forcing, the heterogeneity in the temperature response to CO2 dominates. This suggests that the importance of including regional climate impacts in global metrics depends on whether small sectors are considered in isolation or as part of the overall climate change.
Resumo:
The extra-tropical response to El Niño in configurations of a coupled model with increased horizontal resolution in the oceanic component is shown to be more realistic than in configurations with a low resolution oceanic component. This general conclusion is independent of the atmospheric resolution. Resolving small-scale processes in the ocean produces a more realistic oceanic mean state, with a reduced cold tongue bias, which in turn allows the atmospheric model component to be forced more realistically. A realistic atmospheric basic state is critical in order to represent Rossby wave propagation in response to El Niño, and hence the extra-tropical response to El Niño. Through the use of high and low resolution configurations of the forced atmospheric-only model component we show that, in isolation, atmospheric resolution does not significantly affect the simulation of the extra-tropical response to El Niño. It is demonstrated, through perturbations to the SST forcing of the atmospheric model component, that biases in the climatological SST field typical of coupled model configurations with low oceanic resolution can account for the erroneous atmospheric basic state seen in these coupled model configurations. These results highlight the importance of resolving small-scale oceanic processes in producing a realistic large-scale mean climate in coupled models, and suggest that it might may be possible to “squeeze out” valuable extra performance from coupled models through increases to oceanic resolution alone.
Resumo:
Matei et al. (Reports, 6 January 2012, p. 76) claim to show skillful multiyear predictions of the Atlantic Meridional Overturning Circulation (AMOC). However, these claims are not justified, primarily because the predictions of AMOC transport do not outperform simple reference forecasts based on climatological annual cycles. Accordingly, there is no justification for the “confident” prediction of a stable AMOC through 2014.