941 resultados para RESOLUTION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual telepresence seeks to extend existing teleoperative capability by supplying the operator with a 3D interactive view of the remote environment. This is achieved through the use of a stereo camera platform which, through appropriate 3D display devices, provides a distinct image to each eye of the operator, and which is slaved directly from the operator's head and eye movements. However, the resolution within current head mounted displays remains poor, thereby reducing the operator's visual acuity. This paper reports on the feasibility of incorporation of eye tracking to increase resolution and investigates the stability and control issues for such a system. Continuous domain and discrete simulations are presented which indicates that eye tracking provides a stable feedback loop for tracking applications, though some empirical testing (currently being initiated) of such a system will be required to overcome indicated stability problems associated with micro saccades of the human operator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accurate prediction of the biochemical function of a protein is becoming increasingly important, given the unprecedented growth of both structural and sequence databanks. Consequently, computational methods are required to analyse such data in an automated manner to ensure genomes are annotated accurately. Protein structure prediction methods, for example, are capable of generating approximate structural models on a genome-wide scale. However, the detection of functionally important regions in such crude models, as well as structural genomics targets, remains an extremely important problem. The method described in the current study, MetSite, represents a fully automatic approach for the detection of metal-binding residue clusters applicable to protein models of moderate quality. The method involves using sequence profile information in combination with approximate structural data. Several neural network classifiers are shown to be able to distinguish metal sites from non-sites with a mean accuracy of 94.5%. The method was demonstrated to identify metal-binding sites correctly in LiveBench targets where no obvious metal-binding sequence motifs were detectable using InterPro. Accurate detection of metal sites was shown to be feasible for low-resolution predicted structures generated using mGenTHREADER where no side-chain information was available. High-scoring predictions were observed for a recently solved hypothetical protein from Haemophilus influenzae, indicating a putative metal-binding site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

World-wide structural genomics initiatives are rapidly accumulating structures for which limited functional information is available. Additionally, state-of-the art structural prediction programs are now capable of generating at least low resolution structural models of target proteins. Accurate detection and classification of functional sites within both solved and modelled protein structures therefore represents an important challenge. We present a fully automatic site detection method, FuncSite, that uses neural network classifiers to predict the location and type of functionally important sites in protein structures. The method is designed primarily to require only backbone residue positions without the need for specific side-chain atoms to be present. In order to highlight effective site detection in low resolution structural models FuncSite was used to screen model proteins generated using mGenTHREADER on a set of newly released structures. We found effective metal site detection even for moderate quality protein models illustrating the robustness of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces new insights into the hydrochemical functioning of lowland river systems using field-based spectrophotometric and electrode technologies. The streamwater concentrations of nitrogen species and phosphorus fractions were measured at hourly intervals on a continuous basis at two contrasting sites on tributaries of the River Thames – one draining a rural catchment, the River Enborne, and one draining a more urban system, The Cut. The measurements complement those from an existing network of multi-parameter water quality sondes maintained across the Thames catchment and weekly monitoring based on grab samples. The results of the sub-daily monitoring show that streamwater phosphorus concentrations display highly complex dynamics under storm conditions dependent on the antecedent catchment wetness, and that diurnal phosphorus and nitrogen cycles occur under low flow conditions. The diurnal patterns highlight the dominance of sewage inputs in controlling the streamwater phosphorus and nitrogen concentrations at low flows, even at a distance of 7 km from the nearest sewage treatment works in the rural River Enborne. The time of sample collection is important when judging water quality against ecological thresholds or standards. An exhaustion of the supply of phosphorus from diffuse and multiple septic tank sources during storm events was evident and load estimation was not improved by sub-daily monitoring beyond that achieved by daily sampling because of the eventual reduction in the phosphorus mass entering the stream during events. The results highlight the utility of sub-daily water quality measurements and the discussion considers the practicalities and challenges of in situ, sub-daily monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an approach for dealing with coarse-resolution Earth observations (EO) in terrestrial ecosystem data assimilation schemes. The use of coarse-scale observations in ecological data assimilation schemes is complicated by spatial heterogeneity and nonlinear processes in natural ecosystems. If these complications are not appropriately dealt with, then the data assimilation will produce biased results. The “disaggregation” approach that we describe in this paper combines frequent coarse-resolution observations with temporally sparse fine-resolution measurements. We demonstrate the approach using a demonstration data set based on measurements of an Arctic ecosystem. In this example, normalized difference vegetation index observations are assimilated into a “zero-order” model of leaf area index and carbon uptake. The disaggregation approach conserves key ecosystem characteristics regardless of the observation resolution and estimates the carbon uptake to within 1% of the demonstration data set “truth.” Assimilating the same data in the normal manner, but without the disaggregation approach, results in carbon uptake being underestimated by 58% at an observation resolution of 250 m. The disaggregation method allows the combination of multiresolution EO and improves in spatial resolution if observations are located on a grid that shifts from one observation time to the next. Additionally, the approach is not tied to a particular data assimilation scheme, model, or EO product and can cope with complex observation distributions, as it makes no implicit assumptions of normality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to run General Circulation Models (GCMs) at ever-higher horizontal resolutions has meant that tropical cyclone simulations are increasingly credible. A hierarchy of atmosphere-only GCMs, based on the Hadley Centre Global Environmental Model (HadGEM1), with horizontal resolution increasing from approximately 270km to 60km (at 50N), is used to systematically investigate the impact of spatial resolution on the simulation of global tropical cyclone activity, independent of model formulation. Tropical cyclones are extracted from ensemble simulations and reanalyses of comparable resolutions using a feature-tracking algorithm. Resolution is critical for simulating storm intensity and convergence to observed storm intensities is not achieved with the model hierarchy. Resolution is less critical for simulating the annual number of tropical cyclones and their geographical distribution, which are well captured at resolutions of 135km or higher, particularly for Northern Hemisphere basins. Simulating the interannual variability of storm occurrence requires resolutions of 100km or higher; however, the level of skill is basin dependent. Higher resolution GCMs are increasingly able to capture the interannual variability of the large-scale environmental conditions that contribute to tropical cyclogenesis. Different environmental factors contribute to the interannual variability of tropical cyclones in the different basins: in the North Atlantic basin the vertical wind shear, potential intensity and low-level absolute vorticity are dominant, while in the North Pacific basins mid-level relative humidity and low-level absolute vorticity are dominant. Model resolution is crucial for a realistic simulation of tropical cyclone behaviour, and high-resolution GCMs are found to be valuable tools for investigating the global location and frequency of tropical cyclones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sensitivity to the horizontal resolution of the climate, anthropogenic climate change, and seasonal predictive skill of the ECMWF model has been studied as part of Project Athena—an international collaboration formed to test the hypothesis that substantial progress in simulating and predicting climate can be achieved if mesoscale and subsynoptic atmospheric phenomena are more realistically represented in climate models. In this study the experiments carried out with the ECMWF model (atmosphere only) are described in detail. Here, the focus is on the tropics and the Northern Hemisphere extratropics during boreal winter. The resolutions considered in Project Athena for the ECMWF model are T159 (126 km), T511 (39 km), T1279 (16 km), and T2047 (10 km). It was found that increasing horizontal resolution improves the tropical precipitation, the tropical atmospheric circulation, the frequency of occurrence of Euro-Atlantic blocking, and the representation of extratropical cyclones in large parts of the Northern Hemisphere extratropics. All of these improvements come from the increase in resolution from T159 to T511 with relatively small changes for further resolution increases to T1279 and T2047, although it should be noted that results from this very highest resolution are from a previously untested model version. Problems in simulating the Madden–Julian oscillation remain unchanged for all resolutions tested. There is some evidence that increasing horizontal resolution to T1279 leads to moderate increases in seasonal forecast skill during boreal winter in the tropics and Northern Hemisphere extratropics. Sensitivity experiments are discussed, which helps to foster a better understanding of some of the resolution dependence found for the ECMWF model in Project Athena

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solar-pointing Fourier transform infrared (FTIR) spectroscopy offers the capability to measure both the fine scale and broadband spectral structure of atmospheric transmission simultaneously across wide spectral regions. It is therefore suited to the study of both water vapour monomer and continuum absorption behaviours. However, in order to properly address this issue, it is necessary to radiatively calibrate the FTIR instrument response. A solar-pointing high-resolution FTIR spectrometer was deployed as part of the ‘Continuum Absorption by Visible and Infrared radiation and its Atmospheric Relevance’ (CAVIAR) consortium project. This paper describes the radiative calibration process using an ultra-high-temperature blackbody and the consideration of the related influence factors. The result is a radiatively calibrated measurement of the solar irradiation at the ground across the IR region from 2000 to 10 000 cm−1 with an uncertainty of between 3.3 and 5.9 per cent. This measurement is shown to be in good general agreement with a radiative-transfer model. The results from the CAVIAR field measurements are being used in ongoing studies of atmospheric absorbers, in particular the water vapour continuum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global climate and weather models tend to produce rainfall that is too light and too regular over the tropical ocean. This is likely because of convective parametrizations, but the problem is not well understood. Here, distributions of precipitation rates are analyzed for high-resolution UK Met Office Unified Model simulations of a 10 day case study over a large tropical domain (∼20°S–20°N and 42°E–180°E). Simulations with 12 km grid length and parametrized convection have too many occurrences of light rain and too few of heavier rain when interpolated onto a 1° grid and compared with Tropical Rainfall Measuring Mission (TRMM) data. In fact, this version of the model appears to have a preferred scale of rainfall around 0.4 mm h−1 (10 mm day−1), unlike observations of tropical rainfall. On the other hand, 4 km grid length simulations with explicit convection produce distributions much more similar to TRMM observations. The apparent preferred scale at lighter rain rates seems to be a feature of the convective parametrization rather than the coarse resolution, as demonstrated by results from 12 km simulations with explicit convection and 40 km simulations with parametrized convection. In fact, coarser resolution models with explicit convection tend to have even more heavy rain than observed. Implications for models using convective parametrizations, including interactions of heating and moistening profiles with larger scales, are discussed. One important implication is that the explicit convection 4 km model has temperature and moisture tendencies that favour transitions in the convective regime. Also, the 12 km parametrized convection model produces a more stable temperature profile at its extreme high-precipitation range, which may reduce the chance of very heavy rainfall. Further study is needed to determine whether unrealistic precipitation distributions are due to some fundamental limitation of convective parametrizations or whether parametrizations can be improved, in order to better simulate these distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the four-dimensional variational data assimilation method (4DVar) can be interpreted as a form of Tikhonov regularization, a very familiar method for solving ill-posed inverse problems. It is known from image restoration problems that L1-norm penalty regularization recovers sharp edges in the image more accurately than Tikhonov, or L2-norm, penalty regularization. We apply this idea from stationary inverse problems to 4DVar, a dynamical inverse problem, and give examples for an L1-norm penalty approach and a mixed total variation (TV) L1–L2-norm penalty approach. For problems with model error where sharp fronts are present and the background and observation error covariances are known, the mixed TV L1–L2-norm penalty performs better than either the L1-norm method or the strong constraint 4DVar (L2-norm)method. A strength of the mixed TV L1–L2-norm regularization is that in the case where a simplified form of the background error covariance matrix is used it produces a much more accurate analysis than 4DVar. The method thus has the potential in numerical weather prediction to overcome operational problems with poorly tuned background error covariance matrices.