191 resultados para Coreference resolution
Resumo:
We present an analysis of the oceanic heat advection and its variability in the upper 500 m in the southeastern tropical Pacific (100W–75W, 25S–10S) as simulated by the global coupled model HiGEM, which has one of the highest resolutions currently used in long-term integrations. The simulated climatology represents a temperature advection field arising from transient small-scale (<450 km) features, with structures and transport that appear consistent with estimates based on available observational data for the mooring at 20S, 85W. The transient structures are very persistent (>4 months), and in specific locations they generate an important contribution to the local upper-ocean heat budget, characterised by scales of a few hundred kilometres, and periods of over a year. The contribution from such structures to the local, long-term oceanic heat budget however can be of either sign, or vanishing, depending on the location; and, although there appears some organisation in preferential areas of activity, the average over the entire region is small. While several different mechanisms may be responsible for the temperature advection by transients, we find that a significant, and possibly dominant, component is associated with vortices embedded in the large-scale, climatological salinity gradient associated with the fresh intrusion of mid-latitude intermediate water which penetrates north-westward beneath the tropical thermocline
Resumo:
Tropical Cyclone (TC) is normally not studied at the individual level with Global Climate Models (GCMs), because the coarse grid spacing is often deemed insufficient for a realistic representation of the basic underlying processes. GCMs are indeed routinely deployed at low resolution, in order to enable sufficiently long integrations, which means that only large-scale TC proxies are diagnosed. A new class of GCMs is emerging, however, which is capable of simulating TC-type vortexes by retaining a horizontal resolution similar to that of operational NWP GCMs; their integration on the latest supercomputers enables the completion of long-term integrations. The UK-Japan Climate Collaboration and the UK-HiGEM projects have developed climate GCMs which can be run routinely for decades (with grid spacing of 60 km) or centuries (with grid spacing of 90 km); when coupled to the ocean GCM, a mesh of 1/3 degrees provides eddy-permitting resolution. The 90 km resolution model has been developed entirely by the UK-HiGEM consortium (together with its 1/3 degree ocean component); the 60 km atmospheric GCM has been developed by UJCC, in collaboration with the Met Office Hadley Centre.
High resolution Northern Hemisphere wintertime mid-latitude dynamics during the Last Glacial Maximum
Resumo:
Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.
Resumo:
Visual telepresence seeks to extend existing teleoperative capability by supplying the operator with a 3D interactive view of the remote environment. This is achieved through the use of a stereo camera platform which, through appropriate 3D display devices, provides a distinct image to each eye of the operator, and which is slaved directly from the operator's head and eye movements. However, the resolution within current head mounted displays remains poor, thereby reducing the operator's visual acuity. This paper reports on the feasibility of incorporation of eye tracking to increase resolution and investigates the stability and control issues for such a system. Continuous domain and discrete simulations are presented which indicates that eye tracking provides a stable feedback loop for tracking applications, though some empirical testing (currently being initiated) of such a system will be required to overcome indicated stability problems associated with micro saccades of the human operator.
Resumo:
The accurate prediction of the biochemical function of a protein is becoming increasingly important, given the unprecedented growth of both structural and sequence databanks. Consequently, computational methods are required to analyse such data in an automated manner to ensure genomes are annotated accurately. Protein structure prediction methods, for example, are capable of generating approximate structural models on a genome-wide scale. However, the detection of functionally important regions in such crude models, as well as structural genomics targets, remains an extremely important problem. The method described in the current study, MetSite, represents a fully automatic approach for the detection of metal-binding residue clusters applicable to protein models of moderate quality. The method involves using sequence profile information in combination with approximate structural data. Several neural network classifiers are shown to be able to distinguish metal sites from non-sites with a mean accuracy of 94.5%. The method was demonstrated to identify metal-binding sites correctly in LiveBench targets where no obvious metal-binding sequence motifs were detectable using InterPro. Accurate detection of metal sites was shown to be feasible for low-resolution predicted structures generated using mGenTHREADER where no side-chain information was available. High-scoring predictions were observed for a recently solved hypothetical protein from Haemophilus influenzae, indicating a putative metal-binding site.
Resumo:
World-wide structural genomics initiatives are rapidly accumulating structures for which limited functional information is available. Additionally, state-of-the art structural prediction programs are now capable of generating at least low resolution structural models of target proteins. Accurate detection and classification of functional sites within both solved and modelled protein structures therefore represents an important challenge. We present a fully automatic site detection method, FuncSite, that uses neural network classifiers to predict the location and type of functionally important sites in protein structures. The method is designed primarily to require only backbone residue positions without the need for specific side-chain atoms to be present. In order to highlight effective site detection in low resolution structural models FuncSite was used to screen model proteins generated using mGenTHREADER on a set of newly released structures. We found effective metal site detection even for moderate quality protein models illustrating the robustness of the method.
Resumo:
This paper introduces new insights into the hydrochemical functioning of lowland river systems using field-based spectrophotometric and electrode technologies. The streamwater concentrations of nitrogen species and phosphorus fractions were measured at hourly intervals on a continuous basis at two contrasting sites on tributaries of the River Thames – one draining a rural catchment, the River Enborne, and one draining a more urban system, The Cut. The measurements complement those from an existing network of multi-parameter water quality sondes maintained across the Thames catchment and weekly monitoring based on grab samples. The results of the sub-daily monitoring show that streamwater phosphorus concentrations display highly complex dynamics under storm conditions dependent on the antecedent catchment wetness, and that diurnal phosphorus and nitrogen cycles occur under low flow conditions. The diurnal patterns highlight the dominance of sewage inputs in controlling the streamwater phosphorus and nitrogen concentrations at low flows, even at a distance of 7 km from the nearest sewage treatment works in the rural River Enborne. The time of sample collection is important when judging water quality against ecological thresholds or standards. An exhaustion of the supply of phosphorus from diffuse and multiple septic tank sources during storm events was evident and load estimation was not improved by sub-daily monitoring beyond that achieved by daily sampling because of the eventual reduction in the phosphorus mass entering the stream during events. The results highlight the utility of sub-daily water quality measurements and the discussion considers the practicalities and challenges of in situ, sub-daily monitoring.
Resumo:
We present an approach for dealing with coarse-resolution Earth observations (EO) in terrestrial ecosystem data assimilation schemes. The use of coarse-scale observations in ecological data assimilation schemes is complicated by spatial heterogeneity and nonlinear processes in natural ecosystems. If these complications are not appropriately dealt with, then the data assimilation will produce biased results. The “disaggregation” approach that we describe in this paper combines frequent coarse-resolution observations with temporally sparse fine-resolution measurements. We demonstrate the approach using a demonstration data set based on measurements of an Arctic ecosystem. In this example, normalized difference vegetation index observations are assimilated into a “zero-order” model of leaf area index and carbon uptake. The disaggregation approach conserves key ecosystem characteristics regardless of the observation resolution and estimates the carbon uptake to within 1% of the demonstration data set “truth.” Assimilating the same data in the normal manner, but without the disaggregation approach, results in carbon uptake being underestimated by 58% at an observation resolution of 250 m. The disaggregation method allows the combination of multiresolution EO and improves in spatial resolution if observations are located on a grid that shifts from one observation time to the next. Additionally, the approach is not tied to a particular data assimilation scheme, model, or EO product and can cope with complex observation distributions, as it makes no implicit assumptions of normality.
Resumo:
The ability to run General Circulation Models (GCMs) at ever-higher horizontal resolutions has meant that tropical cyclone simulations are increasingly credible. A hierarchy of atmosphere-only GCMs, based on the Hadley Centre Global Environmental Model (HadGEM1), with horizontal resolution increasing from approximately 270km to 60km (at 50N), is used to systematically investigate the impact of spatial resolution on the simulation of global tropical cyclone activity, independent of model formulation. Tropical cyclones are extracted from ensemble simulations and reanalyses of comparable resolutions using a feature-tracking algorithm. Resolution is critical for simulating storm intensity and convergence to observed storm intensities is not achieved with the model hierarchy. Resolution is less critical for simulating the annual number of tropical cyclones and their geographical distribution, which are well captured at resolutions of 135km or higher, particularly for Northern Hemisphere basins. Simulating the interannual variability of storm occurrence requires resolutions of 100km or higher; however, the level of skill is basin dependent. Higher resolution GCMs are increasingly able to capture the interannual variability of the large-scale environmental conditions that contribute to tropical cyclogenesis. Different environmental factors contribute to the interannual variability of tropical cyclones in the different basins: in the North Atlantic basin the vertical wind shear, potential intensity and low-level absolute vorticity are dominant, while in the North Pacific basins mid-level relative humidity and low-level absolute vorticity are dominant. Model resolution is crucial for a realistic simulation of tropical cyclone behaviour, and high-resolution GCMs are found to be valuable tools for investigating the global location and frequency of tropical cyclones.
Resumo:
Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.
Resumo:
The sensitivity to the horizontal resolution of the climate, anthropogenic climate change, and seasonal predictive skill of the ECMWF model has been studied as part of Project Athena—an international collaboration formed to test the hypothesis that substantial progress in simulating and predicting climate can be achieved if mesoscale and subsynoptic atmospheric phenomena are more realistically represented in climate models. In this study the experiments carried out with the ECMWF model (atmosphere only) are described in detail. Here, the focus is on the tropics and the Northern Hemisphere extratropics during boreal winter. The resolutions considered in Project Athena for the ECMWF model are T159 (126 km), T511 (39 km), T1279 (16 km), and T2047 (10 km). It was found that increasing horizontal resolution improves the tropical precipitation, the tropical atmospheric circulation, the frequency of occurrence of Euro-Atlantic blocking, and the representation of extratropical cyclones in large parts of the Northern Hemisphere extratropics. All of these improvements come from the increase in resolution from T159 to T511 with relatively small changes for further resolution increases to T1279 and T2047, although it should be noted that results from this very highest resolution are from a previously untested model version. Problems in simulating the Madden–Julian oscillation remain unchanged for all resolutions tested. There is some evidence that increasing horizontal resolution to T1279 leads to moderate increases in seasonal forecast skill during boreal winter in the tropics and Northern Hemisphere extratropics. Sensitivity experiments are discussed, which helps to foster a better understanding of some of the resolution dependence found for the ECMWF model in Project Athena
Resumo:
Solar-pointing Fourier transform infrared (FTIR) spectroscopy offers the capability to measure both the fine scale and broadband spectral structure of atmospheric transmission simultaneously across wide spectral regions. It is therefore suited to the study of both water vapour monomer and continuum absorption behaviours. However, in order to properly address this issue, it is necessary to radiatively calibrate the FTIR instrument response. A solar-pointing high-resolution FTIR spectrometer was deployed as part of the ‘Continuum Absorption by Visible and Infrared radiation and its Atmospheric Relevance’ (CAVIAR) consortium project. This paper describes the radiative calibration process using an ultra-high-temperature blackbody and the consideration of the related influence factors. The result is a radiatively calibrated measurement of the solar irradiation at the ground across the IR region from 2000 to 10 000 cm−1 with an uncertainty of between 3.3 and 5.9 per cent. This measurement is shown to be in good general agreement with a radiative-transfer model. The results from the CAVIAR field measurements are being used in ongoing studies of atmospheric absorbers, in particular the water vapour continuum.
Resumo:
References (20)Cited By (1)Export CitationAboutAbstract Proper scoring rules provide a useful means to evaluate probabilistic forecasts. Independent from scoring rules, it has been argued that reliability and resolution are desirable forecast attributes. The mathematical expectation value of the score allows for a decomposition into reliability and resolution related terms, demonstrating a relationship between scoring rules and reliability/resolution. A similar decomposition holds for the empirical (i.e. sample average) score over an archive of forecast–observation pairs. This empirical decomposition though provides a too optimistic estimate of the potential score (i.e. the optimum score which could be obtained through recalibration), showing that a forecast assessment based solely on the empirical resolution and reliability terms will be misleading. The differences between the theoretical and empirical decomposition are investigated, and specific recommendations are given how to obtain better estimators of reliability and resolution in the case of the Brier and Ignorance scoring rule.