915 resultados para Recent Structural Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents recent developments to a vision-based traffic surveillance system which relies extensively on the use of geometrical and scene context. Firstly, a highly parametrised 3-D model is reported, able to adopt the shape of a wide variety of different classes of vehicle (e.g. cars, vans, buses etc.), and its subsequent specialisation to a generic car class which accounts for commonly encountered types of car (including saloon, batchback and estate cars). Sample data collected from video images, by means of an interactive tool, have been subjected to principal component analysis (PCA) to define a deformable model having 6 degrees of freedom. Secondly, a new pose refinement technique using “active” models is described, able to recover both the pose of a rigid object, and the structure of a deformable model; an assessment of its performance is examined in comparison with previously reported “passive” model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence. Typical applications for this work include robot surveillance and navigation tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models often underestimate blocking in the Atlantic and Pacific basins and this can lead to errors in both weather and climate predictions. Horizontal resolution is often cited as the main culprit for blocking errors due to poorly resolved small-scale variability, the upscale effects of which help to maintain blocks. Although these processes are important for blocking, the authors show that much of the blocking error diagnosed using common methods of analysis and current climate models is directly attributable to the climatological bias of the model. This explains a large proportion of diagnosed blocking error in models used in the recent Intergovernmental Panel for Climate Change report. Furthermore, greatly improved statistics are obtained by diagnosing blocking using climate model data corrected to account for mean model biases. To the extent that mean biases may be corrected in low-resolution models, this suggests that such models may be able to generate greatly improved levels of atmospheric blocking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent laboratory observations and advances in theoretical quantum chemistry allow a reappraisal of the fundamental mechanisms that determine the water vapour self-continuum absorption throughout the infrared and millimetre wave spectral regions. By starting from a framework that partitions bimolecular interactions between water molecules into free-pair states, true bound and quasi-bound dimers, we present a critical review of recent observations, continuum models and theoretical predictions. In the near-infrared bands of the water monomer, we propose that spectral features in recent laboratory-derived self-continuum can be well explained as being due to a combination of true bound and quasi-bound dimers, when the spectrum of quasi-bound dimers is approximated as being double the broadened spectrum of the water monomer. Such a representation can explain both the wavenumber variation and the temperature dependence. Recent observations of the self-continuum absorption in the windows between these near-infrared bands indicate that widely used continuum models can underestimate the true strength by around an order of magnitude. An existing far-wing model does not appear able to explain the discrepancy, and although a dimer explanation is possible, currently available observations do not allow a compelling case to be made. In the 8–12 micron window, recent observations indicate that the modern continuum models either do not properly represent the temperature dependence, the wavelength variation, or both. The temperature dependence is suggestive of a transition from the dominance of true bound dimers at lower temperatures to quasibound dimers at higher temperatures. In the mid- and far-infrared spectral region, recent theoretical calculations indicate that true bound dimers may explain at least between 20% and 40% of the observed self-continuum. The possibility that quasi-bound dimers could cause an additional contribution of the same size is discussed. Most recent theoretical considerations agree that water dimers are likely to be the dominant contributor to the self-continuum in the mm-wave spectral range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blumeria graminis is an economically important obligate plant-pathogenic fungus, whose entire genome was recently sequenced and manually annotated using ab initio in silico predictions [7]. Employing large scale proteogenomic analysis we are now able to verify independently the existence of proteins predicted by 24% of open reading frame models. We compared the haustoria and sporulating hyphae proteomes and identified 71 proteins exclusively in haustoria, the feeding and effector-delivery organs of the pathogen. These proteins are ‘significantly smaller than the rest of the protein pool and predicted to be secreted. Most do not share any similarities with Swiss–Prot or Trembl entries nor possess any identifiable Pfam domains. We used a novel automated prediction pipeline to model the 3D structures of the proteins, identify putative ligand binding sites and predict regions of intrinsic disorder. This revealed that the protein set found exclusively in haustoria is significantly less disordered than the rest of the identified Blumeria proteins or random (and representative) protein sets generated from the yeast proteome. For most of the haustorial proteins with unknown functions no good templates could be found, from which to generate high quality models. Thus, these unknown proteins present potentially new protein folds that can be specific to the interaction of the pathogen with its host.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a comparative analysis of projected impacts of climate change on river runoff from two types of distributed hydrological model, a global hydrological model (GHM) and catchment-scale hydrological models (CHM). Analyses are conducted for six catchments that are global in coverage and feature strong contrasts in spatial scale as well as climatic and development conditions. These include the Liard (Canada), Mekong (SE Asia), Okavango (SW Africa), Rio Grande (Brazil), Xiangu (China) and Harper's Brook (UK). A single GHM (Mac-PDM.09) is applied to all catchments whilst different CHMs are applied for each catchment. The CHMs typically simulate water resources impacts based on a more explicit representation of catchment water resources than that available from the GHM, and the CHMs include river routing. Simulations of average annual runoff, mean monthly runoff and high (Q5) and low (Q95) monthly runoff under baseline (1961-1990) and climate change scenarios are presented. We compare the simulated runoff response of each hydrological model to (1) prescribed increases in global mean temperature from the HadCM3 climate model and (2)a prescribed increase in global-mean temperature of 2oC for seven GCMs to explore response to climate model and structural uncertainty. We find that differences in projected changes of mean annual runoff between the two types of hydrological model can be substantial for a given GCM, and they are generally larger for indicators of high and low flow. However, they are relatively small in comparison to the range of projections across the seven GCMs. Hence, for the six catchments and seven GCMs we considered, climate model structural uncertainty is greater than the uncertainty associated with the type of hydrological model applied. Moreover, shifts in the seasonal cycle of runoff with climate change are presented similarly by both hydrological models, although for some catchments the monthly timing of high and low flows differs.This implies that for studies that seek to quantify and assess the role of climate model uncertainty on catchment-scale runoff, it may be equally as feasible to apply a GHM as it is to apply a CHM, especially when climate modelling uncertainty across the range of available GCMs is as large as it currently is. Whilst the GHM is able to represent the broad climate change signal that is represented by the CHMs, we find, however, that for some catchments there are differences between GHMs and CHMs in mean annual runoff due to differences in potential evaporation estimation methods, in the representation of the seasonality of runoff, and in the magnitude of changes in extreme monthly runoff, all of which have implications for future water management issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the finite sample properties of model selection by information criteria in conditionally heteroscedastic models. Recent theoretical results show that certain popular criteria are consistent in that they will select the true model asymptotically with probability 1. To examine the empirical relevance of this property, Monte Carlo simulations are conducted for a set of non–nested data generating processes (DGPs) with the set of candidate models consisting of all types of model used as DGPs. In addition, not only is the best model considered but also those with similar values of the information criterion, called close competitors, thus forming a portfolio of eligible models. To supplement the simulations, the criteria are applied to a set of economic and financial series. In the simulations, the criteria are largely ineffective at identifying the correct model, either as best or a close competitor, the parsimonious GARCH(1, 1) model being preferred for most DGPs. In contrast, asymmetric models are generally selected to represent actual data. This leads to the conjecture that the properties of parameterizations of processes commonly used to model heteroscedastic data are more similar than may be imagined and that more attention needs to be paid to the behaviour of the standardized disturbances of such models, both in simulation exercises and in empirical modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ice cloud representation in general circulation models remains a challenging task, due to the lack of accurate observations and the complexity of microphysical processes. In this article, we evaluate the ice water content (IWC) and ice cloud fraction statistical distributions from the numerical weather prediction models of the European Centre for Medium-Range Weather Forecasts (ECMWF) and the UK Met Office, exploiting the synergy between the CloudSat radar and CALIPSO lidar. Using the last three weeks of July 2006, we analyse the global ice cloud occurrence as a function of temperature and latitude and show that the models capture the main geographical and temperature-dependent distributions, but overestimate the ice cloud occurrence in the Tropics in the temperature range from −60 °C to −20 °C and in the Antarctic for temperatures higher than −20 °C, but underestimate ice cloud occurrence at very low temperatures. A global statistical comparison of the occurrence of grid-box mean IWC at different temperatures shows that both the mean and range of IWC increases with increasing temperature. Globally, the models capture most of the IWC variability in the temperature range between −60 °C and −5 °C, and also reproduce the observed latitudinal dependencies in the IWC distribution due to different meteorological regimes. Two versions of the ECMWF model are assessed. The recent operational version with a diagnostic representation of precipitating snow and mixed-phase ice cloud fails to represent the IWC distribution in the −20 °C to 0 °C range, but a new version with prognostic variables for liquid water, ice and snow is much closer to the observed distribution. The comparison of models and observations provides a much-needed analysis of the vertical distribution of IWC across the globe, highlighting the ability of the models to reproduce much of the observed variability as well as the deficiencies where further improvements are required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents an analysis of WAXS (wide-angle X-ray scattering) data which aids an understanding of the structure of non-crystalline polymers. Experimental results are compared with calculations of scattering from possible models. Evidence is presented which supports the view that the chains in molten PE do not lie parallel but have a conformation in accord with the predictions of energy calculations. However, the evidence indicates that in “molten” PTFE the chains lie parallel over distances well in excess of their diameters. WAXS-based proposals are made for the conformations of a-PMMA and a-PS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of applying a fragment-based protein tertiary structure prediction method to the prediction of 14 CASP5 target domains are described. The method is based on the assembly of supersecondary structural fragments taken from highly resolved protein structures using a simulated annealing algorithm. A number of good predictions for proteins with novel folds were produced, although not always as the first model. For two fold recognition targets, FRAGFOLD produced the most accurate model in both cases, despite the fact that the predictions were not based on a template structure. Although clear progress has been made in improving FRAGFOLD since CASP4, the ranking of final models still seems to be the main problem that needs to be addressed before the next CASP experiment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The water vapour continuum is characterised by absorption that varies smoothly with wavelength, from the visible to the microwave. It is present within the rotational and vibrational–rotational bands of water vapour, which consist of large numbers of narrow spectral lines, and in the many ‘windows’ between these bands. The continuum absorption in the window regions is of particular importance for the Earth’s radiation budget and for remote-sensing techniques that exploit these windows. Historically, most attention has focused on the 8–12 μm (mid-infrared) atmospheric window, where the continuum is relatively well-characterised, but there have been many fewer measurements within bands and in other window regions. In addition, the causes of the continuum remain a subject of controversy. This paper provides a brief historical overview of the development of understanding of the continuum and then reviews recent developments, with a focus on the near-infrared spectral region. Recent laboratory measurements in near-infrared windows, which reveal absorption typically an order of magnitude stronger than in widely used continuum models, are shown to have important consequences for remote-sensing techniques that use these windows for retrieving cloud properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corneal tissue engineering has improved dramatically over recent years. It is now possible to apply these technological advancements to the development of superior in vitro ocular surface models to reduce animal testing. We aim to show the effect different substrates can have on the viability of expanded corneal epithelial cells and that those which more accurately mimic the stromal surface provide the most protection against toxic assault. Compressed collagen gel as a substrate for the expansion of a human epithelial cell line was compared against two well-known substrates for modeling the ocular surface (polycarbonate membrane and conventional collagen gel). Cells were expanded over 10 days at which point cell stratification, cell number and expression of junctional proteins were assessed by electron microscopy, immunohistochemistry and RT-PCR. The effect of increasing concentrations of sodium lauryl sulphate on epithelial cell viability was quantified by MTT assay. Results showed improvement in terms of stratification, cell number and tight junction expression in human epithelial cells expanded upon either the polycarbonate membrane or compressed collagen gel when compared to a the use of a conventional collagen gel. However, cell viability was significantly higher in cells expanded upon the compressed collagen gel. We conclude that the more naturalistic composition and mechanical properties of compressed collagen gels produces a more robust corneal model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protein structure prediction methods aim to predict the structures of proteins from their amino acid sequences, utilizing various computational algorithms. Structural genome annotation is the process of attaching biological information to every protein encoded within a genome via the production of three-dimensional protein models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[1] Remotely sensed, multiannual data sets of shortwave radiative surface fluxes are now available for assimilation into land surface schemes (LSSs) of climate and/or numerical weather prediction models. The RAMI4PILPS suite of virtual experiments assesses the accuracy and consistency of the radiative transfer formulations that provide the magnitudes of absorbed, reflected, and transmitted shortwave radiative fluxes in LSSs. RAMI4PILPS evaluates models under perfectly controlled experimental conditions in order to eliminate uncertainties arising from an incomplete or erroneous knowledge of the structural, spectral and illumination related canopy characteristics typical for model comparison with in situ observations. More specifically, the shortwave radiation is separated into a visible and near-infrared spectral region, and the quality of the simulated radiative fluxes is evaluated by direct comparison with a 3-D Monte Carlo reference model identified during the third phase of the Radiation transfer Model Intercomparison (RAMI) exercise. The RAMI4PILPS setup thus allows to focus in particular on the numerical accuracy of shortwave radiative transfer formulations and to pinpoint to areas where future model improvements should concentrate. The impact of increasing degrees of structural and spectral subgrid variability on the simulated fluxes is documented and the relevance of any thus emerging biases with respect to gross primary production estimates and shortwave radiative forcings due to snow and fire events are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When performing data fusion, one often measures where targets were and then wishes to deduce where targets currently are. There has been recent research on the processing of such out-of-sequence data. This research has culminated in the development of a number of algorithms for solving the associated tracking problem. This paper reviews these different approaches in a common Bayesian framework and proposes an architecture that orthogonalises the data association and out-of-sequence problems such that any combination of solutions to these two problems can be used together. The emphasis is not on advocating one approach over another on the basis of computational expense, but rather on understanding the relationships among the algorithms so that any approximations made are explicit. Results for a multi-sensor scenario involving out-of-sequence data association are used to illustrate the utility of this approach in a specific context.