19 resultados para equivalent web thickness method

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In principle the global mean geostrophic surface circulation of the ocean can be diagnosed by subtracting a geoid from a mean sea surface (MSS). However, because the resulting mean dynamic topography (MDT) is approximately two orders of magnitude smaller than either of the constituent surfaces, and because the geoid is most naturally expressed as a spectral model while the MSS is a gridded product, in practice complications arise. Two algorithms for combining MSS and satellite-derived geoid data to determine the ocean’s mean dynamic topography (MDT) are considered in this paper: a pointwise approach, whereby the gridded geoid height field is subtracted from the gridded MSS; and a spectral approach, whereby the spherical harmonic coefficients of the geoid are subtracted from an equivalent set of coefficients representing the MSS, from which the gridded MDT is then obtained. The essential difference is that with the latter approach the MSS is truncated, a form of filtering, just as with the geoid. This ensures that errors of omission resulting from the truncation of the geoid, which are small in comparison to the geoid but large in comparison to the MDT, are matched, and therefore negated, by similar errors of omission in the MSS. The MDTs produced by both methods require additional filtering. However, the spectral MDT requires less filtering to remove noise, and therefore it retains more oceanographic information than its pointwise equivalent. The spectral method also results in a more realistic MDT at coastlines. 1. Introduction An important challenge in oceanography is the accurate determination of the ocean’s time-mean dynamic topography (MDT). If this can be achieved with sufficient accuracy for combination with the timedependent component of the dynamic topography, obtainable from altimetric data, then the resulting sum (i.e., the absolute dynamic topography) will give an accurate picture of surface geostrophic currents and ocean transports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article introduces a new general method for genealogical inference that samples independent genealogical histories using importance sampling (IS) and then samples other parameters with Markov chain Monte Carlo (MCMC). It is then possible to more easily utilize the advantages of importance sampling in a fully Bayesian framework. The method is applied to the problem of estimating recent changes in effective population size from temporally spaced gene frequency data. The method gives the posterior distribution of effective population size at the time of the oldest sample and at the time of the most recent sample, assuming a model of exponential growth or decline during the interval. The effect of changes in number of alleles, number of loci, and sample size on the accuracy of the method is described using test simulations, and it is concluded that these have an approximately equivalent effect. The method is used on three example data sets and problems in interpreting the posterior densities are highlighted and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ASTER Global Digital Elevation Model (GDEM) has made elevation data at 30 m spatial resolution freely available, enabling reinvestigation of morphometric relationships derived from limited field data using much larger sample sizes. These data are used to analyse a range of morphometric relationships derived for dunes (between dune height, spacing, and equivalent sand thickness) in the Namib Sand Sea, which was chosen because there are a number of extant studies that could be used for comparison with the results. The relative accuracy of GDEM for capturing dune height and shape was tested against multiple individual ASTER DEM scenes and against field surveys, highlighting the smoothing of the dune crest and resultant underestimation of dune height, and the omission of the smallest dunes, because of the 30 m sampling of ASTER DEM products. It is demonstrated that morphometric relationships derived from GDEM data are broadly comparable with relationships derived by previous methods, across a range of different dune types. The data confirm patterns of dune height, spacing and equivalent sand thickness mapped previously in the Namib Sand Sea, but add new detail to these patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increased availability of digital elevation models and satellite image data enable testing of morphometric relationships between sand dune variables (dune height, spacing and equivalent sand thickness), which were originally established using limited field survey data. These long-established geomorphological hypotheses can now be tested against very much larger samples than were possible when available data were limited to what could be collected by field surveys alone. This project uses ASTER Global Digital Elevation Model (GDEM) data to compare morphometric relationships between sand dune variables in the southwest Kalahari dunefield to those of the Namib Sand Sea, to test whether the relationships found in an active sand sea (Namib) also hold for the fixed dune system of the nearby southwest Kalahari. The data show significant morphometric differences between the simple linear dunes of the Namib sand sea and the southwest Kalahari; the latter do not show the expected positive relationship between dune height and spacing. The southwest Kalahari dunes show a similar range of dune spacings, but they are less tall, on average, than the Namib sand sea dunes. There is a clear spatial pattern to these morphometric data; the tallest and most closely spaced dunes are towards the southeast of the Kalahari dunefield; and this is where the highest values of equivalent sand thickness result. We consider the possible reasons for the observed differences and highlight the need for more studies comparing sand seas and dunefields from different environmental settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein–ligand binding site prediction methods aim to predict, from amino acid sequence, protein–ligand interactions, putative ligands, and ligand binding site residues using either sequence information, structural information, or a combination of both. In silico characterization of protein–ligand interactions has become extremely important to help determine a protein’s functionality, as in vivo-based functional elucidation is unable to keep pace with the current growth of sequence databases. Additionally, in vitro biochemical functional elucidation is time-consuming, costly, and may not be feasible for large-scale analysis, such as drug discovery. Thus, in silico prediction of protein–ligand interactions must be utilized to aid in functional elucidation. Here, we briefly discuss protein function prediction, prediction of protein–ligand interactions, the Critical Assessment of Techniques for Protein Structure Prediction (CASP) and the Continuous Automated EvaluatiOn (CAMEO) competitions, along with their role in shaping the field. We also discuss, in detail, our cutting-edge web-server method, FunFOLD for the structurally informed prediction of protein–ligand interactions. Furthermore, we provide a step-by-step guide on using the FunFOLD web server and FunFOLD3 downloadable application, along with some real world examples, where the FunFOLD methods have been used to aid functional elucidation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Web service composition can be facilitated by an automatic process which consists of rules, conditions and actions. This research has adapted ElementaryPetri Net (EPN) to analyze and model the web services and their composition. This paper describes a set of techniques for representing transition rules, algorithm and workflow that web service composition can be automatically carried out.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A sampling oscilloscope is one of the main units in automatic pulse measurement system (APMS). The time jitter in waveform samplers is an important error source that affect the precision of data acquisition. In this paper, this kind of error is greatly reduced by using the deconvolution method. First, the probability density function (PDF) of time jitter distribution is determined by the statistical approach, then, this PDF is used as convolution kern to deconvolve with the acquired waveform data with additional averaging, and the result is the waveform data in which the effect of time jitter has been removed, and the measurement precision of APMS is greatly improved. In addition, some computer simulations are given which prove the success of the method given in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the problem of time-harmonic acoustic scattering in two dimensions by convex polygons. Standard boundary or finite element methods for acoustic scattering problems have a computational cost that grows at least linearly as a function of the frequency of the incident wave. Here we present a novel Galerkin boundary element method, which uses an approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh, with smaller elements closer to the corners of the polygon. We prove that the best approximation from the approximation space requires a number of degrees of freedom to achieve a prescribed level of accuracy that grows only logarithmically as a function of the frequency. Numerical results demonstrate the same logarithmic dependence on the frequency for the Galerkin method solution. Our boundary element method is a discretization of a well-known second kind combined-layer-potential integral equation. We provide a proof that this equation and its adjoint are well-posed and equivalent to the boundary value problem in a Sobolev space setting for general Lipschitz domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The storage and processing capacity realised by computing has lead to an explosion of data retention. We now reach the point of information overload and must begin to use computers to process more complex information. In particular, the proposition of the Semantic Web has given structure to this problem, but has yet realised practically. The largest of its problems is that of ontology construction; without a suitable automatic method most will have to be encoded by hand. In this paper we discus the current methods for semi and fully automatic construction and their current shortcomings. In particular we pay attention the application of ontologies to products and the particle application of the ontologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiation schemes in general circulation models currently make a number of simplifications when accounting for clouds, one of the most important being the removal of horizontal inhomogeneity. A new scheme is presented that attempts to account for the neglected inhomogeneity by using two regions of cloud in each vertical level of the model as opposed to one. One of these regions is used to represent the optically thinner cloud in the level, and the other represents the optically thicker cloud. So, along with the clear-sky region, the scheme has three regions in each model level and is referred to as “Tripleclouds.” In addition, the scheme has the capability to represent arbitrary vertical overlap between the three regions in pairs of adjacent levels. This scheme is implemented in the Edwards–Slingo radiation code and tested on 250 h of data from 12 different days. The data are derived from cloud retrievals using radar, lidar, and a microwave radiometer at Chilbolton, southern United Kingdom. When the data are grouped into periods equivalent in size to general circulation model grid boxes, the shortwave plane-parallel albedo bias is found to be 8%, while the corresponding bias is found to be less than 1% using Tripleclouds. Similar results are found for the longwave biases. Tripleclouds is then compared to a more conventional method of accounting for inhomogeneity that multiplies optical depths by a constant scaling factor, and Tripleclouds is seen to improve on this method both in terms of top-of-atmosphere radiative flux biases and internal heating rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background/aims: Scant consideration has been given to the variation in structure of the human amniotic membrane (AM) at source or to the significance such differences might have on its clinical transparency. Therefore, we applied our experience of quantifying corneal transparency to AM. Methods: Following elective caesarean, AM from areas of the fetal sac distal and proximal (ie, adjacent) to the placenta was compared with freeze-dried AM. The transmission of light through the AM samples was quantified spectrophotometrically; also, tissue thickness was measured by light microscopy and refractive index by refractometry. Results: Freeze-dried and freeze-thawed AM samples distal and proximal to the placenta differed significantly in thickness, percentage transmission of visible light and refractive index. The thinnest tissue (freeze-dried AM) had the highest transmission spectra. The thickest tissue (freeze-thawed AM proximal to the placenta) had the highest refractive index. Using the direct summation of fields method to predict transparency from an equivalent thickness of corneal tissue, AM was found to be up to 85% as transparent as human cornea. Conclusion: When preparing AM for ocular surface reconstruction within the visual field, consideration should be given to its original location from within the fetal sac and its method of preservation, as either can influence corneal transparency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is the numerical realization of the probe method suggested by Ikehata for the detection of an obstacle D in inverse scattering. The main idea of the method is to use probes in the form of point source (., z) with source point z to define an indicator function (I) over cap (z) which can be reconstructed from Cauchy data or far. eld data. The indicator function boolean AND (I) over cap (z) can be shown to blow off when the source point z tends to the boundary aD, and this behavior can be used to find D. To study the feasibility of the probe method we will use two equivalent formulations of the indicator function. We will carry out the numerical realization of the functional and show reconstructions of a sound-soft obstacle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The IntFOLD-TS method was developed according to the guiding principle that the model quality assessment would be the most critical stage for our template based modelling pipeline. Thus, the IntFOLD-TS method firstly generates numerous alternative models, using in-house versions of several different sequence-structure alignment methods, which are then ranked in terms of global quality using our top performing quality assessment method – ModFOLDclust2. In addition to the predicted global quality scores, the predictions of local errors are also provided in the resulting coordinate files, using scores that represent the predicted deviation of each residue in the model from the equivalent residue in the native structure. The IntFOLD-TS method was found to generate high quality 3D models for many of the CASP9 targets, whilst also providing highly accurate predictions of their per-residue errors. This important information may help to make the 3D models that are produced by the IntFOLD-TS method more useful for guiding future experimental work

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a method to objectively determine the most suitable analogue redesign method for forward type converters under digital voltage mode control. Particular emphasis is placed on determining the method which allows the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration have the largest phase margins. An accurate model of the power stage is used for simulation, and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent correlation between the simulation and experimental results is presented. This work will allow designers to confidently choose the analogue redesign method which yields the greater phase margin for their application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a hierarchical clustering method for semantic Web service discovery. This method aims to improve the accuracy and efficiency of the traditional service discovery using vector space model. The Web service is converted into a standard vector format through the Web service description document. With the help of WordNet, a semantic analysis is conducted to reduce the dimension of the term vector and to make semantic expansion to meet the user’s service request. The process and algorithm of hierarchical clustering based semantic Web service discovery is discussed. Validation is carried out on the dataset.