923 resultados para Compression Metric
Resumo:
This paper discusses a study to validate the metric developed in the Geers and Moog Cochlear Implant Study at CID to measure the speech production of hearing impaired children.
Resumo:
Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
The self-consistent field theory (SCFT) prediction for the compression force between two semi-dilute polymer brushes is compared to the benchmark experiments of Taunton et al. [Nature, 1988, 332, 712]. The comparison is done with previously established parameters, and without any fitting parameters whatsoever. The SCFT provides a significant quantitative improvement over the classical strong-stretching theory (SST), yielding excellent quantitative agreement with the experiment. Contrary to earlier suggestions, chain fluctuations cannot be ignored for normal experimental conditions. Although the analytical expressions of SST provide invaluable aids to understanding the qualitative behavior of polymeric brushes, the numerical SCFT is necessary in order to provide quantitatively accurate predictions.
Resumo:
We compare the use of plastically compressed collagen gels to conventional collagen gels as scaffolds onto which corneal limbal epithelial cells (LECs) are seeded to construct an artificial corneal epithelium. LECs were isolated from bovine corneas (limbus) and seeded onto either conventional uncompressed or novel compressed collagen gels and grown in culture. Scanning electron microscopy (SEM) results showed that fibers within the uncompressed gel were loose and irregularly ordered, whereas the fibers within the compressed gel were densely packed and more evenly arranged. Quantitative analysis of LECs expansion across the surface of the two gels showed similar growth rates (p > 0.05). Under SEM, the LECs, expanded on uncompressed gels, showed a rough and heterogeneous morphology, whereas on the compressed gel, the cells displayed a smooth and homogeneous morphology. Transmission electron microscopy (TEM) results showed the compressed scaffold to contain collagen fibers of regular diameter and similar orientation resembling collagen fibers within the normal cornea. TEM and light microscopy also showed that cell–cell and cell–matrix attachment, stratification, and cell density were superior in LECs expanded upon compressed collagen gels. This study demonstrated that the compressed collagen gel was an excellent biomaterial scaffold highly suited to the construction of an artificial corneal epithelium and a significant improvement upon conventional collagen gels.
Resumo:
This paper examines the normal force between two opposing polyelectrolyte brushes and the interpenetration of their chains that is responsible for sliding friction. It focuses on the special case of semi-dilute brushes in a salt-free theta solvent, for which Zhulina and Borisov [J. Chem. Phys., {\bf 107}, 5952, (1997)] have derived analytical predictions using the classical strong-stretching theory (SST) introduced by Semenov and developed by Milner, Witten and Cates. Interestingly, the SST predicts that the brushes contract maintaining a polymer-free gap as they are compressed together, which provides an explanation for the ultra-low frictional forces observed in experiment. We examine the degree to which the SST predictions are affected by chain fluctuations by employing self-consistent field theory (SCFT). While the normal force is relatively unaffected, fluctuations are found to have a strong impact on brush interpenetration. Even still, the contraction of the brushes does significantly prolong the onset of interpenetration, implying that a sizeable normal force can be achieved before the sliding friction becomes significant.
Resumo:
Classical strong-stretching theory (SST) predicts that, as opposing polyelectrolyte brushes are compressed together in a salt-free theta solvent, they contract so as to maintain a finite polymer-free gap, which offers a potential explanation for the ultra-low frictional forces observed in experiments even with the application of large normal forces. However, the SST ignores chain fluctuations, which would tend to close the gap resulting in physical contact and in turn significant friction. In a preceding study, we examined the effect of fluctuations using self-consistent field theory (SCFT) and illustrated that high normal forces can still be applied before the gap is destroyed. We now look at the effect of adding salt. It is found to reduce the long-range interaction between the brushes but has little effect on the short-range part, provided the concentration does not enter the salted-brush regime. Consequently, the maximum normal force between two planar brushes at the point of contact is remarkably unaffected by salt. For the crossed-cylinder geometry commonly used in experiments, however, there is a gradual reduction because in this case the long-range part of the interaction contributes to the maximum normal force.
Resumo:
Vertebral compression fractures are a common clinical problem and the incidence of them will increase with the ageing population. Traditionally management has been conservative; however, there has been a growing trend towards vertebroplasty as an alternative therapy in patients with persisting severe pain. NICE produced guidance in 2003 recommending the procedure after 4 weeks of conservative management. Recent high-quality studies have been contradictory and there is currently a debate surrounding the role of the procedure with no agreement in the literature. We examine the evidence in both osteoporotic and malignant vertebral compression fractures; we also describe the benefits and side effects, alternative treatment options and the cost of the procedure. Finally, we recommend when vertebroplasty is most appropriately used based on the best available evidence.
Resumo:
Empirical Mode Decomposition (EMD) is a data driven technique for extraction of oscillatory components from data. Although it has been introduced over 15 years ago, its mathematical foundations are still missing which also implies lack of objective metrics for decomposed set evaluation. Most common technique for assessing results of EMD is their visual inspection, which is very subjective. This article provides objective measures for assessing EMD results based on the original definition of oscillatory components.
Resumo:
Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.
Resumo:
Searching in a dataset for elements that are similar to a given query element is a core problem in applications that manage complex data, and has been aided by metric access methods (MAMs). A growing number of applications require indices that must be built faster and repeatedly, also providing faster response for similarity queries. The increase in the main memory capacity and its lowering costs also motivate using memory-based MAMs. In this paper. we propose the Onion-tree, a new and robust dynamic memory-based MAM that slices the metric space into disjoint subspaces to provide quick indexing of complex data. It introduces three major characteristics: (i) a partitioning method that controls the number of disjoint subspaces generated at each node; (ii) a replacement technique that can change the leaf node pivots in insertion operations; and (iii) range and k-NN extended query algorithms to support the new partitioning method, including a new visit order of the subspaces in k-NN queries. Performance tests with both real-world and synthetic datasets showed that the Onion-tree is very compact. Comparisons of the Onion-tree with the MM-tree and a memory-based version of the Slim-tree showed that the Onion-tree was always faster to build the index. The experiments also showed that the Onion-tree significantly improved range and k-NN query processing performance and was the most efficient MAM, followed by the MM-tree, which in turn outperformed the Slim-tree in almost all the tests. (C) 2010 Elsevier B.V. All rights reserved.