149 resultados para Standard map


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The southern industrial rivers (Aire, Calder, Don and Trent) feeding the Humber estuary were routinely monitored for a range of chlorinated micro- organic contaminants at least once a week over a 1.5-year period. Environmental Quality Standards (EQSs) for inland waters were set under the European Economic Community for a limited number of problematic contaminants (18). The results of the monitoring program for seven classes of chlorinated pollutants on the EQS list are presented in this study. All compounds were detected frequently with the exception of hexachlorobutadiene (where only one detectable measurement out of 280 individual samples occurred). In general, the rivers fell into two classes with respect to their contamination patterns. The Aire and Calder carried higher concentrations of micro- pollutants than the Don and Trent, with the exception of hexachlorobenzene (HCB). For Σ hexachlorocyclohexane (HCH) isomers (α + γ) and for dieldrin, a number of samples (~ 5%) exceeded their EQS for both the Aire and Calder. Often, ΣHCH concentrations were just below the EQS level. Levels of p,p'- DDT on occasions approached the EQS for these two rivers, but only one sample (out of 140) exceeded the EQS. No compounds exceeded their EQS levels on the Don and Trent. Analysis of the ratio of γ HCH/αHCH indicated that the source of HCH for the Don and Trent catchments was primarily lindane (γHCH) and, to a lesser extent, technical HCH (mixture of HCH isomers, dominated by α HCH), while the source(s) for the Aire and Calder had a much higher contribution from technical HCH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an investigation of various shroud bleed slot configurations of a centrifugal compressor using CFD with a manual multi-block structured grid generation method. The compressor under investigation is used in a turbocharger application for a heavy duty diesel engine of approximately 400hp. The baseline numerical model has been developed and validated against experimental performance measurements. The influence of the bleed slot flow field on a range of operating conditions between surge and choke has been analysed in detail. The impact of the returning bleed flow on the incidence at the impeller blade leading edge due to its mixing with the main through-flow has also been studied. From the baseline geometry, a number of modifications to the bleed slot width have been proposed, and a detailed comparison of the flow characteristics performed. The impact of slot variations on the inlet incidence angle has been investigated, highlighting the improvement in surge and choked flow capability. Along with this, the influence of the bleed slot on stabilizing the blade passage flow by the suction of the tip and over-tip vortex flow by the slot has been considered near surge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As data analytics are growing in importance they are also quickly becoming one of the dominant application domains that require parallel processing. This paper investigates the applicability of OpenMP, the dominant shared-memory parallel programming model in high-performance computing, to the domain of data analytics. We contrast the performance and programmability of key data analytics benchmarks against Phoenix++, a state-of-the-art shared memory map/reduce programming system. Our study shows that OpenMP outperforms the Phoenix++ system by a large margin for several benchmarks. In other cases, however, the programming model is lacking support for this application domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction
Mild cognitive impairment (MCI) has clinical value in its ability to predict later dementia. A better understanding of cognitive profiles can further help delineate who is most at risk of conversion to dementia. We aimed to (1) examine to what extent the usual MCI subtyping using core criteria corresponds to empirically defined clusters of patients (latent profile analysis [LPA] of continuous neuropsychological data) and (2) compare the two methods of subtyping memory clinic participants in their prediction of conversion to dementia.

Methods
Memory clinic participants (MCI, n = 139) and age-matched controls (n = 98) were recruited. Participants had a full cognitive assessment, and results were grouped (1) according to traditional MCI subtypes and (2) using LPA. MCI participants were followed over approximately 2 years after their initial assessment to monitor for conversion to dementia.

Results
Groups were well matched for age and education. Controls performed significantly better than MCI participants on all cognitive measures. With the traditional analysis, most MCI participants were in the amnestic multidomain subgroup (46.8%) and this group was most at risk of conversion to dementia (63%). From the LPA, a three-profile solution fit the data best. Profile 3 was the largest group (40.3%), the most cognitively impaired, and most at risk of conversion to dementia (68% of the group).

Discussion
LPA provides a useful adjunct in delineating MCI participants most at risk of conversion to dementia and adds confidence to standard categories of clinical inference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The area and power consumption of low-density parity check (LDPC) decoders are typically dominated by embedded memories. To alleviate such high memory costs, this paper exploits the fact that all internal memories of a LDPC decoder are frequently updated with new data. These unique memory access statistics are taken advantage of by replacing all static standard-cell based memories (SCMs) of a prior-art LDPC decoder implementation by dynamic SCMs (D-SCMs), which are designed to retain data just long enough to guarantee reliable operation. The use of D-SCMs leads to a 44% reduction in silicon area of the LDPC decoder compared to the use of static SCMs. The low-power LDPC decoder architecture with refresh-free D-SCMs was implemented in a 90nm CMOS process, and silicon measurements show full functionality and an information bit throughput of up to 600 Mbps (as required by the IEEE 802.11n standard).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The environmental quality of land is often assessed by the calculation of threshold values which aim to differentiate between concentrations of elements based on whether the soils are in residential or industrial sites. In Europe, for example, soil guideline values exist for agricultural and grazing land. A threshold is often set to differentiate between concentrations of the element that naturally occur in the soil and concentrations that result from diffuse anthropogenic sources. Regional geochemistry and, in particular, single component geochemical maps are increasingly being used to determine these baseline environmental assessments. The key question raised in this paper is whether the geochemical map can provide an accurate interpretation on its own. Implicit is the thought that single component geochemical maps represent absolute abundances. However,because of the compositional (closed) nature of the data univariate geochemical maps cannot be compared directly with one another.. As a result, any interpretation based on them is vulnerable to spurious correlation problems. What does this mean for soil geochemistry mapping, baseline quality documentation, soil resource assessment or risk evaluation? Despite the limitation of relative abundances, individual raw geochemical maps are deemed fundamental to several applications of geochemical maps including environmental assessments. However, element toxicity is related to its bioavailable concentration, which is lowered if its source is mixed with another source. Elements interact, for example under reducing conditions with iron oxides, its solid state is lost and arsenic becomes soluble and mobile. Both of these matters may be more adequately dealt with if a single component map is not interpreted in isolation to determine baseline and threshold assessments. A range of alternative compositionally compliant representations based on log-ratio and log-contrast approaches are explored to supplement the classical single component maps for environmental assessment. Case study examples are shown based on the Tellus soil geochemical dataset, covering Northern Ireland and the results of in vitro oral bioaccessibility testing carried out on a sub-set of archived Tellus Survey shallow soils following the Unified BARGE (Bioaccessibility Research Group of Europe).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the computational complexity of finding maximum a posteriori configurations in Bayesian networks whose probabilities are specified by logical formulas. This approach leads to a fine grained study in which local information such as context-sensitive independence and determinism can be considered. It also allows us to characterize more precisely the jump from tractability to NP-hardness and beyond, and to consider the complexity introduced by evidence alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lattice-based cryptography has gained credence recently as a replacement for current public-key cryptosystems, due to its quantum-resilience, versatility, and relatively low key sizes. To date, encryption based on the learning with errors (LWE) problem has only been investigated from an ideal lattice standpoint, due to its computation and size efficiencies. However, a thorough investigation of standard lattices in practice has yet to be considered. Standard lattices may be preferred to ideal lattices due to their stronger security assumptions and less restrictive parameter selection process. In this paper, an area-optimised hardware architecture of a standard lattice-based cryptographic scheme is proposed. The design is implemented on a FPGA and it is found that both encryption and decryption fit comfortably on a Spartan-6 FPGA. This is the first hardware architecture for standard lattice-based cryptography reported in the literature to date, and thus is a benchmark for future implementations.
Additionally, a revised discrete Gaussian sampler is proposed which is the fastest of its type to date, and also is the first to investigate the cost savings of implementing with lamda_2-bits of precision. Performance results are promising in comparison to the hardware designs of the equivalent ring-LWE scheme, which in addition to providing a stronger security proof; generate 1272 encryptions per second and 4395 decryptions per second.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article attempts to push Mauss’ work on the sociality of prayer (1909) to its fullest conclusion by arguing that prayer can be viewed anthropologically as providing a map for social and emotional relatedness. Based on fieldwork among deep-sea fisher families living in Gamrie, North-East Scotland (home to 700 people and six Protestant churches), the author takes as his primary ethnographic departure the ritual of the ‘mid-week prayer meeting’. Among the self-proclaimed ‘fundamentalists’ of Gamrie’s Brethren and Presbyterian churches, attending the prayer meeting means praying for salvation. Yet, contrary to the stereotype of Protestant soteriology as highly individualist, in the context of Gamrie, salvation is not principally focused upon the self, but is instead sought on behalf of the ‘unconverted’ other. Locally, this ‘other’ is made sense of with reference to three different categories of relatedness: the family, the village and the nation. The author’s argument is that each category of relatedness carries with it a different affective quality: anguish for one’s family, resentment toward one’s village, and resignation towards one’s nation. As such, prayers for salvation establish and maintain not only vertical – human-divine – relatedness, but also horizontal relatedness between persons, while also giving them their emotional tenor. In ‘fundamentalist’ Gamrie, these human relationships, and crucially their affective asymmetries, may be mapped, therefore, by treating prayers as social phenomena that seek to engage with a world dichotomised into vice and virtue, rebellion and submission, and, ultimately, damnation and salvation.