100 resultados para Standard Insurance Company.
Resumo:
The southern industrial rivers (Aire, Calder, Don and Trent) feeding the Humber estuary were routinely monitored for a range of chlorinated micro- organic contaminants at least once a week over a 1.5-year period. Environmental Quality Standards (EQSs) for inland waters were set under the European Economic Community for a limited number of problematic contaminants (18). The results of the monitoring program for seven classes of chlorinated pollutants on the EQS list are presented in this study. All compounds were detected frequently with the exception of hexachlorobutadiene (where only one detectable measurement out of 280 individual samples occurred). In general, the rivers fell into two classes with respect to their contamination patterns. The Aire and Calder carried higher concentrations of micro- pollutants than the Don and Trent, with the exception of hexachlorobenzene (HCB). For Σ hexachlorocyclohexane (HCH) isomers (α + γ) and for dieldrin, a number of samples (~ 5%) exceeded their EQS for both the Aire and Calder. Often, ΣHCH concentrations were just below the EQS level. Levels of p,p'- DDT on occasions approached the EQS for these two rivers, but only one sample (out of 140) exceeded the EQS. No compounds exceeded their EQS levels on the Don and Trent. Analysis of the ratio of γ HCH/αHCH indicated that the source of HCH for the Don and Trent catchments was primarily lindane (γHCH) and, to a lesser extent, technical HCH (mixture of HCH isomers, dominated by α HCH), while the source(s) for the Aire and Calder had a much higher contribution from technical HCH.
Resumo:
We study how ownership structure and management objectives interact in determining the company size without assuming information constraints or any explicit costs of management. In symmetric agent economies, the optimal company size balances the returns to scale of the production function and the returns to collaboration efficiency. For a general class of payoff functions, we characterize the optimal company size, and we compare the optimal company size across different managerial objectives. We demonstrate the restrictiveness of common assumptions on effort aggregation (e.g., constant elasticity of effort substitution), and we show that common intuition (e.g., that corporate companies are more efficient and therefore will be larger than equal-share partnerships) might not hold in general.
Resumo:
Introduction
Mild cognitive impairment (MCI) has clinical value in its ability to predict later dementia. A better understanding of cognitive profiles can further help delineate who is most at risk of conversion to dementia. We aimed to (1) examine to what extent the usual MCI subtyping using core criteria corresponds to empirically defined clusters of patients (latent profile analysis [LPA] of continuous neuropsychological data) and (2) compare the two methods of subtyping memory clinic participants in their prediction of conversion to dementia.
Methods
Memory clinic participants (MCI, n = 139) and age-matched controls (n = 98) were recruited. Participants had a full cognitive assessment, and results were grouped (1) according to traditional MCI subtypes and (2) using LPA. MCI participants were followed over approximately 2 years after their initial assessment to monitor for conversion to dementia.
Results
Groups were well matched for age and education. Controls performed significantly better than MCI participants on all cognitive measures. With the traditional analysis, most MCI participants were in the amnestic multidomain subgroup (46.8%) and this group was most at risk of conversion to dementia (63%). From the LPA, a three-profile solution fit the data best. Profile 3 was the largest group (40.3%), the most cognitively impaired, and most at risk of conversion to dementia (68% of the group).
Discussion
LPA provides a useful adjunct in delineating MCI participants most at risk of conversion to dementia and adds confidence to standard categories of clinical inference.
Resumo:
The area and power consumption of low-density parity check (LDPC) decoders are typically dominated by embedded memories. To alleviate such high memory costs, this paper exploits the fact that all internal memories of a LDPC decoder are frequently updated with new data. These unique memory access statistics are taken advantage of by replacing all static standard-cell based memories (SCMs) of a prior-art LDPC decoder implementation by dynamic SCMs (D-SCMs), which are designed to retain data just long enough to guarantee reliable operation. The use of D-SCMs leads to a 44% reduction in silicon area of the LDPC decoder compared to the use of static SCMs. The low-power LDPC decoder architecture with refresh-free D-SCMs was implemented in a 90nm CMOS process, and silicon measurements show full functionality and an information bit throughput of up to 600 Mbps (as required by the IEEE 802.11n standard).
Resumo:
Book review of Slavery by Any Other Name: African Life under Company Rule in Colonial Mozambique, by Eric Allina, Charlottesville, University of Virginia Press, 2012, 255 pp., £44.50, ISBN 978-0-8139-3272-9.
Resumo:
Lattice-based cryptography has gained credence recently as a replacement for current public-key cryptosystems, due to its quantum-resilience, versatility, and relatively low key sizes. To date, encryption based on the learning with errors (LWE) problem has only been investigated from an ideal lattice standpoint, due to its computation and size efficiencies. However, a thorough investigation of standard lattices in practice has yet to be considered. Standard lattices may be preferred to ideal lattices due to their stronger security assumptions and less restrictive parameter selection process. In this paper, an area-optimised hardware architecture of a standard lattice-based cryptographic scheme is proposed. The design is implemented on a FPGA and it is found that both encryption and decryption fit comfortably on a Spartan-6 FPGA. This is the first hardware architecture for standard lattice-based cryptography reported in the literature to date, and thus is a benchmark for future implementations.
Additionally, a revised discrete Gaussian sampler is proposed which is the fastest of its type to date, and also is the first to investigate the cost savings of implementing with lamda_2-bits of precision. Performance results are promising in comparison to the hardware designs of the equivalent ring-LWE scheme, which in addition to providing a stronger security proof; generate 1272 encryptions per second and 4395 decryptions per second.
Resumo:
BACKGROUND: Personalised nutrition (PN) may promote public health. PN involves dietary advice based on individual characteristics of end users and can for example be based on lifestyle, blood and/or DNA profiling. Currently, PN is not refunded by most health insurance or health care plans. Improved public health is contingent on individual consumers being willing to pay for the service.
METHODS: A survey with a representative sample from the general population was conducted in eight European countries (N= 8233). Participants reported their willingness to pay (WTP) for PN based on lifestyle information, lifestyle and blood information, and lifestyle and DNA information. WTP was elicited by contingent valuation with the price of a standard, non-PN advice used as reference.
RESULTS: About 30% of participants reported being willing to pay more for PN than for non-PN advice. They were on average prepared to pay about 150% of the reference price of a standard, non-personalised advice, with some differences related to socio-demographic factors.
CONCLUSION: There is a potential market for PN compared to non-PN advice, particularly among men on higher incomes. These findings raise questions to what extent personalized nutrition can be left to the market or should be incorporated into public health programs.