871 resultados para auction aggregation protocols
Resumo:
Current toxic tort cases have increased national awareness of health concerns and present an important avenue in which public health scientists can perform a vital function: in litigation, and in public health initiatives and promotions which may result. This review presents a systematic approach, using the paradigm of interactive public health disciplines, for the design of a matrix framework for medical surveillance of workers exposed to toxic substances. The matrix framework design addresses the required scientific bases to support the legal remedy of medical monitoring for workers injured as a result of their exposure to toxic agents. A background of recent legal developments which have a direct impact on the use of scientific expertise in litigation is examined in the context of toxic exposure litigation and the attainment of public health goals. The matrix model is applied to five different workplace exposures: dental mercury, firefighting, vinyl chloride manufacture, radon in mining and silica. An exposure matrix designed by the Department of Energy for government nuclear workers is included as a reference comparison to the design matrix. ^
Resumo:
The genetic etiology of stroke likely reflects the influence of multiple loci with small effects, each modulating different pathophysiological processes. This research project utilized three analytical strategies to address the paucity of information related to the identification and characterization of genetic variation associated with stroke in the general population. ^ First, the general contribution of familial factors to stroke susceptibility was evaluated in a population-based sample of unrelated individuals. Increased risk of subclinical cerebral infarction was observed among individuals with a positive parental history of stroke. This association did not appear to be mediated by established stroke risk factors, specifically blood pressure levels or hypertension status. ^ The need to identify specific gene variation associated with stroke in the general population was addressed by evaluating seven candidate gene polymorphisms in a population-based sample of unrelated individuals. Three polymorphisms were significantly associated with increased subclinical cerebral infarction or incident clinical ischemic stroke risk. These relationships include the G-protein β3 subunit 825C/T polymorphism and clinical stroke in Whites, the lipoprotein lipase S/X447 polymorphism and subclinical and clinical stroke in men, and the angiotensin I-converting enzyme Ins/Del polymorphism and subclinical stroke in White men. These associations did not appear to be obfuscated by the stroke risk factors adjusted for in the analysis models specifically blood pressure levels or anti-hypertensive medication use. ^ The final research strategy considered, on a genome-wide scale, the idea that genetic variation may contribute to the occurrence of hypertension or stroke through a common etiologic pathway. Genomic regions were identified for which significant evidence of heterogeneity was observed among hypertensive sibpairs stratified by family history of stroke information. Regions identified on chromosome 15 in African Americans, and chromosome 13 in Whites and African Americans, suggest the presence of genes influencing hypertension and stroke susceptibility. ^ Insight into the role of genetics in stroke is useful for the potential early identification of individuals at increased risk for stroke and improved understanding of the etiology of the disease. The ultimate goal of these endeavors is to guide the development of therapeutic intervention and informed prevention to provide a lasting and positive impact on public health. ^
Resumo:
This study of the wholesale electricity market compares the efficiency performance of the auction mechanism currently in place in U.S. markets with the performance of a proposed mechanism. The analysis highlights the importance of considering strategic behavior when comparing different institutional systems. We find that in concentrated markets, neither auction mechanism can guarantee an efficient allocation. The advantage of the current mechanism increases with increased price competition if market demand is perfectly inelastic. However, if market demand has some responsiveness to price, the superiority of the current auction with respect to efficiency is not that obvious. We present a case where the proposed auction outperforms the current mechanism on efficiency even if all offers reflect true production costs. We also find that a market designer might face a choice problem with a tradeoff between lower electricity cost and production efficiency. Some implications for social welfare are discussed as well.
Resumo:
A problem frequently encountered in Data Envelopment Analysis (DEA) is that the total number of inputs and outputs included tend to be too many relative to the sample size. One way to counter this problem is to combine several inputs (or outputs) into (meaningful) aggregate variables reducing thereby the dimension of the input (or output) vector. A direct effect of input aggregation is to reduce the number of constraints. This, in its turn, alters the optimal value of the objective function. In this paper, we show how a statistical test proposed by Banker (1993) may be applied to test the validity of a specific way of aggregating several inputs. An empirical application using data from Indian manufacturing for the year 2002-03 is included as an example of the proposed test.
Resumo:
The purpose of this study was to determine the effects of nutrient intake, genetic factors and common household environmental factors on the aggregation of fasting blood glucose among Mexican-Americans in Starr County, Texas. This study was designed to determine: (a) the proportion of variation of fasting blood glucose concentration explained by unmeasured genetic and common household environmental effects; (b) the degree of familial aggregation of measures of nutrient intake; and (c) the extent to which the familial aggregation of fasting blood glucose is explained by nutrient intake and its aggregation. The method of path analysis was employed to determine these various effects.^ Genes play an important role in fasting blood glucose: Genetic variation was found to explain about 40% of the total variation in fasting blood glucose. Common household environmental effects, on the other hand, explained less than 3% of the variation in fasting blood glucose levels among individuals. Common household effects, however, did have significant effects on measures of nutrient intake, though it explained only about 10% of the total variance in nutrient intake. Finally, there was significant familial aggregation of nutrient intake measures, but their aggregation did not contribute significantly to the familial aggregation of fasting blood glucose. These results imply that similarities among relatives for fasting blood glucose are not due to similarities in nutrient intake among relatives. ^
Resumo:
Applying biometrics to daily scenarios involves demanding requirements in terms of software and hardware. On the contrary, current biometric techniques are also being adapted to present-day devices, like mobile phones, laptops and the like, which are far from meeting the previous stated requirements. In fact, achieving a combination of both necessities is one of the most difficult problems at present in biometrics. Therefore, this paper presents a segmentation algorithm able to provide suitable solutions in terms of precision for hand biometric recognition, considering a wide range of backgrounds like carpets, glass, grass, mud, pavement, plastic, tiles or wood. Results highlight that segmentation accuracy is carried out with high rates of precision (F-measure 88%)), presenting competitive time results when compared to state-of-the-art segmentation algorithms time performance
Resumo:
New trends in biometrics are oriented to mobile devices in order to increase the overall security in daily actions like bank account access, e-commerce or even document protection within the mobile. However, applying biometrics to mobile devices imply challenging aspects in biometric data acquisition, feature extraction or private data storage. Concretely, this paper attempts to deal with the problem of hand segmentation given a picture of the hand in an unknown background, requiring an accurate result in terms of hand isolation. For the sake of user acceptability, no restrictions are done on background, and therefore, hand images can be taken without any constraint, resulting segmentation in an exigent task. Multiscale aggregation strategies are proposed in order to solve this problem due to their accurate results in unconstrained and complicated scenarios, together with their properties in time performance. This method is evaluated with a public synthetic database with 480000 images considering different backgrounds and illumination environments. The results obtained in terms of accuracy and time performance highlight their capability of being a suitable solution for the problem of hand segmentation in contact-less environments, outperforming competitive methods in literature like Lossy Data Compression image segmentation (LDC).
Resumo:
This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage.
Resumo:
Estudio de la cinética de la crioconservación de tejidos vegetales
Resumo:
A mobile ad hoc network MANET is a collection of wireless mobile nodes that can dynamically configure a network without a fixed infrastructure or centralized administration. This makes it ideal for emergency and rescue scenarios where information sharing is essential and should occur as soon as possible. This article discusses which of the routing strategies for mobile ad hoc networks: proactive, reactive and hierarchical, have a better performance in such scenarios. Using a real urban area being set for the emergency and rescue scenario, we calculate the density of nodes and the mobility model needed for validation. The NS2 simulator has been used in our study. We also show that the hierarchical routing strategies are beffer suited for this type of scenarios.
Resumo:
A mobile Ad Hoc network (MANET) is a collection of wireless mobile nodes that can dynamically configure a network without a fixed infrastructure or central administration. This makes it ideal for emergency and rescue scenarios, where sharing information is essential and should occur as soon as possible. This article discusses which of the routing strategies for mobile MANETs: proactive, reactive or hierarchical, has a better performance in such scenarios. By selecting a real urban area for the emergency and rescue scenario, we calculated the density of nodes and the mobility model needed for the validation study of AODV, DSDV and CBRP in the routing model. The NS2 simulator has been used for our study. We also show that the hierarchical routing strategies are better suited for this type of scenarios.
Resumo:
An accepted fact in software engineering is that software must undergo verification and validation process during development to ascertain and improve its quality level. But there are too many techniques than a single developer could master, yet, it is impossible to be certain that software is free of defects. So, it is crucial for developers to be able to choose from available evaluation techniques, the one most suitable and likely to yield optimum quality results for different products. Though, some knowledge is available on the strengths and weaknesses of the available software quality assurance techniques but not much is known yet on the relationship between different techniques and contextual behavior of the techniques. Objective: This research investigates the effectiveness of two testing techniques ? equivalence class partitioning and decision coverage and one review technique ? code review by abstraction, in terms of their fault detection capability. This will be used to strengthen the practical knowledge available on these techniques.
Resumo:
In the smart building control industry, creating a platform to integrate different communication protocols and ease the interaction between users and devices is becoming increasingly important. BATMP is a platform designed to achieve this goal. In this paper, the authors describe a novel mechanism for information exchange, which introduces a new concept, Parameter, and uses it as the common object among all the BATMP components: Gateway Manager, Technology Manager, Application Manager, Model Manager and Data Warehouse. Parameter is an object which represents a physical magnitude and contains the information about its presentation, available actions, access type, etc. Each component of BATMP has a copy of the parameters. In the Technology Manager, three drivers for different communication protocols, KNX, CoAP and Modbus, are implemented to convert devices into parameters. In the Gateway Manager, users can control the parameters directly or by defining a scenario. In the Application Manager, the applications can subscribe to parameters and decide the values of parameters by negotiating. Finally, a Negotiator is implemented in the Model Manager to notify other components about the changes taking place in any component. By applying this mechanism, BATMP ensures the simultaneous and concurrent communication among users, applications and devices.