977 resultados para Kahler metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background

Endocrine disrupting chemicals and carcinogens, some of which may not yet have been classified as such, are present in many occupational environments and could increase breast cancer risk. Prior research has identified associations with breast cancer and work in agricultural and industrial settings. The purpose of this study was to further characterize possible links between breast cancer risk and occupation, particularly in farming and manufacturing, as well as to examine the impacts of early agricultural exposures, and exposure effects that are specific to the endocrine receptor status of tumours.

Methods

1005 breast cancer cases referred by a regional cancer center and 1147 randomly-selected community controls provided detailed data including occupational and reproductive histories. All reported jobs were industry- and occupation-coded for the construction of cumulative exposure metrics representing likely exposure to carcinogens and endocrine disruptors. In a frequency-matched case?control design, exposure effects were estimated using conditional logistic regression.

Results

Across all sectors, women in jobs with potentially high exposures to carcinogens and endocrine disruptors had elevated breast cancer risk (OR = 1.42; 95% CI, 1.18-1.73, for 10 years exposure duration). Specific sectors with elevated risk included: agriculture (OR = 1.36; 95% CI, 1.01-1.82); bars-gambling (OR = 2.28; 95% CI, 0.94-5.53); automotive plastics manufacturing (OR = 2.68; 95% CI, 1.47-4.88), food canning (OR = 2.35; 95% CI, 1.00-5.53), and metalworking (OR = 1.73; 95% CI, 1.02-2.92). Estrogen receptor status of tumors with elevated risk differed by occupational grouping. Premenopausal breast cancer risk was highest for automotive plastics (OR = 4.76; 95% CI, 1.58-14.4) and food canning (OR = 5.70; 95% CI, 1.03-31.5).

Conclusions

These observations support hypotheses linking breast cancer risk and exposures likely to include carcinogens and endocrine disruptors, and demonstrate the value of detailed work histories in environmental and occupational epidemiology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The maintenance of biodiversity is a fundamental theme of the Marine Strategy Framework Directive. Appropriate indicators to monitor change in biodiversity, along with associated targets representing "good environmental status" (GES), are required to be in place by July 2012. A method for selecting species-specific metrics to fulfil various specified indicator roles is proposed for demersal fish communities. Available data frequently do not extend far enough back in time to allow GES to be defined empirically. In such situations, trends-based targets offer a pragmatic solution. A method is proposed for setting indicator-level targets for the number of species-specific metrics required to meet their trends-based metric-level targets. This is based on demonstrating significant departures from the binomial distribution. The procedure is trialled using North Sea demersal fish survey data. Although fisheries management in the North Sea has improved in recent decades, management goals to stop further decline in biodiversity, and to initiate recovery, are yet to be met.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A boronic acid moiety was found to be a critical pharmacophore for enhanced in vitro potency against wild type hepatitis C replicons and known clinical polymorphic and resistant HCV mutant replicons. The synthesis, optimization, and structure-activity relationships associated with inhibition of HCV replication in a sub-genomic replication system for a series of non-nucleoside boron-containing HCV RNA-Dependent RNA Polymerase (NS5B) inhibitors are described. A summary of the discovery of GSK5852 (3), a molecule which entered clinical trials in subjects infected with HCV in 2011, is included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DeAuthentication Denial of Service attacks in Public Access WiFi operate by exploiting the lack of authentication of management frames in the 802.11 protocol. Detection of these attacks rely almost exclusively on the selection of appropriate thresholds. In this work the authors demonstrate that there are additional, previously unconsidered, metrics which also influence DoS detection performance. A method of systematically tuning these metrics to optimal values is proposed which ensures that parameter choices are repeatable and verifiable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The REsearch on a CRuiser Enabled Air Transport Environment (RECREATE) project is considers the introduction and airworthiness of cruiser-feeder operations for civil aircraft. Cruiser-feeder operations are investigated as a promising pioneering idea for the air transport of the future. The soundness of the concept of cruiser-feeder operations for civil aircraft can be understood, taking air-to-air refueling operations as an example. For this example, a comprehensive estimate of the benefits can be made, which shows a fuel burn reduction potential and a CO2 emission reduction of 31% for a typical 6000 nautical miles flight with a payload of 250 passengers. This reduction potential is known to be large by any standard. The top level objective of the RECREATE project is to demonstrate on a preliminary design level that cruiser-feeder operations (as a concept to reduce fuel burn and CO2 emission levels) can be shown to comply with the airworthiness requirements for civil aircraft. The underlying Scientific and Technological (S&T) objectives are to determine and study airworthy operational concepts for cruiser-feeder operations, and to derive and quantify benefits in terms of CO2 emission reduction but also other benefits.

Work Package (WP) 3 has the objective to substantiate the assumed benefits of the cruiser/feeder operations through refined analysis and simulation. In this report, initial benefits evaluation of the initial RECREATE cruiser/feeder concepts is presented. The benefits analysis is conducted in delta mode, i.e. comparison is made with a baseline system. Since comparing different aircraft and air transport systems is never a trivial task, appropriate measures and metrics are defined and selected first. Non-dimensional parameters are defined and values for the baseline system derived.

The impact of cruiser/feeder operations such as air-to-air refueling are studied with respect to fuel-burn (or carbon-dioxide), noise and congestion. For this purpose, traffic simulations have been conducted.
Cruiser/feeder operations will have an impact on dispatch reliability as well. An initial assessment of the effect on dispatch reliability has been made and is reported.

Finally, a considerable effort has been made to create the infrastructure for economic delta analysis of the cruiser/feeder concept of operation. First results of the cost analysis have been obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mitigation of diffuse nutrient and sediment delivery to streams requires successful identification andmanagement of critical source areas within catchments. Approaches to predicting high risk areas forsediment loss have typically relied on structural drivers of connectivity and risk, with little considera-tion given to process driven water quality responses. To assess the applicability of structural metrics topredict critical source areas, geochemical tracing of land use sources was conducted in three headwateragricultural catchments in Co. Down and Co. Louth, Ireland, within a Monte Carlo framework. Outputswere applied to the inverse optimisation of a connectivity model, based on LiDAR DEM data, to assess theefficacy of land use risk weightings to predict sediment source contributions over the 18 month studyperiod in the Louth Upper, Louth Lower and Down catchments. Results of the study indicated sedimentproportions over the study period varied from 6 to 10%, 84 to 87%, 4%, and 2 to 3% for the Down Catch-ment, 79 to 85%, 9 to 17%, 1 to 3% and 2 to 3% in the Louth Upper and 2 to 3%, 79 to 85%, 10 to 17%and 2 to 3% in the Louth Lower for arable, channel bank, grassland, and woodland sources, respectively.Optimised land use risk weightings for each sampling period showed that at the larger catchment scale,no variation in median land use weightings were required to predict land use contributions. However,for the two smaller study catchments, variation in median risk weightings was considerable, which mayindicate the importance of functional connectivity processes at this spatial scale. In all instances, arableland consistently generated the highest risk of sediment loss across all catchments and sampling times.This study documents some of the first data on sediment provenance in Ireland and indicates the needfor cautious consideration of land use as a tool to predict critical source areas at the headwater scale

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Directional modulation (DM) is a recently introduced technique for secure wireless transmission using direct physical layer wave-front manipulation. This paper provides a bit error rate (BER)-based DM array synthesis method. It is shown for the first time that the standard constellation mappings in In-phase and Quadrature (IQ) space to a pre-specified BER can be exactly achieved along a given specified spatial direction. Different receiver capabilities are investigated and different assessment metrics for each case are discussed. The approach is validated for a 1 × 4 element dipole array operating at 1 GHz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To describe associations between reticular pseudodrusen, individual characteristics, and retinal function.

Design: Cohort study.

Participants: We recruited 105 patients (age range, 52–93 years) who had advanced neovascular age-related macular degeneration (AMD) in only 1 eye from 3 clinical centers in Europe.

Methods: Minimum follow-up was 12 months. The eye selected for study was the fellow eye without advanced disease. Clinical measures of vision were distance visual acuity, near visual acuity, and results of the Smith-Kettlewell low-luminance acuity test (SKILL). Fundus imaging included color photography, red-free imaging, blue autofluorescence imaging, fluorescein angiography, indocyanine green angiography, and optical coherence tomography using standardized protocols. These were used to detect progression to neovascular AMD in the study eye during follow-up. All imaging outputs were graded for the presence or absence of reticular pseudodrusen (RPD) using a multimodal approach. Choroidal thickness was measured at the foveal center and at 2 other equidistant locations from the fovea (1500 μm) nasally and temporally. Metrics on retinal thickness and volume were obtained from the manufacturer-supplied automated segmentation readouts.

Main Outcome Measures: Presence of RPD, distance visual acuity, near visual acuity, SKILL score, choroidal thickness, retinal thickness, and retinal volume.

Results: Reticular pseudodrusen was found in 43 participants (41%) on 1 or more imaging method. The SKILL score was significantly worse in those with reticular drusen (mean score ± standard deviation [SD, 38±12) versus those without (mean score ± SD, 33±9) (P = 0.034). Parafoveal retinal thickness, parafoveal retinal volume, and all of the choroidal thickness parameters measured were significantly lower in those with reticular drusen than in those without. The presence of RPD was associated with development of neovascular AMD when corrected for age and sex (odds ratio, 5.5; 95% confidence interval, 1.1–28.8; P = 0.042). All participants in whom geographic atrophy developed during follow-up had visible RPD at baseline.

Conclusions: Significant differences in retinal and choroidal anatomic features, visual function, and risk factor profile exist in unilateral neovascular AMD patients with RPD compared with those without; therefore, such patients should be monitored carefully because of the risk of developing bilateral disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the impact of multiple active eavesdroppers on cooperative single carrier systems with multiple relays and multiple destinations is examined. To achieve the secrecy diversity gains in the form of opportunistic selection, a two-stage scheme is proposed for joint relay and destination selection, in which, after the selection of the relay with the minimum effective maximum signal-to-noise ratio (SNR) to a cluster of eavesdroppers, the destination that has the maximum SNR from the chosen relay is selected. In order to accurately assess the secrecy performance, the exact and asymptotic expressions are obtained in closed-form for several security metrics including the secrecy outage probability, the probability of non-zero secrecy rate, and the ergodic secrecy rate in frequency selective fading. Based on the asymptotic analysis, key design parameters such as secrecy diversity gain, secrecy array gain, secrecy multiplexing gain, and power cost are characterized, from which new insights are drawn. Moreover, it is concluded that secrecy performance limits occur when the average received power at the eavesdropper is proportional to the counterpart at the destination. Specifically, for the secrecy outage probability, it is confirmed that the secrecy diversity gain collapses to zero with outage floor, whereas for the ergodic secrecy rate, it is confirmed confirm that its slope collapses to zero with capacity ceiling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent work suggests that the human ear varies significantly between different subjects and can be used for identification. In principle, therefore, using ears in addition to the face within a recognition system could improve accuracy and robustness, particularly for non-frontal views. The paper describes work that investigates this hypothesis using an approach based on the construction of a 3D morphable model of the head and ear. One issue with creating a model that includes the ear is that existing training datasets contain noise and partial occlusion. Rather than exclude these regions manually, a classifier has been developed which automates this process. When combined with a robust registration algorithm the resulting system enables full head morphable models to be constructed efficiently using less constrained datasets. The algorithm has been evaluated using registration consistency, model coverage and minimalism metrics, which together demonstrate the accuracy of the approach. To make it easier to build on this work, the source code has been made available online.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energy consumption and total cost of ownership are daunting challenges for Datacenters, because they scale disproportionately with performance. Datacenters running financial analytics may incur extremely high operational costs in order to meet performance and latency requirements of their hosted applications. Recently, ARM-based microservers have emerged as a viable alternative to high-end servers, promising scalable performance via scale-out approaches and low energy consumption. In this paper, we investigate the viability of ARM-based microservers for option pricing, using the Monte Carlo and Binomial Tree kernels. We compare an ARM-based microserver against a state-of-the-art x86 server. We define application-related but platform-independent energy and performance metrics to compare those platforms fairly in the context of datacenters for financial analytics and give insight on the particular requirements of option pricing. Our experiments show that through scaling out energyefficient compute nodes within a 2U rack-mounted unit, an ARM-based microserver consumes as little as about 60% of the energy per option pricing compared to an x86 server, despite having significantly slower cores. We also find that the ARM microserver scales enough to meet a high fraction of market throughput demand, while consuming up to 30% less energy than an Intel server

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of learning from imbalanced data is of critical importance in a large number of application domains and can be a bottleneck in the performance of various conventional learning methods that assume the data distribution to be balanced. The class imbalance problem corresponds to dealing with the situation where one class massively outnumbers the other. The imbalance between majority and minority would lead machine learning to be biased and produce unreliable outcomes if the imbalanced data is used directly. There has been increasing interest in this research area and a number of algorithms have been developed. However, independent evaluation of the algorithms is limited. This paper aims at evaluating the performance of five representative data sampling methods namely SMOTE, ADASYN, BorderlineSMOTE, SMOTETomek and RUSBoost that deal with class imbalance problems. A comparative study is conducted and the performance of each method is critically analysed in terms of assessment metrics. © 2013 Springer-Verlag.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: There is an urgent need to develop diagnostic tests to improve the detection of pathogens causing life-threatening infection (sepsis). SeptiFast is a CE-marked multi-pathogen real-time PCR system capable of detecting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here a systematic review and meta-analysis of diagnostic accuracy studies of SeptiFast in the setting of suspected sepsis.

Methods: A comprehensive search strategy was developed to identify studies that compared SeptiFast with blood culture in suspected sepsis. Methodological quality was assessed using QUADAS. Heterogeneity of studies was investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in receiver operator characteristic space. Bivariate model method was used to estimate summary sensitivity and specificity.

Results: From 41 phase III diagnostic accuracy studies, summary sensitivity and specificity for SeptiFast compared with blood culture were 0.68 (95 % CI 0.63–0.73) and 0.86 (95 % CI 0.84–0.89) respectively. Study quality was judged to be variable with important deficiencies overall in design and reporting that could impact on derived diagnostic accuracy metrics.

Conclusions: SeptiFast appears to have higher specificity than sensitivity, but deficiencies in study quality are likely to render this body of work unreliable. Based on the evidence presented here, it remains difficult to make firm recommendations about the likely clinical utility of SeptiFast in the setting of suspected sepsis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process.