149 resultados para eventi, connessioni, Node JS, event loop, thread, aggregazione
Resumo:
High levels of genetic diversity and high propagule pressure are favoured by conservation biologists as the basis for successful reintroductions and ensuring the persistence of populations. However, invasion ecologists recognize the ‘paradox of invasion’, as successful species introductions may often be characterized by limited numbers of individuals and associated genetic bottlenecks. In the present study, we used a combination of high-resolution nuclear and mitochondrial genetic markers to investigate the invasion history of Reeves' muntjac deer in the British Isles. This invasion has caused severe economic and ecological damage, with secondary spread currently a concern throughout Europe and potentially globally. Microsatellite analysis based on eight loci grouped all 176 introduced individuals studied from across the species' range in the UK into one genetic cluster, and seven mitochondrial D-loop haplotypes were recovered, two of which were present at very low frequency and were related to more common haplotypes. Our results indicate that the entire invasion can be traced to a single founding event involving a low number of females. These findings highlight the fact that even small releases of species may, if ignored, result in irreversible and costly invasion, regardless of initial genetic diversity or continual genetic influx.
Resumo:
Donor-type microchimerism, the presence of a minority population of donor-derived haematopoietic cells following solid organ transplantation, has been postulated as a mechanism for induction of donor-specific graft tolerance. The stability, frequency, and relevance of microchimerism with respect to long-term outcome, however, remains uncertain. Using a polymerase chain reaction (PCR)-based method of microsatellite analysis of highly polymorphic short tandem repeat sequences (STRs) to detect donor-type cells, DNA from 11 patients was analyzed prospectively at specific time points for 12 months following liver transplantation, and from a further six patients retrospectively 2 years after liver transplantation. Using a panel of STRs, transient peripheral blood donor microchimerism was detected in 2 of 11 patients at a single time-point following transplantation, but persistent evidence of donor-derived cells was not observed during the study period. Analysis of DNA extracted from skin and duodenum in two patients likewise failed to show donor-type cells at these sites. None of the six patients in the retrospective arm showed donor microchimerism, resulting in an overall detection rate of 1.58%. These results suggest that donor microchimerism following liver transplantation is an infrequent event, and that the generation of graft tolerance is independent of microchimerism.
Resumo:
AIMS: To determine whether alanine aminotransferase or gamma-glutamyltransferase levels, as markers of liver health and non-alcoholic fatty liver disease, might predict cardiovascular events in people with Type 2 diabetes.
METHODS: Data from the Fenofibrate Intervention and Event Lowering in Diabetes study were analysed to examine the relationship between liver enzymes and incident cardiovascular events (non-fatal myocardial infarction, stroke, coronary and other cardiovascular death, coronary or carotid revascularization) over 5 years.
RESULTS: Alanine aminotransferase level had a linear inverse relationship with the first cardiovascular event occurring in participants during the study period. After adjustment, for every 1 sd higher baseline alanine aminotransferase value (13.2 U/l), the risk of a cardiovascular event was 7% lower (95% CI 4-13; P=0.02). Participants with alanine aminotransferase levels below and above the reference range 8-41 U/l for women and 9-59 U/l for men, had hazard ratios for a cardiovascular event of 1.86 (95% CI 1.12-3.09) and 0.65 (95% CI 0.49-0.87), respectively (P=0.001). No relationship was found for gamma-glutamyltransferase.
CONCLUSIONS: The data may indicate that in people with Type 2 diabetes, which is associated with higher alanine aminotransferase levels because of prevalent non-alcoholic fatty liver disease, a low alanine aminotransferase level is a marker of hepatic or systemic frailty rather than health. This article is protected by copyright. All rights reserved.
Resumo:
This paper presents a new framework for multi-subject event inference in surveillance video, where measurements produced by low-level vision analytics usually are noisy, incomplete or incorrect. Our goal is to infer the composite events undertaken by each subject from noise observations. To achieve this, we consider the temporal characteristics of event relations and propose a method to correctly associate the detected events with individual subjects. The Dempster–Shafer (DS) theory of belief functions is used to infer events of interest from the results of our vision analytics and to measure conflicts occurring during the event association. Our system is evaluated against a number of videos that present passenger behaviours on a public transport platform namely buses at different levels of complexity. The experimental results demonstrate that by reasoning with spatio-temporal correlations, the proposed method achieves a satisfying performance when associating atomic events and recognising composite events involving multiple subjects in dynamic environments.
Resumo:
Electric vehicles (EVs) and hybrid electric vehicles (HEVs) can reduce greenhouse gas emissions while switched reluctance motor (SRM) is one of the promising motor for such applications. This paper presents a novel SRM fault-diagnosis and fault-tolerance operation solution. Based on the traditional asymmetric half-bridge topology for the SRM driving, the central tapped winding of the SRM in modular half-bridge configuration are introduced to provide fault-diagnosis and fault-tolerance functions, which are set idle in normal conditions. The fault diagnosis can be achieved by detecting the characteristic of the excitation and demagnetization currents. An SRM fault-tolerance operation strategy is also realized by the proposed topology, which compensates for the missing phase torque under the open-circuit fault, and reduces the unbalanced phase current under the short-circuit fault due to the uncontrolled faulty phase. Furthermore, the current sensor placement strategy is also discussed to give two placement methods for low cost or modular structure. Simulation results in MATLAB/Simulink and experiments on a 750-W SRM validate the effectiveness of the proposed strategy, which may have significant implications and improve the reliability of EVs/HEVs.
Resumo:
Aims: We report simultaneous observations of the nearby flare star Proxima Centauri with VLT/UVES and XMM-Newton over three nights in March 2009. Our optical and X-ray observations cover the star's quiescent state, as well as its flaring activity and allow us to probe the stellar atmospheric conditions from the photosphere into the chromosphere, and then the corona during its different activity stages. Methods: Using the X-ray data, we investigate variations in coronal densities and abundances and infer loop properties for an intermediate-sized flare. The optical data are used to investigate the magnetic field and its possible variability, to construct an emission line list for the chromosphere, and use certain emission lines to construct physical models of Proxima Centauri's chromosphere. Results: We report the discovery of a weak optical forbidden Fe xiii line at 3388 Å during the more active states of Proxima Centauri. For the intermediate flare, we find two secondary flare events that may originate in neighbouring loops, and discuss the line asymmetries observed during this flare in H i, He i, and Ca ii lines. The high time-resolution in the Hα line highlights strong temporal variations in the observed line asymmetries, which re-appear during a secondary flare event. We also present theoretical modelling with the stellar atmosphere code PHOENIX to construct flaring chromospheric models. Based on observations collected at the European Southern Observatory, Paranal, Chile, 082.D-0953A and on observations obtained with XMM-Newton, an ESA science mission with instruments and contributions directly funded by ESA Member states and NASA.Full Table 6 is only available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/534/A133
Resumo:
BACKGROUND: Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone.
METHODS: Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m(2)) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544).
FINDINGS: 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60-71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6%) men were previously treated with local therapy, and median prostate-specific antigen was 65 ng/mL (IQR 23-184). Median follow-up was 43 months (IQR 30-60). There were 415 deaths in the control group (347 [84%] prostate cancer). Median overall survival was 71 months (IQR 32 to not reached) for SOC-only, not reached (32 to not reached) for SOC + ZA (HR 0·94, 95% CI 0·79-1·11; p=0·450), 81 months (41 to not reached) for SOC + Doc (0·78, 0·66-0·93; p=0·006), and 76 months (39 to not reached) for SOC + ZA + Doc (0·82, 0·69-0·97; p=0·022). There was no evidence of heterogeneity in treatment effect (for any of the treatments) across prespecified subsets. Grade 3-5 adverse events were reported for 399 (32%) patients receiving SOC, 197 (32%) receiving SOC + ZA, 288 (52%) receiving SOC + Doc, and 269 (52%) receiving SOC + ZA + Doc.
INTERPRETATION: Zoledronic acid showed no evidence of survival improvement and should not be part of standard of care for this population. Docetaxel chemotherapy, given at the time of long-term hormone therapy initiation, showed evidence of improved survival accompanied by an increase in adverse events. Docetaxel treatment should become part of standard of care for adequately fit men commencing long-term hormone therapy.
FUNDING: Cancer Research UK, Medical Research Council, Novartis, Sanofi-Aventis, Pfizer, Janssen, Astellas, NIHR Clinical Research Network, Swiss Group for Clinical Cancer Research.
Resumo:
Cloud data centres are implemented as large-scale clusters with demanding requirements for service performance, availability and cost of operation. As a result of scale and complexity, data centres typically exhibit large numbers of system anomalies resulting from operator error, resource over/under provisioning, hardware or software failures and security issus anomalies are inherently difficult to identify and resolve promptly via human inspection. Therefore, it is vital in a cloud system to have automatic system monitoring that detects potential anomalies and identifies their source. In this paper we present a lightweight anomaly detection tool for Cloud data centres which combines extended log analysis and rigorous correlation of system metrics, implemented by an efficient correlation algorithm which does not require training or complex infrastructure set up. The LADT algorithm is based on the premise that there is a strong correlation between node level and VM level metrics in a cloud system. This correlation will drop significantly in the event of any performance anomaly at the node-level and a continuous drop in the correlation can indicate the presence of a true anomaly in the node. The log analysis of LADT assists in determining whether the correlation drop could be caused by naturally occurring cloud management activity such as VM migration, creation, suspension, termination or resizing. In this way, any potential anomaly alerts are reasoned about to prevent false positives that could be caused by the cloud operator’s activity. We demonstrate LADT with log analysis in a Cloud environment to show how the log analysis is combined with the correlation of systems metrics to achieve accurate anomaly detection.
Resumo:
In order to protect user privacy on mobile devices, an event-driven implicit authentication scheme is proposed in this paper. Several methods of utilizing the scheme for recognizing legitimate user behavior are investigated. The investigated methods compute an aggregate score and a threshold in real-time to determine the trust level of the current user using real data derived from user interaction with the device. The proposed scheme is designed to: operate completely in the background, require minimal training period, enable high user recognition rate for implicit authentication, and prompt detection of abnormal activity that can be used to trigger explicitly authenticated access control. In this paper, we investigate threshold computation through standard deviation and EWMA (exponentially weighted moving average) based algorithms. The result of extensive experiments on user data collected over a period of several weeks from an Android phone indicates that our proposed approach is feasible and effective for lightweight real-time implicit authentication on mobile smartphones.
Resumo:
Electronic report
Resumo:
This Integration Insight provides a brief overview of the most popular modelling techniques used to analyse complex real-world problems, as well as some less popular but highly relevant techniques. The modelling methods are divided into three categories, with each encompassing a number of methods, as follows: 1) Qualitative Aggregate Models (Soft Systems Methodology, Concept Maps and Mind Mapping, Scenario Planning, Causal (Loop) Diagrams), 2) Quantitative Aggregate Models (Function fitting and Regression, Bayesian Nets, System of differential equations / Dynamical systems, System Dynamics, Evolutionary Algorithms) and 3) Individual Oriented Models (Cellular Automata, Microsimulation, Agent Based Models, Discrete Event Simulation, Social Network
Analysis). Each technique is broadly described with example uses, key attributes and reference material.
Resumo:
This paper proposes a method for the detection and classification of multiple events in an electrical power system in real-time, namely; islanding, high frequency events (loss of load) and low frequency events (loss of generation). This method is based on principal component analysis of frequency measurements and employs a moving window approach to combat the time-varying nature of power systems, thereby increasing overall situational awareness of the power system. Numerical case studies using both real data, collected from the UK power system, and simulated case studies, constructed using DigSilent PowerFactory, for islanding events, as well as both loss of load and generation dip events, are used to demonstrate the reliability of the proposed method.