936 resultados para Large-scale gradient


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new design method of H∞ filtering for nonlinear large-scale systems with interconnected time-varying delays. The interaction terms with interval time-varying delays are bounded by nonlinear bounding functions including all states of the subsystems. A stable linear filter is designed to ensure that the filtering error system is exponentially stable with a prescribed convergence rate. By constructing a set of improved Lyapunov functions and using generalized Jensen inequality, new delay-dependent conditions for designing H∞ filter are obtained in terms of linear matrix inequalities. Finally, an example is provided to illustrate the effectiveness of the proposed result.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-dimensional problem domains pose significant challenges for anomaly detection. The presence of irrelevant features can conceal the presence of anomalies. This problem, known as the '. curse of dimensionality', is an obstacle for many anomaly detection techniques. Building a robust anomaly detection model for use in high-dimensional spaces requires the combination of an unsupervised feature extractor and an anomaly detector. While one-class support vector machines are effective at producing decision surfaces from well-behaved feature vectors, they can be inefficient at modelling the variation in large, high-dimensional datasets. Architectures such as deep belief networks (DBNs) are a promising technique for learning robust features. We present a hybrid model where an unsupervised DBN is trained to extract generic underlying features, and a one-class SVM is trained from the features learned by the DBN. Since a linear kernel can be substituted for nonlinear ones in our hybrid model without loss of accuracy, our model is scalable and computationally efficient. The experimental results show that our proposed model yields comparable anomaly detection performance with a deep autoencoder, while reducing its training and testing time by a factor of 3 and 1000, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the problem of distributed functional state observer design for a class of large-scale interconnected systems in the presence of heterogeneous time-varying delays in the interconnections and the local state vectors is considered. The resulting observer scheme is suitable for strongly coupled subsystems with multiple time-varying delays, and is shown to give better results for systems with very strong interconnections while only some mild existence conditions are imposed. A set of existence conditions are derived along with a computationally simple observer constructive procedure. Based on the Lyapunov-Krasovskii functional method (LKF) in the framework of linear matrix inequalities (LMIs), delay-dependent conditions are derived to obtain the observer parameters ensuring the exponential convergence of the observer error dynamics. The effectiveness of the obtained results is illustrated and tested through a numerical example of a three-area interconnected system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relying on the absence, presence of level of symptomatology may not provide an adequate indication of the effects of treatment for depression, nor sufficient information for the development of treatment plans that meet patients' needs. Using a prospective, multi-centered, and observational design, the present study surveyed a large sample of outpatients with depression in China (n=9855). The 17-item Hamilton Rating Scale for Depression (HRSD-17) and the Remission Evaluation and Mood Inventory Tool (REMIT) were administered at baseline, two weeks later and 4 weeks, to assess patients' self-reported symptoms and general sense of mental health and wellbeing. Of 9855 outpatients, 91.3% were diagnosed as experiencing moderate to severe depression. The patients reported significant improvement over time on both depressive symptoms and general sense after 4-week treatment. The effect sizes of change in general sense were lower than those in symptoms at both two week and four week follow-up. Treatment effects on both general sense and depressive symptomatology were associated with demographic and clinical factors. The findings indicate that a focus on both general sense of mental health and wellbeing in addition to depressive symptomatology will provide clinicians, researchers and patients themselves with a broader perspective of the status of patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of unsupervised anomaly detection arises in a wide variety of practical applications. While one-class support vector machines have demonstrated their effectiveness as an anomaly detection technique, their ability to model large datasets is limited due to their memory and time complexity for training. To address this issue for supervised learning of kernel machines, there has been growing interest in random projection methods as an alternative to the computationally expensive problems of kernel matrix construction and support vector optimisation. In this paper we leverage the theory of nonlinear random projections and propose the Randomised One-class SVM (R1SVM), which is an efficient and scalable anomaly detection technique that can be trained on large-scale datasets. Our empirical analysis on several real-life and synthetic datasets shows that our randomised 1SVM algorithm achieves comparable or better accuracy to deep autoen-coder and traditional kernelised approaches for anomaly detection, while being approximately 100 times faster in training and testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many algorithms have been introduced to deterministically authenticate Radio Frequency Identification (RFID) tags, while little work has been done to address scalability issue in batch authentications. Deterministic approaches verify tags one by one, and the communication overhead and time cost grow linearly with increasing size of tags. We design a fast and scalable counterfeits estimation scheme, INformative Counting (INC), which achieves sublinear authentication time and communication cost in batch verifications. The key novelty of INC builds on an FM-Sketch variant authentication synopsis that can capture key counting information using only sublinear space. With the help of this well-designed data structure, INC is able to provide authentication results with accurate estimates of the number of counterfeiting tags and genuine tags, while previous batch authentication methods merely provide 0/1 results indicating the existence of counterfeits. We conduct detailed theoretical analysis and extensive experiments to examine this design and the results show that INC significantly outperforms previous work in terms of effectiveness and efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To investigate whether cost-related non-collection of prescription medication is associated with a decline in health. SETTINGS: New Zealand Survey of Family, Income and Employment (SoFIE)-Health. PARTICIPANTS: Data from 17 363 participants with at least two observations in three waves (2004-2005, 2006-2007, 2008-2009) of a panel study were analysed using fixed effects regression modelling. PRIMARY OUTCOME MEASURES: Self-rated health (SRH), physical health (PCS) and mental health scores (MCS) were the health measures used in this study. RESULTS: After adjusting for time-varying confounders, non-collection of prescription items was associated with a 0.11 (95% CI 0.07 to 0.15) unit worsening in SRH, a 1.00 (95% CI 0.61 to 1.40) unit decline in PCS and a 1.69 (95% CI 1.19 to 2.18) unit decline in MCS. The interaction of the main exposure with gender was significant for SRH and MCS. Non-collection of prescription items was associated with a decline in SRH of 0.18 (95% CI 0.11 to 0.25) units for males and 0.08 (95% CI 0.03 to 0.13) units for females, and a decrease in MCS of 2.55 (95% CI 1.67 to 3.42) and 1.29 (95% CI 0.70 to 1.89) units for males and females, respectively. The interaction of the main exposure with age was significant for SRH. For respondents aged 15-24 and 25-64 years, non-collection of prescription items was associated with a decline in SRH of 0.12 (95% CI 0.03 to 0.21) and 0.12 (95% CI 0.07 to 0.17) units, respectively, but for respondents aged 65 years and over, non-collection of prescription items had no significant effect on SRH. CONCLUSION: Our results show that those who do not collect prescription medications because of cost have an increased risk of a subsequent decline in health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents a computational parametric analysis of DME steam reforming in a large scale Circulating Fluidized Bed (CFB) reactor. The Computational Fluid Dynamic (CFD) model used, which is based on Eulerian-Eulerian dispersed flow, has been developed and validated in Part I of this study [1]. The effect of the reactor inlet configuration, gas residence time, inlet temperature and steam to DME ratio on the overall reactor performance and products have all been investigated. The results have shown that the use of double sided solid feeding system remarkable improvement in the flow uniformity, but with limited effect on the reactions and products. The temperature has been found to play a dominant role in increasing the DME conversion and the hydrogen yield. According to the parametric analysis, it is recommended to run the CFB reactor at around 300 °C inlet temperature, 5.5 steam to DME molar ratio, 4 s gas residence time and 37,104 ml gcat -1 h-1 space velocity. At these conditions, the DME conversion and hydrogen molar concentration in the product gas were both found to be around 80%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some color centers in diamond can serve as quantum bits which can be manipulated with microwave pulses and read out with laser, even at room temperature. However, the photon collection efficiency of bulk diamond is greatly reduced by refraction at the diamond/air interface. To address this issue, we fabricated arrays of diamond nanostructures, differing in both diameter and top end shape, with HSQ and Cr as the etching mask materials, aiming toward large scale fabrication of single-photon sources with enhanced collection efficiency made of nitrogen vacancy (NV) embedded diamond. With a mixture of O2 and CHF3 gas plasma, diamond pillars with diameters down to 45 nm were obtained. The top end shape evolution has been represented with a simple model. The tests of size dependent single-photon properties confirmed an improved single-photon collection efficiency enhancement, larger than tenfold, and a mild decrease of decoherence time with decreasing pillar diameter was observed as expected. These results provide useful information for future applications of nanostructured diamond as a single-photon source.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spread of antibiotic resistance among bacteria responsible for nosocomial and community-acquired infections urges for novel therapeutic or prophylactic targets and for innovative pathogen-specific antibacterial compounds. Major challenges are posed by opportunistic pathogens belonging to the low GC% gram-positive bacteria. Among those, Enterococcus faecalis is a leading cause of hospital-acquired infections associated with life-threatening issues and increased hospital costs. To better understand the molecular properties of enterococci that may be required for virulence, and that may explain the emergence of these bacteria in nosocomial infections, we performed the first large-scale functional analysis of E. faecalis V583, the first vancomycin-resistant isolate from a human bloodstream infection. E. faecalis V583 is within the high-risk clonal complex 2 group, which comprises mostly isolates derived from hospital infections worldwide. We conducted broad-range screenings of candidate genes likely involved in host adaptation (e.g., colonization and/or virulence). For this purpose, a library was constructed of targeted insertion mutations in 177 genes encoding putative surface or stress-response factors. Individual mutants were subsequently tested for their i) resistance to oxidative stress, ii) antibiotic resistance, iii) resistance to opsonophagocytosis, iv) adherence to the human colon carcinoma Caco-2 epithelial cells and v) virulence in a surrogate insect model. Our results identified a number of factors that are involved in the interaction between enterococci and their host environments. Their predicted functions highlight the importance of cell envelope glycopolymers in E. faecalis host adaptation. This study provides a valuable genetic database for understanding the steps leading E. faecalis to opportunistic virulence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Machine-to-Machine (M2M) paradigm enables machines (sensors, actuators, robots, and smart meter readers) to communicate with each other with little or no human intervention. M2M is a key enabling technology for the cyber-physical systems (CPSs). This paper explores CPS beyond M2M concept and looks at futuristic applications. Our vision is CPS with distributed actuation and in-network processing. We describe few particular use cases that motivate the development of the M2M communication primitives tailored to large-scale CPS. M2M communications in literature were considered in limited extent so far. The existing work is based on small-scale M2M models and centralized solutions. Different sources discuss different primitives. Few existing decentralized solutions do not scale well. There is a need to design M2M communication primitives that will scale to thousands and trillions of M2M devices, without sacrificing solution quality. The main paradigm shift is to design localized algorithms, where CPS nodes make decisions based on local knowledge. Localized coordination and communication in networked robotics, for matching events and robots, were studied to illustrate new directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades we have seen enormous increases in the capabilities of software intensive systems, resulting in exponential growth in their size and complexity. Software and systems engineers routinely develop systems with advanced functionalities that would not even have been conceived of 20 years ago. This observation was highlighted in the Critical Code report commissioned by the US Department of Defense in 2010, which identified a critical software engineering challenge as theability to deliver “software assurance in the presence of...architectural innovation and complexity, criticality with respect to safety, (and) overall complexity and scale”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software quality management (SQM) is the collection of all processes that ensure that software products, services, and life cycle process implementations meet organizational software quality objectives and achieve stakeholder satisfaction. SQM comprises three basic subcategories: software quality planning, software quality assurance (SQA), and software quality control and software process improvement. This chapter provides a general overview of the SQA domain and discuss the related concept. A conceptual model for software quality framework is provided together with the current approaches for SQA. The chapter concludes with some of the identified challenges and future challenges regarding SQA.