929 resultados para Large-scale Distribution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of unsupervised anomaly detection arises in a wide variety of practical applications. While one-class support vector machines have demonstrated their effectiveness as an anomaly detection technique, their ability to model large datasets is limited due to their memory and time complexity for training. To address this issue for supervised learning of kernel machines, there has been growing interest in random projection methods as an alternative to the computationally expensive problems of kernel matrix construction and support vector optimisation. In this paper we leverage the theory of nonlinear random projections and propose the Randomised One-class SVM (R1SVM), which is an efficient and scalable anomaly detection technique that can be trained on large-scale datasets. Our empirical analysis on several real-life and synthetic datasets shows that our randomised 1SVM algorithm achieves comparable or better accuracy to deep autoen-coder and traditional kernelised approaches for anomaly detection, while being approximately 100 times faster in training and testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many algorithms have been introduced to deterministically authenticate Radio Frequency Identification (RFID) tags, while little work has been done to address scalability issue in batch authentications. Deterministic approaches verify tags one by one, and the communication overhead and time cost grow linearly with increasing size of tags. We design a fast and scalable counterfeits estimation scheme, INformative Counting (INC), which achieves sublinear authentication time and communication cost in batch verifications. The key novelty of INC builds on an FM-Sketch variant authentication synopsis that can capture key counting information using only sublinear space. With the help of this well-designed data structure, INC is able to provide authentication results with accurate estimates of the number of counterfeiting tags and genuine tags, while previous batch authentication methods merely provide 0/1 results indicating the existence of counterfeits. We conduct detailed theoretical analysis and extensive experiments to examine this design and the results show that INC significantly outperforms previous work in terms of effectiveness and efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To investigate whether cost-related non-collection of prescription medication is associated with a decline in health. SETTINGS: New Zealand Survey of Family, Income and Employment (SoFIE)-Health. PARTICIPANTS: Data from 17 363 participants with at least two observations in three waves (2004-2005, 2006-2007, 2008-2009) of a panel study were analysed using fixed effects regression modelling. PRIMARY OUTCOME MEASURES: Self-rated health (SRH), physical health (PCS) and mental health scores (MCS) were the health measures used in this study. RESULTS: After adjusting for time-varying confounders, non-collection of prescription items was associated with a 0.11 (95% CI 0.07 to 0.15) unit worsening in SRH, a 1.00 (95% CI 0.61 to 1.40) unit decline in PCS and a 1.69 (95% CI 1.19 to 2.18) unit decline in MCS. The interaction of the main exposure with gender was significant for SRH and MCS. Non-collection of prescription items was associated with a decline in SRH of 0.18 (95% CI 0.11 to 0.25) units for males and 0.08 (95% CI 0.03 to 0.13) units for females, and a decrease in MCS of 2.55 (95% CI 1.67 to 3.42) and 1.29 (95% CI 0.70 to 1.89) units for males and females, respectively. The interaction of the main exposure with age was significant for SRH. For respondents aged 15-24 and 25-64 years, non-collection of prescription items was associated with a decline in SRH of 0.12 (95% CI 0.03 to 0.21) and 0.12 (95% CI 0.07 to 0.17) units, respectively, but for respondents aged 65 years and over, non-collection of prescription items had no significant effect on SRH. CONCLUSION: Our results show that those who do not collect prescription medications because of cost have an increased risk of a subsequent decline in health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents a computational parametric analysis of DME steam reforming in a large scale Circulating Fluidized Bed (CFB) reactor. The Computational Fluid Dynamic (CFD) model used, which is based on Eulerian-Eulerian dispersed flow, has been developed and validated in Part I of this study [1]. The effect of the reactor inlet configuration, gas residence time, inlet temperature and steam to DME ratio on the overall reactor performance and products have all been investigated. The results have shown that the use of double sided solid feeding system remarkable improvement in the flow uniformity, but with limited effect on the reactions and products. The temperature has been found to play a dominant role in increasing the DME conversion and the hydrogen yield. According to the parametric analysis, it is recommended to run the CFB reactor at around 300 °C inlet temperature, 5.5 steam to DME molar ratio, 4 s gas residence time and 37,104 ml gcat -1 h-1 space velocity. At these conditions, the DME conversion and hydrogen molar concentration in the product gas were both found to be around 80%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some color centers in diamond can serve as quantum bits which can be manipulated with microwave pulses and read out with laser, even at room temperature. However, the photon collection efficiency of bulk diamond is greatly reduced by refraction at the diamond/air interface. To address this issue, we fabricated arrays of diamond nanostructures, differing in both diameter and top end shape, with HSQ and Cr as the etching mask materials, aiming toward large scale fabrication of single-photon sources with enhanced collection efficiency made of nitrogen vacancy (NV) embedded diamond. With a mixture of O2 and CHF3 gas plasma, diamond pillars with diameters down to 45 nm were obtained. The top end shape evolution has been represented with a simple model. The tests of size dependent single-photon properties confirmed an improved single-photon collection efficiency enhancement, larger than tenfold, and a mild decrease of decoherence time with decreasing pillar diameter was observed as expected. These results provide useful information for future applications of nanostructured diamond as a single-photon source.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spread of antibiotic resistance among bacteria responsible for nosocomial and community-acquired infections urges for novel therapeutic or prophylactic targets and for innovative pathogen-specific antibacterial compounds. Major challenges are posed by opportunistic pathogens belonging to the low GC% gram-positive bacteria. Among those, Enterococcus faecalis is a leading cause of hospital-acquired infections associated with life-threatening issues and increased hospital costs. To better understand the molecular properties of enterococci that may be required for virulence, and that may explain the emergence of these bacteria in nosocomial infections, we performed the first large-scale functional analysis of E. faecalis V583, the first vancomycin-resistant isolate from a human bloodstream infection. E. faecalis V583 is within the high-risk clonal complex 2 group, which comprises mostly isolates derived from hospital infections worldwide. We conducted broad-range screenings of candidate genes likely involved in host adaptation (e.g., colonization and/or virulence). For this purpose, a library was constructed of targeted insertion mutations in 177 genes encoding putative surface or stress-response factors. Individual mutants were subsequently tested for their i) resistance to oxidative stress, ii) antibiotic resistance, iii) resistance to opsonophagocytosis, iv) adherence to the human colon carcinoma Caco-2 epithelial cells and v) virulence in a surrogate insect model. Our results identified a number of factors that are involved in the interaction between enterococci and their host environments. Their predicted functions highlight the importance of cell envelope glycopolymers in E. faecalis host adaptation. This study provides a valuable genetic database for understanding the steps leading E. faecalis to opportunistic virulence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Machine-to-Machine (M2M) paradigm enables machines (sensors, actuators, robots, and smart meter readers) to communicate with each other with little or no human intervention. M2M is a key enabling technology for the cyber-physical systems (CPSs). This paper explores CPS beyond M2M concept and looks at futuristic applications. Our vision is CPS with distributed actuation and in-network processing. We describe few particular use cases that motivate the development of the M2M communication primitives tailored to large-scale CPS. M2M communications in literature were considered in limited extent so far. The existing work is based on small-scale M2M models and centralized solutions. Different sources discuss different primitives. Few existing decentralized solutions do not scale well. There is a need to design M2M communication primitives that will scale to thousands and trillions of M2M devices, without sacrificing solution quality. The main paradigm shift is to design localized algorithms, where CPS nodes make decisions based on local knowledge. Localized coordination and communication in networked robotics, for matching events and robots, were studied to illustrate new directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades we have seen enormous increases in the capabilities of software intensive systems, resulting in exponential growth in their size and complexity. Software and systems engineers routinely develop systems with advanced functionalities that would not even have been conceived of 20 years ago. This observation was highlighted in the Critical Code report commissioned by the US Department of Defense in 2010, which identified a critical software engineering challenge as theability to deliver “software assurance in the presence of...architectural innovation and complexity, criticality with respect to safety, (and) overall complexity and scale”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software quality management (SQM) is the collection of all processes that ensure that software products, services, and life cycle process implementations meet organizational software quality objectives and achieve stakeholder satisfaction. SQM comprises three basic subcategories: software quality planning, software quality assurance (SQA), and software quality control and software process improvement. This chapter provides a general overview of the SQA domain and discuss the related concept. A conceptual model for software quality framework is provided together with the current approaches for SQA. The chapter concludes with some of the identified challenges and future challenges regarding SQA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land-stewardship programmes are a major focus of investment by governments for conserving biodiversity in agricultural landscapes. These programmes are generally large-scale (e.g. >1000 km) spanning multiple biogeographic regions but developed using spatially limited (e.g. landscape-scale; <100 km) ecological data interpolated across broad areas for one, or a few, well-studied taxonomic groups. Information about how less-studied taxa respond to regional differences in management and environmental effects has potential to further inform land-stewardship conservation programmes, but suitable data sets are rarely available. In this study, we sought to enhance planning of large-scale conservation programmes by quantifying relationships between reptile assemblages and key environmental attributes at regional scales within a large-scale (>172 000 km2) Australian land-stewardship programme. Using 234 remnant woodland monitoring sites spanning four distinct biogeographic regions, we asked: Do reptile assemblages show different environmental associations across biogeographically distinct regions? We found that environmental features important to reptile diversity differed over each region. Abundance and rare species richness of reptiles responded at regional-scales to elevation, native groundcover and aspect. We identified four implications from our study: (1) large-scale conservation schemes can achieve better outcomes for reptiles using regional-scale knowledge of environmental associations; (2) regional-scale knowledge is particularly valuable for conservation of rare reptile taxa; (3) consideration of abiotic environmental features which cannot be directly managed (e.g. aspect, elevation) is important; (4) programmes can be tailored to better support reptile groups at higher conservation risk. Our study shows that reptile-environment associations differ among biogeographic regions, and this presents opportunity for tailoring stronger policy and management strategies for conserving large-scale agricultural landscapes globally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: Edge effects due to habitat loss and fragmentation have pervasive impacts on many natural ecosystems worldwide. Objective: We aimed to explore whether, in tandem with the resource-based model of edge effects, species feeding-guild and flight-capacity can help explain species responses to an edge. Methods: We used a two-sided edge gradient that extended from 1000 m into native Eucalyptus forest to 316 m into an exotic pine plantation. We used generalised additive models to examine the continuous responses of beetle species, feeding-guild species richness and flight-capable group species richness to the edge gradient and environmental covariates. Results: Phytophagous species richness was directly related to variation in vegetation along the edge gradient. There were more flight-capable species in Eucalyptus forest and more flightless species in exotic pine plantation. Many individual species exhibited multiple-peaked edge-profiles. Conclusions: The resource based model for edge effects can be used in tandem with traits such as feeding-guild and flight-capacity to understand drivers of large scale edge responses. Some trait-groups can show generalisable responses that can be linked with drivers such as vegetation richness and habitat structure. Many trait-group responses, however, are less generalisable and not explained by easily measured habitat variables. Difficulties in linking traits with resources along the edge could be due to unmeasured variation and indirect effects. Some species’ responses reached the limits of the edge gradient demonstrating the need to examine edge effects at large scales, such as kilometres.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network traffic analysis has been one of the most crucial techniques for preserving a large-scale IP backbone network. Despite its importance, large-scale network traffic monitoring techniques suffer from some technical and mercantile issues to obtain precise network traffic data. Though the network traffic estimation method has been the most prevalent technique for acquiring network traffic, it still has a great number of problems that need solving. With the development of the scale of our networks, the level of the ill-posed property of the network traffic estimation problem is more deteriorated. Besides, the statistical features of network traffic have changed greatly in terms of current network architectures and applications. Motivated by that, in this paper, we propose a network traffic prediction and estimation method respectively. We first use a deep learning architecture to explore the dynamic properties of network traffic, and then propose a novel network traffic prediction approach based on a deep belief network. We further propose a network traffic estimation method utilizing the deep belief network via link counts and routing information. We validate the effectiveness of our methodologies by real data sets from the Abilene and GÉANT backbone networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim Positive regional correlations between biodiversity and human population have been detected for several taxonomic groups and geographical regions. Such correlations could have important conservation implications and have been mainly attributed to ecological factors, with little testing for an artefactual explanation: more populated regions may show higher biodiversity because they are more thoroughly surveyed. We tested the hypothesis that the correlation between people and herptile diversity in Europe is influenced by survey effort

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing integration of renewable energies in the electricity grid contributes considerably to achieve the European Union goals on energy and Greenhouse Gases (GHG) emissions reduction. However, it also brings problems to grid management. Large scale energy storage can provide the means for a better integration of the renewable energy sources, for balancing supply and demand, to increase energy security, to enhance a better management of the grid and also to converge towards a low carbon economy. Geological formations have the potential to store large volumes of fluids with minimal impact to environment and society. One of the ways to ensure a large scale energy storage is to use the storage capacity in geological reservoir. In fact, there are several viable technologies for underground energy storage, as well as several types of underground reservoirs that can be considered. The geological energy storage technologies considered in this research were: Underground Gas Storage (UGS), Hydrogen Storage (HS), Compressed Air Energy Storage (CAES), Underground Pumped Hydro Storage (UPHS) and Thermal Energy Storage (TES). For these different types of underground energy storage technologies there are several types of geological reservoirs that can be suitable, namely: depleted hydrocarbon reservoirs, aquifers, salt formations and caverns, engineered rock caverns and abandoned mines. Specific site screening criteria are applicable to each of these reservoir types and technologies, which determines the viability of the reservoir itself, and of the technology for any particular site. This paper presents a review of the criteria applied in the scope of the Portuguese contribution to the EU funded project ESTMAP – Energy Storage Mapping and Planning.