890 resultados para robust tori


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The data streaming model provides an attractive framework for one-pass summarization of massive data sets at a single observation point. However, in an environment where multiple data streams arrive at a set of distributed observation points, sketches must be computed remotely and then must be aggregated through a hierarchy before queries may be conducted. As a result, many sketch-based methods for the single stream case do not apply directly, as either the error introduced becomes large, or because the methods assume that the streams are non-overlapping. These limitations hinder the application of these techniques to practical problems in network traffic monitoring and aggregation in sensor networks. To address this, we develop a general framework for evaluating and enabling robust computation of duplicate-sensitive aggregate functions (e.g., SUM and QUANTILE), over data produced by distributed sources. We instantiate our approach by augmenting the Count-Min and Quantile-Digest sketches to apply in this distributed setting, and analyze their performance. We conclude with experimental evaluation to validate our analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Directed self-assembly (DSA) of block copolymers (BCPs) is a prime candidate to further extend dimensional scaling of silicon integrated circuit features for the nanoelectronic industry. Top-down optical techniques employed for photoresist patterning are predicted to reach an endpoint due to diffraction limits. Additionally, the prohibitive costs for “fabs” and high volume manufacturing tools are issues that have led the search for alternative complementary patterning processes. This thesis reports the fabrication of semiconductor features from nanoscale on-chip etch masks using “high χ” BCP materials. Fabrication of silicon and germanium nanofins via metal-oxide enhanced BCP on-chip etch masks that might be of importance for future Fin-field effect transistor (FinFETs) application are detailed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In a time-course microarray experiment, the expression level for each gene is observed across a number of time-points in order to characterize the temporal trajectories of the gene-expression profiles. For many of these experiments, the scientific aim is the identification of genes for which the trajectories depend on an experimental or phenotypic factor. There is an extensive recent body of literature on statistical methodology for addressing this analytical problem. Most of the existing methods are based on estimating the time-course trajectories using parametric or non-parametric mean regression methods. The sensitivity of these regression methods to outliers, an issue that is well documented in the statistical literature, should be of concern when analyzing microarray data. RESULTS: In this paper, we propose a robust testing method for identifying genes whose expression time profiles depend on a factor. Furthermore, we propose a multiple testing procedure to adjust for multiplicity. CONCLUSIONS: Through an extensive simulation study, we will illustrate the performance of our method. Finally, we will report the results from applying our method to a case study and discussing potential extensions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a framework for robust optimization that relaxes the standard notion of robustness by allowing the decision maker to vary the protection level in a smooth way across the uncertainty set. We apply our approach to the problem of maximizing the expected value of a payoff function when the underlying distribution is ambiguous and therefore robustness is relevant. Our primary objective is to develop this framework and relate it to the standard notion of robustness, which deals with only a single guarantee across one uncertainty set. First, we show that our approach connects closely to the theory of convex risk measures. We show that the complexity of this approach is equivalent to that of solving a small number of standard robust problems. We then investigate the conservatism benefits and downside probability guarantees implied by this approach and compare to the standard robust approach. Finally, we illustrate theme thodology on an asset allocation example consisting of historical market data over a 25-year investment horizon and find in every case we explore that relaxing standard robustness with soft robustness yields a seemingly favorable risk-return trade-off: each case results in a higher out-of-sample expected return for a relatively minor degradation of out-of-sample downside performance. © 2010 INFORMS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing election algorithms suffer limited scalability. This limit stems from the communication design which in turn stems from their fundamentally two-state behaviour. This paper presents a new election algorithm specifically designed to be highly scalable in broadcast networks whilst allowing any processing node to become coordinator with initially equal probability. To achieve this, careful attention has been paid to the communication design, and an additional state has been introduced. The design of the tri-state election algorithm has been motivated by the requirements analysis of a major research project to deliver robust scalable distributed applications, including load sharing, in hostile computing environments in which it is common for processing nodes to be rebooted frequently without notice. The new election algorithm is based in-part on a simple 'emergent' design. The science of emergence is of great relevance to developers of distributed applications because it describes how higher-level self-regulatory behaviour can arise from many participants following a small set of simple rules. The tri-state election algorithm is shown to have very low communication complexity in which the number of messages generated remains loosely-bounded regardless of scale for large systems; is highly scalable because nodes in the idle state do not transmit any messages; and because of its self-organising characteristics, is very stable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural distributed systems are adaptive, scalable and fault-tolerant. Emergence science describes how higher-level self-regulatory behaviour arises in natural systems from many participants following simple rulesets. Emergence advocates simple communication models, autonomy and independence, enhancing robustness and self-stabilization. High-quality distributed applications such as autonomic systems must satisfy the appropriate nonfunctional requirements which include scalability, efficiency, robustness, low-latency and stability. However the traditional design of distributed applications, especially in terms of the communication strategies employed, can introduce compromises between these characteristics. This paper discusses ways in which emergence science can be applied to distributed computing, avoiding some of the compromises associated with traditionally-designed applications. To demonstrate the effectiveness of this paradigm, an emergent election algorithm is described and its performance evaluated. The design incorporates nondeterministic behaviour. The resulting algorithm has very low communication complexity, and is simultaneously very stable, scalable and robust.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The anticipated rewards of adaptive approaches will only be fully realised when autonomic algorithms can take configuration and deployment decisions that match and exceed those of human engineers. Such decisions are typically characterised as being based on a foundation of experience and knowledge. In humans, these underpinnings are themselves founded on the ashes of failure, the exuberance of courage and (sometimes) the outrageousness of fortune. In this paper we describe an application framework that will allow the incorporation of similarly risky, error prone and downright dangerous software artefacts into live systems – without undermining the certainty of correctness at application level. We achieve this by introducing the notion of application dreaming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computational modelling approach integrated with optimisation and statistical methods that can aid the development of reliable and robust electronic packages and systems is presented. The design for reliability methodology is demonstrated for the design of a SiP structure. In this study the focus is on the procedure for representing the uncertainties in the package design parameters, their impact on reliability and robustness of the package design and how these can be included in the design optimisation modelling framework. The analysis of thermo-mechanical behaviour of the package is conducted using non-linear transient finite element simulations. Key system responses of interest, the fatigue life-time of the lead-free solder interconnects and warpage of the package, are predicted and used subsequently for design purposes. The design tasks are to identify the optimal SiP designs by varying several package input parameters so that the reliability and the robustness of the package are improved and in the same time specified performance criteria are also satisfied

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a methodology for deploying flexible dynamic configuration into embedded systems whilst preserving the reliability advantages of static systems. The methodology is based on the concept of decision points (DP) which are strategically placed to achieve fine-grained distribution of self-management logic to meet application-specific requirements. DP logic can be changed easily, and independently of the host component, enabling self-management behavior to be deferred beyond the point of system deployment. A transparent Dynamic Wrapper mechanism (DW) automatically detects and handles problems arising from the evaluation of self-management logic within each DP and ensures that the dynamic aspects of the system collapse down to statically defined default behavior to ensure safety and correctness despite failures. Dynamic context management contributes to flexibility, and removes the need for design-time binding of context providers and consumers, thus facilitating run-time composition and incremental component upgrade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.

Relevância:

20.00% 20.00%

Publicador: