913 resultados para exception handling
Resumo:
This paper presents an algorithm for solid model reconstruction from 2D sectional views based on volume-based approach. None of the existing work in automatic reconstruction from 2D orthographic views have addressed sectional views in detail. It is believed that the volume-based approach is better suited to handle different types of sectional views. The volume-based approach constructs the 3D solid by a boolean combination of elementary solids. The elementary solids are formed by sweep operation on loops identified in the input views. The only adjustment to be made for the presence of sectional views is in the identification of loops that would form the elemental solids. In the algorithm, the conventions of engineering drawing for sectional views, are used to identify the loops correctly. The algorithm is simple and intuitive in nature. Results have been obtained for full sections, offset sections and half sections. Future work will address other types of sectional views such as removed and revolved sections and broken-out sections. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The purpose of this article is to report the experience of design and testing of orifice plate-based flow measuring systems for evaluation of air leakages in components of air conditioning systems. Two of the flow measuring stations were designed with a beta value of 0.405 and 0.418. The third was a dual path unit with orifice plates of beta value 0.613 and 0.525. The flow rates covered with all the four were from 4-94 l/s and the range of Reynolds numbers is from 5600 to 76,000. The coefficients of discharge were evaluated and compared with the Stolz equation. Measured C-d values are generally higher than those obtained from the equation, the deviations being larger in the low Reynolds number region. Further, it is observed that a second-degree polynomial is inadequate to relate the pressure drop and flow rate. The lower Reynolds number limits set by standards appear to be somewhat conservative.
Resumo:
A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.
Resumo:
This study focuses on personnel managers in crisis situations. The interviewed personnel managers referred to emotions as a central element to be dealt with in a crisis. However, until recently, the exploration of emotions in organisational life has been de-emphasised or ignored. This study aims to bring to the surface aspects of personnel work that have so far been neglected or remained invisible. It specifically examines how personnel managers handle employees’ and their own emotions in a crisis. Based on the interviews, a number of emotional episodes were constructed. They describe the type and context of the crisis and the person(s) whose emotions are handled. The main findings of the study are the five emotion-handling strategies that could be constructed from the data. The negotiation-like manner in which personnel managers handled emotions in crisis situations proved especially interesting. They were actually negotiating emotional value for their organisations. Further, they handled their own emotions within the frame of two logics of appropriateness labelled mothering and guide-following. The episodes described also enabled identification of the values enacted by the personnel managers in handling emotions. The study provides descriptive information on emotion handling, a current and relevant feature in the practice of personnel management. It seeks to offer a frame for developing practical principles that can be helpful in a crisis. It also offers the opportunity to consider a variety of difficult situations that personnel managers may confront in their work.
Resumo:
We propose a novel second order cone programming formulation for designing robust classifiers which can handle uncertainty in observations. Similar formulations are also derived for designing regression functions which are robust to uncertainties in the regression setting. The proposed formulations are independent of the underlying distribution, requiring only the existence of second order moments. These formulations are then specialized to the case of missing values in observations for both classification and regression problems. Experiments show that the proposed formulations outperform imputation.
Resumo:
We present a framework for performance evaluation of manufacturing systems subject to failure and repair. In particular, we determine the mean and variance of accumulated production over a specified time frame and show the usefulness of these results in system design and in evaluating operational policies for manufacturing systems. We extend this analysis for lead time as well. A detailed performability study is carried out for the generic model of a manufacturing system with centralized material handling. Several numerical results are presented, and the relevance of performability analysis in resolving system design issues is highlighted. Specific problems addressed include computing the distribution of total production over a shift period, determining the shift length necessary to deliver a given production target with a desired probability, and obtaining the distribution of Manufacturing Lead Time, all in the face of potential subsystem failures.
Resumo:
We study the problem of uncertainty in the entries of the Kernel matrix, arising in SVM formulation. Using Chance Constraint Programming and a novel large deviation inequality we derive a formulation which is robust to such noise. The resulting formulation applies when the noise is Gaussian, or has finite support. The formulation in general is non-convex, but in several cases of interest it reduces to a convex program. The problem of uncertainty in kernel matrix is motivated from the real world problem of classifying proteins when the structures are provided with some uncertainty. The formulation derived here naturally incorporates such uncertainty in a principled manner leading to significant improvements over the state of the art. 1.
Resumo:
Many knowledge based systems (KBS) transform a situation information into an appropriate decision using an in built knowledge base. As the knowledge in real world situation is often uncertain, the degree of truth of a proposition provides a measure of uncertainty in the underlying knowledge. This uncertainty can be evaluated by collecting `evidence' about the truth or falsehood of the proposition from multiple sources. In this paper we propose a simple framework for representing uncertainty in using the notion of an evidence space.
Resumo:
The correlation clustering problem is a fundamental problem in both theory and practice, and it involves identifying clusters of objects in a data set based on their similarity. A traditional modeling of this question as a graph theoretic problem involves associating vertices with data points and indicating similarity by adjacency. Clusters then correspond to cliques in the graph. The resulting optimization problem, Cluster Editing (and several variants) are very well-studied algorithmically. In many situations, however, translating clusters to cliques can be somewhat restrictive. A more flexible notion would be that of a structure where the vertices are mutually ``not too far apart'', without necessarily being adjacent. One such generalization is realized by structures called s-clubs, which are graphs of diameter at most s. In this work, we study the question of finding a set of at most k edges whose removal leaves us with a graph whose components are s-clubs. Recently, it has been shown that unless Exponential Time Hypothesis fail (ETH) fails Cluster Editing (whose components are 1-clubs) does not admit sub-exponential time algorithm STACS, 2013]. That is, there is no algorithm solving the problem in time 2 degrees((k))n(O(1)). However, surprisingly they show that when the number of cliques in the output graph is restricted to d, then the problem can be solved in time O(2(O(root dk)) + m + n). We show that this sub-exponential time algorithm for the fixed number of cliques is rather an exception than a rule. Our first result shows that assuming the ETH, there is no algorithm solving the s-Club Cluster Edge Deletion problem in time 2 degrees((k))n(O(1)). We show, further, that even the problem of deleting edges to obtain a graph with d s-clubs cannot be solved in time 2 degrees((k))n(O)(1) for any fixed s, d >= 2. This is a radical contrast from the situation established for cliques, where sub-exponential algorithms are known.
Resumo:
Body Area Network, a new wireless networking paradigm, promises to revolutionize the healthcare applications. A number of tiny sensor nodes are strategically placed in and around the human body to obtain physiological information. The sensor nodes are connected to a coordinator or a data collector to form a Body Area Network. The tiny devices may sense physiological parameters of emergency in nature (e.g. abnormality in heart bit rate, increase of glucose level above the threshold etc.) that needs immediate attention of a physician. Due to ultra low power requirement of wireless body area network, most of the time, the coordinator and devices are expected to be in the dormant mode, categorically when network is not operational. This leads to an open question, how to handle and meet the QoS requirement of emergency data when network is not operational? Emergency handling becomes more challenging at the MAC layer, if the channel access related information is unknown to the device with emergency message. The aforementioned scenarios are very likely scenarios in a MICS (Medical Implant Communication Service, 402-405 MHz) based healthcare systems. This paper proposes a mechanism for timely and reliable transfer of emergency data in a MICS based Body Area Network. We validate our protocol design with simulation in a C++ framework. Our simulation results show that more than 99 p ercentage of the time emergency messages are reached at the coordinator with a delay of 400ms.
Resumo:
There is a need to use probability distributions with power-law decaying tails to describe the large variations exhibited by some of the physical phenomena. The Weierstrass Random Walk (WRW) shows promise for modeling such phenomena. The theory of anomalous diffusion is now well established. It has found number of applications in Physics, Chemistry and Biology. However, its applications are limited in structural mechanics in general, and structural engineering in particular. The aim of this paper is to present some mathematical preliminaries related to WRW that would help in possible applications. In the limiting case, it represents a diffusion process whose evolution is governed by a fractional partial differential equation. Three applications of superdiffusion processes in mechanics, illustrating their effectiveness in handling large variations, are presented.