25 resultados para Graph operations
em Universidad Politécnica de Madrid
Resumo:
Fastener holes in aeronautical structures are typical sources of fatigue cracks due to their induced local stress concentration. A very efficient solution to this problem is to establish compressive residual stresses around the fastener holes that retard the fatigue crack nucleation and its subsequent local propagation. Previous work done on the subject of the application of LSP treatment on thin, open-hole specimens [1] has proven that the LSP effect on fatigue life of treated specimens can be detrimental, if the process is not properly optimized. In fact, it was shown that the capability of the LSP to introduce compressive residual stresses around fastener holes in thin-walled structures representative of typical aircraft constructions was not superior to the performance of conventional techniques, such as cold-working.
Resumo:
A number of thrombectomy devices using a variety of methods have now been developed to facilitate clot removal. We present research involving one such experimental device recently developed in the UK, called a ‘GP’ Thrombus Aspiration Device (GPTAD). This device has the potential to bring about the extraction of a thrombus. Although the device is at a relatively early stage of development, the results look encouraging. In this work, we present an analysis and modeling of the GPTAD by means of the bond graph technique; it seems to be a highly effective method of simulating the device under a variety of conditions. Such modeling is useful in optimizing the GPTAD and predicting the result of clot extraction. The aim of this simulation model is to obtain the minimum pressure necessary to extract the clot and to verify that both the pressure and the time required to complete the clot extraction are realistic for use in clinical situations, and are consistent with any experimentally obtained data. We therefore consider aspects of rheology and mechanics in our modeling.
Resumo:
Tree-reweighted belief propagation is a message passing method that has certain advantages compared to traditional belief propagation (BP). However, it fails to outperform BP in a consistent manner, does not lend itself well to distributed implementation, and has not been applied to distributions with higher-order interactions. We propose a method called uniformly-reweighted belief propagation that mitigates these drawbacks. After having shown in previous works that this method can substantially outperform BP in distributed inference with pairwise interaction models, in this paper we extend it to higher-order interactions and apply it to LDPC decoding, leading performance gains over BP.
Resumo:
We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.
Resumo:
We show a procedure for constructing a probabilistic atlas based on affine moment descriptors. It uses a normalization procedure over the labeled atlas. The proposed linear registration is defined by closed-form expressions involving only geometric moments. This procedure applies both to atlas construction as atlas-based segmentation. We model the likelihood term for each voxel and each label using parametric or nonparametric distributions and the prior term is determined by applying the vote-rule. The probabilistic atlas is built with the variability of our linear registration. We have two segmentation strategy: a) it applies the proposed affine registration to bring the target image into the coordinate frame of the atlas or b) the probabilistic atlas is non-rigidly aligning with the target image, where the probabilistic atlas is previously aligned to the target image with our affine registration. Finally, we adopt a graph cut - Bayesian framework for implementing the atlas-based segmentation.
Resumo:
The study of cross-reactivity in allergy is key to both understanding. the allergic response of many patients and providing them with a rational treatment In the present study, protein microarrays and a co-sensitization graph approach were used in conjunction with an allergen microarray immunoassay. This enabled us to include a wide number of proteins and a large number of patients, and to study sensitization profiles among members of the LTP family. Fourteen LTPs from the most frequent plant food-induced allergies in the geographical area studied were printed into a microarray specifically designed for this research. 212 patients with fruit allergy and 117 food-tolerant pollen allergic subjects were recruited from seven regions of Spain with different pollen profiles, and their sera were tested with allergen microarray. This approach has proven itself to be a good tool to study cross-reactivity between members of LTP family, and could become a useful strategy to analyze other families of allergens.
Resumo:
We propose a method to measure real-valued time series irreversibility which combines two different tools: the horizontal visibility algorithm and the Kullback-Leibler divergence. This method maps a time series to a directed network according to a geometric criterion. The degree of irreversibility of the series is then estimated by the Kullback-Leibler divergence (i.e. the distinguishability) between the in and out degree distributions of the associated graph. The method is computationally efficient and does not require any ad hoc symbolization process. We find that the method correctly distinguishes between reversible and irreversible stationary time series, including analytical and numerical studies of its performance for: (i) reversible stochastic processes (uncorrelated and Gaussian linearly correlated), (ii) irreversible stochastic processes (a discrete flashing ratchet in an asymmetric potential), (iii) reversible (conservative) and irreversible (dissipative) chaotic maps, and (iv) dissipative chaotic maps in the presence of noise. Two alternative graph functionals, the degree and the degree-degree distributions, can be used as the Kullback-Leibler divergence argument. The former is simpler and more intuitive and can be used as a benchmark, but in the case of an irreversible process with null net current, the degree-degree distribution has to be considered to identify the irreversible nature of the series
Resumo:
Non-invasive quantitative assessment of the right ventricular anatomical and functional parameters is a challenging task. We present a semi-automatic approach for right ventricle (RV) segmentation from 4D MR images in two variants, which differ in the amount of user interaction. The method consists of three main phases: First, foreground and background markers are generated from the user input. Next, an over-segmented region image is obtained applying a watershed transform. Finally, these regions are merged using 4D graph-cuts with an intensity based boundary term. For the first variant the user outlines the inside of the RV wall in a few end-diastole slices, for the second two marker pixels serve as starting point for a statistical atlas application. Results were obtained by blind evaluation on 16 testing 4D MR volumes. They prove our method to be robust against markers location and place it favourably in the ranks of existing approaches.
Resumo:
As a thermal separation method, distillation is one of the most important technologies in the chemical industry. Given its importance, it is no surprise that increasing efforts have been made in reducing its energy inefficiencies. A great deal of research is focused in the design and optimization of the Divided-Wall Column. Its applications are still reduced due to distrust of its controllability. Previous references studied the decentralized control of DWC but still few papers deal about Model Predictive Control. In this work we present a decentralized control of both a DWC column along with its equivalent MPC schema.
Resumo:
The road to the automation of the agricultural processes passes through the safe operation of the autonomous vehicles. This requirement is a fact in ground mobile units, but it still has not well defined for the aerial robots (UAVs) mainly because the normative and legislation are quite diffuse or even inexistent. Therefore, to define a common and global policy is the challenge to tackle. This characterization has to be addressed from the field experience. Accordingly, this paper presents the work done in this direction, based on the analysis of the most common sources of hazards when using UAV's for agricultural tasks. The work, based on the ISO 31000 normative, has been carried out by applying a three-step structure that integrates the identification, assessment and reduction procedures. The present paper exposes how this method has been applied to analyze previous accidents and malfunctions during UAV operations in order to obtain real failure causes. It has allowed highlighting common risks and hazardous sources and proposing specific guards and safety measures for the agricultural context.
Resumo:
Nowadays, processing Industry Sector is going through a series of changes, including right management and reduction of environmental affections. Any productive process which looks for sustainable management is incomplete if Cycle of Life of mineral resources sustainability is not taken into account. Raw materials for manufacturing are provided by mineral resources extraction processes, such as copper, aluminum, iron, gold, silver, silicon, titanium? Those elements are necessary for Mankind development and are obtained from the Earth through mineral extractive processes. Mineral extraction processes are operations which must take care about the environmental consequences. Extraction of huge volumes of rock for their transformation into raw materials for industry must be optimized to reduce ecological cost of the final product as l was possible. Reducing the ecological balance on a global scale has no sense to design an efficient manufacturing in secondary industry (transformation), if in first steps of the supply chain (extraction) impact exceeds the savings of resources in successive phases. Mining operations size suggests that it is an environmental aggressive activity, but precisely because of its great impact must be the first element to be considered. That idea implies that a new concept born: Reduce economical and environmental cost This work aims to make a reflection on the parameters that can be modified to reduce the energy cost of the process without an increasing in operational costs and always ensuring the same production capacity. That means minimize economic and environmental cost at same time. An efficient design of mining operation which has taken into account that idea does not implies an increasing of the operating cost. To get this objective is necessary to think in global operation view to make that all departments involved have common guidelines which make you think in the optimization of global energy costs. Sometimes a single operational cost must be increased to reduce global cost. This work makes a review through different design parameters of surface mining setting some key performance indicators (KPIs) which are estimated from an efficient point of view. Those KPIs can be included by HQE Policies as global indicators. The new concept developed is that a new criteria has to be applied in company policies: improve management, improving OPERATIONAL efficiency. That means, that is better to use current resources properly (machinery, equipment,?) than to replace them with new things but not used correctly. As a conclusion, through an efficient management of current technologies in each extractive operation an important reduction of the energy can be achieved looking at downstream in the process. That implies a lower energetic cost in the whole cycle of life in manufactured product.
Resumo:
All activities of an organization involve risks that should be managed. The risk management process aids decision making by taking account of uncertainty and the possibility of future events or circumstances (intended or unintended) and their effects on agreed objectives. With that idea, new ISO Standard has been drawn up. ISO 31010 has been recently issued which provides a structured process that identifies how objectives may be affected, and analyses the risk in term of consequences and their probabilities before deciding on whether further treatment is required. In this lecture, that ISO Standard has been adapted to Open Pit Blasting Operations, focusing in Environmental effects which can be managed properly. Technique used is Fault Tree Analysis (FTA), which is applied in all possible scenarios, providing to Blasting Professionals the tools to identify, analyze and manage environmental effects in blasting operations. Also this lecture can help to minimize each effect, studying each case. This paper also can be useful to Project Managers and Occupational Health and Safety Departments (OH&S) because blasting operations can be evaluated and compared one to each other to determine the risks that should be managed in different case studies. The environmental effects studied are: ground vibrations, flyrock and air overpressure (airblast). Sometimes, blasting operations are carried out near populated areas where environmental effects may impose several limitations on the use of explosives. In those cases, where these factors approach certain limits, National Standards and Regulations have to be applied.
Resumo:
Aim of study: to review the present state of the art in relation to the main labour risks and the most relevant results of recent studies evaluating the safety and health conditions of the forest harvesting work and better ways to reduce accidents. Area of study: It focuses mainly on developed Countries, where the general concern about work risks prevention, together with the complex idiosyncrasy of forest work in forest harvesting operations, has led to a growing interest from the forest scientific and technical community. Material and Methods: The main bibliographic and Internet references have been identified using common reference analysis tools. Their conclusions and recommendations have been comprehensively summarized. Main results: Collection of the principal references and their most important conclusions relating to the main accident risk factors, their causes and consequences, the means used towards their prevention, both instrumental as well as in the aspects of training and business management, besides the influence of the growing mechanization of logging operations on those risks. Research highlights: Accident risk is higher in forest harvesting than in most other work sectors, and the main risk factors such as experience, age, seasonality, training, protective equipment, mechanization degree, etc. have been identified and studied. The paper summarizes some relevant results, one of the principal being that the proper entrepreneurial risk management is a key factor leading to the success in minimizing labour risks..
Resumo:
This paper presents the model named Accepting Networks of Evolutionary Processors as NP-problem solver inspired in the biological DNA operations. A processor has a rules set, splicing rules in this model,an object multiset and a filters set. Rules can be applied in parallel since there exists a large number of copies of objects in the multiset. Processors can form a graph in order to solve a given problem. This paper shows the network configuration in order to solve the SAT problem using linear resources and time. A rule representation arquitecture in distributed environments can be easily implemented using these networks of processors, such as decision support systems, as shown in the paper.
Resumo:
This paper presents an operational concept for Air Traffic Management, and in particular arrival management, in which aircraft are permitted to operate in a manner consistent with current optimal aircraft operating techniques. The proposed concept allows aircraft to descend in the fuel efficient path managed mode and with arrival time not actively controlled. It will be demonstrated how the associated uncertainty in the time dimension of the trajectory can be managed through the application of multiple metering points strategically chosen along the trajectory. The proposed concept does not make assumptions on aircraft equipage (e.g. time of arrival control), but aims at handling mixed-equipage scenarios that most likely will remain far into the next decade and arguably beyond.