793 resultados para Probabilistic Aggregation Criteria
Resumo:
3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon, Portugal.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Electrónica e Telecomunicações
Resumo:
Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.
Resumo:
Mestrado em Tecnologia de Diagnóstico e Intervenção Cardiovascular
Resumo:
The current work can be seen as a starting point for the discussion of the problematic on risk acceptance criteria in occupational environments. Some obstacles to the quantitative acceptance criteria formulation and use were analyzed. A look to the long tradition of major hazards accidents was also performed. This work shows that organizations can have several difficulties in acceptance criteria formulation and that the use of pre-defined acceptance criteria in risk assessment methodologies can be inadequate in some cases. It is urgent to define guidelines that can help organizations in the formulation of risk acceptance criteria for occupational environments.
Resumo:
Purpose: To assess image quality using PGMI (perfect, good, moderate, inadequate) scale in digital mammography examinations acquired in DR systems. Identify the main failures and propose corrective actions. Evaluate the most typical breast density. Methods and Materials: Clinical image quality criteria were evaluated considering mammograms acquired in 13 DR systems and classified according to PGMI scale using the criteria described in European Commission guidelines for radiographers. The breast density was assessed according to ACR recommendations. The data were collected on the acquisition system monitor to reproduce the daily practice of the radiographer. Results: The image quality criteria were evaluated in 3044 images. The criteria were fully achieved in 41% of the images that were classified as P (perfect), 31 % of the images were classified as M (moderate), 20% G (good) and 9% I (inadequate). The main cause of inadequate image quality was absence of all breast tissue in the image, skin folders in the pectoral muscle and in the infra-mammary angle. The higher number of failures occurred in MLO projections (809 out of 1022). The most represented (36%) breast type was type 2 (25-50% glandular tissue). Conclusion: Incorrect radiographic technique was frequently detected suggesting potential training needs and poor communication between the team members (radiographer and radiologists). Further correlations are necessary to identify the main causes for the failures, namely specific education and training in digital mammography and workload.
Resumo:
In this work is discussed the importance of the renewable production forecast in an island environment. A probabilistic forecast based on kernel density estimators is proposed. The aggregation of these forecasts, allows the determination of thermal generation amount needed to schedule and operating a power grid of an island with high penetration of renewable generation. A case study based on electric system of S. Miguel Island is presented. The results show that the forecast techniques are an imperative tool help the grid management.
Resumo:
In the last twenty years genetic algorithms (GAs) were applied in a plethora of fields such as: control, system identification, robotics, planning and scheduling, image processing, and pattern and speech recognition (Bäck et al., 1997). In robotics the problems of trajectory planning, collision avoidance and manipulator structure design considering a single criteria has been solved using several techniques (Alander, 2003). Most engineering applications require the optimization of several criteria simultaneously. Often the problems are complex, include discrete and continuous variables and there is no prior knowledge about the search space. These kind of problems are very more complex, since they consider multiple design criteria simultaneously within the optimization procedure. This is known as a multi-criteria (or multiobjective) optimization, that has been addressed successfully through GAs (Deb, 2001). The overall aim of multi-criteria evolutionary algorithms is to achieve a set of non-dominated optimal solutions known as Pareto front. At the end of the optimization procedure, instead of a single optimal (or near optimal) solution, the decision maker can select a solution from the Pareto front. Some of the key issues in multi-criteria GAs are: i) the number of objectives, ii) to obtain a Pareto front as wide as possible and iii) to achieve a Pareto front uniformly spread. Indeed, multi-objective techniques using GAs have been increasing in relevance as a research area. In 1989, Goldberg suggested the use of a GA to solve multi-objective problems and since then other researchers have been developing new methods, such as the multi-objective genetic algorithm (MOGA) (Fonseca & Fleming, 1995), the non-dominated sorted genetic algorithm (NSGA) (Deb, 2001), and the niched Pareto genetic algorithm (NPGA) (Horn et al., 1994), among several other variants (Coello, 1998). In this work the trajectory planning problem considers: i) robots with 2 and 3 degrees of freedom (dof ), ii) the inclusion of obstacles in the workspace and iii) up to five criteria that are used to qualify the evolving trajectory, namely the: joint traveling distance, joint velocity, end effector / Cartesian distance, end effector / Cartesian velocity and energy involved. These criteria are used to minimize the joint and end effector traveled distance, trajectory ripple and energy required by the manipulator to reach at destination point. Bearing this ideas in mind, the paper addresses the planning of robot trajectories, meaning the development of an algorithm to find a continuous motion that takes the manipulator from a given starting configuration up to a desired end position without colliding with any obstacle in the workspace. The chapter is organized as follows. Section 2 describes the trajectory planning and several approaches proposed in the literature. Section 3 formulates the problem, namely the representation adopted to solve the trajectory planning and the objectives considered in the optimization. Section 4 studies the algorithm convergence. Section 5 studies a 2R manipulator (i.e., a robot with two rotational joints/links) when the optimization trajectory considers two and five objectives. Sections 6 and 7 show the results for the 3R redundant manipulator with five goals and for other complementary experiments are described, respectively. Finally, section 8 draws the main conclusions.
Resumo:
The activity of growing living bacteria was investigated using real-time and in situ rheology-in stationary and oscillatory shear. Two different strains of the human pathogen Staphylococcus aureus-strain COL and its isogenic cell wall autolysis mutant, RUSAL9-were considered in this work. For low bacteria density, strain COL forms small clusters, while the mutant, presenting deficient cell separation, forms irregular larger aggregates. In the early stages of growth, when subjected to a stationary shear, the viscosity of the cultures of both strains increases with the population of cells. As the bacteria reach the exponential phase of growth, the viscosity of the cultures of the two strains follows different and rich behaviors, with no counterpart in the optical density or in the population's colony-forming units measurements. While the viscosity of strain COL culture keeps increasing during the exponential phase and returns close to its initial value for the late phase of growth, where the population stabilizes, the viscosity of the mutant strain culture decreases steeply, still in the exponential phase, remains constant for some time, and increases again, reaching a constant plateau at a maximum value for the late phase of growth. These complex viscoelastic behaviors, which were observed to be shear-stress-dependent, are a consequence of two coupled effects: the cell density continuous increase and its changing interacting properties. The viscous and elastic moduli of strain COL culture, obtained with oscillatory shear, exhibit power-law behaviors whose exponents are dependent on the bacteria growth stage. The viscous and elastic moduli of the mutant culture have complex behaviors, emerging from the different relaxation times that are associated with the large molecules of the medium and the self-organized structures of bacteria. Nevertheless, these behaviors reflect the bacteria growth stage.
Resumo:
This study focus on the probabilistic modelling of mechanical properties of prestressing strands based on data collected from tensile tests carried out in Laboratório Nacional de Engenharia Civil (LNEC), Portugal, for certification purposes, and covers a period of about 9 years of production. The strands studied were produced by six manufacturers from four countries, namely Portugal, Spain, Italy and Thailand. Variability of the most important mechanical properties is examined and the results are compared with the recommendations of the Probabilistic Model Code, as well as the Eurocodes and earlier studies. The obtained results show a very low variability which, of course, benefits structural safety. Based on those results, probabilistic models for the most important mechanical properties of prestressing strands are proposed.
Resumo:
In this article, we present the first study on probabilistic tsunami hazard assessment for the Northeast (NE) Atlantic region related to earthquake sources. The methodology combines the probabilistic seismic hazard assessment, tsunami numerical modeling, and statistical approaches. We consider three main tsunamigenic areas, namely the Southwest Iberian Margin, the Gloria, and the Caribbean. For each tsunamigenic zone, we derive the annual recurrence rate for each magnitude range, from Mw 8.0 up to Mw 9.0, with a regular interval, using the Bayesian method, which incorporates seismic information from historical and instrumental catalogs. A numerical code, solving the shallow water equations, is employed to simulate the tsunami propagation and compute near shore wave heights. The probability of exceeding a specific tsunami hazard level during a given time period is calculated using the Poisson distribution. The results are presented in terms of the probability of exceedance of a given tsunami amplitude for 100- and 500-year return periods. The hazard level varies along the NE Atlantic coast, being maximum along the northern segment of the Morocco Atlantic coast, the southern Portuguese coast, and the Spanish coast of the Gulf of Cadiz. We find that the probability that a maximum wave height exceeds 1 m somewhere in the NE Atlantic region reaches 60 and 100 % for 100- and 500-year return periods, respectively. These probability values decrease, respectively, to about 15 and 50 % when considering the exceedance threshold of 5 m for the same return periods of 100 and 500 years.
Resumo:
Clustering ensemble methods produce a consensus partition of a set of data points by combining the results of a collection of base clustering algorithms. In the evidence accumulation clustering (EAC) paradigm, the clustering ensemble is transformed into a pairwise co-association matrix, thus avoiding the label correspondence problem, which is intrinsic to other clustering ensemble schemes. In this paper, we propose a consensus clustering approach based on the EAC paradigm, which is not limited to crisp partitions and fully exploits the nature of the co-association matrix. Our solution determines probabilistic assignments of data points to clusters by minimizing a Bregman divergence between the observed co-association frequencies and the corresponding co-occurrence probabilities expressed as functions of the unknown assignments. We additionally propose an optimization algorithm to find a solution under any double-convex Bregman divergence. Experiments on both synthetic and real benchmark data show the effectiveness of the proposed approach.
Resumo:
As e-learning gradually evolved many specialized and disparate systems appeared to fulfil the needs of teachers and students, such as repositories of learning objects, authoring tools, intelligent tutors and automatic evaluators. This heterogeneity raises interoperability issues giving the standardization of content an important role in e-learning. This article presents a survey on current e-learning content aggregation standards focusing on their internal organization and packaging. This study is part of an effort to choose the most suitable specifications and standards for an e-learning framework called Ensemble defined as a conceptual tool to organize a network of e-learning systems and services for domains with complex evaluation.
Resumo:
Feature discretization (FD) techniques often yield adequate and compact representations of the data, suitable for machine learning and pattern recognition problems. These representations usually decrease the training time, yielding higher classification accuracy while allowing for humans to better understand and visualize the data, as compared to the use of the original features. This paper proposes two new FD techniques. The first one is based on the well-known Linde-Buzo-Gray quantization algorithm, coupled with a relevance criterion, being able perform unsupervised, supervised, or semi-supervised discretization. The second technique works in supervised mode, being based on the maximization of the mutual information between each discrete feature and the class label. Our experimental results on standard benchmark datasets show that these techniques scale up to high-dimensional data, attaining in many cases better accuracy than existing unsupervised and supervised FD approaches, while using fewer discretization intervals.
Resumo:
Measuring the quality of a b-learning environment is critical to determine the success of a b-learning course. Several initiatives have been recently conducted on benchmarking and quality in e-learning. Despite these efforts in defining and examining quality issues concerning online courses, a defining instrument to evaluate quality is one of the key challenges for blended learning, since it incorporates both traditional and online instruction methods. For this paper, six frameworks for quality assessment of technological enhanced learning were examined and compared regarding similarities and differences. These frameworks aim at the same global objective: the quality of e-learning environment/products. They present different perspectives but also many common issues. Some of them are more specific and related to the course and other are more global and related to institutional aspects. In this work we collected and arrange all the quality criteria identified in order to get a more complete framework and determine if it fits our b-learning environment. We also included elements related to our own b-learning research and experience, acquired during more than 10 years of experience. As a result we have create a new quality reference with a set of dimensions and criteria that should be taken into account when you are analyzing, designing, developing, implementing and evaluating a b-learning environment. Besides these perspectives on what to do when you are developing a b-learning environment we have also included pedagogical issues in order to give directions on how to do it to reach the success of the learning. The information, concepts and procedures here presented give support to teachers and instructors, which intend to validate the quality of their blended learning courses.