937 resultados para Optimal matching analysis.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
Risk Based Inspection (RBI) is a risk methodology used as the basis for prioritizing and managing the efforts for an inspection program allowing the allocation of resources to provide a higher level of coverage on physical assets with higher risk. The main goal of RBI is to increase equipment availability while improving or maintaining the accepted level of risk. This paper presents the concept of risk, risk analysis and RBI methodology and shows an approach to determine the optimal inspection frequency for physical assets based on the potential risk and mainly on the quantification of the probability of failure. It makes use of some assumptions in a structured decision making process. The proposed methodology allows an optimization of inspection intervals deciding when the first inspection must be performed as well as the subsequent intervals of inspection. A demonstrative example is also presented to illustrate the application of the proposed methodology.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores - Ramo de Sistemas Autónomos
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
Most of distribution generation and smart grid research works are dedicated to the study of network operation parameters, reliability among others. However, many of this research works usually uses traditional test systems such as IEEE test systems. This work proposes a voltage magnitude study in presence of fault conditions considering the realistic specifications found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzyprobabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12 bus sub-transmission network.
Resumo:
Most of distributed generation and smart grid research works are dedicated to network operation parameters studies, reliability, etc. However, many of these works normally uses traditional test systems, for instance, IEEE test systems. This paper proposes voltage magnitude and reliability studies in presence of fault conditions, considering realistic conditions found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12-bus sub-transmission network.
Resumo:
This paper analyzes the performance of two cooperative robot manipulators. In order to capture the working performancewe formulated several performance indices that measure the manipulability, the effort reduction and the equilibrium between the two robots. In this perspective the proposed indices we determined the optimal values for the system parameters. Furthermore, it is studied the implementation of fractional-order algorithms in the position/force control of two cooperative robotic manipulators holding an object.
Resumo:
Atmospheric temperatures characterize Earth as a slow dynamics spatiotemporal system, revealing long-memory and complex behavior. Temperature time series of 54 worldwide geographic locations are considered as representative of the Earth weather dynamics. These data are then interpreted as the time evolution of a set of state space variables describing a complex system. The data are analyzed by means of multidimensional scaling (MDS), and the fractional state space portrait (fSSP). A centennial perspective covering the period from 1910 to 2012 allows MDS to identify similarities among different Earth’s locations. The multivariate mutual information is proposed to determine the “optimal” order of the time derivative for the fSSP representation. The fSSP emerges as a valuable alternative for visualizing system dynamics.
Resumo:
Os Mercados Eletrónicos atingiram uma complexidade e nível de sofisticação tão elevados, que tornaram inadequados os modelos de software convencionais. Estes mercados são caracterizados por serem abertos, dinâmicos e competitivos, e constituídos por várias entidades independentes e heterogéneas. Tais entidades desempenham os seus papéis de forma autónoma, seguindo os seus objetivos, reagindo às ocorrências do ambiente em que se inserem e interagindo umas com as outras. Esta realidade levou a que existisse por parte da comunidade científica um especial interesse no estudo da negociação automática executada por agentes de software [Zhang et al., 2011]. No entanto, a diversidade dos atores envolvidos pode levar à existência de diferentes conceptualizações das suas necessidades e capacidades dando origem a incompatibilidades semânticas, que podem prejudicar a negociação e impedir a ocorrência de transações que satisfaçam as partes envolvidas. Os novos mercados devem, assim, possuir mecanismos que lhes permitam exibir novas capacidades, nomeadamente a capacidade de auxiliar na comunicação entre os diferentes agentes. Pelo que, é defendido neste trabalho que os mercados devem oferecer serviços de ontologias que permitam facilitar a interoperabilidade entre os agentes. No entanto, os humanos tendem a ser relutantes em aceitar a conceptualização de outros, a não ser que sejam convencidos de que poderão conseguir um bom negócio. Neste contexto, a aplicação e exploração de relações capturadas em redes sociais pode resultar no estabelecimento de relações de confiança entre vendedores e consumidores, e ao mesmo tempo, conduzir a um aumento da eficiência da negociação e consequentemente na satisfação das partes envolvidas. O sistema AEMOS é uma plataforma de comércio eletrónico baseada em agentes que inclui serviços de ontologias, mais especificamente, serviços de alinhamento de ontologias, incluindo a recomendação de possíveis alinhamentos entre as ontologias dos parceiros de negociação. Este sistema inclui também uma componente baseada numa rede social, que é construída aplicando técnicas de análise de redes socias sobre informação recolhida pelo mercado, e que permite melhorar a recomendação de alinhamentos e auxiliar os agentes na sua escolha. Neste trabalho são apresentados o desenvolvimento e implementação do sistema AEMOS, mais concretamente: • É proposto um novo modelo para comércio eletrónico baseado em agentes que disponibiliza serviços de ontologias; • Adicionalmente propõem-se o uso de redes sociais emergentes para captar e explorar informação sobre relações entre os diferentes parceiros de negócio; • É definida e implementada uma componente de serviços de ontologias que é capaz de: • o Sugerir alinhamentos entre ontologias para pares de agentes; • o Traduzir mensagens escritas de acordo com uma ontologia em mensagens escritas de acordo com outra, utilizando alinhamentos previamente aprovados; • o Melhorar os seus próprios serviços recorrendo às funcionalidades disponibilizadas pela componente de redes sociais; • É definida e implementada uma componente de redes sociais que: • o É capaz de construir e gerir um grafo de relações de proximidade entre agentes, e de relações de adequação de alinhamentos a agentes, tendo em conta os perfis, comportamento e interação dos agentes, bem como a cobertura e utilização dos alinhamentos; • o Explora e adapta técnicas e algoritmos de análise de redes sociais às várias fases dos processos do mercado eletrónico. A implementação e experimentação do modelo proposto demonstra como a colaboração entre os diferentes agentes pode ser vantajosa na melhoria do desempenho do sistema e como a inclusão e combinação de serviços de ontologias e redes sociais se reflete na eficiência da negociação de transações e na dinâmica do mercado como um todo.
Resumo:
INTRODUCTION: Insulin resistance is the pathophysiological key to explain metabolic syndrome. Although clearly useful, the Homeostasis Model Assessment index (an insulin resistance measurement) hasn't been systematically applied in clinical practice. One of the main reasons is the discrepancy in cut-off values reported in different populations. We sought to evaluate in a Portuguese population the ideal cut-off for Homeostasis Model Assessment index and assess its relationship with metabolic syndrome. MATERIAL AND METHODS: We selected a cohort of individuals admitted electively in a Cardiology ward with a BMI < 25 Kg/m2 and no abnormalities in glucose metabolism (fasting plasma glucose < 100 mg/dL and no diabetes). The 90th percentile of the Homeostasis Model Assessment index distribution was used to obtain the ideal cut-off for insulin resistance. We also selected a validation cohort of 300 individuals (no exclusion criteria applied). RESULTS: From 7 000 individuals, and after the exclusion criteria, there were left 1 784 individuals. The 90th percentile for Homeostasis Model Assessment index was 2.33. In the validation cohort, applying that cut-off, we have 49.3% of individuals with insulin resistance. However, only 69.9% of the metabolic syndrome patients had insulin resistance according to that cut-off. By ROC curve analysis, the ideal cut-off for metabolic syndrome is 2.41. Homeostasis Model Assessment index correlated with BMI (r = 0.371, p < 0.001) and is an independent predictor of the presence of metabolic syndrome (OR 19.4, 95% CI 6.6 - 57.2, p < 0.001). DISCUSSION: Our study showed that in a Portuguese population of patients admitted electively in a Cardiology ward, 2.33 is the Homeostasis Model Assessment index cut-off for insulin resistance and 2.41 for metabolic syndrome. CONCLUSION: Homeostasis Model Assessment index is directly correlated with BMI and is an independent predictor of metabolic syndrome.
Resumo:
Dissertação para obtenção do Grau de Doutor em Biologia, Especialidade de Biologia Molecular
Resumo:
ABSTRACT: Financing is a critical factor in ensuring the optimal development and delivery of a mental health system. The primary method of financing worldwide is tax-based. However many low income countries depend on out-of-pocket payments. There is a report on Irish Health Care funding but none that deals exclusively with mental health care. This paper analyses the various financial models that exist globally with respect to financing the mental health sector, examines the impact of various models on service users, especially in terms of relative ‘financial burden’ and provides a more detailed examination of the current mental health funding situation in Ireland After extensive internet and hardcopy research on the above topics, the findings were analysed and a number of recommendations were reached. Mental health service should be free at the point of delivery to achieve universal coverage. Government tax-based funding or mandatory social insurance with government top-ups, as required, appears the optimal option, although there is no one funding system applicable everywhere. Out-of-pocket funding can create a crippling financial burden for service users. It is important to employ improved revenue collection systems, eliminate waste, provide equitable resource distribution, ring fence mental health funding and cap the number of visits, where necessary. Political, economic, social and cultural factors play a role in funding decisions and this can be clearly seen in the context of the current economic recession in Ireland. Only 33% of the Irish population has access to free public health care and the number health insurance policy holders has dramatically declined, resulting in increased out-of-pocket payments. This approach risks negatively impacting on the social determinants of health, increasing health inequalities and negatively affecting economic productivity. It is therefore important the Irish government examines other options to provide funding for mental health services.
Resumo:
ABSTRACT - It is the purpose of the present thesis to emphasize, through a series of examples, the need and value of appropriate pre-analysis of the impact of health care regulation. Specifically, the thesis presents three papers on the theme of regulation in different aspects of health care provision and financing. The first two consist of economic analyses of the impact of health care regulation and the third comprises the creation of an instrument for supporting economic analysis of health care regulation, namely in the field of evaluation of health care programs. The first paper develops a model of health plan competition and pricing in order to understand the dynamics of health plan entry and exit in the presence of switching costs and alternative health premium payment systems. We build an explicit model of death spirals, in which profitmaximizing competing health plans find it optimal to adopt a pattern of increasing relative prices culminating in health plan exit. We find the steady-state numerical solution for the price sequence and the plan’s optimal length of life through simulation and do some comparative statics. This allows us to show that using risk adjusted premiums and imposing price floors are effective at reducing death spirals and switching costs, while having employees pay a fixed share of the premium enhances death spirals and increases switching costs. Price regulation of pharmaceuticals is one of the cost control measures adopted by the Portuguese government, as in many European countries. When such regulation decreases the products’ real price over time, it may create an incentive for product turnover. Using panel data for the period of 1997 through 2003 on drug packages sold in Portuguese pharmacies, the second paper addresses the question of whether price control policies create an incentive for product withdrawal. Our work builds the product survival literature by accounting for unobservable product characteristics and heterogeneity among consumers when constructing quality, price control and competition indexes. These indexes are then used as covariates in a Cox proportional hazard model. We find that, indeed, price control measures increase the probability of exit, and that such effect is not verified in OTC market where no such price regulation measures exist. We also find quality to have a significant positive impact on product survival. In the third paper, we develop a microsimulation discrete events model (MSDEM) for costeffectiveness analysis of Human Immunodeficiency Virus treatment, simulating individual paths from antiretroviral therapy (ART) initiation to death. Four driving forces determine the course of events: CD4+ cell count, viral load resistance and adherence. A novel feature of the model with respect to the previous MSDEMs is that distributions of time to event depend on individuals’ characteristics and past history. Time to event was modeled using parametric survival analysis. Events modeled include: viral suppression, regimen switch due virological failure, regimen switch due to other reasons, resistance development, hospitalization, AIDS events, and death. Disease progression is structured according to therapy lines and the model is parameterized with cohort Portuguese observational data. An application of the model is presented comparing the cost-effectiveness ART initiation with two nucleoside analogue reverse transcriptase inhibitors (NRTI) plus one non-nucleoside reverse transcriptase inhibitor(NNRTI) to two NRTI plus boosted protease inhibitor (PI/r) in HIV- 1 infected individuals. We find 2NRTI+NNRTI to be a dominant strategy. Results predicted by the model reproduce those of the data used for parameterization and are in line with those published in the literature.
Resumo:
Are return migrants more productive than non-migrants? If so, is it a causal effect or simply self-selection? Existing literature has not reached a consensus on the role of return migration for origin countries. To answer these research questions, an empirical analysis was performed based on household data collected in Cape Verde. One of the most common identification problems in the migration literature is the presence of migrant self-selection. In order to disentangle potential selection bias, we use instrumental variable estimation using variation provided by unemployment rates in migrant destination countries, which is compared with OLS and Nearest Neighbor Matching (NNM) methods. The results using the instrumental variable approach provide evidence of labour income gains due to return migration, while OLS underestimates the coefficient of interest. This bias points towards negative self-selection of return migrants on unobserved characteristics, although the different estimates cannot be distinguished statistically. Interestingly, migration duration and occupational changes after migration do not seem to influence post-migration income. There is weak evidence that return migrants from the United States have higher income gains caused by migration than the ones who returned from Portugal.
Resumo:
Hospital-acquired infections (HAIs) delay healing, prolong Hospital stay, and increase both Hospital costs and risk of death. This study aims to estimate the extra length of stay and mortality rate attributable to each of the following HAIs: wound infection (WI); bloodstream infection (BSI); urinary infections (UI); and Hospital-acquired pneumonia (HAP). The study population consisted of patients discharged in CHLC in 2014. Data was collected to identify demographic information, surgical operations, development of HAIs and its outputs. The study used regressions and a matched strategy to compare cases (infected) and controls (uninfected). The matching criteria were: age, sex, week and type of admission, number of admissions, major diagnostic category and type of discharge. When compared to matched controls, cases with HAI had a higher mortality rate and greater length of stay. WI related to hip or knee surgery, increased mortality rate by 27.27% and the length of stay by 74.97 days. WI due to colorectal surgery caused an extra mortality rate of 10.69% and an excess length of stay of 20.23 days. BSI increased Hospital stay by 28.80 days and mortality rate by 32.27%. UI caused an average additional length of stay of 19.66 days and risk of death of 12.85%. HAP resulted in an extra Hospital stay of 25.06 days and mortality rate of 24.71%. This study confirms the results of the previous literature that patients experiencing HAIs incur in an excess of mortality rates and Hospital stay, and, overall, it presents worse results comparing with other countries.