51 resultados para Optimization framework

em Repositório Científico do Instituto Politécnico de Lisboa - Portugal


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Digital radiography detectors—based on different technological solutions—are currently available for clinical applications and widespread in clinical practice. Computed radiography (CR) and digital radiology systems have been available for clinical applications and the trend over the last few years has become digital. Radiology departments have been changing from traditional screen–film technology to digital technology. This chapter is intended to give the reader a practical understanding about the key aspects concerning digital systems, related to the performance of different technologies, image quality, and dose and patient safety/protection. The discussion around an optimization framework for digital systems is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective functions. Our framework is inspired by the search/poll paradigm of direct-search methods of directional type and uses the concept of Pareto dominance to maintain a list of nondominated points (from which the new iterates or poll centers are chosen). The aim of our method is to generate as many points in the Pareto front as possible from the polling procedure itself, while keeping the whole framework general enough to accommodate other disseminating strategies, in particular, when using the (here also) optional search step. DMS generalizes to multiobjective optimization (MOO) all direct-search methods of directional type. We prove under the common assumptions used in direct search for single objective optimization that at least one limit point of the sequence of iterates generated by DMS lies in (a stationary form of) the Pareto front. However, extensive computational experience has shown that our methodology has an impressive capability of generating the whole Pareto front, even without using a search step. Two by-products of this paper are (i) the development of a collection of test problems for MOO and (ii) the extension of performance and data profiles to MOO, allowing a comparison of several solvers on a large set of test problems, in terms of their efficiency and robustness to determine Pareto fronts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

No início da década de 90, as empresas começaram a sentir a necessidade de melhorar o acesso à informação das suas actividades para auxiliar na tomada de decisões. Desta forma, no mundo da informática, emergiu o sector Business Intelligence (BI) composto inicialmente por data warehousing e ferramentas de geração de relatórios. Ao longo dos anos o conceito de BI evoluiu de acordo com as necessidades empresariais, tornando a análise das actividades e do desempenho das organizações em aspectos críticos na gestão das mesmas. A área de BI abrange diversos sectores, sendo o de geração de relatórios e o de análise de dados aqueles que melhor preenchem os requisitos pretendidos no controlo de acesso à informação do negócio e respectivos processos. Actualmente o tempo e a informação são vantagens competitivas e por esse mesmo motivo as empresas estão cada vez mais preocupadas com o facto de o aumento do volume de informação estar a tornar-se insustentável na medida que o tempo necessário para processar a informação é cada vez maior. Por esta razão muitas empresas de software, tais como Microsoft, IBM e Oracle estão numa luta por um lugar neste mercado de BI em expansão. Para que as empresas possam ser competitivas, a sua capacidade de previsão e resposta às necessidades de mercado em tempo real é requisito principal, em detrimento da existência apenas de uma reacção a uma necessidade que peca por tardia. Os produtos de BI têm fama de trabalharem apenas com dados históricos armazenados, o que faz com que as empresas não se possam basear nessas soluções quando o requisito de alguns negócios é de tempo quase real. A latência introduzida por um data warehouse é demasiada para que o desempenho seja aceitável. Desta forma, surge a tecnologia Business Activity Monitoring (BAM) que fornece análise de dados e alertas em tempo quase real sobre os processos do negócio, utilizando fontes de dados como Web Services, filas de mensagens, etc. O conceito de BAM surgiu em Julho de 2001 pela organização Gartner, sendo uma extensão orientada a eventos da área de BI. O BAM define-se pelo acesso em tempo real aos indicadores de desempenho de negócios com o intuito de aumentar a velocidade e eficácia dos processos de negócio. As soluções BAM estão a tornar-se cada vez mais comuns e sofisticadas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A previously developed model is used to numerically simulate real clinical cases of the surgical correction of scoliosis. This model consists of one-dimensional finite elements with spatial deformation in which (i) the column is represented by its axis; (ii) the vertebrae are assumed to be rigid; and (iii) the deformability of the column is concentrated in springs that connect the successive rigid elements. The metallic rods used for the surgical correction are modeled by beam elements with linear elastic behavior. To obtain the forces at the connections between the metallic rods and the vertebrae geometrically, non-linear finite element analyses are performed. The tightening sequence determines the magnitude of the forces applied to the patient column, and it is desirable to keep those forces as small as possible. In this study, a Genetic Algorithm optimization is applied to this model in order to determine the sequence that minimizes the corrective forces applied during the surgery. This amounts to find the optimal permutation of integers 1, ... , n, n being the number of vertebrae involved. As such, we are faced with a combinatorial optimization problem isomorph to the Traveling Salesman Problem. The fitness evaluation requires one computing intensive Finite Element Analysis per candidate solution and, thus, a parallel implementation of the Genetic Algorithm is developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Actualmente, não existem ferramentas open source de Business Intelligence (BI) para suporte à gestão e análise financeira nas empresas, de acordo com o sistema de normalização contabilística (SNC). As diferentes características de cada negócio, juntamente com os requisitos impostos pelo SNC, tornam complexa a criação de uma Framework financeira genérica, que satisfaça, de forma eficiente, as análises financeiras necessárias à gestão das empresas. O objectivo deste projecto é propor uma framework baseada em OLAP, capaz de dar suporte à gestão contabilística e análise financeira, recorrendo exclusivamente a software open source na sua implementação, especificamente, a plataforma Pentaho. Toda a informação contabilística, obtida através da contabilidade geral, da contabilidade analítica, da gestão orçamental e da análise financeira é armazenada num Data mart. Este Data mart suportará toda a análise financeira, incluindo a análise de desvios orçamentais e de fluxo de capitais, permitindo às empresas ter uma ferramenta de BI, compatível com o SNC, que as ajude na tomada de decisões.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A realização do presente trabalho teve como principais objectivos o desenvolvimento de espumas de poliuretano de um componente com propriedades de resistência à chama superiores (B1 & B2), aplicadas por pistola ou por adaptador/tubo e a optimização de uma espuma de poliuretano de um componente de inverno aplicada por pistola. Todo o trabalho desenvolvido está dividido em dois projectos distintos: i. O primeiro projecto consistiu em desenvolver espumas de um componente com propriedades de resistência à chama (classificadas como B1 e B2 de acordo com a norma alemã DIN 4102), aplicadas por pistola (GWB1 e GWB2) ou por adaptador/tubo (AWB), utilizando polióis poliésteres aromáticos modificados e aditivos retardantes de chama halogenados. Estas espumas deveriam apresentar também propriedades aceitáveis a baixas temperaturas. Após realizar várias formulações foi possível desenvolver uma espuma AWB2 com apenas 3,3% de poliol poliéster no pré-polímero e com propriedades equivalentes às da melhor espuma comercial mesmo a 5/-10 (temperatura da lata/cura da espuma em °C) e também com uma altura de chama de apenas 11 cm. A partir de duas formulações (AWB2) que passaram o Teste B2, foram obtidas também, uma espuma GWB2 e outra GWB1 com propriedades equivalentes às da melhor espuma da concorrência a -10/-10 e a 23/5, respectivamente, embora não tenham sido submetidas ao teste B2 e B1 após as modificações efectuadas. ii. O segundo projecto consistiu em optimizar uma espuma de poliuretano de um componente de inverno aplicada por pistola (GWB3). A espuma inicial tinha problemas de glass bubbles quando esta era dispensada a partir de uma lata cheia, sendo necessário ultrapassar este problema. Este problema foi resolvido diminuindo a razão de GPL/DME através do aumento da percentagem em volume de DME no pré-polímero para 14% no entanto, a estabilidade dimensional piorou um pouco. O reagente FCA 400 foi removido da formulação anterior (6925) numa tentativa de diminuir o custo da espuma, obtendo-se uma espuma aceitável a 23/23 e a 5/5, com uma redução de 4% no custo da produção e com uma redução de 5,5% no custo por litro de espuma dispensada, quando comparada com a sua antecessora. Por último, foi avaliada a influência da concentração de diferentes surfactantes na formulação 6925, verificando-se o melhoramento da estrutura celular da espuma para concentrções mais elevadas de surfactante, sendo este efeito mais notório a temperaturas mais baixas (5/5). Dos surfactantes estudados, o B 8871 mostrou o melhor desempenho a 5/5 com a concentração mais baixa, sendo portanto o melhor surfactante, enquanto o Struksilon 8003 demonstrou ser o menos adequado para esta formulação específica, apresentando piores resultados globais. Pode-se ainda acrescentar que os surfactantes L-5351, L-5352 e B 8526 também não são adequados para esta formulação uma vez que as espumas resultantes apresentam cell collapse, especialmente a 5/5. No caso dos surfactantes L-5351 e L-5352, esta propriedade piora com concentrações mais elevadas. Em cada projecto foram também efectuados testes de benchmark em determinadas espumas comerciais com o principal objectivo de comparar todos os resultados das espumas desenvolvidas, em ambos os projectos, com espumas da concorrência.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is on the problem of short-term hydro scheduling, particularly concerning head-dependent reservoirs under competitive environment. We propose a new nonlinear optimization method to consider hydroelectric power generation as a function of water discharge and also of the head. Head-dependency is considered on short-term hydro scheduling in order to obtain more realistic and feasible results. The proposed method has been applied successfully to solve a case study based on one of the main Portuguese cascaded hydro systems, providing a higher profit at a negligible additional computation time in comparison with a linear optimization method that ignores head-dependency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Medicina Nuclear.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As it is well known, competitive electricity markets require new computing tools for power companies that operate in retail markets in order to enhance the management of its energy resources. During the last years there has been an increase of the renewable penetration into the micro-generation which begins to co-exist with the other existing power generation, giving rise to a new type of consumers. This paper develops a methodology to be applied to the management of the all the aggregators. The aggregator establishes bilateral contracts with its clients where the energy purchased and selling conditions are negotiated not only in terms of prices but also for other conditions that allow more flexibility in the way generation and consumption is addressed. The aggregator agent needs a tool to support the decision making in order to compose and select its customers' portfolio in an optimal way, for a given level of profitability and risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a solution to an highly constrained and non-convex economical dispatch (ED) problem with a meta-heuristic technique named Sensing Cloud Optimization (SCO) is presented. The proposed meta-heuristic is based on a cloud of particles whose central point represents the objective function value and the remaining particles act as sensors "to fill" the search space and "guide" the central particle so it moves into the best direction. To demonstrate its performance, a case study with multi-fuel units and valve- point effects is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Screening programs to detect visual abnormalities in children vary among countries. The aim of this study is to describe experts' perception of best practice guidelines and competency framework for visual screening in children. METHODS: A qualitative focus group technique was applied during the Portuguese national orthoptic congress to obtain the perception of an expert panel of 5 orthoptists and 2 ophthalmologists with experience in visual screening for children (mean age 53.43 years, SD ± 9.40). The panel received in advance a script with the description of three tuning competencies dimensions (instrumental, systemic, and interpersonal) for visual screening. The session was recorded in video and audio. Qualitative data were analyzed using a categorical technique. RESULTS: According to experts' views, six tests (35.29%) have to be included in a visual screening: distance visual acuity test, cover test, bi-prism or 4/6(Δ) prism, fusion, ocular movements, and refraction. Screening should be performed according to the child age before and after 3 years of age (17.65%). The expert panel highlighted the influence of the professional experience in the application of a screening protocol (23.53%). They also showed concern about the false negatives control (23.53%). Instrumental competencies were the most cited (54.09%), followed by interpersonal (29.51%) and systemic (16.4%). CONCLUSIONS: Orthoptists should have professional experience before starting to apply a screening protocol. False negative results are a concern that has to be more thoroughly investigated. The proposed framework focuses on core competencies highlighted by the expert panel. Competencies programs could be important do develop better screening programs.