870 resultados para Combines
Resumo:
Dissertação apresentada para obtenção do Grau de Mestre em Informática, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
O cancro do ovário é a neoplasia ginecológica com maior índice de mortalidade entre a população do sexo feminino apesar dos consideráveis progressos verificados no seu tratamento. Caso o diagnóstico seja precoce, a taxa de sobrevida é bastante elevada, mas o cancro do ovário primário é assintomático e na maioria das vezes só é diagnosticado numa fase avançada da doença, resultando num prognóstico pouco favorável. A falta de especificidade das modalidades terapêuticas associada à heterogeneidade das células oncológicas tem limitado a terapia do cancro do ovário. Alguns dos avanços clínicos mais promissores no tratamento do cancro são as terapêuticas dirigidas a alvos específicos, especialmente proteínas sobre expressas em vários tipos de células epiteliais. Neste contexto, a radioimunoterapia com anticorpos monoclonais tem sido explorada nos carcinomas epiteliais que constituem cerca de 90% das neoplasias do ovário. Esta estratégia terapêutica, que tira partido da ação combinada da radiotoxicidade associada aos radionuclídeos para terapia e dos efeitos citotóxicos do anticorpo, pode constituir uma alternativa às terapias convencionais, potenciando a eficácia do tratamento. Neste artigo são abordados aspetos relacionados com o cancro do ovário, nomeadamente o seu diagnóstico e terapia. São ainda revistos, de forma breve, alguns estudos clínicos em que a eficácia de anticorpos monoclonais marcados com radionuclídeos para terapêutica foi avaliada no cancro do ovário.
Resumo:
In a real world multiagent system, where the agents are faced with partial, incomplete and intrinsically dynamic knowledge, conflicts are inevitable. Frequently, different agents have goals or beliefs that cannot hold simultaneously. Conflict resolution methodologies have to be adopted to overcome such undesirable occurrences. In this paper we investigate the application of distributed belief revision techniques as the support for conflict resolution in the analysis of the validity of the candidate beams to be produced in the CERN particle accelerators. This CERN multiagent system contains a higher hierarchy agent, the Specialist agent, which makes use of meta-knowledge (on how the con- flicting beliefs have been produced by the other agents) in order to detect which beliefs should be abandoned. Upon solving a conflict, the Specialist instructs the involved agents to revise their beliefs accordingly. Conflicts in the problem domain are mapped into conflicting beliefs of the distributed belief revision system, where they can be handled by proven formal methods. This technique builds on well established concepts and combines them in a new way to solve important problems. We find this approach generally applicable in several domains.
Resumo:
In this article, we present the first study on probabilistic tsunami hazard assessment for the Northeast (NE) Atlantic region related to earthquake sources. The methodology combines the probabilistic seismic hazard assessment, tsunami numerical modeling, and statistical approaches. We consider three main tsunamigenic areas, namely the Southwest Iberian Margin, the Gloria, and the Caribbean. For each tsunamigenic zone, we derive the annual recurrence rate for each magnitude range, from Mw 8.0 up to Mw 9.0, with a regular interval, using the Bayesian method, which incorporates seismic information from historical and instrumental catalogs. A numerical code, solving the shallow water equations, is employed to simulate the tsunami propagation and compute near shore wave heights. The probability of exceeding a specific tsunami hazard level during a given time period is calculated using the Poisson distribution. The results are presented in terms of the probability of exceedance of a given tsunami amplitude for 100- and 500-year return periods. The hazard level varies along the NE Atlantic coast, being maximum along the northern segment of the Morocco Atlantic coast, the southern Portuguese coast, and the Spanish coast of the Gulf of Cadiz. We find that the probability that a maximum wave height exceeds 1 m somewhere in the NE Atlantic region reaches 60 and 100 % for 100- and 500-year return periods, respectively. These probability values decrease, respectively, to about 15 and 50 % when considering the exceedance threshold of 5 m for the same return periods of 100 and 500 years.
Resumo:
An adaptive antenna array combines the signal of each element, using some constraints to produce the radiation pattern of the antenna, while maximizing the performance of the system. Direction of arrival (DOA) algorithms are applied to determine the directions of impinging signals, whereas beamforming techniques are employed to determine the appropriate weights for the array elements, to create the desired pattern. In this paper, a detailed analysis of both categories of algorithms is made, when a planar antenna array is used. Several simulation results show that it is possible to point an antenna array in a desired direction based on the DOA estimation and on the beamforming algorithms. A comparison of the performance in terms of runtime and accuracy of the used algorithms is made. These characteristics are dependent on the SNR of the incoming signal.
Resumo:
The need for better adaptation of networks to transported flows has led to research on new approaches such as content aware networks and network aware applications. In parallel, recent developments of multimedia and content oriented services and applications such as IPTV, video streaming, video on demand, and Internet TV reinforced interest in multicast technologies. IP multicast has not been widely deployed due to interdomain and QoS support problems; therefore, alternative solutions have been investigated. This article proposes a management driven hybrid multicast solution that is multi-domain and media oriented, and combines overlay multicast, IP multicast, and P2P. The architecture is developed in a content aware network and network aware application environment, based on light network virtualization. The multicast trees can be seen as parallel virtual content aware networks, spanning a single or multiple IP domains, customized to the type of content to be transported while fulfilling the quality of service requirements of the service provider.
Resumo:
The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed generators, storage units, demand response and EVs. The large number of resources causes more complexity in the energy resource management, taking several hours to reach the optimal solution which requires a quick solution for the next day. Therefore, it is necessary to use adequate optimization techniques to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines simulated annealing (SA) and ant colony optimization (ACO) techniques. The case study concerns different EVs penetration levels. Comparisons with a previous SA approach and a deterministic technique are also presented. For 2000 EVs scenario, the proposed hybrid approach found a solution better than the previous SA version, resulting in a cost reduction of 1.94%. For this scenario, the proposed approach is approximately 94 times faster than the deterministic approach.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. This paper presents a methodology to provide decision support to electricity market negotiating players. This model allows integrating different strategic approaches for electricity market negotiations, and choosing the most appropriate one at each time, for each different negotiation context. This methodology is integrated in ALBidS (Adaptive Learning strategic Bidding System) – a multiagent system that provides decision support to MASCEM's negotiating agents so that they can properly achieve their goals. ALBidS uses artificial intelligence methodologies and data analysis algorithms to provide effective adaptive learning capabilities to such negotiating entities. The main contribution is provided by a methodology that combines several distinct strategies to build actions proposals, so that the best can be chosen at each time, depending on the context and simulation circumstances. The choosing process includes reinforcement learning algorithms, a mechanism for negotiating contexts analysis, a mechanism for the management of the efficiency/effectiveness balance of the system, and a mechanism for competitor players' profiles definition.
Resumo:
The proper disposal of the several types of wastes produced in industrial activities increases production costs. As a consequence, it is common to develop strategies to reuse these wastes in the same process and in different processes or to transform them for use in other processes. This work combines the needs for new synthesis methods of nanomaterials and the reduction of production cost using wastes from citrine juice (orange, lime, lemon and mandarin) to produce a new added value product, green zero-valent iron nanoparticles that can be used in several applications, including environmental remediation. The results indicate that extracts of the tested fruit wastes (peel, albedo and pulp fractions) can be used to produce zero-valent iron nanoparticles (nZVIs). This shows that these wastes can be an added value product. The resulting nZVIs had sizes ranging from 3 up to 300 nm and distinct reactivities (pulp > peel > albedo extracts). All the studied nanoparticles did not present a significant agglomeration/settling tendency when compared to similar nanoparticles, which indicates that they remain in suspension and retain their reactivity.
Resumo:
Since long ago cellulosic lyotropic liquid crystals were thought as potential materials to produce fibers competitive with spidersilk or Kevlar, yet the processing of high modulus materials from cellulose-based precursors was hampered by their complex rheological behavior. In this work, by using the Rheo-NMR technique, which combines deuterium NMR with rheology, we investigate the high shear rate regimes that may be of interest to the industrial processing of these materials. Whereas the low shear rate regimes were already investigated by this technique in different works [1-4], the high shear rates range is still lacking a detailed study. This work focuses on the orientational order in the system both under shear and subsequent relaxation process arising after shear cessation through the analysis of deuterium spectra from the deuterated solvent water. At the analyzed shear rates the cholesteric order is suppressed and a flow-aligned nematic is observed which for the higher shear rates develops after certain time periodic perturbations that transiently annihilate the order in the system. During relaxation the flow aligned nematic starts losing order due to the onset of the cholesteric helices leading to a period of very low order where cholesteric helices with different orientations are forming from the aligned nematic, followed in the final stage by an increase in order at long relaxation times corresponding to the development of aligned cholesteric domains. This study sheds light on the complex rheological behavior of chiral nematic cellulose-based systems and opens ways to improve its processing. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.
Resumo:
This paper presents a new parallel implementation of a previously hyperspectral coded aperture (HYCA) algorithm for compressive sensing on graphics processing units (GPUs). HYCA method combines the ideas of spectral unmixing and compressive sensing exploiting the high spatial correlation that can be observed in the data and the generally low number of endmembers needed in order to explain the data. The proposed implementation exploits the GPU architecture at low level, thus taking full advantage of the computational power of GPUs using shared memory and coalesced accesses to memory. The proposed algorithm is evaluated not only in terms of reconstruction error but also in terms of computational performance using two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN. Experimental results using real data reveals signficant speedups up with regards to serial implementation.
Resumo:
Nowadays the incredible grow of mobile devices market led to the need for location-aware applications. However, sometimes person location is difficult to obtain, since most of these devices only have a GPS (Global Positioning System) chip to retrieve location. In order to suppress this limitation and to provide location everywhere (even where a structured environment doesn’t exist) a wearable inertial navigation system is proposed, which is a convenient way to track people in situations where other localization systems fail. The system combines pedestrian dead reckoning with GPS, using widely available, low-cost and low-power hardware components. The system innovation is the information fusion and the use of probabilistic methods to learn persons gait behavior to correct, in real-time, the drift errors given by the sensors.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologias da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Enterprise and Work Innovation Studies,6,IET, pp.9-51