998 resultados para software evolution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The acquisition of a Myocardial Perfusion image (MPI) is of great importance for the diagnosis of the coronary artery disease, since it allows to evaluate which areas of the heart aren’t being properly perfused, in rest and stress situations. This exam is greatly influenced by photon attenuation which creates image artifacts and affects quantification. The acquisition of a Computerized Tomography (CT) image makes it possible to get an atomic images which can be used to perform high-quality attenuation corrections of the radiopharmaceutical distribution, in the MPI image. Studies show that by using hybrid imaging to perform diagnosis of the coronary artery disease, there is an increase on the specificity when evaluating the perfusion of the right coronary artery (RCA). Using an iterative algorithm with a resolution recovery software for the reconstruction, which balances the image quality, the administered activity and the scanning time, we aim to evaluate the influence of attenuation correction on the MPI image and the outcome in perfusion quantification and imaging quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As aplicações de Gestão ou Faturação são uma presença indispensável hoje em dia. Tendo o seu início nas aplicações “MS-DOS” em modo de texto, estas aplicações acompanharam a evolução dos sistemas operativos adotando um ambiente gráfico de forma natural. Se há poucos anos apenas as empresas com volumes de negócio significativo possuíam software de faturação, este foi sendo adotado por cada vez mais empresas e pequenos negócios. As alterações legislativas introduzidas desde 2011 conduziram a uma adoção generalizada por parte de pequenas e microempresas. O mercado de aplicações de gestão está saturado pelos grandes produtores de software nacionais: Primavera, Sage, etc. Estas aplicações, tendo sido construídas para PMEs (Pequenas e Médias Empresas) e mesmo grandes empresas, são excessivamente complexas e onerosas para muito pequenas e microempresas. O Modelo de negócio destes produtores de software é primordialmente a venda de Licenças e contratos de Manutenção, nalguns casos através de redes de Agentes. Este projeto teve como objetivo o desenvolvimento de uma Aplicação de Faturação, de baixo custo, simples e cross-platform para ser comercializada em regime de aluguer em Pequenas e Micro Empresas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uma interface cérebro-computador (BCI) não é mais do que um dispositivo que lê e analisa ondas cerebrais e as converte em ações sobre um computador. Com a evolução das BCI e a possibilidade de acesso às mesmas por parte do público começou a ser possível o uso de BCIs para fins lúdicos. Nesse sentido nesta tese foi feito um estudo sobre interfaces cérebro-computador, o que são, que tipos de BCI existem, o seu uso para entretenimento, as suas limitações e o futuro deste tipo de interfaces. Foi ainda criado um software lúdico controlado por BCI (Emotiv EPOC) que é composto por um jogo tipo Pong e um reprodutor de música. O reprodutor de música através de BCI classifica e recomenda músicas ao utilizador. Com esta tese foi possível chegar à conclusão que é possível utilizar BCI para entretenimento (jogos e recomendação de conteúdos) apesar de se ter verificado que para jogos os dispositivos tradicionais de controlo (rato e teclado) ainda têm uma precisão muito superior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Corporate world is becoming more and more competitive. This leads organisations to adapt to this reality, by adopting more efficient processes, which result in a decrease in cost as well as an increase of product quality. One of these processes consists in making proposals to clients, which necessarily include a cost estimation of the project. This estimation is the main focus of this project. In particular, one of the goals is to evaluate which estimation models fit the Altran Portugal software factory the most, the organization where the fieldwork of this thesis will be carried out. There is no broad agreement about which is the type of estimation model more suitable to be used in software projects. Concerning contexts where there is plenty of objective information available to be used as input to an estimation model, model-based methods usually yield better results than the expert judgment. However, what happens more frequently is not having this volume and quality of information, which has a negative impact in the model-based methods performance, favouring the usage of expert judgement. In practice, most organisations use expert judgment, making themselves dependent on the expert. A common problem found is that the performance of the expert’s estimation depends on his previous experience with identical projects. This means that when new types of projects arrive, the estimation will have an unpredictable accuracy. Moreover, different experts will make different estimates, based on their individual experience. As a result, the company will not directly attain a continuous growing knowledge about how the estimate should be carried. Estimation models depend on the input information collected from previous projects, the size of the project database and the resources available. Altran currently does not store the input information from previous projects in a systematic way. It has a small project database and a team of experts. Our work is targeted to companies that operate in similar contexts. We start by gathering information from the organisation in order to identify which estimation approaches can be applied considering the organization’s context. A gap analysis is used to understand what type of information the company would have to collect so that other approaches would become available. Based on our assessment, in our opinion, expert judgment is the most adequate approach for Altran Portugal, in the current context. We analysed past development and evolution projects from Altran Portugal and assessed their estimates. This resulted in the identification of common estimation deviations, errors, and patterns, which lead to the proposal of metrics to help estimators produce estimates leveraging past projects quantitative and qualitative information in a convenient way. This dissertation aims to contribute to more realistic estimates, by identifying shortcomings in the current estimation process and supporting the self-improvement of the process, by gathering as much relevant information as possible from each finished project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado em Engenharia de Sistemas

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes and validates a model-driven software engineering technique for spreadsheets. The technique that we envision builds on the embedding of spreadsheet models under a widely used spreadsheet system. This means that we enable the creation and evolution of spreadsheet models under a spreadsheet system. More precisely, we embed ClassSheets, a visual language with a syntax similar to the one offered by common spreadsheets, that was created with the aim of specifying spreadsheets. Our embedding allows models and their conforming instances to be developed under the same environment. In practice, this convenient environment enhances evolution steps at the model level while the corresponding instance is automatically co-evolved.Finally,wehave designed and conducted an empirical study with human users in order to assess our technique in production environments. The results of this study are promising and suggest that productivity gains are realizable under our model-driven spreadsheet development setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Hypertrophic Cardiomyopathy (HCM) is a genetically heterogeneous disease. One specific mutation in the MYBPC3 gene is highly prevalent in center east of France giving an opportunity to define the clinical profile of this specific mutation. METHODS: HCM probands were screened for mutation in the MYH7, MYBPC3, TNNT2 and TNNI3 genes. Carriers of the MYBPC3 IVS20-2A>G mutation were genotyped with 8 microsatellites flanking this gene. The age of this MYBPC3 mutation was inferred with the software ESTIAGE. The age at first symptom, diagnosis, first complication, first severe complication and the rate of sudden death were compared between carriers of the IVS20-2 mutation (group A) and carriers of all other mutations (group B) using time to event curves and log rank test. RESULTS: Out of 107 HCM probands, 45 had a single heterozygous mutation in one of the 4 tested sarcomeric genes including 9 patients with the MYBPC3 IVS20-2A>G mutation. The IVS20-2 mutation in these 9 patients and their 25 mutation carrier relatives was embedded in a common haplotype defined after genotyping 4 polymorphic markers on each side of the MYBPC3 gene. This result supports the hypothesis of a common ancestor. Furthermore, we evaluated that the mutation occurred about 47 generations ago, approximately at the 10th century.We then compared the clinical profile of the IVS20-2 mutation carriers (group A) and the carriers of all other mutations (group B). Age at onset of symptoms was similar in the 34 group A cases and the 73 group B cases but group A cases were diagnosed on average 15 years later (log rank test p = 0.022). Age of first complication and first severe complication was delayed in group A vs group B cases but the prevalence of sudden death and age at death was similar in both groups. CONCLUSION: A founder mutation arising at about the 10th century in the MYBPC3 gene accounts for 8.4% of all HCM in center east France and results in a cardiomyopathy starting late and evolving slowly but with an apparent risk of sudden death similar to other sarcomeric mutations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Percutaneous transluminal renal angioplasty (PTRA) is an invasive technique that is costly and involves the risk of complications and renal failure. The ability of PTRA to reduce the administration of antihypertensive drugs has been demonstrated. A potentially greater benefit, which nevertheless remains to be proven, is the deferral of the need for chronic dialysis. The aim of the study (ANPARIA) was to assess the appropriateness of PTRA to impact on the evolution of renal function. A standardized expert panel method was used to assess the appropriateness of medical treatment alone or medical treatment with revascularization in various clinical situations. The choice of revascularization by either PTRA or surgery was examined for each clinical situation. Analysis was based on a detailed literature review and on systematically elicited expert opinion, which were obtained during a two-round modified Delphi process. The study provides detailed responses on the appropriateness of PTRA for 1848 distinct clinical scenarios. Depending on the major clinical presentation, appropriateness of revascularization varied from 32% to 75% for individual scenarios (overal 48%). Uncertainty as to revascularization was 41% overall. When revascularization was appropriate, PTRA was favored over surgery in 94% of the scenarios, except in certain cases of aortic atheroma where sugery was the preferred choice. Kidney size [7 cm, absence of coexisting disease, acute renal failure, a high degree of stenosis (C70%), and absence of multiple arteries were identified as predictive variables of favorable appropriateness ratings. Situations such as cardiac failure with pulmonary edema or acute thrombosis of the renal artery were defined as indications for PTRA. This study identified clinical situations in which PTRA or surgery are appropriate for renal artery disease. We built a decision tree which can be used via Internet: the ANPARIA software (http://www.chu-clermontferrand.fr/anparia/). In numerous clinical situations uncertainty remains as to whether PTRA prevents deterioration of renal function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Like numerous torrents in mountainous regions, the Illgraben creek (canton of Wallis, SW Switzerland) produces almost every year several debris flows. The total area of the active catchment is only 4.7 km², but large events ranging from 50'000 to 400'000 m³ are common (Zimmermann 2000). Consequently, the pathway of the main channel often changes suddenly. One single event can for instance fill the whole river bed and dig new several-meters-deep channels somewhere else (Bardou et al. 2003). The quantification of both, the rhythm and the magnitude of these changes, is very important to assess the variability of the bed's cross section and long profile. These parameters are indispensable for numerical modelling, as they should be considered as initial conditions. To monitor the channel evolution an Optech ILRIS 3D terrestrial laser scanner (LIDAR) was used. LIDAR permits to make a complete high precision 3D model of the channel and its surroundings by scanning it from different view points. The 3D data are treated and interpreted with the software Polyworks from Innovmetric Software Inc. Sequential 3D models allow for the determination of the variation in the bed's cross section and long profile. These data will afterwards be used to quantify the erosion and the deposition in the torrent reaches. To complete the chronological evolution of the landforms, precise digital terrain models, obtained by high resolution photogrammetry based on old aerial photographs, will be used. A 500 m long section of the Illgraben channel was scanned on 18th of August 2005 and on 7th of April 2006. These two data sets permit identifying the changes of the channel that occurred during the winter season. An upcoming scanning campaign in September 2006 will allow for the determination of the changes during this summer. Preliminary results show huge variations in the pathway of the Illgraben channel, as well as important vertical and lateral erosion of the river bed. Here we present the results of a river bank on the left (north-western) flank of the channel (Figure 1). For the August 2005 model the scans from 3 viewpoints were superposed, whereas the April 2006 3D image was obtained by combining 5 separate scans. The bank was eroded. The bank got eroded essentially on its left part (up to 6.3 m), where it is hit by the river and the debris flows (Figures 2 and 3). A debris cone has also formed (Figure 3), which suggests that a part of the bank erosion is due to shallow landslides. They probably occur when the river erosion creates an undercut slope. These geometrical data allow for the monitoring of the alluvial dynamics (i.e. aggradation and degradation) on different time scales and the influence of debris flows occurrence on these changes. Finally, the resistance against erosion of the bed's cross section and long profile will be analysed to assess the variability of these two key parameters. This information may then be used in debris flow simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introdução Hoje em dia, o conceito de ontologia (Especificação explícita de uma conceptualização [Gruber, 1993]) é um conceito chave em sistemas baseados em conhecimento em geral e na Web Semântica em particular. Entretanto, os agentes de software nem sempre concordam com a mesma conceptualização, justificando assim a existência de diversas ontologias, mesmo que tratando o mesmo domínio de discurso. Para resolver/minimizar o problema de interoperabilidade entre estes agentes, o mapeamento de ontologias provou ser uma boa solução. O mapeamento de ontologias é o processo onde são especificadas relações semânticas entre entidades da ontologia origem e destino ao nível conceptual, e que por sua vez podem ser utilizados para transformar instâncias baseadas na ontologia origem em instâncias baseadas na ontologia destino. Motivação Num ambiente dinâmico como a Web Semântica, os agentes alteram não só os seus dados mas também a sua estrutura e semântica (ontologias). Este processo, denominado evolução de ontologias, pode ser definido como uma adaptação temporal da ontologia através de alterações que surgem no domínio ou nos objectivos da própria ontologia, e da gestão consistente dessas alterações [Stojanovic, 2004], podendo por vezes deixar o documento de mapeamento inconsistente. Em ambientes heterogéneos onde a interoperabilidade entre sistemas depende do documento de mapeamento, este deve reflectir as alterações efectuadas nas ontologias, existindo neste caso duas soluções: (i) gerar um novo documento de mapeamento (processo exigente em termos de tempo e recursos computacionais) ou (ii) adaptar o documento de mapeamento, corrigindo relações semânticas inválidas e criar novas relações se forem necessárias (processo menos existente em termos de tempo e recursos computacionais, mas muito dependente da informação sobre as alterações efectuadas). O principal objectivo deste trabalho é a análise, especificação e desenvolvimento do processo de evolução do documento de mapeamento de forma a reflectir as alterações efectuadas durante o processo de evolução de ontologias. Contexto Este trabalho foi desenvolvido no contexto do MAFRA Toolkit1. O MAFRA (MApping FRAmework) Toolkit é uma aplicação desenvolvida no GECAD2 que permite a especificação declarativa de relações semânticas entre entidades de uma ontologia origem e outra de destino, utilizando os seguintes componentes principais: Concept Bridge – Representa uma relação semântica entre um conceito de origem e um de destino; Property Bridge – Representa uma relação semântica entre uma ou mais propriedades de origem e uma ou mais propriedades de destino; Service – São aplicados às Semantic Bridges (Property e Concept Bridges) definindo como as instâncias origem devem ser transformadas em instâncias de destino. Estes conceitos estão especificados na ontologia SBO (Semantic Bridge Ontology) [Silva, 2004]. No contexto deste trabalho, um documento de mapeamento é uma instanciação do SBO, contendo relações semânticas entre entidades da ontologia de origem e da ontologia de destino. Processo de evolução do mapeamento O processo de evolução de mapeamento é o processo onde as entidades do documento de mapeamento são adaptadas, reflectindo eventuais alterações nas ontologias mapeadas, tentando o quanto possível preservar a semântica das relações semântica especificadas. Se as ontologias origem e/ou destino sofrerem alterações, algumas relações semânticas podem tornar-se inválidas, ou novas relações serão necessárias, sendo por isso este processo composto por dois sub-processos: (i) correcção de relações semânticas e (ii) processamento de novas entidades das ontologias. O processamento de novas entidades das ontologias requer a descoberta e cálculo de semelhanças entre entidades e a especificação de relações de acordo com a ontologia/linguagem SBO. Estas fases (“similarity measure” e “semantic bridging”) são implementadas no MAFRA Toolkit, sendo o processo (semi-) automático de mapeamento de ontologias descrito em [Silva, 2004]. O processo de correcção de entidades SBO inválidas requer um bom conhecimento da ontologia/linguagem SBO, das suas entidades e relações, e de todas as suas restrições, i.e. da sua estrutura e semântica. Este procedimento consiste em (i) identificar as entidades SBO inválidas, (ii) a causa da sua invalidez e (iii) corrigi-las da melhor forma possível. Nesta fase foi utilizada informação vinda do processo de evolução das ontologias com o objectivo de melhorar a qualidade de todo o processo. Conclusões Para além do processo de evolução do mapeamento desenvolvido, um dos pontos mais importantes deste trabalho foi a aquisição de um conhecimento mais profundo sobre ontologias, processo de evolução de ontologias, mapeamento etc., expansão dos horizontes de conhecimento, adquirindo ainda mais a consciência da complexidade do problema em questão, o que permite antever e perspectivar novos desafios para o futuro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. This paper studies the situation of research on Catalan literature between 1976 and 2003 by carrying out a bibliometric and social network analysis of PhD theses defended in Spain. It has a dual aim: to present interesting results for the discipline and to demonstrate the methodological efficacy of scientometric tools in the humanities, a field in which they are often neglected due to the difficulty of gathering data. Method. The analysis was performed on 151 records obtained from the TESEO database of PhD theses. The quantitative estimates include the use of the UCINET and Pajek software packages. Authority control was performed on the records. Analysis. Descriptive statistics were used to describe the sample and the distribution of responses to each question. Sex differences on key questions were analysed using the Chi-squared test. Results. The value of the figures obtained is demonstrated. The information obtained on the topic and the periods studied in the theses, and on the actors involved (doctoral students, thesis supervisors and members of defence committees), provide important insights into the mechanisms of humanities disciplines. The main research tendencies of Catalan literature are identified. It is observed that the composition of members of the thesis defence committees follows Lotka's Law. Conclusions. Bibliometric analysis and social network analysis may be especially useful in the humanities and in other fields which are lacking in scientometric data in comparison with the experimental sciences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of adaptive antenna techniques to fixed-architecture base stations has been shown to offer wide-ranging benefits, including interference rejection capabilities or increased coverage and spectral efficiency.Unfortunately, the actual implementation ofthese techniques to mobile communication scenarios has traditionally been set back by two fundamental reasons. On one hand, the lack of flexibility of current transceiver architectures does not allow for the introduction of advanced add-on functionalities. On the other hand, theoften oversimplified models for the spatiotemporal characteristics of the radio communications channel generally give rise toperformance predictions that are, in practice, too optimistic. The advent of software radio architectures represents a big step toward theintroduction of advanced receive/transmitcapabilities. Thanks to their inherent flexibilityand robustness, software radio architecturesare the appropriate enabling technology for theimplementation of array processing techniques.Moreover, given the exponential progression ofcommunication standards in coexistence andtheir constant evolution, software reconfigurabilitywill probably soon become the only costefficientalternative for the transceiverupgrade. This article analyzes the requirementsfor the introduction of software radio techniquesand array processing architectures inmultistandard scenarios. It basically summarizesthe conclusions and results obtained withinthe ACTS project SUNBEAM,1 proposingalgorithms and analyzing the feasibility ofimplementation of innovative and softwarereconfigurablearray processing architectures inmultistandard settings.