97 resultados para Information Search
Resumo:
Egéria was a 4th century A.D. nun who undertook a long journey from her homeland, the roman province of Gallaecia, to the Near East. Her itinerary, which described the segment between Mount Sinai and Constantinople, revealed the enthusiasm which graced her original decision to embark on the journey, and the determination with which she faced every stage. She kept a Diary throughout her journey. Probably, it constitutes one of the first known Travel Diaries. Her reports describe her observations and the splendor of the Christian cult sites. Her text is affectionately dedicated to her fellow nuns that remained in the West of the Empire, keeping their uniting bond strong. Our study aims to search all references to visited sites in Egéria’s text, as the information contained therein serves as a precious descriptor of their locations, spatial organization and environment. Egéria visits unique, historical sites, which will influence her writing style. She is, in fact, a pilgrim to a recently created historical site, The Holy Land. Egéria lived during a fundamental historical and artistic Framework, that of the architectural forms of expression of the First Christianity. Her words can be translated into images, as a partitions script, a visualization of lights and ambiences, and a testimonial of places that no longer exist. We hope to see, in Egéria’s written work, the images she observed, for her words are images. We expect a complementary approach among the research methods given by Archeology, composed by Architecture and explained by the sensible text from Egéria’s journey - a religious and artistic journey written from a powerful feminine point of view.
Resumo:
This paper studies the effects of monetary policy on mutual fund risk taking using a sample of Portuguese fixed-income mutual funds in the 2000-2012 period. Firstly I estimate time-varying measures of risk exposure (betas) for the individual funds, for the benchmark portfolio, as well as for a representative equally-weighted portfolio, through 24-month rolling regressions of a two-factor model with two systematic risk factors: interest rate risk (TERM) and default risk (DEF). Next, in the second phase, using the estimated betas, I try to understand what portion of the risk exposure is in excess of the benchmark (active risk) and how it relates to monetary policy proxies (one-month rate, Taylor residual, real rate and first principal component of a cross-section of government yields and rates). Using this methodology, I provide empirical evidence that Portuguese fixed-income mutual funds respond to accommodative monetary policy by significantly increasing exposure, in excess of their benchmarks, to default risk rate and slightly to interest risk rate as well. I also find that the increase in funds’ risk exposure to gain a boost in return (search-for-yield) is more pronounced following the 2007-2009 global financial crisis, indicating that the current historic low interest rates may incentivize excessive risk taking. My results suggest that monetary policy affects the risk appetite of non-bank financial intermediaries.
Resumo:
This work project focuses on developing new approaches which enhance Portuguese exports towards a defined German industry sector within the information technology and electronics fields. Firstly and foremost, information was collected and a set of expert and top managers’ interviews were performed in order to acknowledge the demand of the German market while identifying compatible Portuguese supply capabilities. Among the main findings, Industry 4.0 presents itself as a valuable opportunity in the German market for Portuguese medium sized companies in the embedded systems area of expertise for machinery and equipment companies. In order to achieve the purpose of the work project, an embedded systems platform targeting machinery and equipment companies was suggested as well as it was developed several recommendations on how to implement it. An alternative approach for this platform was also considered within the German market namely the eHealth sector having the purpose of enhancing the current healthcare service provision.
Resumo:
Combinatorial Optimization Problems occur in a wide variety of contexts and generally are NP-hard problems. At a corporate level solving this problems is of great importance since they contribute to the optimization of operational costs. In this thesis we propose to solve the Public Transport Bus Assignment problem considering an heterogeneous fleet and line exchanges, a variant of the Multi-Depot Vehicle Scheduling Problem in which additional constraints are enforced to model a real life scenario. The number of constraints involved and the large number of variables makes impracticable solving to optimality using complete search techniques. Therefore, we explore metaheuristics, that sacrifice optimality to produce solutions in feasible time. More concretely, we focus on the development of algorithms based on a sophisticated metaheuristic, Ant-Colony Optimization (ACO), which is based on a stochastic learning mechanism. For complex problems with a considerable number of constraints, sophisticated metaheuristics may fail to produce quality solutions in a reasonable amount of time. Thus, we developed parallel shared-memory (SM) synchronous ACO algorithms, however, synchronism originates the straggler problem. Therefore, we proposed three SM asynchronous algorithms that break the original algorithm semantics and differ on the degree of concurrency allowed while manipulating the learned information. Our results show that our sequential ACO algorithms produced better solutions than a Restarts metaheuristic, the ACO algorithms were able to learn and better solutions were achieved by increasing the amount of cooperation (number of search agents). Regarding parallel algorithms, our asynchronous ACO algorithms outperformed synchronous ones in terms of speedup and solution quality, achieving speedups of 17.6x. The cooperation scheme imposed by asynchronism also achieved a better learning rate than the original one.
Resumo:
Remote sensing - the acquisition of information about an object or phenomenon without making physical contact with the object - is applied in a multitude of different areas, ranging from agriculture, forestry, cartography, hydrology, geology, meteorology, aerial traffic control, among many others. Regarding agriculture, an example of application of this information is regarding crop detection, to monitor existing crops easily and help in the region’s strategic planning. In any of these areas, there is always an ongoing search for better methods that allow us to obtain better results. For over forty years, the Landsat program has utilized satellites to collect spectral information from Earth’s surface, creating a historical archive unmatched in quality, detail, coverage, and length. The most recent one was launched on February 11, 2013, having a number of improvements regarding its predecessors. This project aims to compare classification methods in Portugal’s Ribatejo region, specifically regarding crop detection. The state of the art algorithms will be used in this region and their performance will be analyzed.
Resumo:
Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.
Resumo:
In the early nineties, Mark Weiser wrote a series of seminal papers that introduced the concept of Ubiquitous Computing. According to Weiser, computers require too much attention from the user, drawing his focus from the tasks at hand. Instead of being the centre of attention, computers should be so natural that they would vanish into the human environment. Computers become not only truly pervasive but also effectively invisible and unobtrusive to the user. This requires not only for smaller, cheaper and low power consumption computers, but also for equally convenient display solutions that can be harmoniously integrated into our surroundings. With the advent of Printed Electronics, new ways to link the physical and the digital worlds became available. By combining common printing techniques such as inkjet printing with electro-optical functional inks, it is starting to be possible not only to mass-produce extremely thin, flexible and cost effective electronic circuits but also to introduce electronic functionalities into products where it was previously unavailable. Indeed, Printed Electronics is enabling the creation of novel sensing and display elements for interactive devices, free of form factor. At the same time, the rise in the availability and affordability of digital fabrication technologies, namely of 3D printers, to the average consumer is fostering a new industrial (digital) revolution and the democratisation of innovation. Nowadays, end-users are already able to custom design and manufacture on demand their own physical products, according to their own needs. In the future, they will be able to fabricate interactive digital devices with user-specific form and functionality from the comfort of their homes. This thesis explores how task-specific, low computation, interactive devices capable of presenting dynamic visual information can be created using Printed Electronics technologies, whilst following an approach based on the ideals behind Personal Fabrication. Focus is given on the use of printed electrochromic displays as a medium for delivering dynamic digital information. According to the architecture of the displays, several approaches are highlighted and categorised. Furthermore, a pictorial computation model based on extended cellular automata principles is used to programme dynamic simulation models into matrix-based electrochromic displays. Envisaged applications include the modelling of physical, chemical, biological, and environmental phenomena.
Resumo:
Equity research report
Resumo:
This thesis examines the effects of macroeconomic factors on inflation level and volatility in the Euro Area to improve the accuracy of inflation forecasts with econometric modelling. Inflation aggregates for the EU as well as inflation levels of selected countries are analysed, and the difference between these inflation estimates and forecasts are documented. The research proposes alternative models depending on the focus and the scope of inflation forecasts. I find that models with a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) in mean process have better explanatory power for inflation variance compared to the regular GARCH models. The significant coefficients are different in EU countries in comparison to the aggregate EU-wide forecast of inflation. The presence of more pronounced GARCH components in certain countries with more stressed economies indicates that inflation volatility in these countries are likely to occur as a result of the stressed economy. In addition, other economies in the Euro Area are found to exhibit a relatively stable variance of inflation over time. Therefore, when analysing EU inflation one have to take into consideration the large differences on country level and focus on those one by one.
Resumo:
This paper attempts to prove that in the years 1735 to 1755 Venice was the birthplace and cradle of Modern architectural theory, generating a major crisis in classical architecture traditionally based on the Vitruvian assumption that it imitates early wooden structures in stone or in marble. According to its rationalist critics such as the Venetian Observant Franciscan friar and architectural theorist Carlo Lodoli (1690-1761) and his nineteenth-century followers, classical architecture is singularly deceptive and not true to the nature of materials, in other words, dishonest and fallacious. This questioning did not emanate from practising architects, but from Lodoli himself– a philosopher and educator of the Venetian patriciate – who had not been trained as an architect. The roots of this crisis lay in a new approach to architecture stemming from the new rationalist philosophy of the Enlightenment age with its emphasis on reason and universal criticism.
Resumo:
RESUMO - Num contexto de escassez de recursos, agravado pelo aumento da procura de cuidados de saúde e pelo custo do imperativo tecnológico, muitas vezes erroneamente confundido com imperativo ético, a procura da eficiência é cada vez mais relevante. Para que esta se atinja e mantenha é preciso um conhecimento profundo dos efeitos das medidas eficazes, a verdadeira efectividade, numa perspectiva sistémica, o que implica uma partilha de saberes e acções concertadas entre médicos e gestores. A lógica económica não substitui o juízo clínico, mas reforça a necessidade de compatibilizar as intervenções úteis com o custo pessoal e social exigido. Precisa-se de boa informação que gere conhecimento e do insubstituível bom senso.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
Search is now going beyond looking for factual information, and people wish to search for the opinions of others to help them in their own decision-making. Sentiment expressions or opinion expressions are used by users to express their opinion and embody important pieces of information, particularly in online commerce. The main problem that the present dissertation addresses is how to model text to find meaningful words that express a sentiment. In this context, I investigate the viability of automatically generating a sentiment lexicon for opinion retrieval and sentiment classification applications. For this research objective we propose to capture sentiment words that are derived from online users’ reviews. In this approach, we tackle a major challenge in sentiment analysis which is the detection of words that express subjective preference and domain-specific sentiment words such as jargon. To this aim we present a fully generative method that automatically learns a domain-specific lexicon and is fully independent of external sources. Sentiment lexicons can be applied in a broad set of applications, however popular recommendation algorithms have somehow been disconnected from sentiment analysis. Therefore, we present a study that explores the viability of applying sentiment analysis techniques to infer ratings in a recommendation algorithm. Furthermore, entities’ reputation is intrinsically associated with sentiment words that have a positive or negative relation with those entities. Hence, is provided a study that observes the viability of using a domain-specific lexicon to compute entities reputation. Finally, a recommendation system algorithm is improved with the use of sentiment-based ratings and entities reputation.
Resumo:
RESUMO: Introdução - A utilização de células e das suas propriedades para o tratamento das doenças cardiovasculares, é uma promessa para o futuro e talvez a única forma de ultrapassar algumas das insuficiências das terapêuticas atuais. A via de entrega das células mais utilizada na investigação tem sido a intracoronária, ganhando a microcirculação especial relevância, por ser onde ocorre a primeira interação com o tecido nativo. As células estaminais mesenquimais (CEM) têm propriedades que as tornam particularmente aptas para a Terapia Celular, mas as suas dimensões, superiores ao diâmetro dos capilares, tem motivado controvérsia quanto à sua entrega intracoronária. A cardiologia de intervenção tem atualmente técnicas que permitem a avaliação em tempo real e in vivo do estado da microcirculação coronária. A determinação do índice da resistência da microcirculação (IRM) fornece informação sobre a circulação dos pequenos vasos, de forma independente da circulação coronária e do estado hemodinâmico, mas a aplicabilidade clínica deste conhecimento encontra-se ainda por definir. Objectivos Esclarecer o potencial do IRM no estudo dos efeitos do transplante de CEM por via intracoronária. População e Métodos . Estudo pré-clínico com modelo animal (suíno) desenvolvido em 3 fases. Na Primeira Fase foram utilizados 8 animais saudáveis para estudar e validar a técnica de determinação de estudo da microcirculação. Efetuou-se a determinação do IRM com duas doses diferentes de papaverina para a indução da resposta hiperémica máxima (5 e 10 mg) e após a disfunção da microcirculação com injeção intracoronária de microesferas de embozene com 40 μm de diâmetro. Na Segunda Fase foram utilizados 18 animais saudáveis, randomizados em grupo controlo e grupo recetor de 30 x 106 CEM por via intracoronária. Foram avaliados de forma cega o IRM, a pressão aórtica, o fluxo coronário epicárdico e a ocorrência de alterações electrocardiográficas. Na Terceira Fase foram utilizados 18 animais, com enfarte agudo do miocárdio provocado (EAM), randomizados em grupo controlo, grupo recetor de CEM expandidas de forma convencional e grupo recetor de CEM expandidas com metodologia inovadora e de menores dimensões. Foi realizada uma exploração da dose/efeito com infusão faseada de 10 x 106, 15 x 106 e 20 x 106 CEM, com determinação do IRM, da pressão aórtica, do fluxo coronário epicárdico e da ocorrência de alterações eletrocardiográficas. Quatro semanas após a entrega das células foi novamente avaliado o IRM e foi efetuado o estudo anatomopatológico dos animais na procura de evidência de neoangiogénese e de regeneração miocárdica, ou de um efeito positivo da resposta reparadora após o enfarte. Resultados Nas 3 fases todos os animais mantiveram estabilidade hemodinâmica e eletrocardiográfica, com exceção da elevação de ST de V1-V3 verificada após a injeção das microesferas. Na Primeira Fase as duas doses de papaverina induziram uma resposta hiperémica eficaz, sem tradução com significado na determinação do IRM (variação da pressão distal de - 11,4 ± 5 e de - 10,6± 5 mmHg com as doses de 5 e 10 mg respetivamente (p=0,5). Com a injeção das microesferas o IRM teve uma elevação média de 310 ± 190 %, para um valor médio de 41,3 ± 16 U (p = 0,001). Na Segunda Fase não houve diferenças significativas dos parâmetros hemodinâmicos, do fluxo epicárdico e da avaliação eletrocardiográfica entre os dois grupos. O IRM de base foi semelhante e após a infusão intracoronária observou-se uma elevação expressiva do IRM nos animais que receberam células em comparação com o grupo controlo (8,8 U ± 1 vs. 14,2 U ± 1,8, P=0,02) e quanto ao seu valor de base (aumento de 112%, p=0,008). Na terceira Fase não houve novamente diferenças significativas dos parâmetros hemodinâmicos, do fluxo epicárdico e da avaliação eletrocardiográfica entre os três grupos. Houve uma elevação do IRM nos animais que receberam células a partir da 2ª dose (72% nas células convencionai e 108% nas células inovadoras) e que se manteve com a 3ª dose (100% nas células convencionais e 88% nas inovadoras) com significado estatístico em comparação com o grupo controlo (p=0,034 com a 2ªdose e p=0,024 com a 3ª dose). Quatro semanas após a entrega das CEM observou-se a descida do IRM nos dois grupos que receberam células, para valores sobreponíveis aos do grupo controlo e aos valores pós-EAM. Na avaliação anatomopatológica e histológica dos corações explantados não houve diferenças entre os três grupos. Conclusões O IRM permite distinguir alterações da microcirculação coronária motivadas pela entrega intracoronária de CEM, na ausência de alterações de outros parâmetros clínicos da circulação coronária utilizados em tempo real. As alterações do IRM são progressivas e passíveis de avaliar o efeito/dose, embora não tenha sido possível determinar diferenças com os dois tipos de CEM. No nosso modelo a injeção intracoronária não se associou a evidência de efeito benéfico na reparação ou regeneração miocárdica após o EAM.---------------------------- ABSTRACT: ABSTRACT Introduction The use of cells for the treatment of cardiovascular disease is a promise for the future and perhaps the only option to overcome some of the shortcomings of current therapies. The strategy for the delivery of cells most often used in current research has been the intracoronary route and due to this microcirculation gains special relevance, mainly because it is the first interaction site of transplanted cells with the native tissue. Mesenchymal stem cells (MSC) have properties that make them suitable for Cell Therapy, but its dimensions, larger than the diameter of capillaries, have prompted controversy about the safety of intracoronary delivery. The interventional cardiology currently has techniques that allow for real-time and in vivo assessment of coronary microcirculation state. The determination of the index of microcirculatory resistance index (IMR) provides information about small vessels, independently of the coronary circulation and hemodynamic status, but the clinical applicability of this knowledge is yet to be defined. Objectives To clarify the potential use of IMR in the study of the effects of MSC through intracoronary transplantation. Population and Methods Preclinical study with swine model developed in three phases. In Phase One 8 healthy animals were used to study and validate the IMR assessment in our animal model. IMR was assessed with two different doses of papaverine for inducing the maximal hyperaemic response (5 and 10 mg) and microcirculation dysfunction was achieved after intracoronary injection with embozene microspheres with 40 μm in diameter. In Phase Two we randomized 18 healthy animals divided between the control group and the one receiving 30 x 106 MSC through an intracoronary infusion. There we blindly evaluated IMR, the aortic pressure, the epicardial coronary flow and the occurrence of ECG changes. In Phase Three we used 18 animals with a provoked acute myocardial infarction (AMI), randomized into a control group, a MSC expanded conventionally receiver group and a MSC expanded with an innovative methodology receiver group. There was a stepwise infusion with doses of 10 x 106, 15 x 106 and 20 x 106 MSC with determination of IMR, the aortic pressure, the epicardial coronary flow and occurrence of electrocardiographic abnormalities. Four weeks after cell delivery we again measured the IMR and proceeded with the pathological study of animals in the search for evidence of neoangiogenesis and myocardial regeneration, or a positive effect in the reparative response following the infarction. Results All animals remained hemodynamically stable and with no electrocardiographic abnormalities, except for the ST elevation in V1-V3 observed after injection of the microspheres. In Phase One the two doses of papaverine achieved an hyperemic and effective response without significant differences in IMR (variation of the distal pressure -11.4 ± 5 and -10.6 ± 5 mmHg with the doses of 5 and 10 mg respectively (p = 0.5). With the injection of the microspheres the IMR had an average increase of 310 ± 190% for an average value of 41.3 ± 16 U (p = 0.001). In the second phase there were no significant differences in hemodynamic parameters, epicardial flow and electrocardiographic assessment between the two groups. The baseline IMR was similar and after intracoronary infusion there was a significant increase in animals receiving cells compared with the control group (8.8 ± U 1 vs. 14.2 ± 1.8, p = 0.02) and with their baseline (112% increase, p = 0.008). In the third phase again there were no significant differences in hemodynamic parameters, the epicardial flow and electrocardiographic evaluation between the three groups. There was a significant increase in IMR in animals that received cells from the 2nd dose (72% in conventional cells and 108% in the innovative cells) that remained with the 3rd dose (100% in conventional cells and 88% in the innovative) with statistical significance compared with the control group (p = 0.034 with 2nd dose, p = 0.024 with 3rd dose). Four weeks after delivery of the MSC we observed the fall of the IMR in the two groups that received cells with values overlapping those of the control group. In pathological and histological evaluation of removed hearts there were no differences among the three groups. Conclusions The IMR allows for the differentiation of changes in coronary microcirculation motivated by intracoronary delivery of MSC in the absence of modification in other clinical parameters. IMR changes are progressive and enable the evaluation of the effect / dose, though it has not been possible to determine differences in the two types of MSC. In our model, intracoronary injection of MSC was not associated with evidence of repair or myocardial regeneration after AMI.