846 resultados para Open Information Extraction
Resumo:
Aims: Cataract surgery is one of the most common surgeries performed, but its overuse has been reported. The threshold for cataract surgery has become increasingly lenient; therefore, the selection process and surgical need has been questioned. The aim of this study was to evaluate the changes associated with cataract surgery in patient-reported vision-related quality of life (VR-QoL).
Methods: A prospective cohort study was conducted. Consecutive patients referred to cataract clinics in an NHS unit in Scotland were identified. Those listed for surgery were invited to complete a validated questionnaire (TyPE) to measure VR-QoL pre- and post-operatively. TyPE has five different domains (near vision, distance vision, daytime driving, night-time driving, and glare) and a global score of vision. The influence of pre-operative visual acuity (VA) levels, vision, and lens status of the fellow eye on changes in VR-QoL were explored.
Results: A total of 320 listed patients were approached, of whom 36 were excluded. Among the 284 enrolled patients, 229 (81%) returned the questionnaire after surgery. Results revealed that the mean overall vision improved, as reported by patients. Improvements were also seen in all sub-domains of the questionnaire.
Conclusion: The majority of patients appear to have improvement in patient-reported VR-QoL, including those with good pre-operative VA and previous surgery to the fellow eye. VA thresholds may not capture the effects of the quality of life on patients. This information can assist clinicians to make more informed decisions when debating over the benefits of listing a patient for cataract extraction.
Resumo:
We investigate the link between information and thermodynamics embodied by Landauer’s principle in the open dynamics of a multipartite quantum system. Such irreversible dynamics is described in terms of a collisional model with a finite temperature reservoir. We demonstrate that Landauer’s principle holds, for such a configuration, in a form that involves the flow of heat dissipated into the environment and the rate of change of the entropy of the system. Quite remarkably, such a principle for heat and entropy power can be explicitly linked to the rate of creation of correlations among the elements of the multipartite system and, in turn, the non-Markovian nature of their reduced evolution. Such features are illustrated in two exemplary cases.
Resumo:
In many applications, and especially those where batch processes are involved, a target scalar output of interest is often dependent on one or more time series of data. With the exponential growth in data logging in modern industries such time series are increasingly available for statistical modeling in soft sensing applications. In order to exploit time series data for predictive modelling, it is necessary to summarise the information they contain as a set of features to use as model regressors. Typically this is done in an unsupervised fashion using simple techniques such as computing statistical moments, principal components or wavelet decompositions, often leading to significant information loss and hence suboptimal predictive models. In this paper, a functional learning paradigm is exploited in a supervised fashion to derive continuous, smooth estimates of time series data (yielding aggregated local information), while simultaneously estimating a continuous shape function yielding optimal predictions. The proposed Supervised Aggregative Feature Extraction (SAFE) methodology can be extended to support nonlinear predictive models by embedding the functional learning framework in a Reproducing Kernel Hilbert Spaces setting. SAFE has a number of attractive features including closed form solution and the ability to explicitly incorporate first and second order derivative information. Using simulation studies and a practical semiconductor manufacturing case study we highlight the strengths of the new methodology with respect to standard unsupervised feature extraction approaches.
Resumo:
The exponential growth of the world population has led to an increase of settlements often located in areas prone to natural disasters, including earthquakes. Consequently, despite the important advances in the field of natural catastrophes modelling and risk mitigation actions, the overall human losses have continued to increase and unprecedented economic losses have been registered. In the research work presented herein, various areas of earthquake engineering and seismology are thoroughly investigated, and a case study application for mainland Portugal is performed. Seismic risk assessment is a critical link in the reduction of casualties and damages due to earthquakes. Recognition of this relation has led to a rapid rise in demand for accurate, reliable and flexible numerical tools and software. In the present work, an open-source platform for seismic hazard and risk assessment is developed. This software is capable of computing the distribution of losses or damage for an earthquake scenario (deterministic event-based) or earthquake losses due to all the possible seismic events that might occur within a region for a given interval of time (probabilistic event-based). This effort has been developed following an open and transparent philosophy and therefore, it is available to any individual or institution. The estimation of the seismic risk depends mainly on three components: seismic hazard, exposure and vulnerability. The latter component assumes special importance, as by intervening with appropriate retrofitting solutions, it may be possible to decrease directly the seismic risk. The employment of analytical methodologies is fundamental in the assessment of structural vulnerability, particularly in regions where post-earthquake building damage might not be available. Several common methodologies are investigated, and conclusions are yielded regarding the method that can provide an optimal balance between accuracy and computational effort. In addition, a simplified approach based on the displacement-based earthquake loss assessment (DBELA) is proposed, which allows for the rapid estimation of fragility curves, considering a wide spectrum of uncertainties. A novel vulnerability model for the reinforced concrete building stock in Portugal is proposed in this work, using statistical information collected from hundreds of real buildings. An analytical approach based on nonlinear time history analysis is adopted and the impact of a set of key parameters investigated, including the damage state criteria and the chosen intensity measure type. A comprehensive review of previous studies that contributed to the understanding of the seismic hazard and risk for Portugal is presented. An existing seismic source model was employed with recently proposed attenuation models to calculate probabilistic seismic hazard throughout the territory. The latter results are combined with information from the 2011 Building Census and the aforementioned vulnerability model to estimate economic loss maps for a return period of 475 years. These losses are disaggregated across the different building typologies and conclusions are yielded regarding the type of construction more vulnerable to seismic activity.
Resumo:
This paper presents a method of using the so-colled "bacterial algorithm" (4,5) for extracting a fuzzy rule base from a training set. The bewly proposed bacterial evolutionary algorithm (BEA) is shown. In our application one bacterium corresponds to a fuzzy rule system.
Resumo:
This paper is on the implementation of a dual axis positioning system controller. The system was designed to be used for space-dependent ultrasound signal acquisition problems, such as pressure field mapping. The work developed can be grouped in two main subjects: hardware and software. Each axis includes one stepper motor connected to a driver circuit, which is then connected to a processing unit. The graphical user interface is simple and clear for the user. The system resolution was computed as 127 mu m with an accuracy of 2.44 mu m. Although the target application is ultrasound signal acquisition, the controller can be applied to other devices that has up to four stepper motors. The application was developed as an open source software, thus it can be used or changed to fit different purposes.
Resumo:
This report describes climatologically, the snow storm that occurred from November 20-22, 2006.
Resumo:
Tese de doutoramento, Informática (Ciências da Computação), Universidade de Lisboa, Faculdade de Ciências, 2015
Resumo:
Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.
Resumo:
The aim of this study was to undertake a comparative analysis of the practices and information behaviour of European information users who visit information units specialising in European information in Portugal and Spain. The study used a quantitative methodology based on a questionnaire containing closed questions and one open question. The questions covered the general sociological profile of the respondents and their use of European Document Centres, in addition to analysing aspects associated with information behaviour relating to European themes. The study therefore examined data on the preferred means and sources for accessing European information, types of documents and the subjects investigated most. The use of European databases and the Internet to access material on Europe was also studied, together with the reasons which users considered made it easy or difficult to access European information, and the aspects they valued most in accessing this information. The questionnaire was administered in European Document Centres in 2008 and 2010.
Resumo:
O sector do turismo é uma área francamente em crescimento em Portugal e que tem desenvolvido a sua divulgação e estratégia de marketing. Contudo, apenas se prende com indicadores de desempenho e de oferta instalada (número de quartos, hotéis, voos, estadias), deixando os indicadores estatísticos em segundo plano. De acordo com o “ Travel & tourism Competitiveness Report 2013”, do World Economic Forum, classifica Portugal em 72º lugar no que respeita à qualidade e cobertura da informação estatística, disponível para o sector do Turismo. Refira-se que Espanha ocupa o 3º lugar. Uma estratégia de mercado, sem base analítica, que sustente um quadro de orientações específico e objetivo, com relevante conhecimento dos mercados alvo, dificilmente é compreensível ou até mesmo materializável. A implementação de uma estrutura de Business Intelligence que permita a realização de um levantamento e tratamento de dados que possibilite relacionar e sustentar os resultados obtidos no sector do turismo revela-se fundamental e crucial, para que sejam criadas estratégias de mercado. Essas estratégias são realizadas a partir da informação dos turistas que nos visitam, e dos potenciais turistas, para que possam ser cativados no futuro. A análise das características e dos padrões comportamentais dos turistas permite definir perfis distintos e assim detetar as tendências de mercado, de forma a promover a oferta dos produtos e serviços mais adequados. O conhecimento obtido permite, por um lado criar e disponibilizar os produtos mais atrativos para oferecer aos turistas e por outro informá-los, de uma forma direcionada, da existência desses produtos. Assim, a associação de uma recomendação personalizada que, com base no conhecimento de perfis do turista proceda ao aconselhamento dos melhores produtos, revela-se como uma ferramenta essencial na captação e expansão de mercado.
Resumo:
É possível assistir nos dias de hoje, a um processo tecnológico evolutivo acentuado por toda a parte do globo. No caso das empresas, quer as pequenas, médias ou de grandes dimensões, estão cada vez mais dependentes dos sistemas informatizados para realizar os seus processos de negócio, e consequentemente à geração de informação referente aos negócios e onde, muitas das vezes, os dados não têm qualquer relacionamento entre si. A maioria dos sistemas convencionais informáticos não são projetados para gerir e armazenar informações estratégicas, impossibilitando assim que esta sirva de apoio como recurso estratégico. Portanto, as decisões são tomadas com base na experiência dos administradores, quando poderiam serem baseadas em factos históricos armazenados pelos diversos sistemas. Genericamente, as organizações possuem muitos dados, mas na maioria dos casos extraem pouca informação, o que é um problema em termos de mercados competitivos. Como as organizações procuram evoluir e superar a concorrência nas tomadas de decisão, surge neste contexto o termo Business Intelligence(BI). A GisGeo Information Systems é uma empresa que desenvolve software baseado em SIG (sistemas de informação geográfica) recorrendo a uma filosofia de ferramentas open-source. O seu principal produto baseia-se na localização geográfica dos vários tipos de viaturas, na recolha de dados, e consequentemente a sua análise (quilómetros percorridos, duração de uma viagem entre dois pontos definidos, consumo de combustível, etc.). Neste âmbito surge o tema deste projeto que tem objetivo de dar uma perspetiva diferente aos dados existentes, cruzando os conceitos BI com o sistema implementado na empresa de acordo com a sua filosofia. Neste projeto são abordados alguns dos conceitos mais importantes adjacentes a BI como, por exemplo, modelo dimensional, data Warehouse, o processo ETL e OLAP, seguindo a metodologia de Ralph Kimball. São também estudadas algumas das principais ferramentas open-source existentes no mercado, assim como quais as suas vantagens/desvantagens relativamente entre elas. Em conclusão, é então apresentada a solução desenvolvida de acordo com os critérios enumerados pela empresa como prova de conceito da aplicabilidade da área Business Intelligence ao ramo de Sistemas de informação Geográfica (SIG), recorrendo a uma ferramenta open-source que suporte visualização dos dados através de dashboards.
Resumo:
The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.
Resumo:
This study assesses gender differences in spatial and non-spatial relational learning and memory in adult humans behaving freely in a real-world, open-field environment. In Experiment 1, we tested the use of proximal landmarks as conditional cues allowing subjects to predict the location of rewards hidden in one of two sets of three distinct locations. Subjects were tested in two different conditions: (1) when local visual cues marked the potentially-rewarded locations, and (2) when no local visual cues marked the potentially-rewarded locations. We found that only 17 of 20 adults (8 males, 9 females) used the proximal landmarks to predict the locations of the rewards. Although females exhibited higher exploratory behavior at the beginning of testing, males and females discriminated the potentially-rewarded locations similarly when local visual cues were present. Interestingly, when the spatial and local information conflicted in predicting the reward locations, males considered both spatial and local information, whereas females ignored the spatial information. However, in the absence of local visual cues females discriminated the potentially-rewarded locations as well as males. In Experiment 2, subjects (9 males, 9 females) were tested with three asymmetrically-arranged rewarded locations, which were marked by local cues on alternate trials. Again, females discriminated the rewarded locations as well as males in the presence or absence of local cues. In sum, although particular aspects of task performance might differ between genders, we found no evidence that women have poorer allocentric spatial relational learning and memory abilities than men in a real-world, open-field environment.
Resumo:
Since the early 1970's, Canadians have expressed many concerns about the growth of government and its impact on their daily lives. The public has requested increased access to government documents and improved protection of the personal information which is held in government files and data banks. At the same time, both academics and practitioners in the field of public administration have become more interested in the values that public servants bring to their decisions and recommendations. Certain administrative values, such as accountability and integrity, have taken on greater relative importance. The purpose of this thesis is to examine the implementation of Ontario's access and privacy law. It centres on the question of whether or not the Freedom of Information and Protection of Privacy Act, 1987, (FIPPA) has answered the demand for open access to government while at the same time protecting the personal privacy of individual citizens. It also assesses the extent to which this relatively new piece of legislation has made a difference to the people of Ontario. The thesis presents an overview of the issues of freedom of information and protection of privacy in Ontario. It begins with the evolution of the legislation and a description of the law itself. It focuses on the structures and processes which have been established to meet the procedural and administrative demands of the Act. These structures and processes are evaluated in two ways. First, the thesis evaluates how open the Ontario government has become and, second, it determines how Ill carefully the privacy rights of individuals are safeguarded. An analytical framework of administrative values is used to evaluate the overall performance of the government in these two areas. The conclusion is drawn that, overall, the Ontario government has effectively implemented the Freedom of Information and Protection of Privacy Act, particularly by providing access to most government-held documents. The protection of individual privacy has proved to be not only more difficult to achieve, but more difficult to evaluate. However, the administrative culture of the Ontario bureaucracy is shown to be committed to ensuring that the access and privacy rights of citizens are respected.