911 resultados para Case Based Computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A cross: sectional survey on schistosomiasis was done in Comercinho (Minas Gerais State, Brazil), a town with 1474 inhabitants. Stool (Kato-Katz method) and physical examinations were done on 90% of the population and on 84% of the individuals over 2 years of age, respectively. The ecological and individual (case-control) analysis were used to investigate the relation between splenomegaly and S. mansoni egg counts in different age groups. In the ecological analysis there was a clearly correspondence between higher geometric mean of eggs and higher percentage of splenomegaly in the age groups 5-9 and 10-12 years. In the individual analysis it was found that only in the youngest individuals (5-8 or 5-9 years old) the splenomegaly was related with higher mean egg counts in the feces, having been a tendency to the decrease of excretion of eggs in patients with splenomegaly as the age increased. These results strongly suggest that the ecological data are' better indicator of the severity of schistosomiasis in endemic areas, as the decrease of the egg excretion in patients with splenomegaly may be a confounding variable for the individual analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Students of a Cardiopulmonary Sciences curriculum in a Portuguese higher education institution have shown poor learning outcomes and low satisfaction on a course about lung function tests. A transmissive pedagogical approach, mainly based on lectures, was the common teaching practice. Aiming for a change, PBL was considered as a powerful alternative and also as a contribution for progressively innovating the curriculum. Purpose: to create PBL activities in a lung function tests course. to describe their implementation, to analyse the effects of PBL integration in students’ performance and attitudes, to characterize the generated learning environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extracting the semantic relatedness of terms is an important topic in several areas, including data mining, information retrieval and web recommendation. This paper presents an approach for computing the semantic relatedness of terms using the knowledge base of DBpedia — a community effort to extract structured information from Wikipedia. Several approaches to extract semantic relatedness from Wikipedia using bag-of-words vector models are already available in the literature. The research presented in this paper explores a novel approach using paths on an ontological graph extracted from DBpedia. It is based on an algorithm for finding and weighting a collection of paths connecting concept nodes. This algorithm was implemented on a tool called Shakti that extract relevant ontological data for a given domain from DBpedia using its SPARQL endpoint. To validate the proposed approach Shakti was used to recommend web pages on a Portuguese social site related to alternative music and the results of that experiment are reported in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paper presented at the 8th European Conference on Knowledge Management, Barcelona, 6-7 Sep. 2008 URL: http://www.academic-conferences.org/eckm/eckm2007/eckm07-home.htm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper appears in International Journal of Information and Communication Technology Education edited by Lawrence A. Tomei (Ed.) Copyright 2007, IGI Global, www.igi-global.com. Posted by permission of the publisher. URL:http://www.idea-group.com/journals/details.asp?id=4287.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Masters Thesis – Academic Year 2007/2008 - European Master’s Degree in Human Rights and Democratization (E.MA) - European Inter-university Centre for Human Rights and Democratization (EIUC) -Faculdade de Direito, Universidade Nova de Lisboa (UNL)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Belief revision is a critical issue in real world DAI applications. A Multi-Agent System not only has to cope with the intrinsic incompleteness and the constant change of the available knowledge (as in the case of its stand alone counterparts), but also has to deal with possible conflicts between the agents’ perspectives. Each semi-autonomous agent, designed as a combination of a problem solver – assumption based truth maintenance system (ATMS), was enriched with improved capabilities: a distributed context management facility allowing the user to dynamically focus on the more pertinent contexts, and a distributed belief revision algorithm with two levels of consistency. This work contributions include: (i) a concise representation of the shared external facts; (ii) a simple and innovative methodology to achieve distributed context management; and (iii) a reduced inter-agent data exchange format. The different levels of consistency adopted were based on the relevance of the data under consideration: higher relevance data (detected inconsistencies) was granted global consistency while less relevant data (system facts) was assigned local consistency. These abilities are fully supported by the ATMS standard functionalities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Tagus estuary is bordered by the largest metropolitan area in Portugal, the Lisbon capital city council. It has suffered the impact of several major tsunamis in the past, as shown by a recent revision of the catalogue of tsunamis that struck the Portuguese coast over the past two millennia. Hence, the exposure of populations and infrastructure established along the riverfront comprises a critical concern for the civil protection services. The main objectives of this work are to determine critical inundation areas in Lisbon and to quantify the associated severity through a simple index derived from the local maximum of momentum flux per unit mass and width. The employed methodology is based on the mathematical modelling of a tsunami propagating along the estuary, resembling the one occurred on the 1 November of 1755 that followed the 8.5 M-w Great Lisbon Earthquake. The employed simulation tool was STAV-2D, a shallow-flow solver coupled with conservation equations for fine solid phases, and now featuring the novelty of discrete Lagrangian tracking of large debris. Different sets of initial conditions were studied, combining distinct tidal, atmospheric and fluvial scenarios, so that the civil protection services were provided with comprehensive information to devise public warning and alert systems and post-event mitigation intervention. For the most severe scenario, the obtained results have shown a maximum inundation extent of 1.29 km at the AlcA cent ntara valley and water depths reaching nearly 10 m across Lisbon's riverfront.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we exploit the nonlinear property of the SiC multilayer devices to design an optical processor for error detection that enables reliable delivery of spectral data of four-wave mixing over unreliable communication channels. The SiC optical processor is realized by using double pin/pin a-SiC:H photodetector with front and back biased optical gating elements. Visible pulsed signals are transmitted together at different bit sequences. The combined optical signal is analyzed. Data show that the background acts as selector that picks one or more states by splitting portions of the input multi optical signals across the front and back photodiodes. Boolean operations such as EXOR and three bit addition are demonstrated optically, showing that when one or all of the inputs are present, the system will behave as an XOR gate representing the SUM. When two or three inputs are on, the system acts as AND gate indicating the present of the CARRY bit. Additional parity logic operations are performed using four incoming pulsed communication channels that are transmitted and checked for errors together. As a simple example of this approach, we describe an all-optical processor for error detection and then provide an experimental demonstration of this idea. (C) 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the significant impact that cultural events may have in local communities and the inherent organization complexity, it is important to understand their specificities. Most of the times cultural events disregard marketing and often marketing is distant from art. Thus an analysis of an inside perspective might bring significant returns to the organization of such an event. This paper considers the three editions (2011, 2012 and 2013) of a cultural event – Noc Noc – organized by a local association in the city of Guimarães, Portugal. Its format is based in analogous events, as Noc Noc intends to convert everyday spaces (homes, commercial outlets and a number of other buildings) into cultural spaces, processed and transformed by artists, hosts and audiences. By interviewing a sample of people (20) who have hosted this cultural event, sometimes doubling as artists, and by experiencing the three editions of the event, this paper illustrates how the internal public understands this particular cultural event, analyzing specifically their motivations, ways of acting and participating, as well as their relationship with the public, with the organization of the event and with art in general. Results support that artists and hosts motivations must be identified in a timely and appropriate moment, as well as their views of this particular cultural event, in order to keep them participating, since low budget cultural events such as this one may have a key role in small scale cities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new method for self-localization of mobile robots, based on a PCA positioning sensor to operate in unstructured environments, is proposed and experimentally validated. The proposed PCA extension is able to perform the eigenvectors computation from a set of signals corrupted by missing data. The sensor package considered in this work contains a 2D depth sensor pointed upwards to the ceiling, providing depth images with missing data. The positioning sensor obtained is then integrated in a Linear Parameter Varying mobile robot model to obtain a self-localization system, based on linear Kalman filters, with globally stable position error estimates. A study consisting in adding synthetic random corrupted data to the captured depth images revealed that this extended PCA technique is able to reconstruct the signals, with improved accuracy. The self-localization system obtained is assessed in unstructured environments and the methodologies are validated even in the case of varying illumination conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Glass fibre-reinforced plastics (GFRP) have been considered inherently difficult to recycle due to both: crosslinked nature of thermoset resins, which cannot be remoulded, and complex composition of the composite itself. Presently, most of the GFRP waste is landfilled leading to negative environmental impacts and supplementary added costs. With an increasing awareness of environmental matters and the subsequent desire to save resources, recycling would convert an expensive waste disposal into a profitable reusable material. In this study, efforts were made in order to recycle grinded GFRP waste, proceeding from pultrusion production scrap, into new and sustainable composite materials. For this purpose, GFRP waste recyclates, were incorporated into polyester based mortars as fine aggregate and filler replacements at different load contents and particle size distributions. Potential recycling solution was assessed by mechanical behaviour of resultant GFRP waste modified polymer mortars. Results revealed that GFRP waste filled polymer mortars present improved flexural and compressive behaviour over unmodified polyester based mortars, thus indicating the feasibility of the GFRP industrial waste reuse into concrete-polymer composite materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

“Many-core” systems based on a Network-on-Chip (NoC) architecture offer various opportunities in terms of performance and computing capabilities, but at the same time they pose many challenges for the deployment of real-time systems, which must fulfill specific timing requirements at runtime. It is therefore essential to identify, at design time, the parameters that have an impact on the execution time of the tasks deployed on these systems and the upper bounds on the other key parameters. The focus of this work is to determine an upper bound on the traversal time of a packet when it is transmitted over the NoC infrastructure. Towards this aim, we first identify and explore some limitations in the existing recursive-calculus-based approaches to compute the Worst-Case Traversal Time (WCTT) of a packet. Then, we extend the existing model by integrating the characteristics of the tasks that generate the packets. For this extended model, we propose an algorithm called “Branch and Prune” (BP). Our proposed method provides tighter and safe estimates than the existing recursive-calculus-based approaches. Finally, we introduce a more general approach, namely “Branch, Prune and Collapse” (BPC) which offers a configurable parameter that provides a flexible trade-off between the computational complexity and the tightness of the computed estimate. The recursive-calculus methods and BP present two special cases of BPC when a trade-off parameter is 1 or ∞, respectively. Through simulations, we analyze this trade-off, reason about the implications of certain choices, and also provide some case studies to observe the impact of task parameters on the WCTT estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Even though Software Transactional Memory (STM) is one of the most promising approaches to simplify concurrent programming, current STM implementations incur significant overheads that render them impractical for many real-sized programs. The key insight of this work is that we do not need to use the same costly barriers for all the memory managed by a real-sized application, if only a small fraction of the memory is under contention lightweight barriers may be used in this case. In this work, we propose a new solution based on an approach of adaptive object metadata (AOM) to promote the use of a fast path to access objects that are not under contention. We show that this approach is able to make the performance of an STM competitive with the best fine-grained lock-based approaches in some of the more challenging benchmarks. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paper developed for the unit “Innovation Economics and Management” of the PhD programme in Technology Assessment at the Universidade Nova de Lisboa in 2009-10 under the supervision of Prof. Maria Luísa Ferreira