820 resultados para approach to information systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Engenharia Florestal e dos Recursos Naturais - Instituto Superior de Agronomia - UL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Prove that conducting complementary studies at laboratories and imaging studies are unnecessary in irst-time unprovoked seizures, since there is no change in the evolution and prognosis of the disease, as well as the study of our population, the incidence rate and the proportion of our patients that have been studied and given maintenance treatment, so it can be determined whether or not our population should follow the suggestions of the American Academy of Pediatrics and the Spanish Pediatric Association. Methods: An observational study, including patients diagnosed with irst-time unprovoked seizures. They were followed up on by the emergency department and information was collected from their clinical history and compared with the results of the different studies between patients that suffered just one seizure and the ones that had recurrent seizures. Results: Thirty one patients were included, 14 males and 17 females. The average age was 5.5 years old. The 100% of patients were studied, and the groups were compared. The signiicant study was the electroencephalogram (EEG) with a p=0.02 (signiicance p<0.05), incidence of 41%. Conclusions: The study and diagnosis of irst-time unprovoked seizures is based on clinical manifestations. The EEG is important in the study and classiication of unprovoked seizures. Our population has an incidence and recurrence rate similar to that in the bibliography, and for that reason, this study suggests that the diagnostic and therapeutic guidelines of the American Academy of Pediatrics and the Spanish Pediatric Association should be followed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Doutor em Design, apresentada na Universidade de Lisboa - Faculdade de Arquitetura.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral identification of individual micro- and nano-sized particles by the sequential intervention of optical catapulting, optical trapping and laser-induced breakdown spectroscopy is presented [1]. The three techniques are used for different purposes. Optical catapulting (OC) serves to put the particulate material under inspection in aerosol form [2-4]. Optical trapping (OT) permits the isolation and manipulation of individual particles from the aerosol, which are subsequently analyzed by laser-induced breakdown spectroscopy (LIBS). Once catapulted, the dynamics of particle trapping depends on the laser beam characteristics (power and intensity gradient) and on the particle properties (size, mass and shape). Particles are stably trapped in air at atmospheric pressure and can be conveniently manipulated for a precise positioning for LIBS analysis. The spectra acquired from the individually trapped particles permit a straightforward identification of the inspected material. The current work focuses on the development of a procedure for simultaneously acquiring dual information about the particle under study via LIBS and time-resolved plasma images by taking advantage of the aforementioned features of the OC-OT-LIBS instrument to align the multiple lines in a simple yet highly accurate way. The plasma imaging does not only further reinforce the spectral data, but also allows a better comprehension of the chemical and physical processes involved during laser-particle interaction. Also, a thorough determination of the optimal excitation conditions generating the most information out of each laser event was run along the determination of parameters such as the width of the optical trap, its stability as a function of the laser power and the laser wavelength. The extreme sensibility of the presented OC-OT-LIBS technology allows a detection power of attograms for single/individual particle analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sequence problems belong to the most challenging interdisciplinary topics of the actuality. They are ubiquitous in science and daily life and occur, for example, in form of DNA sequences encoding all information of an organism, as a text (natural or formal) or in form of a computer program. Therefore, sequence problems occur in many variations in computational biology (drug development), coding theory, data compression, quantitative and computational linguistics (e.g. machine translation). In recent years appeared some proposals to formulate sequence problems like the closest string problem (CSP) and the farthest string problem (FSP) as an Integer Linear Programming Problem (ILPP). In the present talk we present a general novel approach to reduce the size of the ILPP by grouping isomorphous columns of the string matrix together. The approach is of practical use, since the solution of sequence problems is very time consuming, in particular when the sequences are long.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change, intensive use, and population growth are threatening the availability of water resources. New sources of water, better knowledge of existing ones, and improved water management strategies are of paramount importance. Ground water is often considered as primary water source due to its advantages in terms of quantity, spatial distribution, and natural quality. Remote sensing techniques afford scientists a unique opportunity to characterize landscapes in order to assess groundwater resources, particularly in tectonically influenced areas. Aquifers in volcanic basins are considered the most productive aquifers in Latin America. Although topography is considered the primary driving force for groundwater flows in mountainous terrains, tectonic activity increases the complexity of these groundwater systems by altering the integrity of sedimentary rock units and the overlying drainage networks. Structural controls affect the primary hydraulic properties of the rock formations by developing barriers to flow in some cases and zones of preferential infiltration and subterranean in others. The study area focuses on the Quito Aquifer System (QAS) in Ecuador. The characterization of the hydrogeology started with a lineament analysis based on a combined remote sensing and digital terrain analysis approach. The application of classical tools for regional hydrogeological evaluation and shallow geophysical methods were useful to evaluate the impact of faulting and fracturing on the aquifer system. Given the spatial extension of the area and the complexity of the system, two levels of analysis were applied in this study. At the regional level, a lineament map was created for the QAS. Relationships between fractures, faults and lineaments and the configuration of the groundwater flow on the QAS were determined. At the local level, on the Plateaus region of the QAS, a detailed lineament map was obtained by using high-spatial-resolution satellite imagery and aspect map derived from a digital elevation model (DEM). This map was complemented by the analysis of morphotectonic indicators and shallow geophysics that characterize fracture patterns. The development of the groundwater flow system was studied, drawing upon data pertaining to the aquifer system physical characteristics and topography. Hydrochemistry was used to ascertain the groundwater evolution and verify the correspondence of the flow patterns proposed in the flow system analysis. Isotopic analysis was employed to verify the origin of groundwater. The results of this study show that tectonism plays a very important role for the hydrology of the QAS. The results also demonstrate that faults influence a great deal of the topographic characteristics of the QAS and subsequently the configuration of the groundwater flow. Moreover, for the Plateaus region, the results demonstrate that the aquifer flow systems are affected by secondary porosity. This is a new conceptualization of the functioning of the aquifers on the QAS that will significantly contribute to the development of better strategies for the management of this important water resource.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing in resolution of numerical weather prediction models has allowed more and more realistic forecasts of atmospheric parameters. Due to the growing variability into predicted fields the traditional verification methods are not always able to describe the model ability because they are based on a grid-point-by-grid-point matching between observation and prediction. Recently, new spatial verification methods have been developed with the aim of show the benefit associated to the high resolution forecast. Nested in among of the MesoVICT international project, the initially aim of this work is to compare the newly tecniques remarking advantages and disadvantages. First of all, the MesoVICT basic examples, represented by synthetic precipitation fields, have been examined. Giving an error evaluation in terms of structure, amplitude and localization of the precipitation fields, the SAL method has been studied more thoroughly respect to the others approaches with its implementation in the core cases of the project. The verification procedure has concerned precipitation fields over central Europe: comparisons between the forecasts performed by the 00z COSMO-2 model and the VERA (Vienna Enhanced Resolution Analysis) have been done. The study of these cases has shown some weaknesses of the methodology examined; in particular has been highlighted the presence of a correlation between the optimal domain size and the extention of the precipitation systems. In order to increase ability of SAL, a subdivision of the original domain in three subdomains has been done and the method has been applied again. Some limits have been found in cases in which at least one of the two domains does not show precipitation. The overall results for the subdomains have been summarized on scatter plots. With the aim to identify systematic errors of the model the variability of the three parameters has been studied for each subdomain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Bovine tuberculosis (bTB) is a chronic infectious disease mainly caused by Mycobacterium bovis. Although eradication is a priority for the European authorities, bTB remains active or even increasing in many countries, causing significant economic losses. The integral consideration of epidemiological factors is crucial to more cost-effectively allocate control measures. The aim of this study was to identify the nature and extent of the association between TB distribution and a list of potential risk factors regarding cattle, wild ungulates and environmental aspects in Ciudad Real, a Spanish province with one of the highest TB herd prevalences. RESULTS We used a Bayesian mixed effects multivariable logistic regression model to predict TB occurrence in either domestic or wild mammals per municipality in 2007 by using information from the previous year. The municipal TB distribution and endemicity was clustered in the western part of the region and clearly overlapped with the explanatory variables identified in the final model: (1) incident cattle farms, (2) number of years of veterinary inspection of big game hunting events, (3) prevalence in wild boar, (4) number of sampled cattle, (5) persistent bTB-infected cattle farms, (6) prevalence in red deer, (7) proportion of beef farms, and (8) farms devoted to bullfighting cattle. CONCLUSIONS The combination of these eight variables in the final model highlights the importance of the persistence of the infection in the hosts, surveillance efforts and some cattle management choices in the circulation of M. bovis in the region. The spatial distribution of these variables, together with particular Mediterranean features that favour the wildlife-livestock interface may explain the M. bovis persistence in this region. Sanitary authorities should allocate efforts towards specific areas and epidemiological situations where the wildlife-livestock interface seems to critically hamper the definitive bTB eradication success.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment processes are essential to guarantee quality and continuous improvement of software in healthcare, as they measure software attributes in their lifecycle, verify the degree of alignment between the software and its objectives and identify unpredicted events. This article analyses the use of an assessment model based on software metrics for three healthcare information systems from a public hospital that provides secondary and tertiary care in the region of Ribeirão Preto. Compliance with the metrics was investigated using questionnaires in guided interviews of the system analysts responsible for the applications. The outcomes indicate that most of the procedures specified in the model can be adopted to assess the systems that serves the organization, particularly in the attributes of compatibility, reliability, safety, portability and usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fisheries for mackerel scad, Decapterus macarellus, are particularly important in Cape Verde, constituting almost 40% of total catches at the peak of the fishery in 1997 and 1998 ( 3700 tonnes). Catches have been stable at a much lower level of about 2 100 tonnes in recent years. Given the importance of mackerel scad in terms of catch weight and local food security, there is an urgent need for updated assessment. Stock assessment was carried out using a Bayesian approach to biomass dynamic modelling. In order to tackle the problem of a non-informative CPUE series, the intrinsic rate of increase, r, was estimated separately, and the ratio B-0/X, initial biomass relative to carrying capacity, was assumed based on available information. The results indicated that the current level of fishing is sustainable. The probability of collapse is low, particularly in the short-term, and it is likely that biomass may increase further above B-msy, indicating a healthy stock level. It would appear that it is relatively safe to increase catches even up to 4000 tonnes. However, the marginal posterior of r was almost identical to the prior, indicating that there is relatively low information content in CPUE. This was also the case in relation to B-0/X There have been substantial increases in fishing efficiency, which have not been adequately captured by the measure used for effort (days or trips), implying that the results may be overly optimistic and should be considered preliminary. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dyscalculia stands for a brain-based condition that makes it hard to make sense of numbers and mathematical concepts. Some adolescents with dyscalculia cannot grasp basic number concepts. They work hard to learn and memorize basic number facts. They may know what to do in mathematical classes but do not understand why they are doing it. In other words, they miss the logic behind it. However, it may be worked out in order to decrease its degree of severity. For example, disMAT, an app developed for android may help children to apply mathematical concepts, without much effort, that is turning in itself, a promising tool to dyscalculia treatment. Thus, this work focuses on the development of an Intelligent System to estimate children evidences of dyscalculia, based on data obtained on-the-fly with disMAT. The computational framework is built on top of a Logic Programming framework to Knowledge Representation and Reasoning, complemented with a Case-Based problem solving approach to computing, that allows for the handling of incomplete, unknown, or even contradictory information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nosocomial infections are a growing concern because they affect a large number of people and they increase the admission time in healthcare facilities. Additionally, its diagnosis is very tricky, requiring multiple medical exams. So, this work is focused on the development of a clinical decision support system to prevent these events from happening. The proposed solution is unique once it caters for the explicit treatment of incomplete, unknown, or even contradictory information under a logic programming basis, that to our knowledge is something that happens for the first time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Waiting time at an intensive care unity stands for a key feature in the assessment of healthcare quality. Nevertheless, its estimation is a difficult task, not only due to the different factors with intricate relations among them, but also with respect to the available data, which may be incomplete, self-contradictory or even unknown. However, its prediction not only improves the patients’ satisfaction but also enhance the quality of the healthcare being provided. To fulfill this goal, this work aims at the development of a decision support system that allows one to predict how long a patient should remain at an emergency unit, having into consideration all the remarks that were just stated above. It is built on top of a Logic Programming approach to knowledge representation and reasoning, complemented with a Case Base approach to computing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Declarative techniques such as Constraint Programming can be very effective in modeling and assisting management decisions. We present a method for managing university classrooms which extends the previous design of a Constraint-Informed Information System to generate the timetables while dealing with spatial resource optimization issues. We seek to maximize space utilization along two dimensions: classroom use and occupancy rates. While we want to maximize the room use rate, we still need to satisfy the soft constraints which model students’ and lecturers’ preferences. We present a constraint logic programming-based local search method which relies on an evaluation function that combines room utilization and timetable soft preferences. Based on this, we developed a tool which we applied to the improvement of classroom allocation in a University. Comparing the results to the current timetables obtained without optimizing space utilization, the initial versions of our tool manages to reach a 30% improvement in space utilization, while preserving the quality of the timetable, both for students and lecturers.