923 resultados para Data Systems
Resumo:
Network control systems (NCSs) are spatially distributed systems in which the communication between sensors, actuators and controllers occurs through a shared band-limited digital communication network. However, the use of a shared communication network, in contrast to using several dedicated independent connections, introduces new challenges which are even more acute in large scale and dense networked control systems. In this paper we investigate a recently introduced technique of gathering information from a dense sensor network to be used in networked control applications. Obtaining efficiently an approximate interpolation of the sensed data is exploited as offering a good tradeoff between accuracy in the measurement of the input signals and the delay to the actuation. These are important aspects to take into account for the quality of control. We introduce a variation to the state-of-the-art algorithms which we prove to perform relatively better because it takes into account the changes over time of the input signal within the process of obtaining an approximate interpolation.
Resumo:
Introduction: multimodality environment; requirement for greater understanding of the imaging technologies used, the limitations of these technologies, and how to best interpret the results; dose optimization; introduction of new techniques; current practice and best practice; incidental findings, in low-dose CT images obtained as part of the hybrid imaging process, are an increasing phenomenon with advancing CT technology; resultant ethical and medico-legal dilemmas; understanding limitations of these procedures important when reporting images and recommending follow-up; free-response observer performance study was used to evaluate lesion detection in low-dose CT images obtained during attenuation correction acquisitions for myocardial perfusion imaging, on two hybrid imaging systems.
Resumo:
Dissertation presented to obtain the Ph.D degree in Bioinformatics
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Complex systems, i.e. systems composed of a large set of elements interacting in a non-linear way, are constantly found all around us. In the last decades, different approaches have been proposed toward their understanding, one of the most interesting being the Complex Network perspective. This legacy of the 18th century mathematical concepts proposed by Leonhard Euler is still current, and more and more relevant in real-world problems. In recent years, it has been demonstrated that network-based representations can yield relevant knowledge about complex systems. In spite of that, several problems have been detected, mainly related to the degree of subjectivity involved in the creation and evaluation of such network structures. In this Thesis, we propose addressing these problems by means of different data mining techniques, thus obtaining a novel hybrid approximation intermingling complex networks and data mining. Results indicate that such techniques can be effectively used to i) enable the creation of novel network representations, ii) reduce the dimensionality of analyzed systems by pre-selecting the most important elements, iii) describe complex networks, and iv) assist in the analysis of different network topologies. The soundness of such approach is validated through different validation cases drawn from actual biomedical problems, e.g. the diagnosis of cancer from tissue analysis, or the study of the dynamics of the brain under different neurological disorders.
Resumo:
Retail services are a main contributor to municipal budget and are an activity that affects perceived quality-of-life, especially for those with mobility difficulties (e.g. the elderly, low income citizens). However, there is evidence of a decline in some of the services market towns provide to their citizens. In market towns, this decline has been reported all over the western world, from North America to Australia. The aim of this research was to understand retail decline and enlighten on some ways of addressing this decline, using a case study, Thornbury, a small town in the Southwest of England. Data collected came from two participatory approaches: photo-surveys and multicriteria mapping. The interpretation of data came from using participants as analysts, but also, using systems thinking (systems diagramming and social trap theory) for theory building. This research moves away from mainstream economic and town planning perspectives by making use of different methods and concepts used in anthropology and visual sociology (photo-surveys), decision-making and ecological economics (multicriteria mapping and social trap theory). In sum, this research has experimented with different methods, out of their context, to analyse retail decline in a small town. This research developed a conceptual model for retail decline and identified the existence of conflicting goals and interests and their implications for retail decline, as well as causes for these. Most of the potential causes have had little attention in the literature. This research also identified that some of the measures commonly used for dealing with retail decline may be contributing to the causes of retail decline itself. Additionally, this research reviewed some of the measures that can be used to deal with retail decline, implications for policy-making and reflected on the use of the data collection and analysis methods in the context of small to medium towns.
Resumo:
During the last few years many research efforts have been done to improve the design of ETL (Extract-Transform-Load) systems. ETL systems are considered very time-consuming, error-prone and complex involving several participants from different knowledge domains. ETL processes are one of the most important components of a data warehousing system that are strongly influenced by the complexity of business requirements, their changing and evolution. These aspects influence not only the structure of a data warehouse but also the structures of the data sources involved with. To minimize the negative impact of such variables, we propose the use of ETL patterns to build specific ETL packages. In this paper, we formalize this approach using BPMN (Business Process Modelling Language) for modelling more conceptual ETL workflows, mapping them to real execution primitives through the use of a domain-specific language that allows for the generation of specific instances that can be executed in an ETL commercial tool.
Resumo:
Publicado em "Information control in manufacturing 1998 : (INCOM'98) : advances in industrial engineering : a proceedings volume from the 9th IFAC Symposium, Nancy-Metz, France, 24-26 June 1998. Vol. 2"
Resumo:
Los eventos transitorios únicos analógicos (ASET, Analog Single Event Transient) se producen debido a la interacción de un ión pesado o un protón de alta energía con un dispositivo sensible de un circuito analógico. La interacción del ión con un transistor bipolar o de efecto de campo MOS induce pares electrón-hueco que provocan picos que pueden propagarse a la salida del componente analógico provocando transitorios que pueden inducir fallas en el nivel sistema. Los problemas más graves debido a este tipo de fenómeno se dan en el medioambiente espacial, muy rico en iones pesados. Casos típicos los constituyen las computadoras de a bordo de satélites y otros artefactos espaciales. Sin embargo, y debido a la continua contracción de dimensiones de los transistores (que trae aparejado un aumento de sensibilidad), este fenómeno ha comenzado a observarse a nivel del mar, provocado fundamentalmente por el impacto de neutrones atmosféricos. Estos efectos pueden provocar severos problemas a los sistemas informáticos con interfaces analógicas desde las que obtienen datos para el procesamiento y se han convertido en uno de los problemas más graves a los que tienen que hacer frente los diseñadores de sistemas de alta escala de integración. Casos típicos son los Sistemas en Chip que incluyen módulos de procesamiento de altas prestaciones como las interfaces analógicas.El proyecto persigue como objetivo general estudiar la susceptibilidad de sistemas informáticos a ASETs en sus secciones analógicas, proponiendo estrategias para la mitigación de los errores.Como objetivos específicos se pretende: -Proponer nuevos modelos de ASETs basados en simulaciones en el nivel dispositivo y resueltas por el método de elementos finitos.-Utilizar los modelos para identificar las secciones más propensas a producir errores y consecuentemente para ser candidatos a la aplicación de técnicas de endurecimiento a radiaciones.-Utilizar estos modelos para estudiar la naturaleza de los errores producidos en sistemas de procesamiento de datos.-Proponer soluciones novedosas para la mitigación de estos efectos en los mismos circuitos analógicos evitando su propagación a las secciones digitales.-Proponer soluciones para la mitigación de los efectos en el nivel sistema.Para llevar a cabo el proyecto se plantea un procedimiento ascendente para las investigaciones a realizar, comenzando por descripciones en el nivel físico para posteriormente aumentar el nivel de abstracción en el que se encuentra modelado el circuito. Se propone el modelado físico de los dispositivos MOS y su resolución mediante el Método de Elementos Finitos. La inyección de cargas en las zonas sensibles de los modelos permitirá determinar los perfiles de los pulsos de corriente que deben inyectarse en el nivel circuito para emular estos efectos. Estos procedimientos se realizarán para los distintos bloques constructivos de las interfaces analógicas, proponiendo estrategias de mitigación de errores en diferentes niveles.Los resultados esperados del presente proyecto incluyen hardware para detección de errores y tolerancia a este tipo de eventos que permitan aumentar la confiabilidad de sistemas de tratamiento de la información, así como también nuevos datos referentes a efectos de la radiación en semiconductores, nuevos modelos de fallas transitorias que permitan una simulación de estos eventos en el nivel circuito y la determinación de zonas sensibles de interfaces analógicas típicas que deben ser endurecidas para radiación.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2014
Resumo:
ABSTRACT: BACKGROUND: Cardiovascular magnetic resonance (CMR) has favorable characteristics for diagnostic evaluation and risk stratification of patients with known or suspected CAD. CMR utilization in CAD detection is growing fast. However, data on its cost-effectiveness are scarce. The goal of this study is to compare the costs of two strategies for detection of significant coronary artery stenoses in patients with suspected coronary artery disease (CAD): 1) Performing CMR first to assess myocardial ischemia and/or infarct scar before referring positive patients (defined as presence of ischemia and/or infarct scar to coronary angiography (CXA) versus 2) a hypothetical CXA performed in all patients as a single test to detect CAD. METHODS: A subgroup of the European CMR pilot registry was used including 2,717 consecutive patients who underwent stress-CMR. From these patients, 21% were positive for CAD (ischemia and/or infarct scar), 73% negative, and 6% uncertain and underwent additional testing. The diagnostic costs were evaluated using invoicing costs of each test performed. Costs analysis was performed from a health care payer perspective in German, United Kingdom, Swiss, and United States health care settings. RESULTS: In the public sectors of the German, United Kingdom, and Swiss health care systems, cost savings from the CMR-driven strategy were 50%, 25% and 23%, respectively, versus outpatient CXA. If CXA was carried out as an inpatient procedure, cost savings were 46%, 50% and 48%, respectively. In the United States context, cost savings were 51% when compared with inpatient CXA, but higher for CMR by 8% versus outpatient CXA. CONCLUSION: This analysis suggests that from an economic perspective, the use of CMR should be encouraged as a management option for patients with suspected CAD.
Resumo:
Secondary accident statistics can be useful for studying the impact of traffic incident management strategies. An easy-to-implement methodology is presented for classifying secondary accidents using data fusion of a police accident database with intranet incident reports. A current method for classifying secondary accidents uses a static threshold that represents the spatial and temporal region of influence of the primary accident, such as two miles and one hour. An accident is considered secondary if it occurs upstream from the primary accident and is within the duration and queue of the primary accident. However, using the static threshold may result in both false positives and negatives because accident queues are constantly varying. The methodology presented in this report seeks to improve upon this existing method by making the threshold dynamic. An incident progression curve is used to mark the end of the queue throughout the entire incident. Four steps in the development of incident progression curves are described. Step one is the processing of intranet incident reports. Step two is the filling in of incomplete incident reports. Step three is the nonlinear regression of incident progression curves. Step four is the merging of individual incident progression curves into one master curve. To illustrate this methodology, 5,514 accidents from Missouri freeways were analyzed. The results show that secondary accidents identified by dynamic versus static thresholds can differ by more than 30%.
Resumo:
This report evaluates the use of remotely sensed images in implementing the Iowa DOT LRS that is currently in the stages of system architecture. The Iowa Department of Transportation is investing a significant amount of time and resources into creation of a linear referencing system (LRS). A significant portion of the effort in implementing the system will be creation of a datum, which includes geographically locating anchor points and then measuring anchor section distances between those anchor points. Currently, system architecture and evaluation of different data collection methods to establish the LRS datum is being performed for the DOT by an outside consulting team.
Resumo:
We present a participant study that compares biological data exploration tasks using volume renderings of laser confocal microscopy data across three environments that vary in level of immersion: a desktop, fishtank, and cave system. For the tasks, data, and visualization approach used in our study, we found that subjects qualitatively preferred and quantitatively performed better in the cave compared with the fishtank and desktop. Subjects performed real-world biological data analysis tasks that emphasized understanding spatial relationships including characterizing the general features in a volume, identifying colocated features, and reporting geometric relationships such as whether clusters of cells were coplanar. After analyzing data in each environment, subjects were asked to choose which environment they wanted to analyze additional data sets in - subjects uniformly selected the cave environment.