921 resultados para graphical overlay


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This final year project presents the design principles and prototype implementation of BIMS (Biomedical Information Management System), a flexible software system which provides an infrastructure to manage all information required by biomedical research projects.The BIMS project was initiated with the motivation to solve several limitations in medical data acquisition of some research projects, in which Universitat Pompeu Fabra takes part. These limitations,based on the lack of control mechanisms to constraint information submitted by clinicians, impact on the data quality, decreasing it.BIMS can easily be adapted to manage information of a wide variety of clinical studies, not being limited to a given clinical specialty. The software can manage both, textual information, like clinical data (measurements, demographics, diagnostics, etc ...), as well as several kinds of medical images (magnetic resonance imaging, computed tomography, etc ...). Moreover, BIMS provides a web - based graphical user interface and is designed to be deployed in a distributed andmultiuser environment. It is built on top of open source software products and frameworks.Specifically, BIMS has been used to represent all clinical data being currently used within the CardioLab platform (an ongoing project managed by Universitat Pompeu Fabra), demonstratingthat it is a solid software system, which could fulfill requirements of a real production environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Almost 30 years ago, Bayesian networks (BNs) were developed in the field of artificial intelligence as a framework that should assist researchers and practitioners in applying the theory of probability to inference problems of more substantive size and, thus, to more realistic and practical problems. Since the late 1980s, Bayesian networks have also attracted researchers in forensic science and this tendency has considerably intensified throughout the last decade. This review article provides an overview of the scientific literature that describes research on Bayesian networks as a tool that can be used to study, develop and implement probabilistic procedures for evaluating the probative value of particular items of scientific evidence in forensic science. Primary attention is drawn here to evaluative issues that pertain to forensic DNA profiling evidence because this is one of the main categories of evidence whose assessment has been studied through Bayesian networks. The scope of topics is large and includes almost any aspect that relates to forensic DNA profiling. Typical examples are inference of source (or, 'criminal identification'), relatedness testing, database searching and special trace evidence evaluation (such as mixed DNA stains or stains with low quantities of DNA). The perspective of the review presented here is not exclusively restricted to DNA evidence, but also includes relevant references and discussion on both, the concept of Bayesian networks as well as its general usage in legal sciences as one among several different graphical approaches to evidence evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deterioration in portland cement concrete (PCC) pavements can occur due to distresses caused by a combination of traffic loads and weather conditions. Hot mix asphalt (HMA) overlay is the most commonly used rehabilitation technique for such deteriorated PCC pavements. However, the performance of these HMA overlaid pavements is hindered due to the occurrence of reflective cracking, resulting in significant reduction of pavement serviceability. Various fractured slab techniques, including rubblization, crack and seat, and break and seat are used to minimize reflective cracking by reducing the slab action. However, the design of structural overlay thickness for cracked and seated and rubblized pavements is difficult as the resulting structure is neither a “true” rigid pavement nor a “true” flexible pavement. Existing design methodologies use the empirical procedures based on the AASHO Road Test conducted in 1961. But, the AASHO Road Test did not employ any fractured slab technique, and there are numerous limitations associated with extrapolating its results to HMA overlay thickness design for fractured PCC pavements. The main objective of this project is to develop a mechanistic-empirical (ME) design approach for the HMA overlay thickness design for fractured PCC pavements. In this design procedure, failure criteria such as the tensile strain at the bottom of HMA layer and the vertical compressive strain on the surface of subgrade are used to consider HMA fatigue and subgrade rutting, respectively. The developed ME design system is also implemented in a Visual Basic computer program. A partial validation of the design method with reference to an instrumented trial project (IA-141, Polk County) in Iowa is provided in this report. Tensile strain values at the bottom of the HMA layer collected from the FWD testing at this project site are in agreement with the results obtained from the developed computer program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project focuses on studying and testing the benefits of the NX Remote Desktop technology in administrative use for Finnish Meteorological Institutes existing Linux Terminal Service Project environment. This was done due to the criticality of the system caused by growing number of users as the Linux Terminal Service Project system expands. Although many of the supporting tasks can be done via Secure Shell connection, testing graphical programs or desktop behaviour in such a way is impossible. At first basic technologies behind the NX Remote Desktop were studied, and after that started the testing of two possible programs, FreeNX and NoMachine NX server. Testing the functionality and bandwidth demands were first done in a closed local area network, and results were studied. The better candidate was then installed in a virtual server simulating actual Linux Terminal Service Project server at Finnish Meteorological Institute and connection from Internet was tested to see was there any problems with firewalls and security policies. The results are reported in this study. Studying and testing the two different candidates of NX Remote Desktop showed, that NoMachine NX Server provides better customer support and documentation. Security aspects of the Finnish Meteorological Institute had also to be considered, and since updates along with the new developing tools are announced in next version of the program, this version was the choice. Studies also show that even NoMachine promises a swift connection over an average of 20Kbit/s bandwidth, at least double of that is needed. This project gives an overview of available remote desktop products along their benefits. NX Remote Desktop technology is studied, and installation instructions are included. Testing is done in both, closed and the actual environment and problems and suggestions are studied and analyzed. The installation to the actual LTSP server is not yet made, but a virtual server is put up in the same place in the view of network topology. This ensures, that if the administrators are satisfied with the system, installation and setting up the system will go as described in this report.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The growth-associated and presynaptic protein GAP-43 is important for axonal growth during brain development, for synaptic plasticity and in axonal regeneration [Benowitz, Routtenberg, TINS 12 (1987) 527]. It has been speculated that such growth may be mediated by cytoskeletal proteins. However, the interaction of GAP-43 with proteins of the presynaptic terminals is poorly characterized. Here, we analyze GAP-43 binding to cytoskeletal proteins by two different biochemical assays, by blot overlay and sedimentation. We find that immobilized brain spectrin (BS) is able to bind GAP-43. In contrast, little binding was observed to microtubule proteins and other elements of the cytoskeleton. Since GAP-43 is located presynaptically, it may bind to the presynaptic form of BS (SpIISigma1). It is attractive to think that such an interaction would participate in the structural plasticity observed in growth cones and adult synapses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biplots are graphical displays of data matrices based on the decomposition of a matrix as the product of two matrices. Elements of these two matrices are used as coordinates for the rows and columns of the data matrix, with an interpretation of the joint presentation that relies on the properties of the scalar product. Because the decomposition is not unique, there are several alternative ways to scale the row and column points of the biplot, which can cause confusion amongst users, especially when software packages are not united in their approach to this issue. We propose a new scaling of the solution, called the standard biplot, which applies equally well to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. The standard biplot also handles data matrices with widely different levels of inherent variance. Two concepts taken from correspondence analysis are important to this idea: the weighting of row and column points, and the contributions made by the points to the solution. In the standard biplot one set of points, usually the rows of the data matrix, optimally represent the positions of the cases or sample units, which are weighted and usually standardized in some way unless the matrix contains values that are comparable in their raw form. The other set of points, usually the columns, is represented in accordance with their contributions to the low-dimensional solution. As for any biplot, the projections of the row points onto vectors defined by the column points approximate the centred and (optionally) standardized data. The method is illustrated with several examples to demonstrate how the standard biplot copes in different situations to give a joint map which needs only one common scale on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot readable. The proposal also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, thin whitetopping has evolved as a viable rehabilitation technique for deteriorated asphalt cement concrete (ACC) pavements. Numerous projects have been constructed and tested, allowing researchers to identify the important elements contributing to the projects’ successes. These elements include surface preparation, overlay thickness, synthetic fiber reinforcement usage, joint spacing, and joint sealing. Although the main factors affecting thin whitetopping performance have been identified by previous research, questions still existed as to the optimum design incorporating these variables. The objective of this research is to investigate the interaction between these variables over time. Laboratory testing and field testing were conducted to achieve the research objectives. Laboratory testing involved shear testing of the bond between the portland cement concrete (PCC) overlay and the ACC surface. Field testing involved falling weight deflectometer deflection responses, measurement of joint faulting and joint opening, and visual distress surveys on the 9.6-mile project. The project was located on Iowa Highway 13 extending north from the city of Manchester, Iowa, to Iowa Highway 3 in Delaware County. Variables investigated include ACC surface preparation, PCC thickness, slab size, synthetic fiber reinforcement usage, and joint spacing. This report documents the planning, construction, and performance of each variable in the time period from summer 2002 through spring 2006. The project has performed well with only minor distress identification since its construction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As características do tráfego na Internet são cada vez mais complexas devido à crescente diversidade de aplicações, à existência de diferenças drásticas no comportamento de utilizadores, à mobilidade de utilizadores e equipamentos, à complexidade dos mecanismos de geração e controlo de tráfego, e à crescente diversidade dos tipos de acesso e respectivas capacidades. Neste cenário é inevitável que a gestão da rede seja cada vez mais baseada em medições de tráfego em tempo real. Devido à elevada quantidade de informação que é necessário processar e armazenar, é também cada vez maior a necessidade das plataformas de medição de tráfego assumirem uma arquitectura distribuída, permitindo o armazenamento distribuído, replicação e pesquisa dos dados medidos de forma eficiente, possivelmente imitando o paradigma Peer-to-Peer (P2P). Esta dissertação descreve a especificação, implementação e teste de um sistema de medição de tráfego com uma arquitectura distribuída do tipo P2P, que fornece aos gestores de rede uma ferramenta para configurar remotamente sistemas de monitorização instalados em diversos pontos da rede para a realização de medições de tráfego. O sistema pode também ser usado em redes orientadas à comunidade onde os utilizadores podem partilhar recursos das suas máquinas para permitir que outros realizem medições e partilhem os dados obtidos. O sistema é baseado numa rede de overlay com uma estrutura hierárquica organizada em áreas de medição. A rede de overlay é composta por dois tipos de nós, denominados de probes e super-probes, que realizam as medições e armazenam os resultados das mesmas. As superprobes têm ainda a função de garantir a ligação entre áreas de medição e gerir a troca de mensagens entre a rede e as probes a elas conectadas. A topologia da rede de overlay pode mudar dinamicamente, com a inserção de novos nós e a remoção de outros, e com a promoção de probes a super-probes e viceversa, em resposta a alterações dos recursos disponíveis. Os nós armazenam dois tipos de resultados de medições: Light Data Files (LDFs) e Heavy Data Files (HDFs). Os LDFs guardam informação relativa ao atraso médio de ida-evolta de cada super-probe para todos os elementos a ela ligados e são replicados em todas as super-probes, fornecendo uma visão simples mas facilmente acessível do estado da rede. Os HDFs guardam os resultados detalhados das medições efectuadas a nível do pacote ou do fluxo e podem ser replicados em alguns nós da rede. As réplicas são distribuídas pela rede tendo em consideração os recursos disponíveis nos nós, de forma a garantir resistência a falhas. Os utilizadores podem configurar medições e pesquisar os resultados através do elemento denominado de cliente. Foram realizados diversos testes de avaliação do sistema que demonstraram estar o mesmo a operar correctamente e de forma eficiente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Asphalt pavement recycling has grown dramatically over the last few years as a viable technology to rehabilitate existing asphalt pavements. Iowa's current Cold In-place Recycling (CIR) practice utilizes a generic recipe specification to define the characteristics of the CIR mixture. As CIR continues to evolve, the desire to place CIR mixture with specific engineering properties requires the use of a mix design process. A new mix design procedure was developed for Cold In-place Recycling using foamed asphalt (CIR-foam) in consideration of its predicted field performance. The new laboratory mix design process was validated against various Reclaimed Asphalt Pavement (RAP) materials to determine its consistency over a wide range of RAP materials available throughout Iowa. The performance tests, which include dynamic modulus test, dynamic creep test and raveling test, were conducted to evaluate the consistency of a new CIR-foam mix design process to ensure reliable mixture performance over a wide range of traffic and climatic conditions. The “lab designed” CIR will allow the pavement designer to take the properties of the CIR into account when determining the overlay thickness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The forensic two-trace problem is a perplexing inference problem introduced by Evett (J Forensic Sci Soc 27:375-381, 1987). Different possible ways of wording the competing pair of propositions (i.e., one proposition advanced by the prosecution and one proposition advanced by the defence) led to different quantifications of the value of the evidence (Meester and Sjerps in Biometrics 59:727-732, 2003). Here, we re-examine this scenario with the aim of clarifying the interrelationships that exist between the different solutions, and in this way, produce a global vision of the problem. We propose to investigate the different expressions for evaluating the value of the evidence by using a graphical approach, i.e. Bayesian networks, to model the rationale behind each of the proposed solutions and the assumptions made on the unknown parameters in this problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Graphical displays which show inter--sample distances are importantfor the interpretation and presentation of multivariate data. Except whenthe displays are two--dimensional, however, they are often difficult tovisualize as a whole. A device, based on multidimensional unfolding, isdescribed for presenting some intrinsically high--dimensional displays infewer, usually two, dimensions. This goal is achieved by representing eachsample by a pair of points, say $R_i$ and $r_i$, so that a theoreticaldistance between the $i$-th and $j$-th samples is represented twice, onceby the distance between $R_i$ and $r_j$ and once by the distance between$R_j$ and $r_i$. Self--distances between $R_i$ and $r_i$ need not be zero.The mathematical conditions for unfolding to exhibit symmetry are established.Algorithms for finding approximate fits, not constrained to be symmetric,are discussed and some examples are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to interpret the biplot it is necessary to know which points usually variables are the ones that are important contributors to the solution, and this information is available separately as part of the biplot s numerical results. We propose a new scaling of the display, called the contribution biplot, which incorporates this diagnostic directly into the graphical display, showing visually the important contributors and thus facilitating the biplot interpretation and often simplifying the graphical representation considerably. The contribution biplot can be applied to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. In the contribution biplot one set of points, usually the rows of the data matrix, optimally represent the spatial positions of the cases or sample units, according to some distance measure that usually incorporates some form of standardization unless all data are comparable in scale. The other set of points, usually the columns, is represented by vectors that are related to their contributions to the low-dimensional solution. A fringe benefit is that usually only one common scale for row and column points is needed on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot legible. Furthermore, this version of the biplot also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important, when they are in fact contributing minimally to the solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control ofcomputational flow to ensure that only strictly required computationsare actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tool for user choice of the local bandwidth function for a kernel density estimate is developed using KDE, a graphical object-oriented package for interactive kernel density estimation written in LISP-STAT. The bandwidth function is a cubic spline, whose knots are manipulated by the user in one window, while the resulting estimate appears in another window. A real data illustration of this method raises concerns, because an extremely large family of estimates is available.