996 resultados para Process Visualization
Resumo:
A bidimensional array based on single-photon avalanche diodes for triggered imaging systems is presented. The diodes are operated in the gated mode of acquisition to reduce the probability to detect noise counts interfering with photon arrival events. In addition, low reverse bias overvoltages are used to lessen the dark count rate. Experimental results demonstrate that the prototype fabricated with a standard HV-CMOS process gets rid of afterpulses and offers a reduced dark count probability by applying the proposed modes of operation. The detector exhibits a dynamic range of 15 bits with short gated"on" periods of 10ns and a reverse bias overvoltage of 1.0V.
Resumo:
The need to move forward in the knowledge of the subatomic world has stimulated the development of new particle colliders. However, the objectives of the next generation of colliders sets unprecedented challenges to the detector performance. The purpose of this contribution is to present a bidimensional array based on avalanche photodiodes operated in the Geiger mode to track high energy particles in future linear colliders. The bidimensional array can function in a gated mode to reduce the probability to detect noise counts interfering with real events. Low reverse overvoltages are used to lessen the dark count rate. Experimental results demonstrate that the prototype fabricated with a standard HV-CMOS process presents an increased efficiency and avoids sensor blindness by applying the proposed techniques.
Resumo:
The coverage and volume of geo-referenced datasets are extensive and incessantly¦growing. The systematic capture of geo-referenced information generates large volumes¦of spatio-temporal data to be analyzed. Clustering and visualization play a key¦role in the exploratory data analysis and the extraction of knowledge embedded in¦these data. However, new challenges in visualization and clustering are posed when¦dealing with the special characteristics of this data. For instance, its complex structures,¦large quantity of samples, variables involved in a temporal context, high dimensionality¦and large variability in cluster shapes.¦The central aim of my thesis is to propose new algorithms and methodologies for¦clustering and visualization, in order to assist the knowledge extraction from spatiotemporal¦geo-referenced data, thus improving making decision processes.¦I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical¦Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis:¦the Tree-structured Self-organizing Maps Component Planes. In addition, I present¦methodologies that combined with FGHSON and the Tree-structured SOM Component¦Planes allow the integration of space and time seamlessly and simultaneously in¦order to extract knowledge embedded in a temporal context.¦The originality of the FGHSON lies in its capability to reflect the underlying structure¦of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of¦clusters is crucial when data include complex structures with large variability of cluster¦shapes, variances, densities and number of clusters. The most important characteristics¦of the FGHSON include: (1) It does not require an a-priori setup of the number¦of clusters. (2) The algorithm executes several self-organizing processes in parallel.¦Hence, when dealing with large datasets the processes can be distributed reducing the¦computational cost. (3) Only three parameters are necessary to set up the algorithm.¦In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm¦lies in its ability to create a structure that allows the visual exploratory data analysis¦of large high-dimensional datasets. This algorithm creates a hierarchical structure¦of Self-Organizing Map Component Planes, arranging similar variables' projections in¦the same branches of the tree. Hence, similarities on variables' behavior can be easily¦detected (e.g. local correlations, maximal and minimal values and outliers).¦Both FGHSON and the Tree-structured SOM Component Planes were applied in¦several agroecological problems proving to be very efficient in the exploratory analysis¦and clustering of spatio-temporal datasets.¦In this thesis I also tested three soft competitive learning algorithms. Two of them¦well-known non supervised soft competitive algorithms, namely the Self-Organizing¦Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the¦third was our original contribution, the FGHSON. Although the algorithms presented¦here have been used in several areas, to my knowledge there is not any work applying¦and comparing the performance of those techniques when dealing with spatiotemporal¦geospatial data, as it is presented in this thesis.¦I propose original methodologies to explore spatio-temporal geo-referenced datasets¦through time. Our approach uses time windows to capture temporal similarities and¦variations by using the FGHSON clustering algorithm. The developed methodologies¦are used in two case studies. In the first, the objective was to find similar agroecozones¦through time and in the second one it was to find similar environmental patterns¦shifted in time.¦Several results presented in this thesis have led to new contributions to agroecological¦knowledge, for instance, in sugar cane, and blackberry production.¦Finally, in the framework of this thesis we developed several software tools: (1)¦a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called¦BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user¦interface tool which integrates the FGHSON algorithm with Google Earth in order to¦show zones with similar agroecological characteristics.
Resumo:
Hepatitis C virus (HCV) replicates its genome in a membrane-associated replication complex, composed of viral proteins, replicating RNA and altered cellular membranes. We describe here HCV replicons that allow the direct visualization of functional HCV replication complexes. Viable replicons selected from a library of Tn7-mediated random insertions in the coding sequence of nonstructural protein 5A (NS5A) allowed the identification of two sites near the NS5A C terminus that tolerated insertion of heterologous sequences. Replicons encoding green fluorescent protein (GFP) at these locations were only moderately impaired for HCV RNA replication. Expression of the NS5A-GFP fusion protein could be demonstrated by immunoblot, indicating that the GFP was retained during RNA replication and did not interfere with HCV polyprotein processing. More importantly, expression levels were robust enough to allow direct visualization of the fusion protein by fluorescence microscopy. NS5A-GFP appeared as brightly fluorescing dot-like structures in the cytoplasm. By confocal laser scanning microscopy, NS5A-GFP colocalized with other HCV nonstructural proteins and nascent viral RNA, indicating that the dot-like structures, identified as membranous webs by electron microscopy, represent functional HCV replication complexes. These findings reveal an unexpected flexibility of the C-terminal domain of NS5A and provide tools for studying the formation and turnover of HCV replication complexes in living cells.
Resumo:
Ordering in a binary alloy is studied by means of a molecular-dynamics (MD) algorithm which allows to reach the domain growth regime. Results are compared with Monte Carlo simulations using a realistic vacancy-atom (MC-VA) mechanism. At low temperatures fast growth with a dynamical exponent x>1/2 is found for MD and MC-VA. The study of a nonequilibrium ordering process with the two methods shows the importance of the nonhomogeneity of the excitations in the system for determining its macroscopic kinetics.
Resumo:
The historically-reactive approach to identifying safety problems and mitigating them involves selecting black spots or hot spots by ranking locations based on crash frequency and severity. The approach focuses mainly on the corridor level without taking the exposure rate (vehicle miles traveled) and socio-demographics information of the study area, which are very important in the transportation planning process, into consideration. A larger study analysis unit at the Transportation Analysis Zone (TAZ) level or the network planning level should be used to address the needs of development of the community in the future and incorporate safety into the long-range transportation planning process. In this study, existing planning tools (such as the PLANSAFE models presented in NCHRP Report 546) were evaluated for forecasting safety in small and medium-sized communities, particularly as related to changes in socio-demographics characteristics, traffic demand, road network, and countermeasures. The research also evaluated the applicability of the Empirical Bayes (EB) method to network-level analysis. In addition, application of the United States Road Assessment Program (usRAP) protocols at the local urban road network level was investigated. This research evaluated the applicability of these three methods for the City of Ames, Iowa. The outcome of this research is a systematic process and framework for considering road safety issues explicitly in the small and medium-sized community transportation planning process and for quantifying the safety impacts of new developments and policy programs. More specifically, quantitative safety may be incorporated into the planning process, through effective visualization and increased awareness of safety issues (usRAP), the identification of high-risk locations with potential for improvement, (usRAP maps and EB), countermeasures for high-risk locations (EB before and after study and PLANSAFE), and socio-economic and demographic induced changes at the planning-level (PLANSAFE).
Resumo:
A previous study sponsored by the Smart Work Zone Deployment Initiative, “Feasibility of Visualization and Simulation Applications to Improve Work Zone Safety and Mobility,” demonstrated the feasibility of combining readily available, inexpensive software programs, such as SketchUp and Google Earth, with standard two-dimensional civil engineering design programs, such as MicroStation, to create animations of construction work zones. The animations reflect changes in work zone configurations as the project progresses, representing an opportunity to visually present complex information to drivers, construction workers, agency personnel, and the general public. The purpose of this study is to continue the work from the previous study to determine the added value and resource demands created by including more complex data, specifically traffic volume, movement, and vehicle type. This report describes the changes that were made to the simulation, including incorporating additional data and converting the simulation from a desktop application to a web application.
Resumo:
The Quadrennial Needs Study was developed to assist in the identification of highway needs and the distribution of road funds in Iowa among the various highway entities. During the period 1978 to 1990, the process has seen large shifts in needs and associated funding distribution in individual counties with no apparent reasons. This study investigated the reasons for such shifts. The study identified program inputs that can result in major shifts in needs either up or down from minor changes in the input values. The areas of concern were identified as the condition ratings for roads and structures, traffic volume and mix counts, and the assignment of construction cost areas. Eight counties exhibiting the large shifts (greater than 30%) in needs over time were used to test the sensitivity of the variables. A ninth county was used as the base line for the study. Recommendations are identified for improvements in the process of data collection in the areas of road and structure condition--rating, traffic, and in the assignment of construction cost areas. Advice is also offered in how to account for changes in jurisdiction between successive studies. Maintenance cost area assignment and levels of maintenance service are identified as requiring additional detailed research.
Resumo:
A laboratory study has been conducted with two aims in mind. The first goal was to develop a description of how a cutting edge scrapes ice from the road surface. The second goal was to investigate the extent, if any, to which serrated blades were better than un-serrated or "classical" blades at ice removal. The tests were conducted in the Ice Research Laboratory at the Iowa Institute of Hydraulic Research of the University of Iowa. A specialized testing machine, with a hydraulic ram capable of attaining scraping velocities of up to 30 m.p.h. was used in the testing. In order to determine the ice scraping process, the effects of scraping velocity, ice thickness, and blade geometry on the ice scraping forces were determined. Higher ice thickness lead to greater ice chipping (as opposed to pulverization at lower thicknesses) and thus lower loads. Behavior was observed at higher velocities. The study of blade geometry included the effect of rake angle, clearance angle, and flat width. The latter were found to be particularly important in developing a clear picture of the scraping process. As clearance angle decreases and flat width increases, the scraping loads show a marked increase, due to the need to re-compress pulverized ice fragments. The effect of serrations was to decrease the scraping forces. However, for the coarsest serrated blades (with the widest teeth and gaps) the quantity of ice removed was significantly less than for a classical blade. Finer serrations appear to be able to match the ice removal of classical blades at lower scraping loads. Thus, one of the recommendations of this study is to examine the use of serrated blades in the field. Preliminary work (by Nixon and Potter, 1996) suggests such work will be fruitful. A second and perhaps more challenging result of the study is that chipping of ice is more preferable to pulverization of the ice. How such chipping can be forced to occur is at present an open question.
Resumo:
Laser-induced forward transfer (LIFT) is a laser direct-write technique that offers the possibility of printing patterns with a high spatial resolution from a wide range of materials in a solid or liquid state, such as conductors, dielectrics, and biomolecules in solution. This versatility has made LIFT a very promising alternative to lithography-based processes for the rapid prototyping of biomolecule microarrays. Here, we study the transfer process through the LIFT of droplets of a solution suitable for microarray preparation. The laser pulse energy and beam size were systematically varied, and the effect on the transferred droplets was evaluated. Controlled transfers in which the deposited droplets displayed optimal features could be obtained by varying these parameters. In addition, the transferred droplet volume displayed a linear dependence on the laser pulse energy. This dependence allowed determining a threshold energy density value, independent of the laser focusing conditions, which acted as necessary conditions for the transfer to occur. The corresponding sufficient condition was given by a different total energy threshold for each laser beam dimension. The threshold energy density was found to be the dimensional parameter that determined the amount of the transferred liquid per laser pulse, and there was no substantial loss of material due to liquid vaporization during the transfer.