921 resultados para graphical overlay
Resumo:
The country has witnessed tremendous increase in the vehicle population and increased axle loading pattern during the last decade, leaving its road network overstressed and leading to premature failure. The type of deterioration present in the pavement should be considered for determining whether it has a functional or structural deficiency, so that appropriate overlay type and design can be developed. Structural failure arises from the conditions that adversely affect the load carrying capability of the pavement structure. Inadequate thickness, cracking, distortion and disintegration cause structural deficiency. Functional deficiency arises when the pavement does not provide a smooth riding surface and comfort to the user. This can be due to poor surface friction and texture, hydro planning and splash from wheel path, rutting and excess surface distortion such as potholes, corrugation, faulting, blow up, settlement, heaves etc. Functional condition determines the level of service provided by the facility to its users at a particular time and also the Vehicle Operating Costs (VOC), thus influencing the national economy. Prediction of the pavement deterioration is helpful to assess the remaining effective service life (RSL) of the pavement structure on the basis of reduction in performance levels, and apply various alternative designs and rehabilitation strategies with a long range funding requirement for pavement preservation. In addition, they can predict the impact of treatment on the condition of the sections. The infrastructure prediction models can thus be classified into four groups, namely primary response models, structural performance models, functional performance models and damage models. The factors affecting the deterioration of the roads are very complex in nature and vary from place to place. Hence there is need to have a thorough study of the deterioration mechanism under varied climatic zones and soil conditions before arriving at a definite strategy of road improvement. Realizing the need for a detailed study involving all types of roads in the state with varying traffic and soil conditions, the present study has been attempted. This study attempts to identify the parameters that affect the performance of roads and to develop performance models suitable to Kerala conditions. A critical review of the various factors that contribute to the pavement performance has been presented based on the data collected from selected road stretches and also from five corporations of Kerala. These roads represent the urban conditions as well as National Highways, State Highways and Major District Roads in the sub urban and rural conditions. This research work is a pursuit towards a study of the road condition of Kerala with respect to varying soil, traffic and climatic conditions, periodic performance evaluation of selected roads of representative types and development of distress prediction models for roads of Kerala. In order to achieve this aim, the study is focused into 2 parts. The first part deals with the study of the pavement condition and subgrade soil properties of urban roads distributed in 5 Corporations of Kerala; namely Thiruvananthapuram, Kollam, Kochi, Thrissur and Kozhikode. From selected 44 roads, 68 homogeneous sections were studied. The data collected on the functional and structural condition of the surface include pavement distress in terms of cracks, potholes, rutting, raveling and pothole patching. The structural strength of the pavement was measured as rebound deflection using Benkelman Beam deflection studies. In order to collect the details of the pavement layers and find out the subgrade soil properties, trial pits were dug and the in-situ field density was found using the Sand Replacement Method. Laboratory investigations were carried out to find out the subgrade soil properties, soil classification, Atterberg limits, Optimum Moisture Content, Field Moisture Content and 4 days soaked CBR. The relative compaction in the field was also determined. The traffic details were also collected by conducting traffic volume count survey and axle load survey. From the data thus collected, the strength of the pavement was calculated which is a function of the layer coefficient and thickness and is represented as Structural Number (SN). This was further related to the CBR value of the soil and the Modified Structural Number (MSN) was found out. The condition of the pavement was represented in terms of the Pavement Condition Index (PCI) which is a function of the distress of the surface at the time of the investigation and calculated in the present study using deduct value method developed by U S Army Corps of Engineers. The influence of subgrade soil type and pavement condition on the relationship between MSN and rebound deflection was studied using appropriate plots for predominant types of soil and for classified value of Pavement Condition Index. The relationship will be helpful for practicing engineers to design the overlay thickness required for the pavement, without conducting the BBD test. Regression analysis using SPSS was done with various trials to find out the best fit relationship between the rebound deflection and CBR, and other soil properties for Gravel, Sand, Silt & Clay fractions. The second part of the study deals with periodic performance evaluation of selected road stretches representing National Highway (NH), State Highway (SH) and Major District Road (MDR), located in different geographical conditions and with varying traffic. 8 road sections divided into 15 homogeneous sections were selected for the study and 6 sets of continuous periodic data were collected. The periodic data collected include the functional and structural condition in terms of distress (pothole, pothole patch, cracks, rutting and raveling), skid resistance using a portable skid resistance pendulum, surface unevenness using Bump Integrator, texture depth using sand patch method and rebound deflection using Benkelman Beam. Baseline data of the study stretches were collected as one time data. Pavement history was obtained as secondary data. Pavement drainage characteristics were collected in terms of camber or cross slope using camber board (slope meter) for the carriage way and shoulders, availability of longitudinal side drain, presence of valley, terrain condition, soil moisture content, water table data, High Flood Level, rainfall data, land use and cross slope of the adjoining land. These data were used for finding out the drainage condition of the study stretches. Traffic studies were conducted, including classified volume count and axle load studies. From the field data thus collected, the progression of each parameter was plotted for all the study roads; and validated for their accuracy. Structural Number (SN) and Modified Structural Number (MSN) were calculated for the study stretches. Progression of the deflection, distress, unevenness, skid resistance and macro texture of the study roads were evaluated. Since the deterioration of the pavement is a complex phenomena contributed by all the above factors, pavement deterioration models were developed as non linear regression models, using SPSS with the periodic data collected for all the above road stretches. General models were developed for cracking progression, raveling progression, pothole progression and roughness progression using SPSS. A model for construction quality was also developed. Calibration of HDM–4 pavement deterioration models for local conditions was done using the data for Cracking, Raveling, Pothole and Roughness. Validation was done using the data collected in 2013. The application of HDM-4 to compare different maintenance and rehabilitation options were studied considering the deterioration parameters like cracking, pothole and raveling. The alternatives considered for analysis were base alternative with crack sealing and patching, overlay with 40 mm BC using ordinary bitumen, overlay with 40 mm BC using Natural Rubber Modified Bitumen and an overlay of Ultra Thin White Topping. Economic analysis of these options was done considering the Life Cycle Cost (LCC). The average speed that can be obtained by applying these options were also compared. The results were in favour of Ultra Thin White Topping over flexible pavements. Hence, Design Charts were also plotted for estimation of maximum wheel load stresses for different slab thickness under different soil conditions. The design charts showed the maximum stress for a particular slab thickness and different soil conditions incorporating different k values. These charts can be handy for a design engineer. Fuzzy rule based models developed for site specific conditions were compared with regression models developed using SPSS. The Riding Comfort Index (RCI) was calculated and correlated with unevenness to develop a relationship. Relationships were developed between Skid Number and Macro Texture of the pavement. The effort made through this research work will be helpful to highway engineers in understanding the behaviour of flexible pavements in Kerala conditions and for arriving at suitable maintenance and rehabilitation strategies. Key Words: Flexible Pavements – Performance Evaluation – Urban Roads – NH – SH and other roads – Performance Models – Deflection – Riding Comfort Index – Skid Resistance – Texture Depth – Unevenness – Ultra Thin White Topping
Resumo:
The resurgence of the enteric pathogen Vibrio cholerae, the causative organism of epidemic cholera, remains a major health problem in many developing countries like India. The southern Indian state of Kerala is endemic to cholera. The outbreaks of cholera follow a seasonal pattern in regions of endemicity. Marine aquaculture settings and mangrove environments of Kerala serve as reservoirs for V. cholerae. The non-O1/non-O139 environmental isolates of V. cholerae with incomplete ‘virulence casette’ are to be dealt with caution as they constitute a major reservoir of diverse virulence genes in the marine environment and play a crucial role in pathogenicity and horizontal gene transfer. The genes coding cholera toxin are borne on, and can be infectiously transmitted by CTXΦ, a filamentous lysogenic vibriophages. Temperate phages can provide crucial virulence and fitness factors affecting cell metabolism, bacterial adhesion, colonization, immunity, antibiotic resistance and serum resistance. The present study was an attempt to screen the marine environments like aquafarms and mangroves of coastal areas of Alappuzha and Cochin, Kerala for the presence of lysogenic V. cholerae, to study their pathogenicity and also gene transfer potential. Phenotypic and molecular methods were used for identification of isolates as V. cholerae. The thirty one isolates which were Gram negative, oxidase positive, fermentative, with or without gas production on MOF media and which showed yellow coloured colonies on TCBS (Thiosulfate Citrate Bile salt Sucrose) agar were segregated as vibrios. Twenty two environmental V. cholerae strains of both O1 and non- O1/non-O139 serogroups on induction with mitomycin C showed the presence of lysogenic phages. They produced characteristic turbid plaques in double agar overlay assay using the indicator strain V. cholerae El Tor MAK 757. PCR based molecular typing with primers targeting specific conserved sequences in the bacterial genome, demonstrated genetic diversity among these lysogen containing non-O1 V. cholerae . Polymerase chain reaction was also employed as a rapid screening method to verify the presence of 9 virulence genes namely, ctxA, ctxB, ace, hlyA, toxR, zot,tcpA, ninT and nanH, using gene specific primers. The presence of tcpA gene in ALPVC3 was alarming, as it indicates the possibility of an epidemic by accepting the cholera. Differential induction studies used ΦALPVC3, ΦALPVC11, ΦALPVC12 and ΦEKM14, underlining the possibility of prophage induction in natural ecosystems, due to abiotic factors like antibiotics, pollutants, temperature and UV. The efficiency of induction of prophages varied considerably in response to the different induction agents. The growth curve of lysogenic V. cholerae used in the study drastically varied in the presence of strong prophage inducers like antibiotics and UV. Bacterial cell lysis was directly proportional to increase in phage number due to induction. Morphological characterization of vibriophages by Transmission Electron Microscopy revealed hexagonal heads for all the four phages. Vibriophage ΦALPVC3 exhibited isometric and contractile tails characteristic of family Myoviridae, while phages ΦALPVC11 and ΦALPVC12 demonstrated the typical hexagonal head and non-contractile tail of family Siphoviridae. ΦEKM14, the podophage was distinguished by short non-contractile tail and icosahedral head. This work demonstrated that environmental parameters can influence the viability and cell adsorption rates of V. cholerae phages. Adsorption studies showed 100% adsorption of ΦALPVC3 ΦALPVC11, ΦALPVC12 and ΦEKM14 after 25, 30, 40 and 35 minutes respectively. Exposure to high temperatures ranging from 50ºC to 100ºC drastically reduced phage viability. The optimum concentration of NaCl required for survival of vibriophages except ΦEKM14 was 0.5 M and that for ΦEKM14 was 1M NaCl. Survival of phage particles was maximum at pH 7-8. V. cholerae is assumed to have existed long before their human host and so the pathogenic clones may have evolved from aquatic forms which later colonized the human intestine by progressive acquisition of genes. This is supported by the fact that the vast majority of V. cholerae strains are still part of the natural aquatic environment. CTXΦ has played a critical role in the evolution of the pathogenicity of V. cholerae as it can transmit the ctxAB gene. The unusual transformation of V. cholerae strains associated with epidemics and the emergence of V. cholera O139 demonstrates the evolutionary success of the organism in attaining greater fitness. Genetic changes in pathogenic V. cholerae constitute a natural process for developing immunity within an endemically infected population. The alternative hosts and lysogenic environmental V. cholerae strains may potentially act as cofactors in promoting cholera phage ‘‘blooms’’ within aquatic environments, thereby influencing transmission of phage sensitive, pathogenic V. cholerae strains by aquatic vehicles. Differential induction of the phages is a clear indication of the impact of environmental pollution and global changes on phage induction. The development of molecular biology techniques offered an accessible gateway for investigating the molecular events leading to genetic diversity in the marine environment. Using nucleic acids as targets, the methods of fingerprinting like ERIC PCR and BOX PCR, revealed that the marine environment harbours potentially pathogenic group of bacteria with genetic diversity. The distribution of virulence associated genes in the environmental isolates of V. cholerae provides tangible material for further investigation. Nucleotide and protein sequence analysis alongwith protein structure prediction aids in better understanding of the variation inalleles of same gene in different ecological niche and its impact on the protein structure for attaining greater fitness of pathogens. The evidences of the co-evolution of virulence genes in toxigenic V. cholerae O1 from different lineages of environmental non-O1 strains is alarming. Transduction studies would indicate that the phenomenon of acquisition of these virulence genes by lateral gene transfer, although rare, is not quite uncommon amongst non-O1/non-O139 V. cholerae and it has a key role in diversification. All these considerations justify the need for an integrated approach towards the development of an effective surveillance system to monitor evolution of V. cholerae strains with epidemic potential. Results presented in this study, if considered together with the mechanism proposed as above, would strongly suggest that the bacteriophage also intervenes as a variable in shaping the cholera bacterium, which cannot be ignored and hinting at imminent future epidemics.
Resumo:
Zur Senkung von Kosten werden in vielen Unternehmen Dienstleistungen, die nicht zur Kernkompetenz gehören, an externe Dienstleister ausgelagert. Dieser Prozess wird auch als Outsourcing bezeichnet. Die dadurch entstehenden Abhängigkeiten zu den externen Dienstleistern werden mit Hilfe von Service Level Agreements (SLAs) vertraglich geregelt. Die Aufgabe des Service Level Managements (SLM) ist es, die Einhaltung der vertraglich fixierten Dienstgüteparameter zu überwachen bzw. sicherzustellen. Für eine automatische Bearbeitung ist daher eine formale Spezifikation von SLAs notwendig. Da der Markt eine Vielzahl von unterschiedlichen SLM-Werkzeugen hervorgebracht hat, entstehen in der Praxis Probleme durch proprietäre SLA-Formate und fehlende Spezifikationsmethoden. Daraus resultiert eine Werkzeugabhängigkeit und eine limitierte Wiederverwendbarkeit bereits spezifizierter SLAs. In der vorliegenden Arbeit wird ein Ansatz für ein plattformunabhängiges Service Level Management entwickelt. Ziel ist eine Vereinheitlichung der Modellierung, so dass unterschiedliche Managementansätze integriert und eine Trennung zwischen Problem- und Technologiedomäne erreicht wird. Zudem wird durch die Plattformunabhängigkeit eine hohe zeitliche Stabilität erstellter Modelle erreicht. Weiteres Ziel der Arbeit ist, die Wiederverwendbarkeit modellierter SLAs zu gewährleisten und eine prozessorientierte Modellierungsmethodik bereitzustellen. Eine automatisierte Etablierung modellierter SLAs ist für eine praktische Nutzung von entscheidender Relevanz. Zur Erreichung dieser Ziele werden die Prinzipien der Model Driven Architecture (MDA) auf die Problemdomäne des Service Level Managements angewandt. Zentrale Idee der Arbeit ist die Definition von SLA-Mustern, die konfigurationsunabhängige Abstraktionen von Service Level Agreements darstellen. Diese SLA-Muster entsprechen dem Plattformunabhängigen Modell (PIM) der MDA. Durch eine geeignete Modelltransformation wird aus einem SLA-Muster eine SLA-Instanz generiert, die alle notwendigen Konfigurationsinformationen beinhaltet und bereits im Format der Zielplattform vorliegt. Eine SLA-Instanz entspricht damit dem Plattformspezifischen Modell (PSM) der MDA. Die Etablierung der SLA-Instanzen und die daraus resultierende Konfiguration des Managementsystems entspricht dem Plattformspezifischen Code (PSC) der MDA. Nach diesem Schritt ist das Managementsystem in der Lage, die im SLA vereinbarten Dienstgüteparameter eigenständig zu überwachen. Im Rahmen der Arbeit wurde eine UML-Erweiterung definiert, die eine Modellierung von SLA-Mustern mit Hilfe eines UML-Werkzeugs ermöglicht. Hierbei kann die Modellierung rein graphisch als auch unter Einbeziehung der Object Constraint Language (OCL) erfolgen. Für die praktische Realisierung des Ansatzes wurde eine Managementarchitektur entwickelt, die im Rahmen eines Prototypen realisiert wurde. Der Gesamtansatz wurde anhand einer Fallstudie evaluiert.
Resumo:
Erfolgskontrollen für Agrarumweltprogramme bezogen sich bisher meistens auf einzelne Flächen oder auf programmbezogene, großräumige Evaluationen. Es wurde jedoch kaum untersucht, wie sich die Maßnahmen auf die Entwicklung einzelner Naturräume auswirken. Auch gab es keine Studien, welche die Wechselwirkungen zwischen den Beweggründen der Landnutzer auf der ei-nen- sowie Landnutzung und Vegetation auf der anderen Seite interpretierten. Die Dissertation Wirkungen von Extensivierungs- und Vertragsnaturschutzprogrammen auf die Entwick-lung einer »gerade noch aktuellen Agrarlandschaft« hat diese Lücke geschlossen. Sie erklärt, welche Bedeutung die hessischen Programme HELP und HEKUL für den hohen Anteil naturschutzfachlich wertvollen Grünlands im Rommeroder Hügel-land westlich des Meißner haben. Untersuchungsgegenstand waren die Grünlandvegetation und die landwirtschaftlichen Betriebe mit ihren Menschen und deren Beweggründen. Diese Inhalte er-forderten eine Vorgehensweise, die sowohl sozialwissenschaftliche als auch naturwissenschaftliche Methoden einbindet, um Bezüge zwischen Betrieben und Grünlandvegetation zu er-kennen. Umfangreiche pflanzensoziologische Untersuchungen und Interviews der Betriebsleiter waren Grundlage für eine Schlagdatenbank und weitergehende Auswertungen. Die Interpretation vegetationskundlicher Erhebungen im Kontext betrieblicher Entscheidungen und Beweggründe erforderte es, althergebrachte Ansätze in neuer Form zu verknüpfen. Die Bewertung der Programmwirkungen stützte sich auf die Schlagdatenbank und auf vier Szena-rien zur zukünftigen Gebietsentwicklung bei unterschiedlichen Programmfortschreibungen. Zur Darstellung von Erhebungen und Ergebnissen entstand eine Vielzahl thematischer Karten. Der überdurchschnittlich hohe Anteil naturschutzfachlich bedeutsamer Grünlandtypen auf den Programmflächen und die Interpretation der Szenarien belegten eine hohe Wirksamkeit von HELP und HEKUL im Gebiet. Nicht nur auf den Vertragsnaturschutzflächen des HELP, sondern auch auf dem HEKUL-Grünland sind naturschutzfachlich bedeutende Vegetationstypen überproportional vertreten. Die vier Szenarien ließen erkennen, dass eine Beschränkung des HELP auf Schutzgebiete, eine Abschaffung der HEKUL-Grünlandextensivierung oder gar eine ersatzlose Strei-chung beider Programme zu erheblichen Verschlechterungen der naturschutzfachlichen Situation führen würde. Gleichzeitig war festzustellen, dass es ohne die landwirtschaftlich schwierigen natur-räumlichen Verhältnisse sowie eine eher großteilige Agrarstruktur mit überdurchschnittlich flächen-starken und wirtschaftlich stabilen Vollerwerbsbetrieben keine so deutlichen Programmwirkungen gegeben hätte. Auch die Tatsache, dass viele Landwirte eine intensive Landwirtschaft aus innerer Überzeugung ablehnen und mit einer erheblich geringeren Stickstoffintensität wirtschaften als es HEKUL verlangt, wirkte verstärkend. Die große Bedeutung individueller Beweggründe einzelner Betriebsleiter wurde auch in den engen Beziehungen einzelner Grünland-Pflanzengesellschaften zu bestimmten Betriebstypen und sogar einzelnen Höfen sichtbar, deren Beschreibung und Interpretation wichtige Erkenntnisse zu den so-zioökonomischen Voraussetzungen verschiedener Vegetationstypen lieferte. Für die zukünftige Entwicklung der hessischen Agrarumweltförderung empfiehlt die Dissertation eine Einführung und bevorzugte Anwendung ergebnisorientierter Honorierungsverfahren, eine bessere Berücksichtigung des gering gedüngten Grünlands über eine differenzierte Förderung sehr extensiver Düngeregime, eine stärkere Modularisierung des Gesamtprogramms und eine Durchführung aller Maßnahmen im Vertragsverfahren. Die betriebszweigbezogene Grünlandextensivierung sollte zukünftig in Kulissen angeboten werden, in denen ein verstärktes Wechseln von Betrieben in die Mindestpflege nach DirektZahlVerpflV zu erwarten ist.
Resumo:
TOSCANA is a graphical tool that supports the human-centered interactive processes of conceptual knowledge processing. The generality of the approach makes TOSCANA a universal tool applicable to a variety of domains. Only the so-called conceptual scales have to be designed for new applications. The presentation shows how the use of abstract scales allows the reuse of formerly defined conceptual scales. Furthermore it describes how thesauri and conceptual taxonomies can be integrated in the generation of conceptual scales.
Resumo:
The aim of this paper is to indicate how TOSCANA may be extended to allow graphical representations not only of concept lattices but also of concept graphs in the sense of Contextual Logic. The contextual-logic extension of TOSCANA requires the logical scaling of conceptual and relatioal scales for which we propose the Peircean Algebraic Logic as reconstructed by R. W. Burch. As graphical representations we recommend, besides labelled line diagrams of concept lattices and Sowa's diagrams of conceptual graphs, particular information maps for utilizing background knowledge as much as possible. Our considerations are illustrated by a small information system about the domestic flights in Austria.
Resumo:
Vorgestellt wird eine weltweit neue Methode, Schnittstellen zwischen Menschen und Maschinen für individuelle Bediener anzupassen. Durch Anwenden von Abstraktionen evolutionärer Mechanismen wie Selektion, Rekombination und Mutation in der EOGUI-Methodik (Evolutionary Optimization of Graphical User Interfaces) kann eine rechnergestützte Umsetzung der Methode für Graphische Bedienoberflächen, insbesondere für industrielle Prozesse, bereitgestellt werden. In die Evolutionäre Optimierung fließen sowohl die objektiven, d.h. messbaren Größen wie Auswahlhäufigkeiten und -zeiten, mit ein, als auch das anhand von Online-Fragebögen erfasste subjektive Empfinden der Bediener. Auf diese Weise wird die Visualisierung von Systemen den Bedürfnissen und Präferenzen einzelner Bedienern angepasst. Im Rahmen dieser Arbeit kann der Bediener aus vier Bedienoberflächen unterschiedlicher Abstraktionsgrade für den Beispielprozess MIPS ( MIschungsProzess-Simulation) die Objekte auswählen, die ihn bei der Prozessführung am besten unterstützen. Über den EOGUI-Algorithmus werden diese Objekte ausgewählt, ggf. verändert und in einer neuen, dem Bediener angepassten graphischen Bedienoberfläche zusammengefasst. Unter Verwendung des MIPS-Prozesses wurden Experimente mit der EOGUI-Methodik durchgeführt, um die Anwendbarkeit, Akzeptanz und Wirksamkeit der Methode für die Führung industrieller Prozesse zu überprüfen. Anhand der Untersuchungen kann zu großen Teilen gezeigt werden, dass die entwickelte Methodik zur Evolutionären Optimierung von Mensch-Maschine-Schnittstellen industrielle Prozessvisualisierungen tatsächlich an den einzelnen Bediener anpaßt und die Prozessführung verbessert.
Resumo:
DIADEM, created by THOMSON-CSF, is a methodology for specifying and developing user interfaces. It improves productivity of the interface development process as well as quality of the interface. The method provides support to user interface development in three aspects. (1) DIADEM defines roles of people involved and their tasks and organises the sequence of activities. (2) It provides graphical formalisms supporting information exchange between people. (3) It offers a basic set of rules for optimum human-machine interfaces. The use of DIADEM in three areas (process control, sales support, and multimedia presentation) was observed and evaluated by our laboratory in the European project DIAMANTA (ESPRIT P20507). The method provides an open procedure that leaves room for adaptation to a specific application and environment. This paper gives an overview of DIADEM and shows how to extend formalisms for developing multimedia interfaces.
Resumo:
Presentation at the 1997 Dagstuhl Seminar "Evaluation of Multimedia Information Retrieval", Norbert Fuhr, Keith van Rijsbergen, Alan F. Smeaton (eds.), Dagstuhl Seminar Report 175, 14.04. - 18.04.97 (9716). - Abstract: This presentation will introduce ESCHER, a database editor which supports visualization in non-standard applications in engineering, science, tourism and the entertainment industry. It was originally based on the extended nested relational data model and is currently extended to include object-relational properties like inheritance, object types, integrity constraints and methods. It serves as a research platform into areas such as multimedia and visual information systems, QBE-like queries, computer-supported concurrent work (CSCW) and novel storage techniques. In its role as a Visual Information System, a database editor must support browsing and navigation. ESCHER provides this access to data by means of so called fingers. They generalize the cursor paradigm in graphical and text editors. On the graphical display, a finger is reflected by a colored area which corresponds to the object a finger is currently pointing at. In a table more than one finger may point to objects, one of which is the active finger and is used for navigating through the table. The talk will mostly concentrate on giving examples for this type of navigation and will discuss some of the architectural needs for fast object traversal and display. ESCHER is available as public domain software from our ftp site in Kassel. The portable C source can be easily compiled for any machine running UNIX and OSF/Motif, in particular our working environments IBM RS/6000 and Intel-based LINUX systems. A porting to Tcl/Tk is under way.
Resumo:
Enhanced reality visualization is the process of enhancing an image by adding to it information which is not present in the original image. A wide variety of information can be added to an image ranging from hidden lines or surfaces to textual or iconic data about a particular part of the image. Enhanced reality visualization is particularly well suited to neurosurgery. By rendering brain structures which are not visible, at the correct location in an image of a patient's head, the surgeon is essentially provided with X-ray vision. He can visualize the spatial relationship between brain structures before he performs a craniotomy and during the surgery he can see what's under the next layer before he cuts through. Given a video image of the patient and a three dimensional model of the patient's brain the problem enhanced reality visualization faces is to render the model from the correct viewpoint and overlay it on the original image. The relationship between the coordinate frames of the patient, the patient's internal anatomy scans and the image plane of the camera observing the patient must be established. This problem is closely related to the camera calibration problem. This report presents a new approach to finding this relationship and develops a system for performing enhanced reality visualization in a surgical environment. Immediately prior to surgery a few circular fiducials are placed near the surgical site. An initial registration of video and internal data is performed using a laser scanner. Following this, our method is fully automatic, runs in nearly real-time, is accurate to within a pixel, allows both patient and camera motion, automatically corrects for changes to the internal camera parameters (focal length, focus, aperture, etc.) and requires only a single image.
Resumo:
This thesis presents the development of hardware, theory, and experimental methods to enable a robotic manipulator arm to interact with soils and estimate soil properties from interaction forces. Unlike the majority of robotic systems interacting with soil, our objective is parameter estimation, not excavation. To this end, we design our manipulator with a flat plate for easy modeling of interactions. By using a flat plate, we take advantage of the wealth of research on the similar problem of earth pressure on retaining walls. There are a number of existing earth pressure models. These models typically provide estimates of force which are in uncertain relation to the true force. A recent technique, known as numerical limit analysis, provides upper and lower bounds on the true force. Predictions from the numerical limit analysis technique are shown to be in good agreement with other accepted models. Experimental methods for plate insertion, soil-tool interface friction estimation, and control of applied forces on the soil are presented. In addition, a novel graphical technique for inverting the soil models is developed, which is an improvement over standard nonlinear optimization. This graphical technique utilizes the uncertainties associated with each set of force measurements to obtain all possible parameters which could have produced the measured forces. The system is tested on three cohesionless soils, two in a loose state and one in a loose and dense state. The results are compared with friction angles obtained from direct shear tests. The results highlight a number of key points. Common assumptions are made in soil modeling. Most notably, the Mohr-Coulomb failure law and perfectly plastic behavior. In the direct shear tests, a marked dependence of friction angle on the normal stress at low stresses is found. This has ramifications for any study of friction done at low stresses. In addition, gradual failures are often observed for vertical tools and tools inclined away from the direction of motion. After accounting for the change in friction angle at low stresses, the results show good agreement with the direct shear values.
Resumo:
The next generations of both biological engineering and computer engineering demand that control be exerted at the molecular level. Creating, characterizing and controlling synthetic biological systems may provide us with the ability to build cells that are capable of a plethora of activities, from computation to synthesizing nanostructures. To develop these systems, we must have a set of tools not only for synthesizing systems, but also designing and simulating them. The BioJADE project provides a comprehensive, extensible design and simulation platform for synthetic biology. BioJADE is a graphical design tool built in Java, utilizing a database back end, and supports a range of simulations using an XML communication protocol. BioJADE currently supports a library of over 100 parts with which it can compile designs into actual DNA, and then generate synthesis instructions to build the physical parts. The BioJADE project contributes several tools to Synthetic Biology. BioJADE in itself is a powerful tool for synthetic biology designers. Additionally, we developed and now make use of a centralized BioBricks repository, which enables the sharing of BioBrick components between researchers, and vastly reduces the barriers to entry for aspiring Synthetic Biologists.
Resumo:
Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.
Resumo:
We present an overview of current research on artificial neural networks, emphasizing a statistical perspective. We view neural networks as parameterized graphs that make probabilistic assumptions about data, and view learning algorithms as methods for finding parameter values that look probable in the light of the data. We discuss basic issues in representation and learning, and treat some of the practical issues that arise in fitting networks to data. We also discuss links between neural networks and the general formalism of graphical models.
Resumo:
Most of the recent literature on chaos and nonlinear dynamics is written either for popular science magazine readers or for advanced mathematicians. This paper gives a broad introduction to this interesting and rapidly growing field at a level that is between the two. The graphical and analytical tools used in the literature are explained and demonstrated, the rudiments of the current theory are outlined and that theory is discussed in the context of several examples: an electronic circuit, a chemical reaction and a system of satellites in the solar system.