840 resultados para Visual research methods : image, society and representation
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
From a pragmatic point of view, it is the people who make an organisation, but organisations are both people and structures, and not least organisations develop culture. One of the significant features of the European Educational Research Association (EERA) as an organisation is that many of its activities are run by people on a voluntary basis. Apart from a small office, now in Berlin, which oversees and handles the everyday management, participation on Council, reviewing and programming for ECER (European Conference on Educational Research), managing networks, etc. are all undertaken as voluntary work by academics from across Europe (and beyond). From the large group of people who are currently sustaining these activities, many have participated from the beginning, but many others, after having been once at the conference, returned and got engaged in the work, for instance within one of the networks. Among the many who participate in EERA activities, there is a diversity of reasons for doing so, but there seems to be something which is recurring in what people say about why they do it. One of these recurring ideas is that the discursive norms of the organisation are enforced in the context of welcoming people and ideas, and second, there exists an intellectual generosity and egalitarianism which encourages newcomers to participate rather than protect themselves. We believe that this tells something about what EERA and ECER are about.
Resumo:
Purpose: Custom cranio-orbital implants have been shown to achieve better performance than their hand-shaped counterparts by restoring skull anatomy more accurately and by reducing surgery time. Designing a custom implant involves reconstructing a model of the patient's skull using their computed tomography (CT) scan. The healthy side of the skull model, contralateral to the damaged region, can then be used to design an implant plan. Designing implants for areas of thin bone, such as the orbits, is challenging due to poor CT resolution of bone structures. This makes preoperative design time-intensive since thin bone structures in CT data must be manually segmented. The objective of this thesis was to research methods to accurately and efficiently design cranio-orbital implant plans, with a focus on the orbits, and to develop software that integrates these methods. Methods: The software consists of modules that use image and surface restoration approaches to enhance both the quality of CT data and the reconstructed model. It enables users to input CT data, and use tools to output a skull model with restored anatomy. The skull model can then be used to design the implant plan. The software was designed using 3D Slicer, an open-source medical visualization platform. It was tested on CT data from thirteen patients. Results: The average time it took to create a skull model with restored anatomy using our software was 0.33 hours ± 0.04 STD. In comparison, the design time of the manual segmentation method took between 3 and 6 hours. To assess the structural accuracy of the reconstructed models, CT data from the thirteen patients was used to compare the models created using our software with those using the manual method. When registering the skull models together, the difference between each set of skulls was found to be 0.4 mm ± 0.16 STD. Conclusions: We have developed a software to design custom cranio-orbital implant plans, with a focus on thin bone structures. The method described decreases design time, and is of similar accuracy to the manual method.
Resumo:
Due to design and process-related factors, there are local variations in the microstructure and mechanical behaviour of cast components. This work establishes a Digital Image Correlation (DIC) based method for characterisation and investigation of the effects of such local variations on the behaviour of a high pressure, die cast (HPDC) aluminium alloy. Plastic behaviour is studied using gradient solidified samples and characterisation models for the parameters of the Hollomon equation are developed, based on microstructural refinement. Samples with controlled microstructural variations are produced and the observed DIC strain field is compared with Finite Element Method (FEM) simulation results. The results show that the DIC based method can be applied to characterise local mechanical behaviour with high accuracy. The microstructural variations are observed to cause a redistribution of strain during tensile loading. This redistribution of strain can be predicted in the FEM simulation by incorporating local mechanical behaviour using the developed characterization model. A homogeneous FEM simulation is unable to predict the observed behaviour. The results motivate the application of a previously proposed simulation strategy, which is able to predict and incorporate local variations in mechanical behaviour into FEM simulations already in the design process for cast components.
Resumo:
The aim of this study was to establish guidelines for the optimization of biologic therapies for health professionals involved in the management of patients with RA, AS and PsA. Recommendations were established via consensus by a panel of experts in rheumatology and hospital pharmacy, based on analysis of available scientific evidence obtained from four systematic reviews and on the clinical experience of panellists. The Delphi method was used to evaluate these recommendations, both between panellists and among a wider group of rheumatologists. Previous concepts concerning better management of RA, AS and PsA were reviewed and, more specifically, guidelines for the optimization of biologic therapies used to treat these diseases were formulated. Recommendations were made with the aim of establishing a plan for when and how to taper biologic treatment in patients with these diseases. The recommendations established herein aim not only to provide advice on how to improve the risk:benefit ratio and efficiency of such treatments, but also to reduce variability in daily clinical practice in the use of biologic therapies for rheumatic diseases
Resumo:
Transportation research makes a difference for Iowans and the nation. Implementation of cost effective research projects contributes to a transportation network that is safer, more efficient, and longer lasting. Working in cooperation with our partners from universities, industry, other states, and FHWA, as well as participation in the Transportation Research Board (TRB), provides benefits for every facet of the DOT. This allows us to serve our communities and the traveling public more effectively. Pooled fund projects allow leveraging of funds for higher returns on investments. In 2010, Iowa led fifteen active pooled fund studies, participated in twenty-two others, and was wrapping-up, reconciling, and closing out an additional 6 Iowa Led pooled fund studies. In addition, non-pooled fund SPR projects included approximately 20 continued, 9 new, and over a dozen reoccurring initiatives such as the technical transfer/training program. Additional research is managed and conducted by the Office of Traffic and Safety and other departments in the Iowa DOT.
Resumo:
Gait analysis allows to characterize motor function, highlighting deviations from normal motor behavior related to an underlying pathology. The widespread use of wearable inertial sensors has opened the way to the evaluation of ecological gait, and a variety of methodological approaches and algorithms have been proposed for the characterization of gait from inertial measures (e.g. for temporal parameters, motor stability and variability, specific pathological alterations). However, no comparative analysis of their performance (i.e. accuracy, repeatability) was available yet, in particular, analysing how this performance is affected by extrinsic (i.e. sensor location, computational approach, analysed variable, testing environmental constraints) and intrinsic (i.e. functional alterations resulting from pathology) factors. The aim of the present project was to comparatively analyze the influence of intrinsic and extrinsic factors on the performance of the numerous algorithms proposed in the literature for the quantification of specific characteristics (i.e. timing, variability/stability) and alterations (i.e. freezing) of gait. Considering extrinsic factors, the influence of sensor location, analyzed variable, and computational approach on the performance of a selection of gait segmentation algorithms from a literature review was analysed in different environmental conditions (e.g. solid ground, sand, in water). Moreover, the influence of altered environmental conditions (i.e. in water) was analyzed as referred to the minimum number of stride necessary to obtain reliable estimates of gait variability and stability metrics, integrating what already available in the literature for over ground gait in healthy subjects. Considering intrinsic factors, the influence of specific pathological conditions (i.e. Parkinson’s Disease) was analyzed as affecting the performance of segmentation algorithms, with and without freezing. Finally, the analysis of the performance of algorithms for the detection of gait freezing showed how results depend on the domain of implementation and IMU position.
Resumo:
In this Ph.D. project, original and innovative approaches for the quali-quantitative analysis of abuse substances, as well as therapeutic agents with abuse potential and related compounds were designed, developed and validated for application to different fields such as forensics, clinical and pharmaceutical. All the parameters involved in the developed analytical workflows were properly and accurately optimised, from sample collection to sample pretreatment up to the instrumental analysis. Advanced dried blood microsampling technologies have been developed, able of bringing several advantages to the method as a whole, such as significant reduction of solvent use, feasible storage and transportation conditions and enhancement of analyte stability. At the same time, the use of capillary blood allows to increase subject compliance and overall method applicability by exploiting such innovative technologies. Both biological and non-biological samples involved in this project were subjected to optimised pretreatment techniques developed ad-hoc for each target analyte, making also use of advanced microextraction techniques. Finally, original and advanced instrumental analytical methods have been developed based on high and ultra-high performance liquid chromatography (HPLC,UHPLC) coupled to different detection means (mainly mass spectrometry, but also electrochemical, and spectrophotometric detection for screening purpose), and on attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR) for solid-state analysis. Each method has been designed to obtain highly selective, sensitive yet sustainable systems and has been validated according to international guidelines. All the methods developed herein proved to be suitable for the analysis of the compounds under investigation and may be useful tools in medicinal chemistry, pharmaceutical analysis, within clinical studies and forensic investigations.
Resumo:
Astrocytes are the most numerous glial cell type in the mammalian brain and permeate the entire CNS interacting with neurons, vasculature, and other glial cells. Astrocytes display intracellular calcium signals that encode information about local synaptic function, distributed network activity, and high-level cognitive functions. Several studies have investigated the calcium dynamics of astrocytes in sensory areas and have shown that these cells can encode sensory stimuli. Nevertheless, only recently the neuro-scientific community has focused its attention on the role and functions of astrocytes in associative areas such as the hippocampus. In our first study, we used the information theory formalism to show that astrocytes in the CA1 area of the hippocampus recorded with 2-photon fluorescence microscopy during spatial navigation encode spatial information that is complementary and synergistic to information encoded by nearby "place cell" neurons. In our second study, we investigated various computational aspects of applying the information theory formalism to astrocytic calcium data. For this reason, we generated realistic simulations of calcium signals in astrocytes to determine optimal hyperparameters and procedures of information measures and applied them to real astrocytic calcium imaging data. Calcium signals of astrocytes are characterized by complex spatiotemporal dynamics occurring in subcellular parcels of the astrocytic domain which makes studying these cells in 2-photon calcium imaging recordings difficult. However, current analytical tools which identify the astrocytic subcellular regions are time consuming and extensively rely on user-defined parameters. Here, we present Rapid Astrocytic calcium Spatio-Temporal Analysis (RASTA), a novel machine learning algorithm for spatiotemporal semantic segmentation of 2-photon calcium imaging recordings of astrocytes which operates without human intervention. We found that RASTA provided fast and accurate identification of astrocytic cell somata, processes, and cellular domains, extracting calcium signals from identified regions of interest across individual cells and populations of hundreds of astrocytes recorded in awake mice.
Resumo:
Natural events are a widely recognized hazard for industrial sites where relevant quantities of hazardous substances are handled, due to the possible generation of cascading events resulting in severe technological accidents (Natech scenarios). Natural events may damage storage and process equipment containing hazardous substances, that may be released leading to major accident scenarios called Natech events. The need to assess the risk associated with Natech scenarios is growing and methodologies were developed to allow the quantification of Natech risk, considering both point sources and linear sources as pipelines. A key element of these procedures is the use of vulnerability models providing an estimation of the damage probability of equipment or pipeline segment as a result of the impact of the natural event. Therefore, the first aim of the PhD project was to outline the state of the art of vulnerability models for equipment and pipelines subject to natural events such as floods, earthquakes, and wind. Moreover, the present PhD project also aimed at the development of new vulnerability models in order to fill some gaps in literature. In particular, a vulnerability model for vertical equipment subject to wind and to flood were developed. Finally, in order to improve the calculation of Natech risk for linear sources an original methodology was developed for Natech quantitative risk assessment methodology for pipelines subject to earthquakes. Overall, the results obtained are a step forward in the quantitative risk assessment of Natech accidents. The tools developed open the way to the inclusion of new equipment in the analysis of Natech events, and the methodology for the assessment of linear risk sources as pipelines provides an important tool for a more accurate and comprehensive assessment of Natech risk.
Resumo:
El Niño-Southern Oscillation (ENSO) è il maggiore fenomeno climatico che avviene a livello dell’Oceano Pacifico tropicale e che ha influenze ambientali, climatiche e socioeconomiche a larga scala. In questa tesi si ripercorrono i passi principali che sono stati fatti per tentare di comprendere un fenomeno così complesso. Per prima cosa, si sono studiati i meccanismi che ne governano la dinamica, fino alla formulazione del modello matematico chiamato Delayed Oscillator (DO) model, proposto da Suarez e Schopf nel 1988. In seguito, per tenere conto della natura caotica del sistema studiato, si è introdotto nel modello lo schema chiamato Stochastically Perturbed Parameterisation Tendencies (SPPT). Infine, si sono portati due esempi di soluzione numerica del DO, sia con che senza l’introduzione della correzione apportata dallo schema SPPT, e si è visto in che misura SPPT porta reali miglioramenti al modello studiato.
Resumo:
Amorphous glass/ZnO-Al/p(a-Si:H)/i(a-Si:H)/n(a-Si1-xCx:H)/Al imagers with different n-layer resistivities were produced by plasma enhanced chemical vapour deposition technique (PE-CVD). An image is projected onto the sensing element and leads to spatially confined depletion regions that can be readout by scanning the photodiode with a low-power modulated laser beam. The essence of the scheme is the analog readout, and the absence of semiconductor arrays or electrode potential manipulations to transfer the information coming from the transducer. The influence of the intensity of the optical image projected onto the sensor surface is correlated with the sensor output characteristics (sensitivity, linearity blooming, resolution and signal-to-noise ratio) are analysed for different material compositions (0.5 < x < 1). The results show that the responsivity and the spatial resolution are limited by the conductivity of the doped layers. An enhancement of one order of magnitude in the image intensity signal and on the spatial resolution are achieved at 0.2 mW cm(-2) light flux by decreasing the n-layer conductivity by the same amount. A physical model supported by electrical simulation gives insight into the image-sensing technique used.
Resumo:
Tutkimuksen tavoitteena on selvittää paikallisten metsäteollisuuteen toimittavien pienten ja keskisuurten ohjelmistoyritysten nykytilannetta, hyviä käytäntöjä ja toiminnan ongelmia. Tunnistamalla pienten ja keskisuurten ohjelmistoyritysten nykytilanne on mahdollista suunnitella konkreettisia yrityksille suunnattavia kehittämistoimenpiteitä. Työssä on keskitytty tutkimaan metsäteollisuuteen toimittavia kaakkoissuomaisia pieniä ja keskisuuria ohjelmistoyrityksiä. Työn viitekehyksenä esitellään metsäteollisuuden nykytilannetta, kehitystrendejä sekä kehitystrendien vaikutusta metsäteollisuusyritysten tietojärjestelmätarpeisiin. Lisäksi työn viitekehyksessä esitellään tutkimuksessa käytetyt kvalitatiiviset tutkimusmenetelmät.Tutkimus on luonteeltaan kvalitatiivinen eli laadullinen ja tutkimusotteeltaan deskriptiivinen eli kuvaileva. Tutkimusaineisto koostui 19 asiantuntijahaastattelusta ja dokumenttiaineistosta. Tutkimusaineiston analysoinnissa käytin aineistopohjaista meneelmää ja tapaustutkimusta. Tutkimustulosten perusteella pystyttiinkuvaamaan tutkimuksessa mukana olleille 10 paikalliselle pienelle ja keskisuurelle ohjelmistoyritykselle yhteisiä ominaisuuksia. Tämän lisäksi pystyttiin tunnistamaan ohjelmistoyrityksistä kolme erilaista tyyppiä ja luonnehtimaan kunkin tyypin nykytilannetta, hyviä käytäntöjä ja toiminnan ongelmia. Tunnistetut paikallisten pienten ja keskisuurten ohjelmistoyritysten tyypit ovat: erikoistuja, ennakoija ja keräilijä. Tutkimustulokset antavat hyvän lähtökohdan tulevaisuuden kehitystrendien tunnistamisessa ja toiminnankehitystoimenpiteiden suunnittelussa.