97 resultados para Software Architecture
Resumo:
How have changes in communications technology affected the way that misinformation spreads through a population and persists? To what extent do differences in the architecture of social networks affect the spread of misinformation, relative to the rates and rules by which individuals transmit or eliminate different pieces of information (cultural traits)? Here, we use analytical models and individual-based simulations to study how a 'cultural load' of misinformation can be maintained in a population under a balance between social transmission and selective elimination of cultural traits with low intrinsic value. While considerable research has explored how network architecture affects percolation processes, we find that the relative rates at which individuals transmit or eliminate traits can have much more profound impacts on the cultural load than differences in network architecture. In particular, the cultural load is insensitive to correlations between an individual's network degree and rate of elimination when these quantities vary among individuals. Taken together, these results suggest that changes in communications technology may have influenced cultural evolution more strongly through changes in the amount of information flow, rather than the details of who is connected to whom.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
Antiresorptive agents such as bisphosphonates induce a rapid increase of BMD during the 1st year of treatment and a partial maintenance of bone architecture. Trabecular Bone Score (TBS), a new grey-level texture measurement that can be extracted from the DXA image, correlates with 3D parameters of bone micro-architecture. Aim: To evaluate the longitudinal effect of antiresorptive agents on spine BMD and on site-matched spine microarchitecture as assessed by TBS. Methods: From the BMD database for Province of Manitoba, Canada, we selected women age >50 with paired baseline and follow up spine DXA examinations who had not received any prior HRT or other antiresorptive drug.Women were divided in two subgroups: (1) those not receiving any HRT or antiresorptive drug during follow up (=non-users) and (2) those receiving non-HRT antiresorptive drug during follow up (=users) with high adherence (medication possession ratio >75%) from a provincial pharmacy database system. Lumbar spine TBS was derived by the Bone Disease Unit, University of Lausanne, for each spine DXA examination using anonymized files (blinded from clinical parameters and outcomes). Effects of antiresorptive treatment for users and non-users on TBS and BMD at baseline and during mean 3.7 years follow-up were compared. Results were expressed % change per year. Results: 1150 non-users and 534 users met the inclusion criteria. At baseline, users and non-users had a mean age and BMI of [62.2±7.9 vs 66.1±8.0 years] and [26.3±4.7 vs 24.7±4.0 kg/m²] respectively. Antiresorptive drugs received by users were bisphosphonates (86%), raloxifene (10%) and calcitonin (4%). Significant differences in BMD change and TBS change were seen between users and nonusers during follow-up (p<0.0001). Significant decreases in mean BMD and TBS (−0.36± 0.05% per year; −0.31±0.06% per year) were seen for non-users compared with baseline (p<0.001). A significant increase in mean BMD was seen for users compared with baseline (+1.86±0.0% per year, p<0.0018). TBS of users also increased compared with baseline (+0.20±0.08% per year, p<0.001), but more slowly than BMD. Conclusion: We observed a significant increase in spine BMD and a positive maintenance of bone micro-architecture from TBS with antiresorptive treatment, whereas the treatment naïve group lost both density and micro-architecture. TBS seems to be responsive to treatment and could be suitable for monitoring micro-architecture. This article is part of a Special Issue entitled ECTS 2011. Disclosure of interest: M.-A. Krieg: None declared, A. Goertzen: None declared, W. Leslie: None declared, D. Hans Consulting fees from Medimaps.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.
Resumo:
Basé sur une expérience de terrain en archives médicales analysée notamment à l'aide de notions issues de l'ethnométhodologie, cet article entend revenir sur des aspects généralement invisibles de l'architecture de l'information telles les activités et personnes qui assurent sa production et son maintien. Utilisant la notion d'équipement des documents, nous proposons une incursion dans le monde de ceux qui réalisent ces opérations au quotidien, et produisent, par leur activité, une architecture de l'information située à partir de leurs compétences spécifiques. Nous discutons notamment des pratiques relatives à la numérisation des documents dans le contexte d'une architecture globale.
Resumo:
Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.
Resumo:
Abstract Human experience takes place in the line of mental time (MT) created through 'self-projection' of oneself to different time-points in the past or future. Here we manipulated self-projection in MT not only with respect to one's life events but also with respect to one's faces from different past and future time-points. Behavioural and event-related functional magnetic resonance imaging activity showed three independent effects characterized by (i) similarity between past recollection and future imagination, (ii) facilitation of judgements related to the future as compared with the past, and (iii) facilitation of judgements related to time-points distant from the present. These effects were found with respect to faces and events, and also suggest that brain mechanisms of MT are independent of whether actual life episodes have to be re-experienced or pre-experienced, recruiting a common cerebral network including the anteromedial temporal, posterior parietal, inferior frontal, temporo-parietal and insular cortices. These behavioural and neural data suggest that self-projection in time is a fundamental aspect of MT, relying on neural structures encoding memory, mental imagery and self.
Resumo:
Aim. Several software packages (SWP) and models have been released for quantification of myocardial perfusion (MP). Although they all are validated against something, the question remains how well their values agree. The present analysis focused on cross-comparison of three SWP for MP quantification of 13N-ammonia PET studies. Materials & Methods. 48 rest and stress MP 13N-ammonia PET studies of hypertrophic cardiomyopathy (HCM) patients (Sciagrà et al., 2009) were analysed with three SW packages - Carimas, PMOD, and FlowQuant - by three observers blinded to the results of each other. All SWP implement the one-tissue-compartment model (1TCM, DeGrado et al. 1996), and first two - the two-tissue-compartment model (2TCM, Hutchins et al. 1990) as well. Linear mixed model for the repeated measures was fitted to the data. Where appropriate we used Bland-Altman plots as well. The reproducibility was assessed on global, regional and segmental levels. Intraclass correlation coefficients (ICC), differences between the SWPs and between models were obtained. ICC≥0.75 indicated excellent reproducibility, 0.4≤ICC<0.75 indicated fair to good reproducibility, ICC<0.4 - poor reproducibility (Rosner, 2010). Results. When 1TCM MP values were compared, the SW agreement on global and regional levels was excellent, except for Carimas vs. PMOD at RCA: ICC=0.715 and for PMOD vs. FlowQuant at LCX:ICC=0.745 which were good. In segmental analysis in five segments: 7,12,13, 16, and 17 the agreement between all SWP was excellent; in the remaining 12 segments the agreement varied between the compared SWP. Carimas showed excellent agreement with FlowQuant in 13 segments and good in four - 1, 5, 6, 11: 0.687≤ICCs≤0.73; Carimas had excellent agreement with PMOD in 11 segments, good in five_4, 9, 10, 14, 15: 0.682≤ICCs≤0.737, and poor in segment 3: ICC=0.341. PMOD had excellent agreement with FlowQuant in eight segments and substantial-to-good in nine_1, 2, 3, 5, 6,8-11: 0.585≤ICCs≤0.738. Agreement between Carimas and PMOD for 2TCM was good at a global level: ICC=0.745, excellent at LCX (0.780) and RCA (0.774), good at LAD (0.662); agreement was excellent for ten segments, fair-to-substantial for segments 2, 3, 8, 14, 15 (0.431≤ICCs≤0.681), poor for segments 4 (0.384) and 17 (0.278). Conclusions. The three SWP used by different operators to analyse 13N-ammonia PET MP studies provide results that agree well at a global level, regional levels, and mostly well even at a segmental level. Agreement is better for 1TCM. Poor agreement at segments 4 and 17 for 2TCM needs further clarification.
Resumo:
Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the trabecular bone score (TBS) measure. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis values, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goal of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. We included 631 women: mean age 67.4 ± 6.7 years, BMI 26.1 ± 4.6, mean lumbar spine BMD 0.943 ± 0.168 (T-score − 1.4 SD), and TBS 1.271 ± 0.103. As expected, correlation between BMD and site matched TBS is low (r2 = 0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2-2.5), 1.6 (1.2-2.1), and 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), and 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < − 2.5 SD or a TBS < 1.200. If we combine a BMD < − 2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been misclassified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS and HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.