944 resultados para Development tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is on state-of-the-art research efforts specific to infrastructure inventory/data collection with sign inventory as a case study. The development of an agency-wide sign inventory is based on feature inventory and location information. Specific to location, a quick and simple location acquisition tool is critical to tying assets to an accurate location-referencing system. This research effort provides a contrast between legacy referencing systems (route and milepost) and global positioning system- (GPS-) based techniques (latitude and longitude) integrated into a geographic information system (GIS) database. A summary comparison of field accuracies using a variety of consumer grade devices is also provided. This research, and the data collection tools developed, are critical in supporting the Iowa Department of Transportation (DOT) Statewide Sign Management System development effort. For the last two years, a Task Force has embarked on a comprehensive effort to develop a sign management system to improve sign quality, as well as to manage all aspects of signage, from request, ordering, fabricating, installing, maintaining, and ultimately removing, and to provide the ability to budget for these key assets on a statewide basis. This effort supported the development of a sign inventory tool and is the beginning of the development of a sign management system to support the Iowa DOT efforts in the consistent, cost effective, and objective decision making process when it comes to signs and their maintenance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several accidents, some involving fatalities, have occurred on U.S. Highway 30 near the Archer Daniels Midland Company (ADM) Corn Sweeteners plant in Cedar Rapids, Iowa. A contributing factor to many of these accidents has been the large amounts of water (vapor and liquid) emitted from multiple sources at ADM's facility located along the south side of the highway. Weather and road closure data acquired from IDOT have been used to develop a database of meteorological conditions preceding and accompanying closure of Highway 30 in Cedar Rapids. An expert system and a FORTRAN program were developed as aids in decision making with regard to closure of Highway 30 near the plant. The computer programs were used for testing, evaluation, and final deployment. Reports indicate the decision tools have been successfully implemented and were judged to be helpful in forecasting road closures and in reducing costs and personnel time in monitoring the roadway.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A compilation of sample pages from various employment statistical reports issued by Iowa Workforce Development (IWD), including URLs for locating electronic versions of the reports online at the IWD website

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the result of a research about diverse areas of the information technology world applied to cartography. Its final result is a complete and custom geographic information web system, designed and implemented to manage archaeological information of the city of Tarragona. The goal of the platform is to show on a web-focused application geographical and alphanumerical data and to provide concrete queries to explorate this. Various tools, between others, have been used: the PostgreSQL database management system in conjunction with its geographical extension PostGIS, the geographic server GeoServer, the GeoWebCache tile caching, the maps viewer and maps and satellite imagery from Google Maps, locations imagery from Google Street View, and other open source libraries. The technology has been chosen from an investigation of the requirements of the project, and has taken great part of its development. Except from the Google Maps tools which are not open source but are free, all design has been implemented with open source and free tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Current tools for analgesia are often only partially successful, thus investigations of new targets for pain therapy stimulate great interest. Consequent to peripheral nerve injury, c-Jun N-terminal kinase (JNK) activity in cells of the dorsal root ganglia (DRGs) and spinal cord is involved in triggering neuropathic pain. However, the relative contribution of distinct JNK isoforms is unclear. Using knockout mice for single isoforms, and blockade of JNK activity by a peptide inhibitor, we have used behavioral tests to analyze the contribution of JNK in the development of neuropathic pain after unilateral sciatic nerve transection. In addition, immunohistochemical labelling for the growth associated protein (GAP)-43 and Calcitonin Gene Related Peptide (CGRP) in DRGs was used to relate injury related compensatory growth to altered sensory function. RESULTS: Peripheral nerve injury produced pain-related behavior on the ipsilateral hindpaw, accompanied by an increase in the percentage of GAP43-immunoreactive (IR) neurons and a decrease in the percentage of CGRP-IR neurons in the lumbar DRGs. The JNK inhibitor, D-JNKI-1, successfully modulated the effects of the sciatic nerve transection. The onset of neuropathic pain was not prevented by the deletion of a single JNK isoform, leading us to conclude that all JNK isoforms collectively contribute to maintain neuropathy. Autotomy behavior, typically induced by sciatic nerve axotomy, was absent in both the JNK1 and JNK3 knockout mice. CONCLUSIONS: JNK signaling plays an important role in regulating pain threshold: the inhibition of all of the JNK isoforms prevents the onset of neuropathic pain, while the deletion of a single splice JNK isoform mitigates established sensory abnormalities. JNK inactivation also has an effect on axonal sprouting following peripheral nerve injury.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main function of a roadway culvert is to effectively convey drainage flow during normal and extreme hydrologic conditions. This function is often impaired due to the sedimentation blockage of the culvert. This research sought to understand the mechanics of sedimentation process at multi-box culverts, and develop self-cleaning systems that flush out sediment deposits using the power of drainage flows. The research entailed field observations, laboratory experiments, and numerical simulations. The specific role of each of these investigative tools is summarized below: a) The field observations were aimed at understanding typical sedimentation patterns and their dependence on culvert geometry and hydrodynamic conditions during normal and extreme hydrologic events. b) The laboratory experiments were used for modeling sedimentation process observed insitu and for testing alternative self-cleaning concepts applied to culverts. The major tasks for the initial laboratory model study were to accurately replicate the culvert performance curves and the dynamics of sedimentation process, and to provide benchmark data for numerical simulation validation. c) The numerical simulations enhanced the understanding of the sedimentation processes and aided in testing flow cases complementary to those conducted in the model reducing the number of (more expensive) tests to be conducted in the laboratory. Using the findings acquired from the laboratory and simulation works, self-cleaning culvert concepts were developed and tested for a range of flow conditions. The screening of the alternative concepts was made through experimental studies in a 1:20 scale model guided by numerical simulations. To ensure the designs are effective, performance studies were finally conducted in a 1:20 hydraulic model using the most promising design alternatives to make sure that the proposed systems operate satisfactory under closer to natural scale conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The model plant Arabidopsis thaliana (Arabidopsis) shows a wide range of genetic and trait variation among wild accessions. Because of its unparalleled biological and genomic resources, the potential of Arabidopsis for molecular genetic analysis of this natural variation has increased dramatically in recent years. SCOPE: Advanced genomics has accelerated molecular phylogenetic analysis and gene identification by quantitative trait loci (QTL) mapping and/or association mapping in Arabidopsis. In particular, QTL mapping utilizing natural accessions is now becoming a major strategy of gene isolation, offering an alternative to artificial mutant lines. Furthermore, the genomic information is used by researchers to uncover the signature of natural selection acting on the genes that contribute to phenotypic variation. The evolutionary significance of such genes has been evaluated in traits such as disease resistance and flowering time. However, although molecular hallmarks of selection have been found for the genes in question, a corresponding ecological scenario of adaptive evolution has been difficult to prove. Ecological strategies, including reciprocal transplant experiments and competition experiments, and utilizing near-isogenic lines of alleles of interest will be a powerful tool to measure the relative fitness of phenotypic and/or allelic variants. CONCLUSIONS: As the plant model organism, Arabidopsis provides a wealth of molecular background information for evolutionary genetics. Because genetic diversity between and within Arabidopsis populations is much higher than anticipated, combining this background information with ecological approaches might well establish Arabidopsis as a model organism for plant evolutionary ecology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The microtubule-associated protein MAP2 is essential for development of early neuronal morphology and maintenance of adult neuronal morphology. Several splice variants exist, MAP2a-d, with a lack of MAP2a in cat brain. MAP2 is widely used as a neuronal marker. In this study we compared five monoclonal antibodies (MAbs) against MAP2. They show differences in the immunocytochemical distribution of MAP2 isoforms during development of the visual cortex and cerebellum of the cat. Local and temporal differences were seen with MAb AP18, an antibody directed against a phosphorylation-dependent epitope near the N-terminal end. In large pyramidal dendrites in visual cortex, the AP18 epitope remained in parts immunoreactive after treatment with alkaline phosphatase. Three MAbs, AP14, MT-01, and MT-02, recognized the central region of the MAP2b molecule, which is not present in MAP2c and 2d, and reacted with phosphorylation-independent epitopes. During the first postnatal week the immunostaining in cerebellum differed between antibodies in that some cellular elements in external and internal granular layers and Purkinje cells were stained to various degrees, whereas at later stages staining patterns were similar. At early stages, antibody MT-02 stained cell bodies and dendrites in cerebral cortex and cerebellum. With progressing maturation, immunoreactivity became restricted to distal parts of apical dendrites of pyramidal cells and was absent from perikarya and finer proximal dendrites in cortex. MT-02 did not stain MAP2 in cerebellum of adult animals. This study demonstrates that the immunocytochemical detection of MAP2 depends on modifications such as phosphorylation and conformational changes of the molecule, and that MAP2 staining patterns differ between MAbs. Phosphorylation and specific conformations in the molecule may be essential for modulating function and molecular stability of MAP2, and monoclonal antibodies against such sites may provide tools for studying the functional role of modifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, most software development teams use free and open source software (FOSS) components, because it increases the speed and the quality of the development. Many open source components are the de facto standard of their category. However, FOSS has licensing restrictions, and corporate organizations usually maintain a list of allowed and forbidden licenses. But how do you enforce this policy? How can you make sure that ALL files in your source depot, either belong to you, or fit your licensing policy? A first, preventive approach is to train and increase the awareness of the development team to these licensing issues. Depending on the size of the team, it may be costly but necessary. However, this does not ensure that a single individual will not commit a forbidden icon or library, and jeopardize the legal status of the whole release... if not the company, since software is becoming more and more a critical asset. Another approach is to verify what is included in the source repository, and check whether it belongs to the open-source world. This can be done on-the-fly, whenever a new file is added into the source depot. It can also be part of the release process, as a verification step before publishing the release. In both cases, there are some tools and databases to automate the detection process. We will present the various options regarding FOSS detection, how this process can be integrated in the "software factory", and how the results can be displayed in a usable and efficient way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les plantes sont essentielles pour les sociétés humaines. Notre alimentation quotidienne, les matériaux de constructions et les sources énergétiques dérivent de la biomasse végétale. En revanche, la compréhension des multiples aspects développementaux des plantes est encore peu exploitée et représente un sujet de recherche majeur pour la science. L'émergence des technologies à haut débit pour le séquençage de génome à grande échelle ou l'imagerie de haute résolution permet à présent de produire des quantités énormes d'information. L'analyse informatique est une façon d'intégrer ces données et de réduire la complexité apparente vers une échelle d'abstraction appropriée, dont la finalité est de fournir des perspectives de recherches ciblées. Ceci représente la raison première de cette thèse. En d'autres termes, nous appliquons des méthodes descriptives et prédictives combinées à des simulations numériques afin d'apporter des solutions originales à des problèmes relatifs à la morphogénèse à l'échelle de la cellule et de l'organe. Nous nous sommes fixés parmi les objectifs principaux de cette thèse d'élucider de quelle manière l'interaction croisée des phytohormones auxine et brassinosteroïdes (BRs) détermine la croissance de la cellule dans la racine du méristème apical d'Arabidopsis thaliana, l'organisme modèle de référence pour les études moléculaires en plantes. Pour reconstruire le réseau de signalement cellulaire, nous avons extrait de la littérature les informations pertinentes concernant les relations entre les protéines impliquées dans la transduction des signaux hormonaux. Le réseau a ensuite été modélisé en utilisant un formalisme logique et qualitatif pour pallier l'absence de données quantitatives. Tout d'abord, Les résultats ont permis de confirmer que l'auxine et les BRs agissent en synergie pour contrôler la croissance de la cellule, puis, d'expliquer des observations phénotypiques paradoxales et au final, de mettre à jour une interaction clef entre deux protéines dans la maintenance du méristème de la racine. Une étude ultérieure chez la plante modèle Brachypodium dystachion (Brachypo- dium) a révélé l'ajustement du réseau d'interaction croisée entre auxine et éthylène par rapport à Arabidopsis. Chez ce dernier, interférer avec la biosynthèse de l'auxine mène à la formation d'une racine courte. Néanmoins, nous avons isolé chez Brachypodium un mutant hypomorphique dans la biosynthèse de l'auxine qui affiche une racine plus longue. Nous avons alors conduit une analyse morphométrique qui a confirmé que des cellules plus anisotropique (plus fines et longues) sont à l'origine de ce phénotype racinaire. Des analyses plus approfondies ont démontré que la différence phénotypique entre Brachypodium et Arabidopsis s'explique par une inversion de la fonction régulatrice dans la relation entre le réseau de signalisation par l'éthylène et la biosynthèse de l'auxine. L'analyse morphométrique utilisée dans l'étude précédente exploite le pipeline de traitement d'image de notre méthode d'histologie quantitative. Pendant la croissance secondaire, la symétrie bilatérale de l'hypocotyle est remplacée par une symétrie radiale et une organisation concentrique des tissus constitutifs. Ces tissus sont initialement composés d'une douzaine de cellules mais peuvent aisément atteindre des dizaines de milliers dans les derniers stades du développement. Cette échelle dépasse largement le seuil d'investigation par les moyens dits 'traditionnels' comme l'imagerie directe de tissus en profondeur. L'étude de ce système pendant cette phase de développement ne peut se faire qu'en réalisant des coupes fines de l'organe, ce qui empêche une compréhension des phénomènes cellulaires dynamiques sous-jacents. Nous y avons remédié en proposant une stratégie originale nommée, histologie quantitative. De fait, nous avons extrait l'information contenue dans des images de très haute résolution de sections transverses d'hypocotyles en utilisant un pipeline d'analyse et de segmentation d'image à grande échelle. Nous l'avons ensuite combiné avec un algorithme de reconnaissance automatique des cellules. Cet outil nous a permis de réaliser une description quantitative de la progression de la croissance secondaire révélant des schémas développementales non-apparents avec une inspection visuelle classique. La formation de pôle de phloèmes en structure répétée et espacée entre eux d'une longueur constante illustre les bénéfices de notre approche. Par ailleurs, l'exploitation approfondie de ces résultats a montré un changement de croissance anisotropique des cellules du cambium et du phloème qui semble en phase avec l'expansion du xylème. Combinant des outils génétiques et de la modélisation biomécanique, nous avons démontré que seule la croissance plus rapide des tissus internes peut produire une réorientation de l'axe de croissance anisotropique des tissus périphériques. Cette prédiction a été confirmée par le calcul du ratio des taux de croissance du xylème et du phloème au cours de développement secondaire ; des ratios élevés sont effectivement observés et concomitant à l'établissement progressif et tangentiel du cambium. Ces résultats suggèrent un mécanisme d'auto-organisation établi par un gradient de division méristématique qui génèrent une distribution de contraintes mécaniques. Ceci réoriente la croissance anisotropique des tissus périphériques pour supporter la croissance secondaire. - Plants are essential for human society, because our daily food, construction materials and sustainable energy are derived from plant biomass. Yet, despite this importance, the multiple developmental aspects of plants are still poorly understood and represent a major challenge for science. With the emergence of high throughput devices for genome sequencing and high-resolution imaging, data has never been so easy to collect, generating huge amounts of information. Computational analysis is one way to integrate those data and to decrease the apparent complexity towards an appropriate scale of abstraction with the aim to eventually provide new answers and direct further research perspectives. This is the motivation behind this thesis work, i.e. the application of descriptive and predictive analytics combined with computational modeling to answer problems that revolve around morphogenesis at the subcellular and organ scale. One of the goals of this thesis is to elucidate how the auxin-brassinosteroid phytohormone interaction determines the cell growth in the root apical meristem of Arabidopsis thaliana (Arabidopsis), the plant model of reference for molecular studies. The pertinent information about signaling protein relationships was obtained through the literature to reconstruct the entire hormonal crosstalk. Due to a lack of quantitative information, we employed a qualitative modeling formalism. This work permitted to confirm the synergistic effect of the hormonal crosstalk on cell elongation, to explain some of our paradoxical mutant phenotypes and to predict a novel interaction between the BREVIS RADIX (BRX) protein and the transcription factor MONOPTEROS (MP),which turned out to be critical for the maintenance of the root meristem. On the same subcellular scale, another study in the monocot model Brachypodium dystachion (Brachypodium) revealed an alternative wiring of auxin-ethylene crosstalk as compared to Arabidopsis. In the latter, increasing interference with auxin biosynthesis results in progressively shorter roots. By contrast, a hypomorphic Brachypodium mutant isolated in this study in an enzyme of the auxin biosynthesis pathway displayed a dramatically longer seminal root. Our morphometric analysis confirmed that more anisotropic cells (thinner and longer) are principally responsible for the mutant root phenotype. Further characterization pointed towards an inverted regulatory logic in the relation between ethylene signaling and auxin biosynthesis in Brachypodium as compared to Arabidopsis, which explains the phenotypic discrepancy. Finally, the morphometric analysis of hypocotyl secondary growth that we applied in this study was performed with the image-processing pipeline of our quantitative histology method. During its secondary growth, the hypocotyl reorganizes its primary bilateral symmetry to a radial symmetry of highly specialized tissues comprising several thousand cells, starting with a few dozens. However, such a scale only permits observations in thin cross-sections, severely hampering a comprehensive analysis of the morphodynamics involved. Our quantitative histology strategy overcomes this limitation. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with an automated cell type recognition algorithm, it allows precise quantitative characterization of vascular development and reveals developmental patterns that were not evident from visual inspection, for example the steady interspace distance of the phloem poles. Further analyses indicated a change in growth anisotropy of cambial and phloem cells, which appeared in phase with the expansion of xylem. Combining genetic tools and computational modeling, we showed that the reorientation of growth anisotropy axis of peripheral tissue layers only occurs when the growth rate of central tissue is higher than the peripheral one. This was confirmed by the calculation of the ratio of the growth rate xylem to phloem throughout secondary growth. High ratios are indeed observed and concomitant with the homogenization of cambium anisotropy. These results suggest a self-organization mechanism, promoted by a gradient of division in the cambium that generates a pattern of mechanical constraints. This, in turn, reorients the growth anisotropy of peripheral tissues to sustain the secondary growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verkostoitunut kansainvälinen tuotekehitys on tärkeä osa menestystä nykypäivän muuttuvassa yritysmaailmassa. Toimintojen tehostamiseksi myös projektitoiminnot on sopeutettava kansainväliseen toimintaympäristöön. Kilpailukyvyn säilyttämiseksi projektitoimintoja on lisäksi jatkuvasti tehostettava. Yhtenäkeinona nähdään projektioppiminen, jota voidaan edistää monin eri tavoin. Tässätyössä keskitytään projektitiedonhallinnan kehittämisen tuomiin oppimismahdollisuuksiin. Kirjallisuudessa kerrotaan, että projektitiedon jakaminen ja sen hyödyntäminen seuraavissa projekteissa on eräs projektioppimisen edellytyksistä. Tämäon otettu keskeiseksi näkökulmaksi tässä tutkimuksessa. Lisäksi tutkimusalueen rajaamiseksi työ tarkastelee erityisesti projektioppimista kansainvälisten tuotekehitysprojektien välillä. Työn tavoitteena on esitellä keskeisiä projektioppimisen haasteita ja etsiä konkreettinen ratkaisu vastaamaan näihin haasteisiin. Tuotekehitystoiminnot ja kansainvälinen hajautettu projektiorganisaatio kohtaavat lisäksi erityisiä haasteita, kuten tiedon hajautuneisuus, projektihenkilöstön vaihtuvuus, tiedon luottamuksellisuus ja maantieteelliset haasteet (esim. aikavyöhykkeet ja toimipisteen sijainti). Nämä erityishaasteet on otettu huomioon ratkaisua etsittäessä. Haasteisiin päädyttiin vastaamaan tietotekniikkapohjaisella ratkaisulla, joka suunniteltiin erityisesti huomioiden esimerkkiorganisaation tarpeet ja haasteet. Työssä tarkastellaan suunnitellun ratkaisun vaikutusta projektioppimiseen ja kuinka se vastaa havaittuihin haasteisiin. Tuloksissa huomattiin, että projektioppimista tapahtui, vaikka oppimista oli vaikea suoranaisesti huomata tutkimusorganisaation jäsenten keskuudessa. Projektioppimista voidaan kuitenkin sanoa tapahtuvan, jos projektitieto on helposti koko projektiryhmän saatavilla ja se on hyvin järjesteltyä. Muun muassa nämä ehdot täyttyivät. Projektioppiminen nähdään yleisesti haastavana kehitysalueena esimerkkiorganisaatiossa. Suuri osa tietämyksestä on niin sanottua hiljaistatietoa, jota on hankala tai mahdoton saattaa kirjalliseen muotoon. Näin olleen tiedon siirtäminen jää suurelta osin henkilökohtaisen vuorovaikutuksen varaan. Siitä huolimatta projektioppimista on mahdollista kehittää erilaisin toimintamallein ja menetelmin. Kehitys vaatii kuitenkin resursseja, pitkäjänteisyyttä ja aikaa. Monet muutokset voivat vaatia myös organisaatiokulttuurin muutoksen ja vaikuttamista organisaation jäseniin. Motivaatio, positiiviset mielikuvat ja selkeät strategiset tavoitteet luovat vakaan pohjan projektioppimisen kehittämiselle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this master's thesis a mechanical model that is driven with variable speed synchronous machine was developed. The developed mechanical model simulates the mechanics of power transmission and its torsional vibrations. The mechanical model was developed for the need of the branched mechanics of a rolling mill and the propulsion system of a tanker. First, the scope of the thesis was to clarify the concepts connected to the mechanical model. The clarified concepts are the variable speed drive, the mechanics of power transmission and the vibrationsin the power transmission. Next, the mechanical model with straight shaft line and twelve moments of inertia that existed in the beginning was developed to be branched considering the case of parallel machines and the case of parallel rolls. Additionally, the model was expanded for the need of moreaccurate simulation to up to thirty moments of inertia. The model was also enhanced to enable three phase short circuit situation of the simulated machine. After that the mechanical model was validated by comparing the results of the developed simulation tool to results of other simulation tools. The compared results are the natural frequencies and mode shapes of torsional vibration, the response of the load torque step and the stress in the mechanical system occurred by the permutation of the magnetic field that is arisen from the three phase short circuit situation. The comparisons were accomplished well and the mechanical model was validated for the compared cases. Further development to be made is to develop the load torque to be time-dependent and to install two frequency converters and two FEM modeled machines to be simulated parallel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän hetken trendit kuten globalisoituminen, ympäristömme turbulenttisuus, elintason nousu, turvallisuuden tarpeen kasvu ja teknologian kehitysnopeus korostavatmuutosten ennakoinnin tarpeellisuutta. Pysyäkseen kilpailukykyisenä yritysten tulee kerätä, analysoida ja hyödyntää liiketoimintatietoa, jokatukee niiden toimintaa viranomaisten, kilpailijoiden ja asiakkaiden toimenpiteiden ennakoinnissa. Innovoinnin ja uusien konseptien kehittäminen, kilpailijoiden toiminnan arviointi, asiakkaiden tarpeet muun muassa vaativatennakoivaa arviointia. Heikot signaalit ovat keskeisessä osassa organisaatioiden valmistautumisessa tulevaisuuden tapahtumiin. Opinnäytetyön tarkoitus on luoda ja kehittää heikkojen signaalien ymmärrystä ja hallintaa sekäkehittää konseptuaalinen ja käytännöllinen lähestymistapa ennakoivan toiminnan edistämiselle. Heikkojen signaalien tyyppien luokittelu perustuu ominaisuuksiin ajan, voimakkuuden ja liiketoimintaan integroinnin suhteen. Erityyppiset heikot signaalit piirteineen luovat reunaehdot laatutekijöiden keräämiselle ja siitä edelleen laatujärjestelmän ja matemaattiseen malliin perustuvan työvälineen kehittämiselle. Heikkojen signaalien laatutekijät on kerätty yhteen kaikista heikkojen signaalien konseptin alueista. Analysoidut ja kohdistetut laatumuuttujat antavat mahdollisuuden kehittää esianalyysiä ja ICT - työvälineitä perustuen matemaattisen mallin käyttöön. Opinnäytetyön tavoitteiden saavuttamiseksi tehtiin ensin Business Intelligence -kirjallisuustutkimus. Hiekkojen signaalien prosessi ja systeemi perustuvat koottuun Business Intelligence - systeemiin. Keskeisinä kehitysalueina tarkasteltiin liiketoiminnan integraatiota ja systemaattisen menetelmän kehitysaluetta. Heikkojen signaalien menetelmien ja määritelmien kerääminen sekä integrointi määriteltyyn prosessiin luovat uuden konseptin perustan, johon tyypitys ja laatutekijät kytkeytyvät. Käytännöllisen toiminnan tarkastelun ja käyttöönoton mahdollistamiseksi toteutettiin Business Intelligence markkinatutkimus (n=156) sekä yhteenveto muihin saatavilla oleviin markkinatutkimuksiin. Syvähaastatteluilla (n=21) varmennettiin laadullisen tarkastelun oikeellisuus. Lisäksi analysoitiin neljä käytännön projektia, joiden yhteenvedot kytkettiin uuden konseptin kehittämiseen. Prosessi voidaan jakaa kahteen luokkaan: yritysten markkinasignaalit vuoden ennakoinnilla ja julkisen sektorin verkostoprojektit kehittäen ennakoinnin struktuurin luonnin 7-15 vuoden ennakoivalle toiminnalle. Tutkimus rajattiin koskemaan pääasiassa ulkoisen tiedon aluetta. IT työvälineet ja lopullisen laatusysteemin kehittäminen jätettiin tutkimuksen ulkopuolelle. Opinnäytetyön tavoitteena ollut heikkojen signaalien konseptin kehittäminen toteutti sille asetetut odotusarvot. Heikkojen signaalien systemaattista tarkastelua ja kehittämistyötä on mahdollista edistää Business Intelligence - systematiikan hyödyntämisellä. Business Intelligence - systematiikkaa käytetään isojen yritysten liiketoiminnan suunnittelun tukena.Organisaatioiden toiminnassa ei ole kuitenkaan yleisesti hyödynnetty laadulliseen analyysiin tukeutuvaa ennakoinnin weak signals - toimintaa. Ulkoisenja sisäisen tiedon integroinnin ja systematiikan hyödyt PK -yritysten tukena vaativat merkittävää panostusta julkishallinnon rahoituksen ja kehitystoiminnan tukimuotoina. Ennakointi onkin tuottanut lukuisia julkishallinnon raportteja, mutta ei käytännön toteutuksia. Toisaalta analysoitujen case-tapausten tuloksena voidaan nähdä, ettei organisaatioissa välttämättä tarvita omaa projektipäällikköä liiketoiminnan tuen kehittämiseksi. Business vastuun ottamiseksi ja asiaan sitoutumiseen on kuitenkin löydyttävä oikea henkilö

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Yrityksen sisäisten rajapintojen tunteminen mahdollistaa tiedonvaihdon hallinnan läpi organisaation. Idean muokkaaminen kannattavaksi innovaatioksi edellyttää organisaation eri osien läpi kulkevaa saumatonta prosessiketjua sekä tietovirtaa. Tutkielman tavoitteena oli mallintaa organisaation kahden toiminnallisesti erilaisen osan välinen tiedon vaihto. Tiedon vaihto kuvattiin rajapintana, tietoliittymänä. Kolmiulotteinen organisaatiomalli muodosti tutkimuksen pääteorian. Se kytkettiin yrityksen tuotanto- ja myyntiosiin, kuten myös BestServ-projektin kehittämään uuteen palvelujen kehittämisen prosessiin. Uutta palvelujen kehittämisen prosessia laajennettiin ISO/IEC 15288 standardin kuvaamalla prosessimallilla. Yritysarkkitehtuurikehikoita käytettiin mallintamisen perustana. Tietoliittymä nimenä kuvastaa näkemystä siitä, että tieto [tietämys] on olemukseltaan yksilöiden tai ryhmien välistä. Mallinnusmenetelmät eivät kuitenkaan vielä mahdollista tietoon [tietämykseen] liittyvien kaikkien ominaisuuksien mallintamista. Tietoliittymän malli koostuu kolmesta osasta, joista kaksi esitetään graafisessa muodossa ja yksi taulukkona. Mallia voidaan käyttää itsenäisesti tai osana yritysarkkitehtuuria. Teollisessa palveluliiketoiminnassa sekä tietoliittymän mallinnusmenetelmä että sillä luotu malli voivat auttaa konepajateollisuuden yritystä ymmärtämään yrityksen kehittämistarpeet ja -kohteet, kun se haluaa palvelujen tuottamisella suuremman roolin asiakasyrityksen liiketoiminnassa. Tietoliittymän mallia voidaan käyttää apuna organisaation tietovarannon ja tietämyksen mallintamisessa sekä hallinnassa ja näin pyrkiä yhdistämään ne yrityksen strategiaa palvelevaksi kokonaisuudeksi. Tietoliittymän mallinnus tarjoaa tietojohtamisen kauppatieteelliselle tutkimukselle menetelmällisyyden tutkia innovaatioiden hallintaa sekä organisaation uudistumiskykyä. Kumpikin tutkimusalue tarvitsevat tarkempaa tietoa ja mahdollisuuksia hallita tietovirtoja, tiedon vaihtoa sekä organisaation tietovarannon käyttöä.