958 resultados para Actor-network mapping


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents the Multiple Kernel Learning (MKL) approach as a modelling and data exploratory tool and applies it to the problem of wind speed mapping. Support Vector Regression (SVR) is used to predict spatial variations of the mean wind speed from terrain features (slopes, terrain curvature, directional derivatives) generated at different spatial scales. Multiple Kernel Learning is applied to learn kernels for individual features and thematic feature subsets, both in the context of feature selection and optimal parameters determination. An empirical study on real-life data confirms the usefulness of MKL as a tool that enhances the interpretability of data-driven models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel image classification scheme for benthic coral reef images that can be applied to both single image and composite mosaic datasets. The proposed method can be configured to the characteristics (e.g., the size of the dataset, number of classes, resolution of the samples, color information availability, class types, etc.) of individual datasets. The proposed method uses completed local binary pattern (CLBP), grey level co-occurrence matrix (GLCM), Gabor filter response, and opponent angle and hue channel color histograms as feature descriptors. For classification, either k-nearest neighbor (KNN), neural network (NN), support vector machine (SVM) or probability density weighted mean distance (PDWMD) is used. The combination of features and classifiers that attains the best results is presented together with the guidelines for selection. The accuracy and efficiency of our proposed method are compared with other state-of-the-art techniques using three benthic and three texture datasets. The proposed method achieves the highest overall classification accuracy of any of the tested methods and has moderate execution time. Finally, the proposed classification scheme is applied to a large-scale image mosaic of the Red Sea to create a completely classified thematic map of the reef benthos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Activity decreases, or deactivations, of midline and parietal cortical brain regions are routinely observed in human functional neuroimaging studies that compare periods of task-based cognitive performance with passive states, such as rest. It is now widely held that such task-induced deactivations index a highly organized"default-mode network" (DMN): a large-scale brain system whose discovery has had broad implications in the study of human brain function and behavior. In this work, we show that common task-induced deactivations from rest also occur outside of the DMN as a function of increased task demand. Fifty healthy adult subjects performed two distinct functional magnetic resonance imaging tasks that were designed to reliably map deactivations from a resting baseline. As primary findings, increases in task demand consistently modulated the regional anatomy of DMN deactivation. At high levels of task demand, robust deactivation was observed in non-DMN regions, most notably, the posterior insular cortex. Deactivation of this region was directly implicated in a performance-based analysis of experienced task difficulty. Together, these findings suggest that task-induced deactivations from rest are not limited to the DMN and extend to brain regions typically associated with integrative sensory and interoceptive processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli selvittää miten Bluetooth-teknologia voi vaikuttaa ICT-alan arvoverkkoon ja alan toimijoiden rooleihin. Tutkimuksessa käytettiin kvalitatiivisia menetelmiä ja siinä oli piirteitä tulevaisuuden tutkimuksesta, koska myös eksploratiivisia menetelmiä käytettiin. Tutkimus perustui laajalti kirjallisuuteen ja artikkeleihin, joiden perusteella muodostettiin skenaarioita ICT-alan tulevaisuuden arvoverkosta. Tutkimus sisälsi myös empiirisen osion, jossa järjestettiin kaksi ryhmäkeskustelua, joissa alan toimijat keskustelivat muodostetuista skenaarioista sekä yleisemmin Bluetoothin vaikutuksista arvoverkkoon. Tutkimuksen mukaan ICT-alan arvoverkko tulevaisuudessa on rakenteeltaan strateginen arvoverkko, jota johtaa eri toimija erilaisissa markkinatilanteissa. Verkon strateginen keskus voi muuttua ajan kuluessa kun Bluetoothin sovelluskohteet lisääntyvät ja se voi vaihdella markkina-alueesta toiseen. Bluetooth luo uuden kommunikaatiokanavan nykyisten rinnalle ja se voi paikallisesti korvata nykyisten kanavien käytön tiedon siirrossa. Bluetooth voi synnyttää lukuisia uusia liiketoimintamahdollisuuksia ja sen avulla voidaan tuottaa lisäarvopalveluita. Suurimpien muutosten voidaan olettaa koskevan teleoperaattoreiden ja sisällöntuottajien liiketoimintaa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The second scientific meeting of the European systems genetics network for the study of complex genetic human disease using genetic reference populations (SYSGENET) took place at the Center for Cooperative Research in Biosciences in Bilbao, Spain, December 10-12, 2012. SYSGENET is funded by the European Cooperation in the Field of Scientific and Technological Research (COST) and represents a network of scientists in Europe that use mouse genetic reference populations (GRPs) to identify complex genetic factors influencing disease phenotypes (Schughart, Mamm Genome 21:331-336, 2010). About 50 researchers working in the field of systems genetics attended the meeting, which consisted of 27 oral presentations, a poster session, and a management committee meeting. Participants exchanged results, set up future collaborations, and shared phenotyping and data analysis methodologies. This meeting was particularly instrumental for conveying the current status of the US, Israeli, and Australian Collaborative Cross (CC) mouse GRP. The CC is an open source project initiated nearly a decade ago by members of the Complex Trait Consortium to aid the mapping of multigenetic traits (Threadgill, Mamm Genome 13:175-178, 2002). In addition, representatives of the International Mouse Phenotyping Consortium were invited to exchange ongoing activities between the knockout and complex genetics communities and to discuss and explore potential fields for future interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ongoing global financial crisis has demonstrated the importance of a systemwide, or macroprudential, approach to safeguarding financial stability. An essential part of macroprudential oversight concerns the tasks of early identification and assessment of risks and vulnerabilities that eventually may lead to a systemic financial crisis. Thriving tools are crucial as they allow early policy actions to decrease or prevent further build-up of risks or to otherwise enhance the shock absorption capacity of the financial system. In the literature, three types of systemic risk can be identified: i ) build-up of widespread imbalances, ii ) exogenous aggregate shocks, and iii ) contagion. Accordingly, the systemic risks are matched by three categories of analytical methods for decision support: i ) early-warning, ii ) macro stress-testing, and iii ) contagion models. Stimulated by the prolonged global financial crisis, today's toolbox of analytical methods includes a wide range of innovative solutions to the two tasks of risk identification and risk assessment. Yet, the literature lacks a focus on the task of risk communication. This thesis discusses macroprudential oversight from the viewpoint of all three tasks: Within analytical tools for risk identification and risk assessment, the focus concerns a tight integration of means for risk communication. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. The overall task of this thesis is to represent high-dimensional data concerning financial entities on lowdimensional displays. The low-dimensional representations have two subtasks: i ) to function as a display for individual data concerning entities and their time series, and ii ) to use the display as a basis to which additional information can be linked. The final nuance of the task is, however, set by the needs of the domain, data and methods. The following ve questions comprise subsequent steps addressed in the process of this thesis: 1. What are the needs for macroprudential oversight? 2. What form do macroprudential data take? 3. Which data and dimension reduction methods hold most promise for the task? 4. How should the methods be extended and enhanced for the task? 5. How should the methods and their extensions be applied to the task? Based upon the Self-Organizing Map (SOM), this thesis not only creates the Self-Organizing Financial Stability Map (SOFSM), but also lays out a general framework for mapping the state of financial stability. This thesis also introduces three extensions to the standard SOM for enhancing the visualization and extraction of information: i ) fuzzifications, ii ) transition probabilities, and iii ) network analysis. Thus, the SOFSM functions as a display for risk identification, on top of which risk assessments can be illustrated. In addition, this thesis puts forward the Self-Organizing Time Map (SOTM) to provide means for visual dynamic clustering, which in the context of macroprudential oversight concerns the identification of cross-sectional changes in risks and vulnerabilities over time. Rather than automated analysis, the aim of visual means for identifying and assessing risks is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence, as well as external risk communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous loading and unloading can cause breakdown of cranes. In seeking solution to this problem, the use of an intelligent control system for improving the fatigue life of cranes in the control of mechatronics has been under study since 1994. This research focuses on the use of neural networks as possibilities of developing algorithm to map stresses on a crane. The intelligent algorithm was designed to be a part of the system of a crane, the design process started with solid works, ANSYS and co-simulation using MSc Adams software which was incorporated in MATLAB-Simulink and finally MATLAB neural network (NN) for the optimization process. The flexibility of the boom accounted for the accuracy of the maximum stress results in the ADAMS model. The flexibility created in ANSYS produced more accurate results compared to the flexibility model in ADAMS/View using discrete link. The compatibility between.ADAMS and ANSYS softwares was paramount in the efficiency and the accuracy of the results. Von Mises stresses analysis was more suitable for this thesis work because the hydraulic boom was made from construction steel FE-510 of steel grade S355 with yield strength of 355MPa. Von Mises theory was good for further analysis due to ductility of the material and the repeated tensile and shear loading. Neural network predictions for the maximum stresses were then compared with the co-simulation results for accuracy, and the comparison showed that the results obtained from neural network model were sufficiently accurate in predicting the maximum stresses on the boom than co-simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acid sulfate (a.s.) soils constitute a major environmental issue. Severe ecological damage results from the considerable amounts of acidity and metals leached by these soils in the recipient watercourses. As even small hot spots may affect large areas of coastal waters, mapping represents a fundamental step in the management and mitigation of a.s. soil environmental risks (i.e. to target strategic areas). Traditional mapping in the field is time-consuming and therefore expensive. Additional more cost-effective techniques have, thus, to be developed in order to narrow down and define in detail the areas of interest. The primary aim of this thesis was to assess different spatial modeling techniques for a.s. soil mapping, and the characterization of soil properties relevant for a.s. soil environmental risk management, using all available data: soil and water samples, as well as datalayers (e.g. geological and geophysical). Different spatial modeling techniques were applied at catchment or regional scale. Two artificial neural networks were assessed on the Sirppujoki River catchment (c. 440 km2) located in southwestern Finland, while fuzzy logic was assessed on several areas along the Finnish coast. Quaternary geology, aerogeophysics and slope data (derived from a digital elevation model) were utilized as evidential datalayers. The methods also required the use of point datasets (i.e. soil profiles corresponding to known a.s. or non-a.s. soil occurrences) for training and/or validation within the modeling processes. Applying these methods, various maps were generated: probability maps for a.s. soil occurrence, as well as predictive maps for different soil properties (sulfur content, organic matter content and critical sulfide depth). The two assessed artificial neural networks (ANNs) demonstrated good classification abilities for a.s. soil probability mapping at catchment scale. Slightly better results were achieved using a Radial Basis Function (RBF) -based ANN than a Radial Basis Functional Link Net (RBFLN) method, narrowing down more accurately the most probable areas for a.s. soil occurrence and defining more properly the least probable areas. The RBF-based ANN also demonstrated promising results for the characterization of different soil properties in the most probable a.s. soil areas at catchment scale. Since a.s. soil areas constitute highly productive lands for agricultural purpose, the combination of a probability map with more specific soil property predictive maps offers a valuable toolset to more precisely target strategic areas for subsequent environmental risk management. Notably, the use of laser scanning (i.e. Light Detection And Ranging, LiDAR) data enabled a more precise definition of a.s. soil probability areas, as well as the soil property modeling classes for sulfur content and the critical sulfide depth. Given suitable training/validation points, ANNs can be trained to yield a more precise modeling of the occurrence of a.s. soils and their properties. By contrast, fuzzy logic represents a simple, fast and objective alternative to carry out preliminary surveys, at catchment or regional scale, in areas offering a limited amount of data. This method enables delimiting and prioritizing the most probable areas for a.s soil occurrence, which can be particularly useful in the field. Being easily transferable from area to area, fuzzy logic modeling can be carried out at regional scale. Mapping at this scale would be extremely time-consuming through manual assessment. The use of spatial modeling techniques enables the creation of valid and comparable maps, which represents an important development within the a.s. soil mapping process. The a.s. soil mapping was also assessed using water chemistry data for 24 different catchments along the Finnish coast (in all, covering c. 21,300 km2) which were mapped with different methods (i.e. conventional mapping, fuzzy logic and an artificial neural network). Two a.s. soil related indicators measured in the river water (sulfate content and sulfate/chloride ratio) were compared to the extent of the most probable areas for a.s. soils in the surveyed catchments. High sulfate contents and sulfate/chloride ratios measured in most of the rivers demonstrated the presence of a.s. soils in the corresponding catchments. The calculated extent of the most probable a.s. soil areas is supported by independent data on water chemistry, suggesting that the a.s. soil probability maps created with different methods are reliable and comparable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this Master’s thesis is to study value co-creation in emerging value network. The main objective is to examine how value is co-created in bio-based chemicals value network. The study provides insights to different actors’ perceived value in the value network and enlightens their motivations to commit to the collaborative partnerships with other actors. Empirical study shows that value co-creation is creation of mutual value for both parties of the relationship by combining their non-competing resources to achieve a common goal. Value co-creation happens in interactions, and trust, commitment and information sharing are essential prerequisites for value co-creation. Value co-creation is not only common value creation, but it is also value that emerges for each actor because of the co-operation with the other actor. Even though the case companies define value mainly in economic terms, the other value elements like value of the partnership, knowledge transfer and innovation are more important for value co-creation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract 1: Social Networks such as Twitter are often used for disseminating and collecting information during natural disasters. The potential for its use in Disaster Management has been acknowledged. However, more nuanced understanding of the communications that take place on social networks are required to more effectively integrate this information into the processes within disaster management. The type and value of information shared should be assessed, determining the benefits and issues, with credibility and reliability as known concerns. Mapping the tweets in relation to the modelled stages of a disaster can be a useful evaluation for determining the benefits/drawbacks of using data from social networks, such as Twitter, in disaster management.A thematic analysis of tweets’ content, language and tone during the UK Storms and Floods 2013/14 was conducted. Manual scripting was used to determine the official sequence of events, and classify the stages of the disaster into the phases of the Disaster Management Lifecycle, to produce a timeline. Twenty- five topics discussed on Twitter emerged, and three key types of tweets, based on the language and tone, were identified. The timeline represents the events of the disaster, according to the Met Office reports, classed into B. Faulkner’s Disaster Management Lifecycle framework. Context is provided when observing the analysed tweets against the timeline. This illustrates a potential basis and benefit for mapping tweets into the Disaster Management Lifecycle phases. Comparing the number of tweets submitted in each month with the timeline, suggests users tweet more as an event heightens and persists. Furthermore, users generally express greater emotion and urgency in their tweets.This paper concludes that the thematic analysis of content on social networks, such as Twitter, can be useful in gaining additional perspectives for disaster management. It demonstrates that mapping tweets into the phases of a Disaster Management Lifecycle model can have benefits in the recovery phase, not just in the response phase, to potentially improve future policies and activities. Abstract2: The current execution of privacy policies, as a mode of communicating information to users, is unsatisfactory. Social networking sites (SNS) exemplify this issue, attracting growing concerns regarding their use of personal data and its effect on user privacy. This demonstrates the need for more informative policies. However, SNS lack the incentives required to improve policies, which is exacerbated by the difficulties of creating a policy that is both concise and compliant. Standardization addresses many of these issues, providing benefits for users and SNS, although it is only possible if policies share attributes which can be standardized. This investigation used thematic analysis and cross- document structure theory, to assess the similarity of attributes between the privacy policies (as available in August 2014), of the six most frequently visited SNS globally. Using the Jaccard similarity coefficient, two types of attribute were measured; the clauses used by SNS and the coverage of forty recommendations made by the UK Information Commissioner’s Office. Analysis showed that whilst similarity in the clauses used was low, similarity in the recommendations covered was high, indicating that SNS use different clauses, but to convey similar information. The analysis also showed that low similarity in the clauses was largely due to differences in semantics, elaboration and functionality between SNS. Therefore, this paper proposes that the policies of SNS already share attributes, indicating the feasibility of standardization and five recommendations are made to begin facilitating this, based on the findings of the investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European research project TIDE (Tidal Inlets Dynamics and Environment) is developing and validating coupled models describing the morphological, biological and ecological evolution of tidal environments. The interactions between the physical and biological processes occurring in these regions requires that the system be studied as a whole rather than as separate parts. Extensive use of remote sensing including LiDAR is being made to provide validation data for the modelling. This paper describes the different uses of LiDAR within the project and their relevance to the TIDE science objectives. LiDAR data have been acquired from three different environments, the Venice Lagoon in Italy, Morecambe Bay in England, and the Eden estuary in Scotland. LiDAR accuracy at each site has been evaluated using ground reference data acquired with differential GPS. A semi-automatic technique has been developed to extract tidal channel networks from LiDAR data either used alone or fused with aerial photography. While the resulting networks may require some correction, the procedure does allow network extraction over large areas using objective criteria and reduces fieldwork requirements. The networks extracted may subsequently be used in geomorphological analyses, for example to describe the drainage patterns induced by networks and to examine the rate of change of networks. Estimation of the heights of the low and sparse vegetation on marshes is being investigated by analysis of the statistical distribution of the measured LiDAR heights. Species having different mean heights may be separated using the first-order moments of the height distribution.