877 resultados para Analysis tools
Resumo:
This research proposes a method for extracting technology intelligence (TI) systematically from a large set of document data. To do this, the internal and external sources in the form of documents, which might be valuable for TI, are first identified. Then the existing techniques and software systems applicable to document analysis are examined. Finally, based on the reviews, a document-mining framework designed for TI is suggested and guidelines for software selection are proposed. The research output is expected to support intelligence operatives in finding suitable techniques and software systems for getting value from document-mining and thus facilitate effective knowledge management. Copyright © 2012 Inderscience Enterprises Ltd.
Resumo:
A substantial amount of the 'critical mass' of digital data available to scholarship contains place-names, and it is now recognised that spatial and temporal data points, including place-names, are a vital part of the e-research infrastructure that supports the use, re-use and advanced analysis of data using ICT tools and methods. Place-names can also be linked semantically to contribute to the web of data, and to enrich content through linking existing data, and identifying new collections for digitization to strategically enhance existing digital collections. However, existing e-projects rely on modern gazetteers limiting them to the modern and the near-contemporary. This workshop explored how to further integrate the wealth of historical place-name scholarship, and the resulting digital resources generated within UK academia, so enabling integration of local knowledge over much longer periods.
Resumo:
Integrating analysis and design models is a complex task due to differences between the models and the architectures of the toolsets used to create them. This complexity is increased with the use of many different tools for specific tasks during an analysis process. In this work various design and analysis models are linked throughout the design lifecycle, allowing them to be moved between packages in a way not currently available. Three technologies named Cellular Modeling, Virtual Topology and Equivalencing are combined to demonstrate how different finite element meshes generated on abstract analysis geometries can be linked to their original geometry. Establishing the equivalence relationships between models enables analysts to utilize multiple packages for specialist tasks without worrying about compatibility issues or rework.
Resumo:
The work aims at assessing the success of Brunetta’s reform (Legislative Decree n. 150/2009), a far-reaching reform that aimed at improving both organizational and individual performance in Italian public administration through a specific planning and control process (the performance cycle) and most of all through two new tools, Performance Plan and Performance Report. The success of the reform is assessed, with particular emphasis on local governments, analyzing the diffusion and use of these new tools. The study has been conducted using a deductive-inductive methodology. Thus, after a study of managerial reforms in Italy and performance measurement literature, a possible model (PerformEL Model) local governments could follow to draw up Performance Plan and Report as effective tools for performance measurement has been designed (deductive phase). Performance Plans 2011-2013 and Performance Report 2011 downloaded from Italian big sized municipalities’ websites have been analyzed in the light of PerformEL Model, to assess the diffusion of the documents and their coherence with legal requirements and suggestions from literature (inductive phase). Data arising from the empirical analysis have been studied to evaluate the diffusion and the effectiveness of big sized municipalities’ Performance Plans and Reports as performance measurement tools and thus to assess the success of the reform (feedback phase). The study shows a scarce diffusion of the documents; they are mostly drew up because of their compulsoriness or to gain legitimization. The results testify the failure of Brunetta’s reform, at least with regard to local governments.
Resumo:
Integrating analysis and design models is a complex task due to differences between the models and the architectures of the toolsets used to create them. This complexity is increased with the use of many different tools for specific tasks using an analysis process. In this work various design and analysis models are linked throughout the design lifecycle, allowing them to be moved between packages in a way not currently available. Three technologies named Cellular Modeling, Virtual Topology and Equivalencing are combined to demonstrate how different finite element meshes generated on abstract analysis geometries can be linked to their original geometry. Cellular models allow interfaces between adjacent cells to be extracted and exploited to transfer analysis attributes such as mesh associativity or boundary conditions between equivalent model representations. Virtual Topology descriptions used for geometry clean-up operations are explicitly stored so they can be reused by downstream applications. Establishing the equivalence relationships between models enables analysts to utilize multiple packages for specialist tasks without worrying about compatibility issues or substantial rework.
Resumo:
The present work deals with the development of robust numerical tools for Isogeometric Analysis suitable for problems of solid mechanics in the nonlinear regime. To that end, a new solid-shell element, based on the Assumed Natural Strain method, is proposed for the analysis of thin shell-like structures. The formulation is extensively validated using a set of well-known benchmark problems available in the literature, in both linear and nonlinear (geometric and material) regimes. It is also proposed an alternative formulation which is focused on the alleviation of the volumetric locking pathology in linear elastic problems. In addition, an introductory study in the field of contact mechanics, in the context of Isogeometric Analysis, is also presented, with special focus on the implementation of a the Point-to-Segment algorithm. All the methodologies presented in the current work were implemented in a in-house code, together with several pre- and post-processing tools. In addition, user subroutines for the commercial software Abaqus were also implemented.
Resumo:
Si3N4 tools were coated with a thin diamond film using a Hot-Filament Chemical Vapour Deposition (HFCVD) reactor, in order to machining a grey cast iron. Wear behaviour of these tools in high speed machining was the main subject of this work. Turning tests were performed with a combination of cutting speeds of 500, 700 and 900 m min−1, and feed rates of 0.1, 0.25 and 0.4 mm rot−1, remaining constant the depth of cut of 1 mm. In order to evaluate the tool behaviour during the turning tests, cutting forces were analyzed being verified a significant increase with feed rate. Diamond film removal occurred for the most severe set of cutting parameters. It was also observed the adhesion of iron and manganese from the workpiece to the tool. Tests were performed on a CNC lathe provided with a 3-axis dynamometer. Results were collected and registered by homemade software. Tool wear analysis was achieved by a Scanning Electron Microscope (SEM) provided with an X-ray Energy Dispersive Spectroscopy (EDS) system. Surface analysis was performed by a profilometer.
Resumo:
Dissertation to obtain the degree of Doctor in Electrical and Computer Engineering, specialization of Collaborative Networks
Resumo:
The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.
Resumo:
This paper presents a tool for the analysis and regeneration of Web contents, implemented through XML and Java. At the moment, the Web content delivery from server to clients is carried out without taking into account clients' characteristics. Heterogeneous and diverse characteristics, such as user's preferences, different capacities of the client's devices, different types of access, state of the network and current load on the server, directly affect the behavior of Web services. On the other hand, the growing use of multimedia objects in the design of Web contents is made without taking into account this diversity and heterogeneity. It affects, even more, the appropriate content delivery. Thus, the objective of the presented tool is the treatment of Web pages taking into account the mentioned heterogeneity and adapting contents in order to improve the performance on the Web
Resumo:
A set of NIH Image macro programs was developed to make qualitative and quantitative analyses from digital stereo pictures produced by scanning electron microscopes. These tools were designed for image alignment, anaglyph representation, animation, reconstruction of true elevation surfaces, reconstruction of elevation profiles, true-scale elevation mapping and, for the quantitative approach, surface area and roughness calculations. Limitations on time processing, scanning techniques and programming concepts are also discussed.
Resumo:
Design tools have existed for decades for standard step-index fibers, with analytical expressions for cutoff conditions as a function of core size, refractive indexes, and wavelength. We present analytical expressions for cutoff conditions for fibers with a ring-shaped propagation region. We validate our analytical expressions against numerical solutions, as well as via asymptotic analysis yielding the existing solutions for standard step-index fiber. We demonstrate the utility of our solutions for optimizing fibers supporting specific eigenmode behaviors of interest for spatial division multiplexing. In particular, we address large mode separation for orbital angular momentum modes and fibers supporting only modes with a single intensity ring.
Resumo:
The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.