958 resultados para Engineering -- Data processing
Resumo:
In this paper, the issues that arise in multi-organisational collaborative groups (MOCGs) in the public sector are discussed and how a technology-based group support system (GSS) could assist individuals within these groups. MOCGs are commonly used in the public sector to find solutions to multifaceted social problems. Finding solutions for such problems is difficult because their scope is outside the boundary of a single government agency. The standard approach to solving such problems is collaborative involving a diverse range of stakeholders. Collaborative working can be advantageous but it also introduces its own pressures. Conflicts can arise due to the multiple contexts and goals of group members and the organisations that they represent. Trust, communication and a shared interface are crucial to making any significant progress. A GSS could support these elements.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
Multiple versions of information and associated problems are well documented in both academic research and industry best practices. Many solutions have proposed a single version of the truth, with Business intelligence being adopted by many organizations. Business Intelligence (BI), however, is largely based on the collection of data, processing and presentation of information to meet different stakeholders’ requirement. This paper reviews the promise of Enterprise Intelligence, which promises to support decision-making based on a defined strategic understanding of the organizations goals and a unified version of the truth.
Resumo:
In recent years, there has been an increasing interest in the adoption of emerging ubiquitous sensor network (USN) technologies for instrumentation within a variety of sustainability systems. USN is emerging as a sensing paradigm that is being newly considered by the sustainability management field as an alternative to traditional tethered monitoring systems. Researchers have been discovering that USN is an exciting technology that should not be viewed simply as a substitute for traditional tethered monitoring systems. In this study, we investigate how a movement monitoring measurement system of a complex building is developed as a research environment for USN and related decision-supportive technologies. To address the apparent danger of building movement, agent-mediated communication concepts have been designed to autonomously manage large volumes of exchanged information. In this study, we additionally detail the design of the proposed system, including its principles, data processing algorithms, system architecture, and user interface specifics. Results of the test and case study demonstrate the effectiveness of the USN-based data acquisition system for real-time monitoring of movement operations.
Resumo:
Environment monitoring applications using Wireless Sensor Networks (WSNs) have had a lot of attention in recent years. In much of this research tasks like sensor data processing, environment states and events decision making and emergency message sending are done by a remote server. A proposed cross layer protocol for two different applications where, reliability for delivered data, delay and life time of the network need to be considered, has been simulated and the results are presented in this paper. A WSN designed for the proposed applications needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from source nodes to the sink. A cross layer based on the design given in [1] has been extended and simulated for the proposed applications, with new features, such as routes discovery algorithms added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability.
Resumo:
The deterpenation of bergamot essential oil can be performed by liquid liquid extraction using hydrous ethanol as the solvent. A ternary mixture composed of 1-methyl-4-prop-1-en-2-yl-cydohexene (limonene), 3,7-dimethylocta-1,6-dien-3-yl-acetate (linalyl acetate), and 3,7-dimethylocta-1,6-dien-3-ol (linalool), three major compounds commonly found in bergamot oil, was used to simulate this essential oil. Liquid liquid equilibrium data were experimentally determined for systems containing essential oil compounds, ethanol, and water at 298.2 K and are reported in this paper. The experimental data were correlated using the NRTL and UNIQUAC models, and the mean deviations between calculated and experimental data were lower than 0.0062 in all systems, indicating the good descriptive quality of the molecular models. To verify the effect of the water mass fraction in the solvent and the linalool mass fraction in the terpene phase on the distribution coefficients of the essential oil compounds, nonlinear regression analyses were performed, obtaining mathematical models with correlation coefficient values higher than 0.99. The results show that as the water content in the solvent phase increased, the kappa value decreased, regardless of the type of compound studied. Conversely, as the linalool content increased, the distribution coefficients of hydrocarbon terpene and ester also increased. However, the linalool distribution coefficient values were negatively affected when the terpene alcohol content increased in the terpene phase.
Resumo:
Liquid-liquid equilibrium experimental data for refined sunflower seed oil, artificially acidified with commercial oleic acid or commercial linoleic acid and a solvent (ethanol + water), were determined at 298.2 K. This set of experimental data and the experimental data from Cuevas et al.,(1) which were obtained from (283.2 to 333.2) K, for degummed sunflower seed oil-containing systems were correlated using NRTL and UNIQUAC models with temperature-dependent binary parameters. The deviation between experimental and calculated compositions presented average values of (1.13 and 1.41) % for NRTL and UNIQUAC equations, respectively, indicating that the models were able to correctly describe the behavior of compounds under different temperature and solvent hydration.
Resumo:
The present paper reports phase equilibrium experimental data for two systems composed by peanut oil or avocado seed oil + commercial oleic acid + ethanol + water at 298.2 K and different water contents in the solvent. The addition of water to the solvent reduces the loss of neutral oil in the alcoholic phase and improves the solvent selectivity. The experimental data were correlated by the NRTL and UNIQUAC models. The global deviations between calculated and experimental values were 0.63 % and 1.08 %, respectively, for the systems containing avocado seed oil. In the case of systems containing peanut oil those deviations were 0.65 % and 0.98 %, respectively. Such results indicate that both models were able to reproduce correctly the experimental data, although the NRTL model presented a better performance.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
This work aims to determine a better methodology to help predicting some operational parameters to a new design of mixer-settler on treating wastewater produced by petroleum industry, called MDIF (Misturador-Decantador à Inversão de Fases/ Mixer-Settler based on Phase Inversion MSPI). The data from this research were obtained from the wastewater treatment unit, called MSPI-TU, installed on a wastewater treatment plant (WTP) of PETROBRAS/UO-RNCE. The importance in determining the better methodology to predict the results of separation and extraction efficiency of the equipment, contributes significantly to determine the optimum operating variables for the control of the unit. The study was based on a comparison among the experimental efficiency (E) obtained by operating MSPI-TU, the efficiency obtained by experimental design equation (Eplan) from the software Statistica Experimental Design® (version 7.0), and the other obtained from a modeling equation based on a dimensional analysis (Ecalc). The results shows that the experimental design equation gives a good prediction of the unit efficiencies with better data reliability, regarding to the condition before a run operation. The average deviation between the proposed by statistic planning model equation and experimental data was 0.13%. On the other hand, the efficiency calculated by the equation which represents the dimensional analysis, may result on important relative deviations (up 70%). Thus, the experimental design is confirmed as a reliable tool, with regard the experimental data processing of the MSPI-TU
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.
Resumo:
Grinding process is usually the last finishing process of a precision component in the manufacturing industries. This process is utilized for manufacturing parts of different materials, so it demands results such as low roughness, dimensional and shape error control, optimum tool-life, with minimum cost and time. Damages on the parts are very expensive since the previous processes and the grinding itself are useless when the part is damaged in this stage. This work aims to investigate the efficiency of digital signal processing tools of acoustic emission signals in order to detect thermal damages in grinding process. To accomplish such a goal, an experimental work was carried out for 15 runs in a surface grinding machine operating with an aluminum oxide grinding wheel and ABNT 1045 e VC131 steels. The acoustic emission signals were acquired from a fixed sensor placed on the workpiece holder. A high sampling rate acquisition system at 2.5 MHz was used to collect the raw acoustic emission instead of root mean square value usually employed. In each test AE data was analyzed off-line, with results compared to inspection of each workpiece for burn and other metallurgical anomaly. A number of statistical signal processing tools have been evaluated.
Resumo:
This paper adresses the problem on processing biological data such as cardiac beats, audio and ultrasonic range, calculating wavelet coefficients in real time, with processor clock running at frequency of present ASIC's and FPGA. The Paralell Filter Architecture for DWT has been improved, calculating wavelet coefficients in real time with hardware reduced to 60%. The new architecture, which also processes IDWT, is implemented with the Radix-2 or the Booth-Wallace Constant multipliers. Including series memory register banks, one integrated circuit Signal Analyzer, ultrasonic range, is presented.