46 resultados para modern techniques
Resumo:
Two chromatographic methods, gas chromatography with flow ionization detection (GC–FID) and liquid chromatography with ultraviolet detection (LC–UV), were used to determine furfuryl alcohol in several kinds of foundry resins, after application of an optimised extraction procedure. The GC method developed gave feasibility that did not depend on resin kind. Analysis by LC was suitable just for furanic resins. The presence of interference in the phenolic resins did not allow an appropriate quantification by LC. Both methods gave accurate and precise results. Recoveries were >94%; relative standard deviations were ≤7 and ≤0.3%, respectively for GC and LC methods. Good relative deviations between the two methods were found (≤3%).
Resumo:
Phenol is a toxic compound present in a wide variety of foundry resins. Its quantification is important for the characterization of the resins as well as for the evaluation of free contaminants present in foundry wastes. Two chromatographic methods, liquid chromatography with ultraviolet detection (LC-UV) and gas chromatography with flame ionization detection (GC-FID), for the analysis of free phenol in several foundry resins, after a simple extraction procedure (30 min), were developed. Both chromatographic methods were suitable for the determination of phenol in the studied furanic and phenolic resins, showing good selectivity, accuracy (recovery 99–100%; relative deviations <5%), and precision (coefficients of variation <6%). The used ASTM reference method was only found to be useful in the analysis of phenolic resins, while the LC and GC methods were applicable for all the studied resins. The developed methods reduce the time of analysis from 3.5 hours to about 30 min and can readily be used in routine quality control laboratories.
Resumo:
When exploring a virtual environment, realism depends mainly on two factors: realistic images and real-time feedback (motions, behaviour etc.). In this context, photo realism and physical validity of computer generated images required by emerging applications, such as advanced e-commerce, still impose major challenges in the area of rendering research whereas the complexity of lighting phenomena further requires powerful and predictable computing if time constraints must be attained. In this technical report we address the state-of-the-art on rendering, trying to put the focus on approaches, techniques and technologies that might enable real-time interactive web-based clientserver rendering systems. The focus is on the end-systems and not the networking technologies used to interconnect client(s) and server(s).
Resumo:
Para obtenção do grau de Doutor pela Universidade de Vigo com menção internacional Departamento de Informática
Resumo:
Wind resource evaluation in two sites located in Portugal was performed using the mesoscale modelling system Weather Research and Forecasting (WRF) and the wind resource analysis tool commonly used within the wind power industry, the Wind Atlas Analysis and Application Program (WAsP) microscale model. Wind measurement campaigns were conducted in the selected sites, allowing for a comparison between in situ measurements and simulated wind, in terms of flow characteristics and energy yields estimates. Three different methodologies were tested, aiming to provide an overview of the benefits and limitations of these methodologies for wind resource estimation. In the first methodology the mesoscale model acts like “virtual” wind measuring stations, where wind data was computed by WRF for both sites and inserted directly as input in WAsP. In the second approach, the same procedure was followed but here the terrain influences induced by the mesoscale model low resolution terrain data were removed from the simulated wind data. In the third methodology, the simulated wind data is extracted at the top of the planetary boundary layer height for both sites, aiming to assess if the use of geostrophic winds (which, by definition, are not influenced by the local terrain) can bring any improvement in the models performance. The obtained results for the abovementioned methodologies were compared with those resulting from in situ measurements, in terms of mean wind speed, Weibull probability density function parameters and production estimates, considering the installation of one wind turbine in each site. Results showed that the second tested approach is the one that produces values closest to the measured ones, and fairly acceptable deviations were found using this coupling technique in terms of estimated annual production. However, mesoscale output should not be used directly in wind farm sitting projects, mainly due to the mesoscale model terrain data poor resolution. Instead, the use of mesoscale output in microscale models should be seen as a valid alternative to in situ data mainly for preliminary wind resource assessments, although the application of mesoscale and microscale coupling in areas with complex topography should be done with extreme caution.
Resumo:
The rapid increase in the use of microprocessor-based systems in critical areas, where failures imply risks to human lives, to the environment or to expensive equipment, significantly increased the need for dependable systems, able to detect, tolerate and eventually correct faults. The verification and validation of such systems is frequently performed via fault injection, using various forms and techniques. However, as electronic devices get smaller and more complex, controllability and observability issues, and sometimes real time constraints, make it harder to apply most conventional fault injection techniques. This paper proposes a fault injection environment and a scalable methodology to assist the execution of real-time fault injection campaigns, providing enhanced performance and capabilities. Our proposed solutions are based on the use of common and customized on-chip debug (OCD) mechanisms, present in many modern electronic devices, with the main objective of enabling the insertion of faults in microprocessor memory elements with minimum delay and intrusiveness. Different configurations were implemented starting from basic Components Off-The-Shelf (COTS) microprocessors, equipped with real-time OCD infrastructures, to improved solutions based on modified interfaces, and dedicated OCD circuitry that enhance fault injection capabilities and performance. All methodologies and configurations were evaluated and compared concerning performance gain and silicon overhead.
Resumo:
Engineering education practices have evolved not only due to the natural changes in the contents of the curricula and skills but also, and more recently, due to the requirements imposed by the Bologna revision process. In addition, industry is becoming more demanding, as society is becoming more and more aware of the global needs and consequences of industrial practices. Under this scope, higher education needs not only to follow but also to lead these trends. Therefore, the School of Engineering of the Polytechnic Institute of Porto (ISEP), a Global Reporting Initiative (GRI) training partner in Portugal, prepared and presented its Sustainability Action Plan (PASUS), with the main objective of creating a new kind of engineers, with Sustainable Development at the core of their graduation and MsC degrees. In this paper, the main strategies and activities of the referred plan along with the strategic approach, which guided its development and implementation, will be presented in detail. Additionally, a reflection about the above mentioned bridge between concept and application will be established and justified, in the framework of the action plan. Although in most of the situations, there was no prior discussion or specific request, many of the graduation and post-graduation programmes offered by ISEP already include courses that attend to PASUS philosophy. As a consequence, the number of Master thesis, Graduation projects and R&D projects that address sustainability problems has grown substantially, a proof that for ISEP community, sustainability really matters!
Resumo:
Recent studies of mobile Web trends show the continued explosion of mobile-friend content. However, the wide number and heterogeneity of mobile devices poses several challenges for Web programmers, who want automatic delivery of context and adaptation of the content to mobile devices. Hence, the device detection phase assumes an important role in this process. In this chapter, the authors compare the most used approaches for mobile device detection. Based on this study, they present an architecture for detecting and delivering uniform m-Learning content to students in a Higher School. The authors focus mainly on the XML device capabilities repository and on the REST API Web Service for dealing with device data. In the former, the authors detail the respective capabilities schema and present a new caching approach. In the latter, they present an extension of the current API for dealing with it. Finally, the authors validate their approach by presenting the overall data and statistics collected through the Google Analytics service, in order to better understand the adherence to the mobile Web interface, its evolution over time, and the main weaknesses.
Resumo:
Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.
Resumo:
A civilização contemporânea, pelas suas características, é muito exigente em tudo o que diz respeito ao conforto dos edifícios, para trabalho ou habitação, e à necessidade de economizar e racionalizar o uso de energia. A térmica dos edifícios assume, por isso, uma importância acrescida na atividade profissional e no ensino. Para se conduzir ao aperfeiçoamento de soluções na envolvente dos edifícios a este nível, o trabalho aqui realizado centrou-se no estudo do funcionamento da termografia de infravermelhos e da importância da sua utilização na inspeção térmica de edifícios. Descoberta no início do século XIX e desenvolvendo os primeiros sistemas operativos desde a 1ª Guerra Mundial, a fim de determinar heterogeneidades de temperatura superficial, esta técnica não destrutiva permite identificar anomalias que não são visualizadas a olho nu. Com a análise dessas variações de temperatura é possível conhecer os problemas e a localização de irregularidades. Este trabalho baseia-se substancialmente no estudo de edifícios. A análise realizada teve como finalidade executar inspeções termográficas – visuais, com duas abordagens. Por um lado, avaliar salas pertencentes a estabelecimentos de ensino secundário, reabilitadas e não reabilitadas, todas construídas entre as décadas de 60 e 90, com o intuito de diagnosticar patologias construtivas, recorrendo à termografia. Por outro, a análise de edifícios de habitação, com a intenção de avaliar a necessidade de um equipamento complementar às inspeções termográficas – o sistema de porta ventiladora. As inspeções foram regidas pelas diretrizes da norma europeia EN 13187. A termografia é uma técnica importante na realização de ensaios in situ que requerem rapidez de execução, aliada à vantagem de disponibilizar resultados em tempo real, permitindo assim uma primeira análise das leituras no local. A inspeção termográfica complementada com o sistema de porta ventiladora permitiu, também, revelar a importância da necessidade de meios auxiliares em certos casos. A conjugação destas diferentes técnicas permite reduzir a subjetividade da análise in situ e aumentar a fiabilidade do diagnóstico.
Resumo:
More than ever, there is an increase of the number of decision support methods and computer aided diagnostic systems applied to various areas of medicine. In breast cancer research, many works have been done in order to reduce false-positives when used as a double reading method. In this study, we aimed to present a set of data mining techniques that were applied to approach a decision support system in the area of breast cancer diagnosis. This method is geared to assist clinical practice in identifying mammographic findings such as microcalcifications, masses and even normal tissues, in order to avoid misdiagnosis. In this work a reliable database was used, with 410 images from about 115 patients, containing previous reviews performed by radiologists as microcalcifications, masses and also normal tissue findings. Throughout this work, two feature extraction techniques were used: the gray level co-occurrence matrix and the gray level run length matrix. For classification purposes, we considered various scenarios according to different distinct patterns of injuries and several classifiers in order to distinguish the best performance in each case described. The many classifiers used were Naïve Bayes, Support Vector Machines, k-nearest Neighbors and Decision Trees (J48 and Random Forests). The results in distinguishing mammographic findings revealed great percentages of PPV and very good accuracy values. Furthermore, it also presented other related results of classification of breast density and BI-RADS® scale. The best predictive method found for all tested groups was the Random Forest classifier, and the best performance has been achieved through the distinction of microcalcifications. The conclusions based on the several tested scenarios represent a new perspective in breast cancer diagnosis using data mining techniques.