885 resultados para ORIENTED PYROLYTIC-GRAPHITE
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.
Resumo:
Object-oriented modeling is spreading in current simulation of wastewater treatments plants through the use of the individual components of the process and its relations to define the underlying dynamic equations. In this paper, we describe the use of the free-software OpenModelica simulation environment for the object-oriented modeling of an activated sludge process under feedback control. The performance of the controlled system was analyzed both under normal conditions and in the presence of disturbances. The object-oriented described approach represents a valuable tool in teaching provides a practical insight in wastewater process control field.
Resumo:
These are the instructions for a programming assignment of the subject Programming 3 taught at University of Alicante in Spain. The objective of the assignment is to build an object-oriented version of Conway's game of life in Java. The assignment is divided into four sub-assignments.
Resumo:
A method has been developed for the direct determination of Cu, Cd, Ni and Pb in aquatic humic substances (AHS) by graphite furnace atomic absorption spectrometry. AHS were isolated from water samples rich in organic matter, collected in the Brazilian Ecological Parks. All analytical curves presented good linear correlation coefficient. The limits of detection and quantification were in the ranges 2.5-16.7 mu g g(-1) and 8.5-50.0 mu g g(-1), respectively. The accuracy was determined using recovery tests, and for all analytes recovery percentages ranged from 93 - 98 %, with a relative standard deviation less than 4 %. The results indicated that the proposed method is a suitable alternative for the direct determination of metals in AHS.
Resumo:
The production of natural extracts requires suitable processing conditions to maximize the preservation of the bioactive ingredients. Herein, a microwave-assisted extraction (MAE) process was optimized, by means of response surface methodology (RSM), to maximize the recovery of phenolic acids and flavonoids and obtain antioxidant ingredients from tomato. A 5-level full factorial Box-Behnken design was successfully implemented for MAE optimization, in which the processing time (t), temperature (T), ethanol concentration (Et) and solid/liquid ratio (S/L) were relevant independent variables. The proposed model was validated based on the high values of the adjusted coefficient of determination and on the non-significant differences between experimental and predicted values. The global optimum processing conditions (t=20 min; T=180 ºC; Et=0 %; and S/L=45 g/L) provided tomato extracts with high potential as nutraceuticals or as active ingredients in the design of functional foods. Additionally, the round tomato variety was highlighted as a source of added-value phenolic acids and flavonoids.
Resumo:
2014
Resumo:
Doutoramento em Economia.
Resumo:
In this work results for the flexural strength and the thermal properties of interpenetrated graphite preforms infiltrated with Al-12wt%Si are discussed and compared to those for packed graphite particles. To make this comparison relevant, graphite particles of four sizes in the range 15–124 μm, were obtained by grinding the graphite preform. Effects of the pressure applied to infiltrate the liquid alloy on composite properties were investigated. In spite of the largely different reinforcement volume fractions (90% in volume in the preform and around 50% in particle compacts) most properties are similar. Only the Coefficient of Thermal Expansion is 50% smaller in the preform composites. Thermal conductivity of the preform composites (slightly below 100 W/m K), may be increased by reducing the graphite content, alloying, or increasing the infiltration pressure. The strength of particle composites follows Griffith criterion if the defect size is identified with the particle diameter. On the other hand, the composites strength remains increasing up to unusually high values of the infiltration pressure. This is consistent with the drainage curves measured in this work. Mg and Ti additions are those that produce the most significant improvements in performance. Although extensive development work remains to be done, it may be concluded that both mechanical and thermal properties make these materials suitable for the fabrication of piston engines.
Resumo:
Purpose: To evaluate the comparative efficiency of graphite furnace atomic absorption spectrometry (GFAAS) and hydride generation atomic absorption spectrometry (HGAAS) for trace analysis of arsenic (As) in natural herbal products (NHPs). Method: Arsenic analysis in natural herbal products and standard reference material was conducted using atomic absorption spectrometry (AAS), namely, hydride generation AAS (HGAAS) and graphite furnace (GFAAS). The samples were digested with HNO3–H2O2 in a ratio of 4:1 using microwaveassisted acid digestion. The methods were validated with the aid of the standard reference material 1515 Apple Leaves (SRM) from NIST Results: Mean recovery of three different samples of NHPs, using HGAAS and GFAAS, ranged from 89.3 - 91.4 %, and 91.7 - 93.0 %, respectively. The difference between the two methods was insignificant. A (P= 0.5), B (P=0.4) and C (P=0.88) Relative standard deviation (RSD) RSD, i.e., precision was 2.5 - 6.5 % and 2.3 - 6.7 % using HGAAS and GFAAS techniques, respectively. Recovery of arsenic in SRM was 98 and 102 % by GFAAS and HGAAS, respectively. Conclusion: GFAAS demonstrates acceptable levels of precision and accuracy. Both techniques possess comparable accuracy and repeatability. Thus, the two methods are recommended as an alternative approach for trace analysis of arsenic in natural herbal products.
Resumo:
Graphene-based nanomaterials are a kind of new technological materials with high interest for physicists, chemists and materials scientists. Graphene is a two-dimensional (2-D) sheet of carbon atoms in a hexagonal configuration with atoms bonded by sp2 bonds. These bonds and this electron configuration provides the extraordinary properties of graphene, such as very large surface area, a tunable band gap, high mechanical strength and high elasticity and thermal conductivity [1]. Graphene has also been investigated for preparation of composites with various semiconductors like TiO2, ZnO, CdS aiming at enhanced photocatalytic activity for their use for photochemical reaction as water splitting or CO2 to methanol conversion [2-3]. In this communication, the synthesis of porous graphene@TiO2 obtained from a powder graphite recycled, supplied by ECOPIBA, is presented. This graphite was exfoliated, using a nonionic surfactant (Triton X-100) and sonication. Titanium(IV) isopropoxide was used as TiO2 source. After removing the surfactant with a solution HCl/n-propanol, a porous solid is obtained with a specific area of 358 m2g-1. The solid was characterized by XRD, FTIR, XPS, EDX and TEM. Figure 1 shows the graphene 2D layer bonded with nanoparticles of TiO2. When a water suspension of this material is exposed with UV-vis radiation, water splitting reaction is carried out and H2/O2 bubbles are observed (Figure 2)
Resumo:
With recent advances in remote sensing processing technology, it has become more feasible to begin analysis of the enormous historic archive of remotely sensed data. This historical data provides valuable information on a wide variety of topics which can influence the lives of millions of people if processed correctly and in a timely manner. One such field of benefit is that of landslide mapping and inventory. This data provides a historical reference to those who live near high risk areas so future disasters may be avoided. In order to properly map landslides remotely, an optimum method must first be determined. Historically, mapping has been attempted using pixel based methods such as unsupervised and supervised classification. These methods are limited by their ability to only characterize an image spectrally based on single pixel values. This creates a result prone to false positives and often without meaningful objects created. Recently, several reliable methods of Object Oriented Analysis (OOA) have been developed which utilize a full range of spectral, spatial, textural, and contextual parameters to delineate regions of interest. A comparison of these two methods on a historical dataset of the landslide affected city of San Juan La Laguna, Guatemala has proven the benefits of OOA methods over those of unsupervised classification. Overall accuracies of 96.5% and 94.3% and F-score of 84.3% and 77.9% were achieved for OOA and unsupervised classification methods respectively. The greater difference in F-score is a result of the low precision values of unsupervised classification caused by poor false positive removal, the greatest shortcoming of this method.
Resumo:
As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.
Resumo:
Purpose – The purpose of this empirical paper is to investigate internal marketing from a behavioural perspective. The impact of internal marketing behaviours, operationalised as an internal market orientation (IMO), on employees' marketing and other in/role behaviours (IRB) were examined. Design/methodology/approach – Survey data measuring IMO, market orientation and a range of constructs relevant to the nomological network in which they are embedded were collected from the UK retail managers. These were tested to establish their psychometric properties and the conceptual model was analysed using structural equations modelling, employing a partial least squares methodology. Findings – IMO has positive consequences for employees' market/oriented and other IRB. These, in turn, influence marketing success. Research limitations/implications – The paper provides empirical support for the long/held assumption that internal and external marketing are related and that organisations should balance their external focus with some attention to employees. Future research could measure the attitudes and behaviours of managers, employees and customers directly and explore the relationships between them. Practical implications – Firm must ensure that they do not put the needs of their employees second to those of managers and shareholders; managers must develop their listening skills and organisations must become more responsive to the needs of their employees. Originality/value – The paper contributes to the scarce body of empirical support for the role of internal marketing in services organisations. For researchers, this paper legitimises the study of internal marketing as a route to external market success; for managers, the study provides quantifiable evidence that focusing on employees' wants and needs impacts their behaviours towards the market. © 2010, Emerald Group Publishing Limited
Resumo:
Code patterns, including programming patterns and design patterns, are good references for programming language feature improvement and software re-engineering. However, to our knowledge, no existing research has attempted to detect code patterns based on code clone detection technology. In this study, we build upon the previous work and propose to detect and analyze code patterns from a collection of open source projects using NiPAT technology. Because design patterns are most closely associated with object-oriented languages, we choose Java and Python projects to conduct our study. The tool we use for detecting patterns is NiPAT, a pattern detecting tool originally developed for the TXL programming language based on the NiCad clone detector. We extend NiPAT for the Java and Python programming languages. Then, we try to identify all the patterns from the pattern report and classify them into several different categories. In the end of the study, we analyze all the patterns and compare the differences between Java and Python patterns.