88 resultados para script-driven test program generation process
Resumo:
The design of supplementary damping controllers to mitigate the effects of electromechanical oscillations in power systems is a highly complex and time-consuming process, which requires a significant amount of knowledge from the part of the designer. In this study, the authors propose an automatic technique that takes the burden of tuning the controller parameters away from the power engineer and places it on the computer. Unlike other approaches that do the same based on robust control theories or evolutionary computing techniques, our proposed procedure uses an optimisation algorithm that works over a formulation of the classical tuning problem in terms of bilinear matrix inequalities. Using this formulation, it is possible to apply linear matrix inequality solvers to find a solution to the tuning problem via an iterative process, with the advantage that these solvers are widely available and have well-known convergence properties. The proposed algorithm is applied to tune the parameters of supplementary controllers for thyristor controlled series capacitors placed in the New England/New York benchmark test system, aiming at the improvement of the damping factor of inter-area modes, under several different operating conditions. The results of the linear analysis are validated by non-linear simulation and demonstrate the effectiveness of the proposed procedure.
Resumo:
There is no normalized test to assess the shear strength of vertical interfaces of interconnected masonry walls. The approach used to evaluate this strength is normally indirect and often unreliable. The aim of this study is to propose a new test specimen to eliminate this deficiency. The main features of the proposed specimen are failure caused by shear stress on the vertical interface and a small number of units (blocks). The paper presents a numerical analysis based on the finite element method, with the purpose of showing the theoretical performance of the designed specimen, in terms of its geometry, boundary conditions, and loading scheme, and describes an experimental program using the specimen built with full- and third-scale clay blocks. The main conclusions are that the proposed specimen is easy to build and is appropriate to evaluate the sheaf strength of vertical interfaces of masonry walls.
Resumo:
Purpose - The purpose of this paper is to identify the key elements of a new rapid prototyping process, which involves layer-by-layer deposition of liquid-state material and at the same time using an ultraviolet line source to cure the deposited material. This paper reports studies about the behaviour of filaments, deposition accuracy, filaments interaction and functional feasibility of system. Additionally, the author describes the process which has been proposed, the equipment that has been used for these studies and the material which was developed in this application. Design/methodology/approach - The research has been separated into three study areas in accordance with their goals. In the first, both the behaviour of filament and deposition accuracy was studied. The design of the experiment is described with focus on four response factors (bead width, filament quality, deposition accuracy and deposition continuity) along with function of three control factors (deposition height, deposition velocity and extrusion velocity). The author also studied the interaction between filaments as a function of bead centre distance. In addition, two test samples were prepared to serve as a proof of the methodology and to verify the functional feasibility of the process which has been studied. Findings - The results show that the proposed process is functionally feasible, and that it is possible to identify the main effects of control factors over response factors. That analysis is used to predict the condition of process as a function of the parameters which control the process. Also identified were distances of centre beads which result in a specific behaviour. The types of interaction between filaments were analysed and sorted into: union, separation and indeterminate. At the end, the functional feasibility of process was proved whereby two test parts could be built. Originality/value - This paper proposes a new rapid prototyping process and also presents test studies related to this proposition. The author has focused on the filament behaviour, deposition accuracy, interaction between filaments and studied the functional feasibility of process to provide new information about this process, which at the same time is useful to the development of other rapid prototyping processes.
Resumo:
This paper deals with the use of simplified methods to predict methane generation in tropical landfills. Methane recovery data obtained on site as part of a research program being carried Out at the Metropolitan Landfill, Salvador, Brazil, is analyzed and used to obtain field methane generation over time. Laboratory data from MSW samples of different ages are presented and discussed: and simplified procedures to estimate the methane generation potential, L(o), and the constant related to the biodegradation rate, k are applied. The first order decay method is used to fit field and laboratory results. It is demonstrated that despite the assumptions and the simplicity of the adopted laboratory procedures, the values L(o) and k obtained are very close to those measured in the field, thus making this kind of analysis very attractive for first approach purposes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Coatings based on NiCrAlC intermetallic based alloy were applied on AISI 316L stainless steel substrates using a high velocity oxygen fuel torch. The influence of the spray parameters on friction and abrasive wear resistance were investigated using an instrumented rubber wheel abrasion test, able to measure the friction forces. The corrosion behaviour of the coatings were studied with electrochemical techniques and compared with the corrosion resistance of the substrate material. Specimens prepared using lower O(2)/C(3)H(8) ratios showed smaller porosity values. The abrasion wear rate of the NiCrAlC coatings was much smaller than that described in the literature for bulk as cast materials with similar composition and one order of magnitude higher than bulk cast and heat treated (aged) NiCrAlC alloy. All coatings showed higher corrosion resistance than the AISI 316L substrate in HCl (5%) aqueous solution at 40 degrees C.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
Due to the several kinds of services that use the Internet and data networks infra-structures, the present networks are characterized by the diversity of types of traffic that have statistical properties as complex temporal correlation and non-gaussian distribution. The networks complex temporal correlation may be characterized by the Short Range Dependence (SRD) and the Long Range Dependence - (LRD). Models as the fGN (Fractional Gaussian Noise) may capture the LRD but not the SRD. This work presents two methods for traffic generation that synthesize approximate realizations of the self-similar fGN with SRD random process. The first one employs the IDWT (Inverse Discrete Wavelet Transform) and the second the IDWPT (Inverse Discrete Wavelet Packet Transform). It has been developed the variance map concept that allows to associate the LRD and SRD behaviors directly to the wavelet transform coefficients. The developed methods are extremely flexible and allow the generation of Gaussian time series with complex statistical behaviors.
Resumo:
This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.
Resumo:
The water diffusion attributable to concentration gradients is among the main mechanisms of water transport into the asphalt mixture. The transport of small molecules through polymeric materials is a very complex process, and no single model provides a complete explanation because of the small molecule`s complex internal structure. The objective of this study was to experimentally determine the diffusion of water in different fine aggregate mixtures (FAM) using simple gravimetric sorption measurements. For the purposes of measuring the diffusivity of water, FAMs were regarded as a representative homogenous volume of the hot-mix asphalt (HMA). Fick`s second law is generally used to model diffusion driven by concentration gradients in different materials. The concept of the dual mode diffusion was investigated for FAM cylindrical samples. Although FAM samples have three components (asphalt binder, aggregates, and air voids), the dual mode was an attempt to represent the diffusion process by only two stages that occur simultaneously: (1) the water molecules are completely mobile, and (2) the water molecules are partially mobile. The combination of three asphalt binders and two aggregates selected from the Strategic Highway Research Program`s (SHRP) Materials Reference Library (MRL) were evaluated at room temperature [23.9 degrees C (75 degrees F)] and at 37.8 degrees C (100 degrees F). The results show that moisture uptake and diffusivity of water through FAM is dependent on the type of aggregate and asphalt binder. At room temperature, the rank order of diffusivity and moisture uptake for the three binders was the same regardless of the type of aggregate. However, this rank order changed at higher temperatures, suggesting that at elevated temperatures different binders may be undergoing a different level of change in the free volume. DOI: 10.1061/(ASCE)MT.1943-5533.0000190. (C) 2011 American Society of Civil Engineers.
Resumo:
Electron beam induced second harmonic generation (SHG) is studied in Er(3+) doped PbO-GeO(2) glasses containing silver nanoparticles with concentrations that are controlled by the heat-treatment of the samples. The SHG is observed at T = 4.2 K using a p-polarized laser beam at 1064 nm. Enhancement of the SHG is observed in the samples that are submitted to electron beam incidence. The highest value of the nonlinear susceptibility, 2.08 pm/V, is achieved for the sample heat-treated during 72 h and submitted to an electron beam current of 15 nA. The samples that were not exposed to the electron beam present a susceptibility of a parts per thousand 0.5 pm/V.
Resumo:
The present investigation is the first part of an initiative to prepare a regional map of the natural abundance of selenium in various areas of Brazil, based on the analysis of bean and soil samples. Continuous-flow hydride generation electrothermal atomic absorption spectrometry (HG-ET AAS) with in situ trapping on an iridium-coated graphite tube has been chosen because of the high sensitivity and relative simplicity. The microwave-assisted acid digestion for bean and soil samples was tested for complete recovery of inorganic and organic selenium compounds (selenomethionine). The reduction of Se(VI) to Se(IV) was optimized in order to guarantee that there is no back-oxidation, which is of importance when digested samples are not analyzed immediately after the reduction step. The limits of detection and quantification of the method were 30 ng L(-1) Se and 101 ng L(-1) Se, respectively, corresponding to about 3 ng g(-1) and 10 ng g(-1), respectively, in the solid samples, considering a typical dilution factor of 100 for the digestion process. The results obtained for two certified food reference materials (CRM), soybean and rice, and for a soil and sediment CRM confirmed the validity of the investigated method. The selenium content found in a number of selected bean samples varied between 5.5 +/- 0.4 ng g(-1) and 1726 +/- 55 ng g(-1), and that in soil samples varied between 113 +/- 6.5 ng g(-1) and 1692 +/- 21 ng g(-1). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The aim of the present study was to provide a numerical measure, through the process capability indexes (PCIs), C(p) and C(pk), on whether or not the manufacturing process can be considered capable of producing metamizol (500 mg) tablets. They were also used as statistical tool in order to prove the consistency of the tabletting process, making sure that the tablet weight and the content uniformity of metamizol are able to comply with the preset requirements. Besides that, the ANOVA, the t-test and the test for equal variances were applied to this study, allowing additional knowledge of the tabletting phase. Therefore, the proposed statistical approach intended to assure more safety, precision and accuracy on the process validation analysis.
Resumo:
In this preliminary study eighteen p-substituted benzoic acid [(5-nitro-thiophen-2-yl)-methylene]-hydrazides with antimicrobial activity were evaluated against multidrug-resistant Staphylococcus aureus, correlating the three-dimensional characteristics of the ligands with their respective bioactivities. The computer programs Sybyl and CORINA were used, respectively, for the design and three-dimensional conversion of the ligands. Molecular interaction fields were calculated using GRID program. Calculations using Volsurf resulted in a statistically consistent model with 48 structural descriptors showing that hydrophobicity is a fundamental property in the analyzed biological response.
Resumo:
Few molecular studies have been devoted to the finger drop process that occurs during banana fruit ripening. Recent studies revealed the involvement of changes in the properties of cell wall polysaccharides in the pedicel rupture area. In this study, the expression of cell-wall modifying genes was monitored in peel tissue during post-harvest ripening of Cavendish banana fruit, at median area (control zone) and compared with that in the pedicel rupture area (drop zone). To this end, three pectin methylesterase (PME) and seven xyloglucan endotransglycosylase/hydrolase (XTH) genes were isolated. The accumulation of their mRNAs and those of polygalaturonase, expansin, and pectate lyase genes already isolated from banana were examined. During post-harvest ripening, transcripts of all genes were detected in both zones, but accumulated differentially. MaPME1, MaPG1, and MaXTH4 mRNA levels did not change in either zone. Levels of MaPME3 and MaPG3 mRNAs increased greatly only in the control zone and at the late ripening stages. For other genes, the main molecular changes occurred 1-4 d after ripening induction. MaPME2, MaPEL1, MaPEL2, MaPG4, MaXTH6, MaXTH8, MaXTH9, MaEXP1, MaEXP4, and MaEXP5 accumulated highly in the drop zone, contrary to MaXTH3 and MaXTH5, and MaEXP2 throughout ripening. For MaPG2, MaXET1, and MaXET2 genes, high accumulation in the drop zone was transient. The transcriptional data obtained from all genes examined suggested that finger drop and peel softening involved similar mechanisms. These findings also led to the proposal of a sequence of molecular events leading to finger drop and to suggest some candidates.