987 resultados para statistical methodology
Resumo:
This work presents a thermoeconomic optimization methodology for the analysis and design of energy systems. This methodology involves economic aspects related to the exergy conception, in order to develop a tool to assist the equipment selection, operation mode choice as well as to optimize the thermal plants design. It also presents the concepts related to exergy in a general scope and in thermoeconomics which combines the thermal sciences principles (thermodynamics, heat transfer, and fluid mechanics) and the economic engineering in order to rationalize energy systems investment decisions, development and operation. Even in this paper, it develops a thermoeconomic methodology through the use of a simple mathematical model, involving thermodynamics parameters and costs evaluation, also defining the objective function as the exergetic production cost. The optimization problem evaluation is developed for two energy systems. First is applied to a steam compression refrigeration system and then to a cogeneration system using backpressure steam turbine. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
BACKGROUND: The combined effects of vanillin and syringaldehyde on xylitol production by Candida guilliermondii using response surface methodology (RSM) have been studied. A 2(2) full-factorial central composite design was employed for experimental design and analysis of the results. RESULTS: Maximum xylitol productivities (Q(p) = 0.74 g L(-1) h(-1)) and yields (Y(P/S) = 0.81 g g(-1)) can be attained by adding only vanillin at 2.0 g L(-1) to the fermentation medium. These data were closely correlated with the experimental results obtained (0.69 +/- 0.04 g L(-1) h(-1) and 0.77 +/- 0.01 g g(-1)) indicating a good agreement with the predicted value. C. guilliermondii was able to convert vanillin completely after 24 h of fermentation with 94% yield of vanillyl alcohol. CONCLUSIONS: The bioconversion of xylose into xylitol by C. guilliermondii is strongly dependent on the combination of aldehydes and phenolics in the fermentation medium. Vanillin is a source of phenolic compound able to improve xylitol production by yeast. The conversion of vanillin to alcohol vanilyl reveals the potential of this yeast for medium detoxification. (C) 2009 Society of Chemical Industry
Resumo:
Response surface methodology was used to evaluate optimal time, temperature and oxalic acid concentration for simultaneous saccharification and fermentation (SSF) of corncob particles by Pichia stipitis CBS 6054. Fifteen different conditions for pretreatment were examined in a 2(3) full factorial design with six axial points. Temperatures ranged from 132 to 180 degrees C, time from 10 to 90 min and oxalic acid loadings from 0.01 to 0.038 g/g solids. Separate maxima were found for enzymatic saccharification and hemicellulose fermentation, respectively, with the condition for maximum saccharification being significantly more severe. Ethanol production was affected by reaction temperature more than by oxalic acid and reaction time over the ranges examined. The effect of reaction temperature was significant at a 95% confidence level in its effect on ethanol production. Oxalic acid and reaction time were statistically significant at the 90% level. The highest ethanol concentration (20 g/l) was obtained after 48 h with an ethanol volumetric production rate of 0.42 g ethanol l(-1) h(-1). The ethanol yield after SSF with P. stipitis was significantly higher than predicted by sequential saccharification and fermentation of substrate pretreated under the same condition. This was attributed to the secretion of beta-glucosidase by P. stipitis. During SSF, free extracellular beta-glucosidase activity was 1.30 pNPG U/g with P. stipitis, while saccharification without the yeast was 0.66 pNPG U/g. Published by Elsevier Ltd.
Resumo:
A hybrid system to automatically detect, locate and classify disturbances affecting power quality in an electrical power system is presented in this paper. The disturbances characterized are events from an actual power distribution system simulated by the ATP (Alternative Transients Program) software. The hybrid approach introduced consists of two stages. In the first stage, the wavelet transform (WT) is used to detect disturbances in the system and to locate the time of their occurrence. When such an event is flagged, the second stage is triggered and various artificial neural networks (ANNs) are applied to classify the data measured during the disturbance(s). A computational logic using WTs and ANNs together with a graphical user interface (GU) between the algorithm and its end user is then implemented. The results obtained so far are promising and suggest that this approach could lead to a useful application in an actual distribution system. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Despite modern weed control practices, weeds continue to be a threat to agricultural production. Considering the variability of weeds, a classification methodology for the risk of infestation in agricultural zones using fuzzy logic is proposed. The inputs for the classification are attributes extracted from estimated maps for weed seed production and weed coverage using kriging and map analysis and from the percentage of surface infested by grass weeds, in order to account for the presence of weed species with a high rate of development and proliferation. The output for the classification predicts the risk of infestation of regions of the field for the next crop. The risk classification methodology described in this paper integrates analysis techniques which may help to reduce costs and improve weed control practices. Results for the risk classification of the infestation in a maize crop field are presented. To illustrate the effectiveness of the proposed system, the risk of infestation over the entire field is checked against the yield loss map estimated by kriging and also with the average yield loss estimated from a hyperbolic model.
Resumo:
This work presents a statistical study on the variability of the mechanical properties of hardened self-compacting concrete, including the compressive strength, splitting tensile strength and modulus of elasticity. The comparison of the experimental results with those derived from several codes and recommendations allows evaluating if the hardened behaviour of self-compacting concrete can be appropriately predicted by the existing formulations. The variables analyzed include the maximum size aggregate, paste and gravel content. Results from the analyzed self-compacting concretes presented variability measures in the same range than the expected for conventional vibrated concrete, with all the results within a confidence level of 95%. From several formulations for conventional concrete considered in this study, it was observed that a safe estimation of the modulus of elasticity can be obtained from the value of compressive strength; with lower strength self-compacting concretes presenting higher safety margins. However, most codes overestimate the material tensile strength. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The importance of a careful selection of rocks used in building facade cladding is highlighted. A simple and viable methodology for the structural detailing of dimension stones and the verification of the global performance is presented based on a Strap software simulation. The results obtained proved the applicability of the proposed structural dimensioning methodology which represents an excellent simple tool for dimensioning rock slabs used for building facade cladding. The Strap software satisfactorily simulated the structural conditions of the stone slabs under the studied conditions, allowing the determination of alternative slab dimensions and the verification of the cladding strength at the support.
Resumo:
In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Dynamic vehicle behavior is used to identify safe traffic speed limits. The proposed methodology is based on the vehicle vertical wheel contact force response excited by measured pavement irregularities on the frequency domain. A quarter-car model is used to identify vehicle dynamic behavior. The vertical elevation of an unpaved road surface has been measured. The roughness spectral density is quantified as ISO Level C. Calculations for the vehicle inertance function were derived by using the vertical contact force transfer function weighed by the pavement spectral density roughness function in the frequency domain. The statistical contact load variation is obtained from the vehicle inertance density function integration. The vehicle safety behavior concept is based on its handling ability properties. The ability to generate tangential forces on the wheel/road contact interface is the key to vehicle handling. This ability is related to tire/pavement contact forces. A contribution to establish a traffic safety speed limit is obtained from the likelihood of the loss of driveability. The results show that at speeds faster than 25 km/h the likelihood of tire contact loss is possible when traveling on the measured road type. DOI: 10.1061/(ASCE)TE.19435436.0000216. (C) 2011 American Society of Civil Engineers.
Resumo:
A methodology for rock-excavation structural-reliability analysis that uses Distinct Element Method numerical models is presented. The methodology solves the problem of the conventional numerical models that supply only punctual results and use fixed input parameters, without considering its statistical errors. The analysis of rock-excavation stability must consider uncertainties from geological variability, from uncertainty in the choice of mechanical behaviour hypothesis, and from uncertainties in parameters adopted in numerical model construction. These uncertainties can be analyzed in simple deterministic models, but a new methodology was developed for numerical models with results of several natures. The methodology is based on Monte Carlo simulations and uses principles of Paraconsistent Logic. It will be presented in the analysis of a final slope of a large-dimensioned surface mine.
Resumo:
This work describes the development of an engineering approach based upon a toughness scaling methodology incorporating the effects of weld strength mismatch on crack-tip driving forces. The approach adopts a nondimensional Weibull stress, (sigma) over bar (w), as a the near-tip driving force to correlate cleavage fracture across cracked weld configurations with different mismatch conditions even though the loading parameter (measured by J) may vary widely due to mismatch and constraint variations. Application of the procedure to predict the failure strain for an overmatch girth weld made of an API X80 pipeline steel demonstrates the effectiveness of the micromechanics approach. Overall, the results lend strong support to use a Weibull stress based procedure in defect assessments of structural welds.
Resumo:
In this paper, 2 different approaches for estimating the directional wave spectrum based on a vessel`s 1st-order motions are discussed, and their predictions are compared to those provided by a wave buoy. The real-scale data were obtained in an extensive monitoring campaign based on an FPSO unit operating at Campos Basin, Brazil. Data included vessel motions, heading and tank loadings. Wave field information was obtained by means of a heave-pitch-roll buoy installed in the vicinity of the unit. `two of the methods most widely used for this kind of analysis are considered, one based on Bayesian statistical inference, the other consisting of a parametrical representation of the wave spectrum. The performance of both methods is compared, and their sensitivity to input parameters is discussed. This analysis complements a set of previous validations based on numerical and towing-tank results and allows for a preliminary evaluation of reliability when applying the methodology at full scale.
Resumo:
Purpose - This paper seeks to identify collaboration elements and evaluate their intensity in the Brazilian supermarket retail chain, especially the manufacturer-retailer channel. Design/methodology/approach - A structured questionnaire was elaborated and applied to 125 representatives from suppliers of large supermarket chains. Statistical methods including multivariate analysis were employed. Variables were grouped and composed into five indicators (joint actions, information sharing, interpersonal integration, gains and cost sharing, and strategic integration) to assess the degree of collaboration. Findings - The analyses showed that the interviewees considered interpersonal integration to be of greater importance to collaboration intensity than the other integration factors, such as gain or cost sharing or even strategic integration. Research limitations/implications - The research was conducted solely from the point of view of the industries that supply the large retail networks. The interviews were not conducted in pairs; that is, there was no application of one questionnaire to the retail network and another to the partner industry. Practical implications - Companies should invest in conducting periodic meetings with their partners to increase collaboration intensity, and should carry out technical visits to learn about their partners` logistic reality and thus make better operational decisions. Originality/value - The paper reveals which indicators produce greater collaboration intensity, and thus those that are more relevant to more efficient logistics management.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
The aim objective of this project was to evaluate the protein extraction of soybean flour in dairy whey, by the multivariate statistical method with 2(3) experiments. Influence of three variables were considered: temperature, pH and percentage of sodium chloride against the process specific variable ( percentage of protein extraction). It was observed that, during the protein extraction against time and temperature, the treatments at 80 degrees C for 2h presented great values of total protein (5.99%). The increasing for the percentage of protein extraction was major according to the heating time. Therefore, the maximum point from the function that represents the protein extraction was analysed by factorial experiment 2(3). By the results, it was noted that all the variables were important to extraction. After the statistical analyses, was observed that the parameters as pH, temperature, and percentage of sodium chloride, did not sufficient for the extraction process, since did not possible to obtain the inflection point from mathematical function, however, by the other hand, the mathematical model was significant, as well as, predictive.