72 resultados para statistical methodology
Resumo:
Dynamic vehicle behavior is used to identify safe traffic speed limits. The proposed methodology is based on the vehicle vertical wheel contact force response excited by measured pavement irregularities on the frequency domain. A quarter-car model is used to identify vehicle dynamic behavior. The vertical elevation of an unpaved road surface has been measured. The roughness spectral density is quantified as ISO Level C. Calculations for the vehicle inertance function were derived by using the vertical contact force transfer function weighed by the pavement spectral density roughness function in the frequency domain. The statistical contact load variation is obtained from the vehicle inertance density function integration. The vehicle safety behavior concept is based on its handling ability properties. The ability to generate tangential forces on the wheel/road contact interface is the key to vehicle handling. This ability is related to tire/pavement contact forces. A contribution to establish a traffic safety speed limit is obtained from the likelihood of the loss of driveability. The results show that at speeds faster than 25 km/h the likelihood of tire contact loss is possible when traveling on the measured road type. DOI: 10.1061/(ASCE)TE.19435436.0000216. (C) 2011 American Society of Civil Engineers.
Resumo:
A methodology for rock-excavation structural-reliability analysis that uses Distinct Element Method numerical models is presented. The methodology solves the problem of the conventional numerical models that supply only punctual results and use fixed input parameters, without considering its statistical errors. The analysis of rock-excavation stability must consider uncertainties from geological variability, from uncertainty in the choice of mechanical behaviour hypothesis, and from uncertainties in parameters adopted in numerical model construction. These uncertainties can be analyzed in simple deterministic models, but a new methodology was developed for numerical models with results of several natures. The methodology is based on Monte Carlo simulations and uses principles of Paraconsistent Logic. It will be presented in the analysis of a final slope of a large-dimensioned surface mine.
Resumo:
This work describes the development of an engineering approach based upon a toughness scaling methodology incorporating the effects of weld strength mismatch on crack-tip driving forces. The approach adopts a nondimensional Weibull stress, (sigma) over bar (w), as a the near-tip driving force to correlate cleavage fracture across cracked weld configurations with different mismatch conditions even though the loading parameter (measured by J) may vary widely due to mismatch and constraint variations. Application of the procedure to predict the failure strain for an overmatch girth weld made of an API X80 pipeline steel demonstrates the effectiveness of the micromechanics approach. Overall, the results lend strong support to use a Weibull stress based procedure in defect assessments of structural welds.
Resumo:
In this paper, 2 different approaches for estimating the directional wave spectrum based on a vessel`s 1st-order motions are discussed, and their predictions are compared to those provided by a wave buoy. The real-scale data were obtained in an extensive monitoring campaign based on an FPSO unit operating at Campos Basin, Brazil. Data included vessel motions, heading and tank loadings. Wave field information was obtained by means of a heave-pitch-roll buoy installed in the vicinity of the unit. `two of the methods most widely used for this kind of analysis are considered, one based on Bayesian statistical inference, the other consisting of a parametrical representation of the wave spectrum. The performance of both methods is compared, and their sensitivity to input parameters is discussed. This analysis complements a set of previous validations based on numerical and towing-tank results and allows for a preliminary evaluation of reliability when applying the methodology at full scale.
Resumo:
Purpose - This paper seeks to identify collaboration elements and evaluate their intensity in the Brazilian supermarket retail chain, especially the manufacturer-retailer channel. Design/methodology/approach - A structured questionnaire was elaborated and applied to 125 representatives from suppliers of large supermarket chains. Statistical methods including multivariate analysis were employed. Variables were grouped and composed into five indicators (joint actions, information sharing, interpersonal integration, gains and cost sharing, and strategic integration) to assess the degree of collaboration. Findings - The analyses showed that the interviewees considered interpersonal integration to be of greater importance to collaboration intensity than the other integration factors, such as gain or cost sharing or even strategic integration. Research limitations/implications - The research was conducted solely from the point of view of the industries that supply the large retail networks. The interviews were not conducted in pairs; that is, there was no application of one questionnaire to the retail network and another to the partner industry. Practical implications - Companies should invest in conducting periodic meetings with their partners to increase collaboration intensity, and should carry out technical visits to learn about their partners` logistic reality and thus make better operational decisions. Originality/value - The paper reveals which indicators produce greater collaboration intensity, and thus those that are more relevant to more efficient logistics management.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
The aim objective of this project was to evaluate the protein extraction of soybean flour in dairy whey, by the multivariate statistical method with 2(3) experiments. Influence of three variables were considered: temperature, pH and percentage of sodium chloride against the process specific variable ( percentage of protein extraction). It was observed that, during the protein extraction against time and temperature, the treatments at 80 degrees C for 2h presented great values of total protein (5.99%). The increasing for the percentage of protein extraction was major according to the heating time. Therefore, the maximum point from the function that represents the protein extraction was analysed by factorial experiment 2(3). By the results, it was noted that all the variables were important to extraction. After the statistical analyses, was observed that the parameters as pH, temperature, and percentage of sodium chloride, did not sufficient for the extraction process, since did not possible to obtain the inflection point from mathematical function, however, by the other hand, the mathematical model was significant, as well as, predictive.
Resumo:
This article considers alternative methods to calculate the fair premium rate of crop insurance contracts based on county yields. The premium rate was calculated using parametric and nonparametric approaches to estimate the conditional agricultural yield density. These methods were applied to a data set of county yield provided by the Statistical and Geography Brazilian Institute (IBGE), for the period of 1990 through 2002, for soybean, corn and wheat, in the State of Paran. In this article, we propose methodological alternatives to pricing crop insurance contracts resulting in more accurate premium rates in a situation of limited data.
Resumo:
Probable consequences of the mitigation of citrus canker eradication methodology in Sao Paulo state Recently the Sao Paulo state government mitigated its citrus canker eradication methodology adopted since 1999. In April 2009 at least 99.8% of commercial sweet orange orchards were free of citrus canker in Sao Paulo state. Consequently the mitigation of the eradication methodology reduced the high level of safety and the competitiveness of the citrus production sector in Sao Paulo state, Brazil. Therefore we suggest the re-adoption of the same eradication methodology of citrus canker adopted in Sao Paulo from 1999 to 2009, or the adoption of a new methodology, effective for citrus canker suppression, because in new sample surveys citrus canker was detected in >0.36% of affected orchards. This incidence threshold was calculated by using the Duncan test (P <= 0.05) to compare the yearly sample surveys conducted in Sao Paulo state to estimate citrus canker incidence between 1999 and 2009. The calculated minimum significant level was 0.28% among sample surveys and the lowest citrus canker incidence in Sao Paulo state was 0.08%, occurring in 2001. Thus, as an alternative, we suggest the adoption of a new eradication methodology for citrus canker suppression when a new sample survey detected >0.36% of affected orchards in Sao Paulo state, Brazil.
Resumo:
The objective of this study was to develop a dessert that contains soy protein (SP) (1%, 2%, 3%) and guava juice (GJ) (22%, 27%, 32%) using Response Surface Methodology (RSM) as the optimisation technique. Water activity, physical stability, colour, acidity, pH, iron, and carotenoid contents were analysed. Affective tests were performed to determine the degree of liking of colour, creaminess, and acceptability. The results showed that GJ increased the values of redness, hue angle, chromaticity, acidity, and carotenoid content, while SP reduced water activity. Optimisation suggested a dessert containing 32% GJ and 1.17% SP as the best proportion of these components. This sample was considered a source of fibres, ascorbic acid, copper, and iron and garnered scores above the level of `slightly liked` for sensory attributes. Moreover, RSM was shown to be an adequate approach for modelling the physicochemical parameters and the degree of liking of creaminess of desserts. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Desserts made with soy cream, which are oil-in-water emulsions, are widely consumed by lactose-intolerant individuals in Brazil. In this regard, this study aimed at using response surface methodology (RSM) to optimize the sensory attributes of a soy-based emulsion over a range of pink guava juice (GJ: 22% to 32%) and soy protein (SP: 1% to 3%). WHC and backscattering were analyzed after 72 h of storage at 7 degrees C. Furthermore, a rating test was performed to determine the degree of liking of color, taste, creaminess, appearance, and overall acceptability. The data showed that the samples were stable against gravity and storage. The models developed by RSM adequately described the creaminess, taste, and appearance of the emulsions. The response surface of the desirability function was used successfully in the optimization of the sensory properties of dairy-free emulsions, suggesting that a product with 30.35% GJ and 3% SP was the best combination of these components. The optimized sample presented suitable sensory properties, in addition to being a source of dietary fiber, iron, copper, and ascorbic acid.
Resumo:
The antioxidant activity of natural and synthetic compounds was evaluated using five in vitro methods: ferric reducing/antioxidant power (FRAP), 2,2-diphenyl-1-picrylhydradzyl (DPPH), oxygen radical absorption capacity (ORAL), oxidation of an aqueous dispersion of linoleic acid accelerated by azo-initiators (LAOX), and oxidation of a meat homogenate submitted to a thermal treatment (TBARS). All results were expressed as Trolox equivalents. The application of multivariate statistical techniques suggested that the phenolic compounds (caffeic acid, carnosic acid, genistein and resveratrol), beyond their high antioxidant activity measured by the DPPH, FRAP and TBARS methods, showed the highest ability to react with the radicals in the ORAC methodology, compared to the other compounds evaluated in this study (ascorbic acid, erythorbate, tocopherol, BHT, Trolox, tryptophan, citric acid, EDTA, glutathione, lecithin, methionine and tyrosine). This property was significantly correlated with the number of phenolic rings and catecholic structure present in the molecule. Based on a multivariate analysis, it is possible to select compounds from different clusters and explore their antioxidant activity interactions in food products.
Resumo:
This study describes an accurate, sensitive, and specific chromatographic method for the simultaneous quantitative determination of lamivudine and zidovudine in human blood plasma, using stavudine as an internal standard. The chromatographic separation was performed using a C8 column (150 x 4.6 mm, 5 mu m), and ultraviolet absorbency detection at 270 nm with gradient elution. Two mobile phases were used. Phase A contained 10 mM potassium phosphate and 3% acetonitrile, whereas Phase B contained methanol. A linear gradient was used with a variability of A-B phase proportion from 98-2% to 72-28%, respectively. The drug extraction was performed with two 4 mL aliquots of ethyl acetate.
Resumo:
Exposure to oxygen may induce a lack of functionality of probiotic dairy foods because the anaerobic metabolism of probiotic bacteria compromises during storage the maintenance of their viability to provide benefits to consumer health. Glucose oxidase can constitute a potential alternative to increase the survival of probiotic bacteria in yogurt because it consumes the oxygen permeating to the inside of the pot during storage, thus making it possible to avoid the use of chemical additives. This research aimed to optimize the processing of probiotic yogurt supplemented with glucose oxidase using response surface methodology and to determine the levels of glucose and glucose oxidase that minimize the concentration of dissolved oxygen and maximize the Bifidobacterium longum count by the desirability function. Response surface methodology mathematical models adequately described the process, with adjusted determination coefficients of 83% for the oxygen and 94% for the B. longum. Linear and quadratic effects of the glucose oxidase were reported for the oxygen model, whereas for the B. longum count model an influence of the glucose oxidase at the linear level was observed followed by the quadratic influence of glucose and quadratic effect of glucose oxidase. The desirability function indicated that 62.32 ppm of glucose oxidase and 4.35 ppm of glucose was the best combination of these components for optimization of probiotic yogurt processing. An additional validation experiment was performed and results showed acceptable error between the predicted and experimental results.
Resumo:
Despite the increase in the use of natural compounds in place of synthetic derivatives as antioxidants in food products, the extent of this substitution is limited by cost constraints. Thus, the objective of this study was to explore the synergism on the antioxidant activity of natural compounds, for further application in food products. Three hydrosoluble compounds (x(1) = caffeic acid, x(2) = carnosic acid, and x(3) = glutathione) and three liposoluble compounds (x(1) = quercetin, x(2) = rutin, and x(3) = genistein) were mixed according to a ""centroid simplex design"". The antioxidant activity of the mixtures was analyzed by the ferric reducing antioxidant power (FRAP) and oxygen radical absorbance capacity (ORAL) methodologies, and activity was also evaluated in an oxidized mixed micelle prepared with linoleic acid (LAOX). Cubic polynomial models with predictive capacity were obtained when the mixtures were submitted to the LAOX methodology ((y) over cap = 0.56 x(1) + 0.59 x(2) + 0.04 x(3) + 0.41 x(1)x(2) - 0.41 x(1)x(3) - 1.12 x(2)x(3) - 4.01 x(1)x(2)x(3)) for the hydrosoluble compounds, and to FRAP methodology ((y) over cap = 3.26 x(1) + 2.39 x(2) + 0.04 x(3) + 1.51 x(1)x(2) + 1.03 x(1)x(3) + 0.29 x(1)x(3) + 3.20 x(1)x(2)x(3)) for the liposoluble compounds. Optimization of the models suggested that a mixture containing 47% caffeic acid + 53% carnosic acid and a mixture containing 67% quercetin + 33% rutin were potential synergistic combinations for further evaluation using a food matrix.