941 resultados para Safety data recording


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents results of laboratory testing of unrestrained drying shrinkage during a period of 154 days of different concrete mixtures from the Brazilian production line that utilize ground granulated blast-furnace slag in their compositions. Three concrete mixtures with water/cement ratio of 0.78(M1), 0.41(M2), and 0.37(M3) were studied. The obtained experimental data were compared with the analytical results from prediction models available in the literature: the ACI 209 model (ACI), the B3 model (B3), the Eurocode 2 model (EC2), the GL 2000 model (GL), and the Brazilian NBR 6118 model (NBR), and an analysis of the efficacy of these models was conducted utilizing these experimental data. In addition, the development of the mechanical properties (compressive strength and modulus of elasticity) of the studied concrete mixtures was also measured in the laboratory until 126 days. From this study, it could be concluded that the ACI and the GL were the models that most approximated the experimental drying shrinkage data measured during the analyzed period of time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motivation for this research is to make a comparison between dynamic results of a free railway wheelset derailment and safety limits. For this purpose, a numerical simulation of a wheelset derailment submitted to increasing lateral force is used to compare with the safety limit, using different criteria. A simplified wheelset model is used to simulate derailments with different adhesion conditions. The contact force components, including the longitudinal and spin effects, are identified in a steady-state condition on the verge of a derailment. The contact force ratios are used in a three-dimensional (3D) analytical formula to calculate the safety limits. Simulation results obtained with two contact methods were compared with the published results and the safety limit was identified with the two criteria. Results confirm Nadal`s conservative aspect and show that safety 3D analytical formula presents slightly higher safety limits for lower friction coefficients and smaller limits for high friction, in comparison with the simulation results with Fastsim.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamic vehicle behavior is used to identify safe traffic speed limits. The proposed methodology is based on the vehicle vertical wheel contact force response excited by measured pavement irregularities on the frequency domain. A quarter-car model is used to identify vehicle dynamic behavior. The vertical elevation of an unpaved road surface has been measured. The roughness spectral density is quantified as ISO Level C. Calculations for the vehicle inertance function were derived by using the vertical contact force transfer function weighed by the pavement spectral density roughness function in the frequency domain. The statistical contact load variation is obtained from the vehicle inertance density function integration. The vehicle safety behavior concept is based on its handling ability properties. The ability to generate tangential forces on the wheel/road contact interface is the key to vehicle handling. This ability is related to tire/pavement contact forces. A contribution to establish a traffic safety speed limit is obtained from the likelihood of the loss of driveability. The results show that at speeds faster than 25 km/h the likelihood of tire contact loss is possible when traveling on the measured road type. DOI: 10.1061/(ASCE)TE.19435436.0000216. (C) 2011 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Safety Instrumented Systems (SIS) are designed to prevent and / or mitigate accidents, avoiding undesirable high potential risk scenarios, assuring protection of people`s health, protecting the environment and saving costs of industrial equipment. The design of these systems require formal methods for ensuring the safety requirements, but according material published in this area, has not identified a consolidated procedure to match the task. This sense, this article introduces a formal method for diagnosis and treatment of critical faults based on Bayesian network (BN) and Petri net (PN). This approach considers diagnosis and treatment for each safety instrumented function (SIF) including hazard and operability (HAZOP) study in the equipment or system under control. It also uses BN and Behavioral Petri net (BPN) for diagnoses and decision-making and the PN for the synthesis, modeling and control to be implemented by Safety Programmable Logic Controller (PLC). An application example considering the diagnosis and treatment of critical faults is presented and illustrates the methodology proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, processing methods of Fourier optics implemented in a digital holographic microscopy system are presented. The proposed methodology is based on the possibility of the digital holography in carrying out the whole reconstruction of the recorded wave front and consequently, the determination of the phase and intensity distribution in any arbitrary plane located between the object and the recording plane. In this way, in digital holographic microscopy the field produced by the objective lens can be reconstructed along its propagation, allowing the reconstruction of the back focal plane of the lens, so that the complex amplitudes of the Fraunhofer diffraction, or equivalently the Fourier transform, of the light distribution across the object can be known. The manipulation of Fourier transform plane makes possible the design of digital methods of optical processing and image analysis. The proposed method has a great practical utility and represents a powerful tool in image analysis and data processing. The theoretical aspects of the method are presented, and its validity has been demonstrated using computer generated holograms and images simulations of microscopic objects. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thermodynamic properties of bread dough (fusion enthalpy, apparent specific heat, initial freezing point and unfreezable water) were measured at temperatures from -40 degrees C to 35 degrees C using differential scanning calorimetry. The initial freezing point was also calculated based on the water activity of dough. The apparent specific heat varied as a function of temperature: specific heat in the freezing region varied from (1.7-23.1) J g(-1) degrees C(-1), and was constant at temperatures above freezing (2.7 J g(-1) degrees C(-1)). Unfreezable water content varied from (0.174-0.182) g/g of total product. Values of heat capacity as a function of temperature were correlated using thermodynamic models. A modification for low-moisture foodstuffs (such as bread dough) was successfully applied to the experimental data. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, an axisymmetric two-dimensional finite element model was developed to simulate instrumented indentation testing of thin ceramic films deposited onto hard steel substrates. The level of film residual stress (sigma(r)), the film elastic modulus (E) and the film work hardening exponent (n) were varied to analyze their effects on indentation data. These numerical results were used to analyze experimental data that were obtained with titanium nitride coated specimens, in which the substrate bias applied during deposition was modified to obtain films with different levels of sigma(r). Good qualitative correlation was obtained when numerical and experimental results were compared, as long as all film properties are considered in the analyses, and not only sigma(r). The numerical analyses were also used to further understand the effect of sigma(r) on the mechanical properties calculated based on instrumented indentation data. In this case, the hardness values obtained based on real or calculated contact areas are similar only when sink-in occurs, i.e. with high n or high ratio VIE, where Y is the yield strength of the film. In an additional analysis, four ratios (R/h(max)) between indenter tip radius and maximum penetration depth were simulated to analyze the combined effects of R and sigma(r) on the indentation load-displacement curves. In this case, or did not significantly affect the load curve exponent, which was affected only by the indenter tip radius. On the other hand, the proportional curvature coefficient was significantly affected by sigma(r) and n. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the first time, we introduce and study some mathematical properties of the Kumaraswamy Weibull distribution that is a quite flexible model in analyzing positive data. It contains as special sub-models the exponentiated Weibull, exponentiated Rayleigh, exponentiated exponential, Weibull and also the new Kumaraswamy exponential distribution. We provide explicit expressions for the moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are derived for the mean deviations, Bonferroni and Lorenz curves, reliability and Renyi entropy. The moments of the order statistics are calculated. We also discuss the estimation of the parameters by maximum likelihood. We obtain the expected information matrix. We provide applications involving two real data sets on failure times. Finally, some multivariate generalizations of the Kumaraswamy Weibull distribution are discussed. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimation of Taylor`s power law for species abundance data may be performed by linear regression of the log empirical variances on the log means, but this method suffers from a problem of bias for sparse data. We show that the bias may be reduced by using a bias-corrected Pearson estimating function. Furthermore, we investigate a more general regression model allowing for site-specific covariates. This method may be efficiently implemented using a Newton scoring algorithm, with standard errors calculated from the inverse Godambe information matrix. The method is applied to a set of biomass data for benthic macrofauna from two Danish estuaries. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interval-censored survival data, in which the event of interest is not observed exactly but is only known to occur within some time interval, occur very frequently. In some situations, event times might be censored into different, possibly overlapping intervals of variable widths; however, in other situations, information is available for all units at the same observed visit time. In the latter cases, interval-censored data are termed grouped survival data. Here we present alternative approaches for analyzing interval-censored data. We illustrate these techniques using a survival data set involving mango tree lifetimes. This study is an example of grouped survival data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, regression models are evaluated for grouped survival data when the effect of censoring time is considered in the model and the regression structure is modeled through four link functions. The methodology for grouped survival data is based on life tables, and the times are grouped in k intervals so that ties are eliminated. Thus, the data modeling is performed by considering the discrete models of lifetime regression. The model parameters are estimated by using the maximum likelihood and jackknife methods. To detect influential observations in the proposed models, diagnostic measures based on case deletion, which are denominated global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to those measures, the local influence and the total influential estimate are also employed. Various simulation studies are performed and compared to the performance of the four link functions of the regression models for grouped survival data for different parameter settings, sample sizes and numbers of intervals. Finally, a data set is analyzed by using the proposed regression models. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A four-parameter extension of the generalized gamma distribution capable of modelling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone and non-monotone failure rate functions, which are quite common in lifetime data analysis and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the exponentiated Weibull, exponentiated generalized half-normal, exponentiated gamma and generalized Rayleigh, among others. We derive two infinite sum representations for its moments. We calculate the density of the order statistics and two expansions for their moments. The method of maximum likelihood is used for estimating the model parameters and the observed information matrix is obtained. Finally, a real data set from the medical area is analysed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Joint generalized linear models and double generalized linear models (DGLMs) were designed to model outcomes for which the variability can be explained using factors and/or covariates. When such factors operate, the usual normal regression models, which inherently exhibit constant variance, will under-represent variation in the data and hence may lead to erroneous inferences. For count and proportion data, such noise factors can generate a so-called overdispersion effect, and the use of binomial and Poisson models underestimates the variability and, consequently, incorrectly indicate significant effects. In this manuscript, we propose a DGLM from a Bayesian perspective, focusing on the case of proportion data, where the overdispersion can be modeled using a random effect that depends on some noise factors. The posterior joint density function was sampled using Monte Carlo Markov Chain algorithms, allowing inferences over the model parameters. An application to a data set on apple tissue culture is presented, for which it is shown that the Bayesian approach is quite feasible, even when limited prior information is available, thereby generating valuable insight for the researcher about its experimental results.