10 resultados para Materiali compositi, CFRP, Combined Loading Compression (CLC) test method
em CentAUR: Central Archive University of Reading - UK
Resumo:
Investigation of the fracture mode for hard and soft wheat endosperm was aimed at gaining a better understanding of the fragmentation process. Fracture mechanical characterization was based on the three-point bending test which enables stable crack propagation to take place in small rectangular pieces of wheat endosperm. The crack length can be measured in situ by using an optical microscope with light illumination from the side of the specimen or from the back of the specimen. Two new techniques were developed and used to estimate the fracture toughness of wheat endosperm, a geometric approach and a compliance method. The geometric approach gave average fracture toughness values of 53.10 and 27.0 J m(-2) for hard and soft endosperm, respectively. Fracture toughness estimated using the compliance method gave values of 49.9 and 29.7 J m(-2) for hard and soft endosperm, respectively. Compressive properties of the endosperm in three mutually perpendicular axes revealed that the hard and soft endosperms are isotropic composites. Scanning electron microscopy (SEM) observation of the fracture surfaces and the energy-time curves of loading-unloading cycles revealed that there was a plastic flow during crack propagation for both the hard and soft endosperms, and confirmed that the fracture mode is significantly related to the adhesion level between starch granules and the protein matrix.
Resumo:
Resistance baselines were obtained for the first generation anticoagulant rodenticides chlorophacinone and diphacinone using laboratory, caesarian-derived Norway rats (Rattus norvegicus) as the susceptible strain and the blood clotting response test method. The ED99 estimates for a quantal response were: chlorophacinone, males 0.86 mg kg−1, females 1.03 mg kg−1; diphacinone, males 1.26 mg kg−1, females 1.60 mg kg−1. The dose-response data also showed that chlorophacinone was significantly (p<0.0001) more potent than diphacinone for both male and female rats, and that male rats were more susceptible than females to both compounds (p<0.002). The ED99 doses were then given to groups of five male and five female rats of the Welsh and Hampshire warfarin-resistant strains. Twenty-four hours later, prothrombin times were slightly elevated in both strains but all the animals were classified as resistant to the two compounds, indicating cross-resistance from warfarin to diphacinone and chlorophacinone. When rats of the two resistant strains were fed for six consecutive days on baits containing either diphacinone or chlorophacinone, many animals survived, indicating that their resistance might enable them to survive treatments with these compounds in the field.
Resumo:
This study examines the numerical accuracy, computational cost, and memory requirements of self-consistent field theory (SCFT) calculations when the diffusion equations are solved with various pseudo-spectral methods and the mean field equations are iterated with Anderson mixing. The different methods are tested on the triply-periodic gyroid and spherical phases of a diblock-copolymer melt over a range of intermediate segregations. Anderson mixing is found to be somewhat less effective than when combined with the full-spectral method, but it nevertheless functions admirably well provided that a large number of histories is used. Of the different pseudo-spectral algorithms, the 4th-order one of Ranjan, Qin and Morse performs best, although not quite as efficiently as the full-spectral method.
Resumo:
Many key economic and financial series are bounded either by construction or through policy controls. Conventional unit root tests are potentially unreliable in the presence of bounds, since they tend to over-reject the null hypothesis of a unit root, even asymptotically. So far, very little work has been undertaken to develop unit root tests which can be applied to bounded time series. In this paper we address this gap in the literature by proposing unit root tests which are valid in the presence of bounds. We present new augmented Dickey–Fuller type tests as well as new versions of the modified ‘M’ tests developed by Ng and Perron [Ng, S., Perron, P., 2001. LAG length selection and the construction of unit root tests with good size and power. Econometrica 69, 1519–1554] and demonstrate how these tests, combined with a simulation-based method to retrieve the relevant critical values, make it possible to control size asymptotically. A Monte Carlo study suggests that the proposed tests perform well in finite samples. Moreover, the tests outperform the Phillips–Perron type tests originally proposed in Cavaliere [Cavaliere, G., 2005. Limited time series with a unit root. Econometric Theory 21, 907–945]. An illustrative application to U.S. interest rate data is provided
Resumo:
There is growing interest, especially for trials in stroke, in combining multiple endpoints in a single clinical evaluation of an experimental treatment. The endpoints might be repeated evaluations of the same characteristic or alternative measures of progress on different scales. Often they will be binary or ordinal, and those are the cases studied here. In this paper we take a direct approach to combining the univariate score statistics for comparing treatments with respect to each endpoint. The correlations between the score statistics are derived and used to allow a valid combined score test to be applied. A sample size formula is deduced and application in sequential designs is discussed. The method is compared with an alternative approach based on generalized estimating equations in an illustrative analysis and replicated simulations, and the advantages and disadvantages of the two approaches are discussed.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.
Resumo:
BACKGROUND: Monitoring of fruit and vegetable (F&V) intake is fraught with difficulties. Available dietary assessment methods are associated with considerable error, and the use of biomarkers offers an attractive alternative. Few studies to date have examined the use of plasma biomarkers to monitor or predict the F&V intake of volunteers consuming a wide range of intakes from both habitual F&V and manipulated diets. OBJECTIVE: This study tested the hypothesis that an integrated biomarker calculated from a combination of plasma vitamin C, cholesterol-adjusted carotenoid concentration and Ferric Reducing Antioxidant Power (FRAP) had more power to predict F&V intake than each individual biomarker. METHODS: Data from a randomized controlled dietary intervention study [FLAVURS (Flavonoids University of Reading Study); n = 154] in which the test groups observed sequential increases of 2.3, 3.2, and 4.2 portions of F&Vs every 6 wk across an 18-wk period were used in this study. RESULTS: An integrated plasma biomarker was devised that included plasma vitamin C, total cholesterol-adjusted carotenoids, and FRAP values, which better correlated with F&V intake (r = 0.47, P < 0.001) than the individual biomarkers (r = 0.33, P < 0.01; r = 0.37, P < 0.001; and r = 0.14, respectively; P = 0.099). Inclusion of urinary potassium concentration did not significantly improve the correlation. The integrated plasma biomarker predicted F&V intake more accurately than did plasma total cholesterol-adjusted carotenoid concentration, with the difference being significant at visit 2 (P < 0.001) and with a tendency to be significant at visit 1 (P = 0.07). CONCLUSION: Either plasma total cholesterol-adjusted carotenoid concentration or the integrated biomarker could be used to distinguish between high- and moderate-F&V consumers. This trial was registered at www.controlled-trials.com as ISRCTN47748735.
Resumo:
This study examines whether combined cognitive bias modification for interpretative biases (CBM-I) and computerised cognitive behaviour therapy (C-CBT) can produce enhanced positive effects on interpretation biases and social anxiety. Forty socially anxious students were randomly assigned into two conditions, an intervention group (positive CBM-I + C-CBT) or an active control (neutral CBM-I + C-CBT). At pre-test, participants completed measures of social anxiety, interpretative bias, cognitive distortions, and social and work adjustment. They were exposed to 6 × 30 min sessions of web-based interventions including three sessions of either positive or neutral CBM-I and three sessions of C-CBT, one session per day. At post-test and two-week follow-up, participants completed the baseline measures. A combined positive CBM-I + C-CBT produced less negative interpretations of ambiguous situations than neutral CBM-I + C-CBT. The results also showed that both positive CBM-I + C-CBT and neutral CBM-I + C-CBT reduced social anxiety and cognitive distortions as well as improving work and social adjustment. However, greater effect sizes were observed in the positive CBM-I + C-CBT condition than the control. This indicates that adding positive CBM-I to C-CBT enhanced the training effects on social anxiety, cognitive distortions, and social and work adjustment compared to the neutral CBM-I + C-CBT condition.