904 resultados para Test data
Resumo:
The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^
Resumo:
A pre-test, post-test, quasi-experimental design was used to examine the effects of student-centered and traditional models of reading instruction on outcomes of literal comprehension and critical thinking skills. The sample for this study consisted of 101 adult students enrolled in a high-level developmental reading course at a large, urban community college in the Southeastern United States. The experimental group consisted of 48 students, and the control group consisted of 53 students. Students in the experimental group were limited in the time spent reading a course text of basic skills, with instructors using supplemental materials such as poems, news articles, and novels. Discussions, the reading-writing connection, and student choice in material selection were also part of the student-centered curriculum. Students in the control group relied heavily on a course text and vocabulary text for reading material, with great focus placed on basic skills. Activities consisted primarily of multiple-choice questioning and quizzes. The instrument used to collect pre-test data was Descriptive Tests of Language Skills in Reading Comprehension; post-test data were taken from the Florida College Basic Skills Exit Test. A MANCOVA was used as the statistical method to determine if either model of instruction led to significantly higher gains in literal comprehension skills or critical thinking skills. A paired samples t-test was also used to compare pre-test and post-test means. The results of the MANCOVA indicated no significant difference between instructional models on scores of literal comprehension and critical thinking. Neither was there any significant difference in scores between subgroups of age (under 25 and 25 and older) and language background (native English speaker and second-language learner). The results of the t-test indicated, however, that students taught under both instructional models made significant gains in on both literal comprehension and critical thinking skills from pre-test to post-test.
Resumo:
The estimation of pavement layer moduli through the use of an artificial neural network is a new concept which provides a less strenuous strategy for backcalculation procedures. Artificial Neural Networks are biologically inspired models of the human nervous system. They are specifically designed to carry out a mapping characteristic. This study demonstrates how an artificial neural network uses non-destructive pavement test data in determining flexible pavement layer moduli. The input parameters include plate loadings, corresponding sensor deflections, temperature of pavement surface, pavement layer thicknesses and independently deduced pavement layer moduli.
Resumo:
In the last 16 years emerged in Brazil a segment of independent producers with focus on onshore basins and shallow waters. Among the challenges of these companies is the development of fields with projects with a low net present value (NPV). The objective of this work was to study the technical-economical best option to develop an oil field in the Brazilian Northeast using reservoir simulation. Real geology, reservoir and production data was used to build the geological and simulation model. Due to not having PVT analysis, distillation method test data known as the true boiling points (TBP) were used to create a fluids model generating the PVT data. After execution of the history match, four development scenarios were simulated: the extrapolation of production without new investments, the conversion of a producing well for immiscible gas injection, the drilling of a vertical well and the drilling of a horizontal well. As a result, from the financial point of view, the gas injection is the alternative with lower added value, but it may be viable if there are environmental or regulatory restrictions to flaring or venting the produced gas into the atmosphere from this field or neighboring accumulations. The recovery factor achieved with the drilling of vertical and horizontal wells is similar, but the horizontal well is a project of production acceleration; therefore, the present incremental cumulative production with a minimum rate of company's attractiveness is higher. Depending on the crude oil Brent price and the drilling cost, this option can be technically and financially viable.
Resumo:
Chronic Hepatitis C is the leading cause of chronic liver disease in advanced final stage of hepatocellular carcinoma (HCC) and of death related to liver disease. Evolves progressively in time 20-30 years. Evolutionary rates vary depending on factors virus, host and behavior. This study evaluated the impact of hepatitis C on the lives of patients treated at a referral service in Hepatology of the University Hospital Onofre Lopes - Liver Study Group - from May 1995 to December 2013. A retrospective evaluation was performed on 10,304 records, in order to build a cohort of patients with hepatitis C, in which all individuals had their diagnosis confirmed by gold standard molecular biological test. Data were obtained directly from patient charts and recorded in an Excel spreadsheet, previously built, following an elaborate encoding with the study variables, which constitute individual data and prognostic factors defined in the literature in the progression of chronic hepatitis C. The Research Ethics Committee approved the project. The results were statistically analyzed with the Chi-square test and Fisher's exact used to verify the association between variable for the multivariate analysis, we used the Binomial Logistic regression method. For both tests, it was assumed significance p < 0.05 and 95%. The results showed that the prevalence of chronic hepatitis C in NEF was 4.96 %. The prevalence of cirrhosis due to hepatitis C was 13.7%. The prevalence of diabetes in patients with Hepatitis C was 8.78 % and diabetes in cirrhotic patients with hepatitis C 38.0 %. The prevalence of HCC was 5.45%. The clinical follow-up discontinuation rates were 67.5 %. The mortality in confirmed cases without cirrhosis was 4.10% and 32.1% in cirrhotic patients. The factors associated with the development of cirrhosis were genotype 1 (p = 0.0015) and bilirubin > 1.3 mg % (p = 0.0017). Factors associated with mortality were age over 35 years, abandon treatment, diabetes, insulin use, AST> 60 IU, ALT> 60 IU, high total bilirubin, extended TAP, INR high, low albumin, treatment withdrawal, cirrhosis and hepatocarcinoma. The occurrence of diabetes mellitus increased mortality of patients with hepatitis C in 6 times. Variables associated with the diagnosis of cirrhosis by us were blood donor (odds ratio 0.24, p = 0.044) and professional athlete (odds ratio 0.18, p = 0.35). It is reasonable to consider a revaluation in screening models for CHC currently proposed. The condition of cirrhosis and diabetes modifies the clinical course of patients with chronical hepatitis C, making it a disease more mortality. However, being a blood donor or professional athlete is a protective factor that reduces the risk of cirrhosis, independent of alcohol consumption. Public policies to better efficient access, hosting and resolution are needed for this population.
Resumo:
Subspaces and manifolds are two powerful models for high dimensional signals. Subspaces model linear correlation and are a good fit to signals generated by physical systems, such as frontal images of human faces and multiple sources impinging at an antenna array. Manifolds model sources that are not linearly correlated, but where signals are determined by a small number of parameters. Examples are images of human faces under different poses or expressions, and handwritten digits with varying styles. However, there will always be some degree of model mismatch between the subspace or manifold model and the true statistics of the source. This dissertation exploits subspace and manifold models as prior information in various signal processing and machine learning tasks.
A near-low-rank Gaussian mixture model measures proximity to a union of linear or affine subspaces. This simple model can effectively capture the signal distribution when each class is near a subspace. This dissertation studies how the pairwise geometry between these subspaces affects classification performance. When model mismatch is vanishingly small, the probability of misclassification is determined by the product of the sines of the principal angles between subspaces. When the model mismatch is more significant, the probability of misclassification is determined by the sum of the squares of the sines of the principal angles. Reliability of classification is derived in terms of the distribution of signal energy across principal vectors. Larger principal angles lead to smaller classification error, motivating a linear transform that optimizes principal angles. This linear transformation, termed TRAIT, also preserves some specific features in each class, being complementary to a recently developed Low Rank Transform (LRT). Moreover, when the model mismatch is more significant, TRAIT shows superior performance compared to LRT.
The manifold model enforces a constraint on the freedom of data variation. Learning features that are robust to data variation is very important, especially when the size of the training set is small. A learning machine with large numbers of parameters, e.g., deep neural network, can well describe a very complicated data distribution. However, it is also more likely to be sensitive to small perturbations of the data, and to suffer from suffer from degraded performance when generalizing to unseen (test) data.
From the perspective of complexity of function classes, such a learning machine has a huge capacity (complexity), which tends to overfit. The manifold model provides us with a way of regularizing the learning machine, so as to reduce the generalization error, therefore mitigate overfiting. Two different overfiting-preventing approaches are proposed, one from the perspective of data variation, the other from capacity/complexity control. In the first approach, the learning machine is encouraged to make decisions that vary smoothly for data points in local neighborhoods on the manifold. In the second approach, a graph adjacency matrix is derived for the manifold, and the learned features are encouraged to be aligned with the principal components of this adjacency matrix. Experimental results on benchmark datasets are demonstrated, showing an obvious advantage of the proposed approaches when the training set is small.
Stochastic optimization makes it possible to track a slowly varying subspace underlying streaming data. By approximating local neighborhoods using affine subspaces, a slowly varying manifold can be efficiently tracked as well, even with corrupted and noisy data. The more the local neighborhoods, the better the approximation, but the higher the computational complexity. A multiscale approximation scheme is proposed, where the local approximating subspaces are organized in a tree structure. Splitting and merging of the tree nodes then allows efficient control of the number of neighbourhoods. Deviation (of each datum) from the learned model is estimated, yielding a series of statistics for anomaly detection. This framework extends the classical {\em changepoint detection} technique, which only works for one dimensional signals. Simulations and experiments highlight the robustness and efficacy of the proposed approach in detecting an abrupt change in an otherwise slowly varying low-dimensional manifold.
Resumo:
This is the second part of the assessment of primary energy conversions of oscillating water columns (OWCs) wave energy converters. In the first part of the research work, the hydrodynamic performance of OWC wave energy converter has been extensively examined, targeting on a reliable numerical assessment method. In this part of the research work, the application of the air turbine power take-off (PTO) to the OWC device leads to a coupled model of the hydrodynamics and thermodynamics of the OWC wave energy converters, in a manner that under the wave excitation, the varying air volume due to the internal water surface motion creates a reciprocating chamber pressure (alternative positive and negative chamber pressure), whilst the chamber pressure, in turn, modifies the motions of the device and the internal water surface. To do this, the thermodynamics of the air chamber is first examined and applied by including the air compressibility in the oscillating water columns for different types of the air turbine PTOs. The developed thermodynamics is then coupled with the hydrodynamics of the OWC wave energy converters. This proposed assessment method is then applied to two generic OWC wave energy converters (one bottom fixed and another floating), and the numerical results are compared to the experimental results. From the comparison to the model test data, it can be seen that this numerical method is capable of assessing the primary energy conversion for the oscillating water column wave energy converters.
Resumo:
Air traffic management research lacks a framework for modelling the cost of resilience during disturbance. There is no universally accepted metric for cost resilience. The design of such a framework is presented and the modelling to date is reported. The framework allows performance assessment as a function of differential stakeholder uptake of strategic mechanisms designed to mitigate disturbance. Advanced metrics, cost- and non-cost-based, disaggregated by stakeholder sub-types, are described. A new cost resilience metric is proposed and exemplified with early test data.
Resumo:
In establishing the reliability of performance-related design methods for concrete – which are relevant for resistance against chloride-induced corrosion - long-term experience of local materials and practices and detailed knowledge of the ambient and local micro-climate are critical. Furthermore, in the development of analytical models for performance-based design, calibration against test data representative of actual conditions in practice is required. To this end, the current study presents results from full-scale, concrete pier-stems under long-term exposure to a marine environment with work focussing on XS2 (below mid-tide level) in which the concrete is regarded as fully saturated and XS3 (tidal, splash and spray) in which the concrete is in an unsaturated condition. These exposures represent zones where concrete structures are most susceptible to ionic ingress and deterioration. Chloride profiles and chloride transport behaviour are studied using both an empirical model (erfc function) and a physical model (ClinConc). The time dependency of surface chloride concentration (Cs) and apparent diffusivity (Da) were established for the empirical model whereas, in the ClinConc model (originally based on saturated concrete), two new environmental factors were introduced for the XS3 environmental exposure zone. Although the XS3 is considered as one environmental exposure zone according to BS EN 206-1:2013, the work has highlighted that even within this zone, significant changes in chloride ingress are evident. This study aims to update the parameters of both models for predicting the long term transport behaviour of concrete subjected to environmental exposure classes XS2 and XS3.
Resumo:
This letter presents novel behaviour-based tracking of people in low-resolution using instantaneous priors mediated by head-pose. We extend the Kalman Filter to adaptively combine motion information with an instantaneous prior belief about where the person will go based on where they are currently looking. We apply this new method to pedestrian surveillance, using automatically-derived head pose estimates, although the theory is not limited to head-pose priors. We perform a statistical analysis of pedestrian gazing behaviour and demonstrate tracking performance on a set of simulated and real pedestrian observations. We show that by using instantaneous `intentional' priors our algorithm significantly outperforms a standard Kalman Filter on comprehensive test data.
Resumo:
The development of alkali-activated binders with superior engineering properties and longer durability has emerged as an alternative to ordinary portland cement (OPC). It is possible to use alkali-activated natural pozzolans to prepare environmentally friendly geopolymer cement leading to the concept of sustainable development. This paper presents a summary of an experimental work that was conducted to determine mechanical strength, modulus of elasticity, ultrasonic pulse velocity, and shrinkage of different concrete mixtures prepared with alkali-activated Iranian natural pozzolans—namely Taftan andesite and Shahindej dacite, both with and without calcining. Test data were used for Taftan pozzolan to identify the effects of water-binder ratios (w/b) and curing conditions on the properties of the geopolymer concrete, whereas the influence of material composition was studied by activating Shahindej pozzolan both in the natural and calcined states. The results show that alkali-activated natural pozzolan (AANP) concretes develop moderate-to-high mechanical strength with a high modulus of elasticity and a shrinkage much lower than with OPC.
Resumo:
As noted in Part 1 of this report, the objective of the investigation was to apply principles of first-arrival seismic refraction to the problem of more quickly determining in-place dry density in highway materials. Part 2 of the report, contained herein, presents the results of both additional laboratory development of test techniques, plus extensive field test data.
Resumo:
Research activities during this period concentrated on continuation of field and laboratory testing for the Dallas County test road. Stationary ditch collection of dust was eliminated because of inconsistent data, and because of vandalism to collectors. Braking tests were developed and initiated to evaluate the influence of treatments on braking and safety characteristics of the test sections. Dust testing was initiated for out of the wheelpath conditions as well as in the wheelpath. Contrary to the results obtained during the summer and fall of 1987, the 1.5 percent bentonite treatment appears to be outperforming the other bentonite treated sections after over a year of service. Overall dust reduction appears to average between 25 to 35 percent. Dallas County applied 300 tons per mile of class A roadstone maintenance surfacing to the test road in August 1988. Test data indicates that the bentonite is capable of interacting and functioning to reduce dust generation of the new surfacing material. Again, the 1.5 percent bentonite treatment appeared the most effective. The fine particulate bonding and aggregation mechanism of the bentonite appears recoverable from the environmental effects of winter, and from alternating wet and dry road surface conditions. The magnesium chloride treatment appears capable of long-term (over one year) dust reduction and exhibited an overall average reduction in the range of 15 to 30 percent. The magnesium chloride treatment also appears capable of interacting with newly applied crushed stone to reduce dust generation. Two additional one mile test roads were to have been constructed early this year. Due to an extremely dry spring and summer, construction scheduling was not possible until August. This would have allowed only minimal data collection. Considering this and the fact that this was an atypically dry summer, it was our opinion that it would be in the best interest of the research project to extend the project (at no additional cost) for a period of one year. The two additional test roads will be constructed in early spring 1989 in Adair and Marion counties.
Resumo:
This thesis evaluates the rheological behaviour of asphalt mixtures and the corresponding extracted binders from the mixtures containing different amounts of Reclaimed Asphalt (RA). Generally, the use of RA is limited to certain amounts. The study materials are Stone Mastic Asphalts including a control sample with 0% RA, and other samples with RA rates of 30%, 60% and 100%. Another set of studied mixtures are Asphalt Concretes (AC) types with again a control mix having 0% RA rate and the other mixtures designs containing 30%, 60% and 90% of reclaimed asphalt which also contain additives. In addition to the bitumen samples extracted from asphalt mixes, there are bitumen samples directly extracted from the original RA. To characterize the viscoelastic behaviour of the binders, Dynamic Shear Rheometer (DSR) tests were conducted on bitumen specimens. The resulting influence of the RA content in the bituminous binders are illustrated through master curves, black diagrams and Cole-Cole plots with regressing these experimental data by the application of the analogical 2S2P1D and the analytical CA model. The advantage of the CA model is in its limited number of parameters and thus is a simple model to use. The 2S2P1D model is an analogical rheological model for the prediction of the linear viscoelastic properties of both asphalt binders and mixtures. In order to study the influence of RA on mixtures, the Indirect Tensile Test (ITT) has been conducted. The master curves of different mixture samples are evaluated by regressing the test data points to a sigmoidal function and subsequently by comparing the master curves, the influence of RA materials is studied. The thesis also focusses on the applicability and also differences of CA model and 2S2P1D model for bitumen samples and the sigmoid function for the mixtures and presents the influence of the RA rate on the investigated model parameters.
Resumo:
Background and Objectives: The measurement of salivary immunoglobulin A is a useful and non-invasive method for measuring stress. Personality traits and rumination act as possible mediators in the relationship between psychological stressors and the immune system. This study was aimed to evaluate the levels of salivary IgA under psychological stress and its relationship with rumination and five personality traits in medical students. Methods: In this cross- sectional study, 45 medical students who intended to participate in the final exam were selected by simple random sampling. Two months before the exam, in the basal conditions, the NEO Personality Inventory-Short Form and Emotional Control Questionnaire (ECQ) were completed. Saliva samples were taken from students in both the basal conditions and exam stress conditions. Salivary IgA was measured by an ELISA test. Data was analyzed using paired samples T-test, Pearson correlation analysis, and stepwise regression. Results: A significant reduction of salivary IgA levels was found in exam stress conditions. Also, a significant negative correlation was found between traits of neuroticism and rumination with salivary IgA, as well as a significant positive correlation between of openness to experience and emotional inhibition with salivary IgA. Openness to experience and emotional inhibition may predict a substantial variance (34%) of salivary IgA under exam stress. Conclusions: Salivary IgA is reduced in response to exam stress. In addition, the rumination and personality traits may reduce or increase stress effects on the immune system, particularly the salivary IgA.