974 resultados para Dynamic testing
Resumo:
This paper explores analytically the contemporary pottery-making community of Pereruela (north-west Spain) that produces cooking pots from a mixture of red clay and kaolin. Analyses by different techniques (XRF, NAA, XRD, SEM and petrography) showed an extremely high variability for cooking ware pottery produced in a single production centre, by the same technology and using local clays. The main source of chemical variation is related to the use of different red clays and the presence of non-normally distributed inclusions of monazite. These two factors induce a high chemical variability, not only in the output of a single production centre, but even in the paste of a single pot, to an extent to which chemical compositions from one"workshop", or even one"pot", could be classified as having different provenances. The implications for the chemical characterization and for provenance studies of archaeological ceramics are addressed.
Resumo:
The Iowa State University (ISU) Bridge Engineering Center (BEC) performed full-scale laboratory testing of the proposed paving notch replacement system. The objective of the testing program was to verify the structural capacity of the proposed precast paving notch system and to investigate the feasibility of the proposed solution. This report describes the laboratory testing procedure and discusses its results
Resumo:
The 2011 Missouri River flooding caused significant damage to many geo-infrastructure systems including levees, bridge abutments/foundations, paved and unpaved roadways, culverts, and embankment slopes in western Iowa. The flooding resulted in closures of several interchanges along Interstate 29 and of more than 100 miles of secondary roads in western Iowa, causing severe inconvenience to residents and losses to local businesses. The main goals of this research project were to assist county and city engineers by deploying and using advanced technologies to rapidly assess the damage to geo-infrastructure and develop effective repair and mitigation strategies and solutions for use during future flood events in Iowa. The research team visited selected sites in western Iowa to conduct field reconnaissance, in situ testing on bridge abutment backfills that were affected by floods, flooded and non-flooded secondary roadways, and culverts. In situ testing was conducted shortly after the flood waters receded, and several months after flooding to evaluate recovery and performance. Tests included falling weight deflectometer, dynamic cone penetrometer, three-dimensional (3D) laser scanning, ground penetrating radar, and hand auger soil sampling. Field results indicated significant differences in roadway support characteristics between flooded and non-flooded areas. Support characteristics in some flooded areas recovered over time, while others did not. Voids were detected in culvert and bridge abutment backfill materials shortly after flooding and several months after flooding. A catalog of field assessment techniques and 20 potential repair/mitigation solutions are provided in this report. A flow chart relating the damages observed, assessment techniques, and potential repair/mitigation solutions is provided. These options are discussed for paved/unpaved roads, culverts, and bridge abutments, and are applicable for both primary and secondary roadways.
Resumo:
Several agencies specify AASHTO T283 as the primary test for field acceptance of moisture susceptibility in hot mix asphalt. When used in this application, logistical difficulties challenge its practicality, while repeatability is routinely scrutinized by contractors. An alternative test is needed which can effectively demonstrate the ability to screen mixtures based on expected performance. The ideal replacement can be validated with field performance, is repeatable, and allows for prompt reporting of results. Dynamic modulus, flow number, AASHTO T283, Hamburg wheel tracking device (HWTD), and the moisture induced sensitivity test (MIST) were performed on plant produced surface mixes in Iowa. Follow-up distress surveys were used to rank the mixes by their performance. The rankings indicate both the quantity of swelling from MIST conditioning and submersed flow number matched the performance ranking of all but one mixture. Hamburg testing parameters also appear effective, namely the stripping inflection point and the ratio between stripping slope and the creep slope. Dynamic modulus testing was ineffective, followed by AASHTO T283 and ratios produced from flow number results of conditioned samples.
Resumo:
This report outlines the current drugs testing practices and using these practices for testing requirements.
Resumo:
The goal of this work is to develop a method to objectively compare the performance of a digital and a screen-film mammography system in terms of image quality. The method takes into account the dynamic range of the image detector, the detection of high and low contrast structures, the visualisation of the images and the observer response. A test object, designed to represent a compressed breast, was constructed from various tissue equivalent materials ranging from purely adipose to purely glandular composition. Different areas within the test object permitted the evaluation of low and high contrast detection, spatial resolution and image noise. All the images (digital and conventional) were captured using a CCD camera to include the visualisation process in the image quality assessment. A mathematical model observer (non-prewhitening matched filter), that calculates the detectability of high and low contrast structures using spatial resolution, noise and contrast, was used to compare the two technologies. Our results show that for a given patient dose, the detection of high and low contrast structures is significantly better for the digital system than for the conventional screen-film system studied. The method of using a test object with a large tissue composition range combined with a camera to compare conventional and digital imaging modalities can be applied to other radiological imaging techniques. In particular it could be used to optimise the process of radiographic reading of soft copy images.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.
Resumo:
Gauguin's first attempts at still-life painting, around 1875, followed the Dutch tradition, influenced mainly by Manet's palette. But he did take occasional liberties in depicting flowers with more fluid colour and dynamic backgrounds. From 1879 his style shows the influence of the Impressionists: Pissarro in the landscapes and Degas in the composition of his still-lifes. He was also open to the new trends which were developing among artists in Paris and applied them in his paintings, using still-lifes as his main means for testing them. He did not escape the contemporary fascination with Japonism, and even experimented briefly with Pointillism in Still Life with Horse's Head. His stays in Britain between 1886 and 1890 correspond to an extremely rich and innovative period for him, in which still-lifes served for increasing experimentation. "Fête Gloanec" and Three Puppies reflect his preoccupations: rejection of perspective, use of areas of flat colour, and mixed styles. These pictures amount to an aesthetic manifesto; many of them are also imbued with strong symbolism, as in the Portrait of Meyer de Haan, which is a melancholic reflection on the fall of man. In Still-Life with Japanese Print, frail blue flowers seem to come out of the head of the artist-martyr, a pure product of the painter's "restless imagination". Thus Gauguin showed that art is an "abstraction" through a genre which was reputed to lend itself with difficulty to anything other than mimesis. Although he moved away from still-life after 1890, Gauguin is one of the first artists to radically renew its role and the status of still-life at the end of the 19th century, well before the Fauvists and Cubists.
Resumo:
BACKGROUND: In a simulation based on a pharmacokinetic model we demonstrated that increasing the erythropoiesis stimulating agents (ESAs) half-life or shortening their administration interval decreases hemoglobin variability. The benefit of reducing the administration interval was however lessened by the variability induced by more frequent dosage adjustments. The purpose of this study was to analyze the reticulocyte and hemoglobin kinetics and variability under different ESAs and administration intervals in a collective of chronic hemodialysis patients. METHODS: The study was designed as an open-label, randomized, four-period cross-over investigation, including 30 patients under chronic hemodialysis at the regional hospital of Locarno (Switzerland) in February 2010 and lasting 2 years. Four subcutaneous treatment strategies (C.E.R.A. every 4 weeks Q4W and every 2 weeks Q2W, Darbepoetin alfa Q4W and Q2W) were compared with each other. The mean square successive difference of hemoglobin, reticulocyte count and ESAs dose was used to quantify variability. We distinguished a short- and a long-term variability based respectively on the weekly and monthly successive difference. RESULTS: No difference was found in the mean values of biological parameters (hemoglobin, reticulocytes, and ferritin) between the 4 strategies. ESAs type did not affect hemoglobin and reticulocyte variability, but C.E.R.A induced a more sustained reticulocytes response over time and increased the risk of hemoglobin overshooting (OR 2.7, p = 0.01). Shortening the administration interval lessened the amplitude of reticulocyte count fluctuations but resulted in more frequent ESAs dose adjustments and in amplified reticulocyte and hemoglobin variability. Q2W administration interval was however more favorable in terms of ESAs dose, allowing a 38% C.E.R.A. dose reduction, and no increase of Darbepoetin alfa. CONCLUSIONS: The reticulocyte dynamic was a more sensitive marker of time instability of the hemoglobin response under ESAs therapy. The ESAs administration interval had a greater impact on hemoglobin variability than the ESAs type. The more protracted reticulocyte response induced by C.E.R.A. could explain both, the observed higher risk of overshoot and the significant increase in efficacy when shortening its administration interval.Trial registrationClinicalTrials.gov NCT01666301.
Resumo:
Histological subtyping and grading by malignancy are the cornerstones of the World Health Organization (WHO) classification of tumors of the central nervous system. They shall provide clinicians with guidance as to the course of disease to be expected and the choices of treatment to be made. Nonetheless, patients with histologically identical tumors may have very different outcomes, notably in patients with astrocytic and oligodendroglial gliomas of WHO grades II and III. In gliomas of adulthood, 3 molecular markers have undergone extensive studies in recent years: 1p/19q chromosomal codeletion, O(6)-methylguanine methyltransferase (MGMT) promoter methylation, and mutations of isocitrate dehydrogenase (IDH) 1 and 2. However, the assessment of these molecular markers has so far not been implemented in clinical routine because of the lack of therapeutic implications. In fact, these markers were considered to be prognostic irrespective of whether patients were receiving radiotherapy (RT), chemotherapy, or both (1p/19q, IDH1/2), or of limited value because testing is too complex and no chemotherapy alternative to temozolomide was available (MGMT). In 2012, this situation has changed: long-term follow-up of the Radiation Therapy Oncology Group 9402 and European Organisation for Research and Treatment of Cancer 26951 trials demonstrated an overall survival benefit from the addition to RT of chemotherapy with procarbazine/CCNU/vincristine confined to patients with anaplastic oligodendroglial tumors with (vs without) 1p/19q codeletion. Furthermore, in elderly glioblastoma patients, the NOA-08 and the Nordic trial of RT alone versus temozolomide alone demonstrated a profound impact of MGMT promoter methylation on outcome by therapy and thus established MGMT as a predictive biomarker in this patient population. These recent results call for the routine implementation of 1p/19q and MGMT testing at least in subpopulations of malignant glioma patients and represent an encouraging step toward the development of personalized therapeutic approaches in neuro-oncology.
Resumo:
PURPOSE OF REVIEW: HIV targets primary CD4(+) T cells. The virus depends on the physiological state of its target cells for efficient replication, and, in turn, viral infection perturbs the cellular state significantly. Identifying the virus-host interactions that drive these dynamic changes is important for a better understanding of viral pathogenesis and persistence. The present review focuses on experimental and computational approaches to study the dynamics of viral replication and latency. RECENT FINDINGS: It was recently shown that only a fraction of the inducible latently infected reservoirs are successfully induced upon stimulation in ex-vivo models while additional rounds of stimulation make allowance for reactivation of more latently infected cells. This highlights the potential role of treatment duration and timing as important factors for successful reactivation of latently infected cells. The dynamics of HIV productive infection and latency have been investigated using transcriptome and proteome data. The cellular activation state has shown to be a major determinant of viral reactivation success. Mathematical models of latency have been used to explore the dynamics of the latent viral reservoir decay. SUMMARY: Timing is an important component of biological interactions. Temporal analyses covering aspects of viral life cycle are essential for gathering a comprehensive picture of HIV interaction with the host cell and untangling the complexity of latency. Understanding the dynamic changes tipping the balance between success and failure of HIV particle production might be key to eradicate the viral reservoir.
Resumo:
In this paper we report on the growth of thick films of magnetoresistive La2/3Sr1/3MnO3 by using spray and screen printing techniques on various substrates (Al2O3 and ZrO2). The growth conditions are explored in order to optimize the microstructure of the films. The films display a room-temperature magnetoresistance of 0.0012%/Oe in the 1 kOe field region. A magnetic sensor is described and tested.