978 resultados para Field Testing
Resumo:
The research interest of this study is to investigate surface immobilization strategies for proteins and other biomolecules by the surface plasmon field-enhanced fluorescence spectroscopy (SPFS) technique. The recrystallization features of the S-layer proteins and the possibility of combining the S-layer lattice arrays with other functional molecules make this protein a prime candidate for supramolecular architectures. The recrystallization behavior on gold or on the secondary cell wall polymer (SCWP) was recorded by SPR. The optical thicknesses and surface densities for different protein layers were calculated. In DNA hybridization tests performed in order to discriminate different mismatches, recombinant S-layer-streptavidin fusion protein matrices showed their potential for new microarrays. Moreover, SCWPs coated gold chips, covered with a controlled and oriented assembly of S-layer fusion proteins, represent an even more sensitive fluorescence testing platform. Additionally, S-layer fusion proteins as the matrix for LHCII immobilization strongly demonstrate superiority over routine approaches, proving the possibility of utilizing them as a new strategy for biomolecular coupling. In the study of the SPFS hCG immunoassay, the biophysical and immunological characteristics of this glycoprotein hormone were presented first. After the investigation of the effect of the biotin thiol dilution on the coupling efficiently, the interfacial binding model including the appropriate binary SAM structure and the versatile streptavidin-biotin interaction was chosen as the basic supramolecular architecture for the fabrication of a SPFS-based immunoassay. Next, the affinity characteristics between different antibodies and hCG were measured via an equilibrium binding analysis, which is the first example for the titration of such a high affinity interaction by SPFS. The results agree very well with the constants derived from the literature. Finally, a sandwich assay and a competitive assay were selected as templates for SPFS-based hCG detection, and an excellent LOD of 0.15 mIU/ml was attained via the “one step” sandwich method. Such high sensitivity not only fulfills clinical requirements, but is also better than most other biosensors. Fully understanding how LHCII complexes transfer the sunlight energy directionally and efficiently to the reaction center is potentially useful for constructing biomimetic devices as solar cells. After the introduction of the structural and the spectroscopic features of LHCII, different surface immobilization strategies of LHCII were summarized next. Among them the strategy based on the His-tag and the immobilized metal (ion) affinity chromatography (IMAC) technique were of great interest and resulted in different kinds of home-fabricated His-tag chelating chips. Their substantial protein coupling capacity, maintenance of high biological activity and a remarkably repeatable binding ability on the same chip after regeneration was demonstrated. Moreover, different parameters related to the stability of surface coupled reconstituted complexes, including sucrose, detergent, lipid, oligomerization, temperature and circulation rate, were evaluated in order to standardize the most effective immobilization conditions. In addition, partial lipid bilayers obtained from LHCII contained proteo-liposomes fusion on the surface were observed by the QCM technique. Finally, the inter-complex energy transfer between neighboring LHCIIs on a gold protected silver surface by excitation with a blue laser (λ = 473nm) was recorded for the first time, and the factors influencing the energy transfer efficiency were evaluated.
Resumo:
The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. Security testing methodologies are the first step towards standardized security evaluation processes and understanding of how the security threats evolve over time. This dissertation analyzes some of the most used identifying differences and commonalities, useful to compare them and assess their quality. The dissertation then proposes a new enhanced methodology built by keeping the best of every analyzed methodology. The designed methodology is tested over different systems with very effective results, which is the main evidence that it could really be applied in practical cases. Most of the dissertation discusses and proves how the presented testing methodology could be applied to such different systems and even to evade security measures by inverting goals and scopes. Real cases are often hard to find in methodology' documents, in contrary this dissertation wants to show real and practical cases offering technical details about how to apply it. Electronic voting systems are the first field test considered, and Pvote and Scantegrity are the two tested electronic voting systems. The usability and effectiveness of the designed methodology for electronic voting systems is proved thanks to this field cases analysis. Furthermore reputation and anti virus engines have also be analyzed with similar results. The dissertation concludes by presenting some general guidelines to build a coordination-based approach of electronic voting systems to improve the security without decreasing the system modularity.
Resumo:
Weak lensing experiments such as the future ESA-accepted mission Euclid aim to measure cosmological parameters with unprecedented accuracy. It is important to assess the precision that can be obtained in these measurements by applying analysis software on mock images that contain many sources of noise present in the real data. In this Thesis, we show a method to perform simulations of observations, that produce realistic images of the sky according to characteristics of the instrument and of the survey. We then use these images to test the performances of the Euclid mission. In particular, we concentrate on the precision of the photometric redshift measurements, which are key data to perform cosmic shear tomography. We calculate the fraction of the total observed sample that must be discarded to reach the required level of precision, that is equal to 0.05(1+z) for a galaxy with measured redshift z, with different ancillary ground-based observations. The results highlight the importance of u-band observations, especially to discriminate between low (z < 0.5) and high (z ~ 3) redshifts, and the need for good observing sites, with seeing FWHM < 1. arcsec. We then construct an optimal filter to detect galaxy clusters through photometric catalogues of galaxies, and we test it on the COSMOS field, obtaining 27 lensing-confirmed detections. Applying this algorithm on mock Euclid data, we verify the possibility to detect clusters with mass above 10^14.2 solar masses with a low rate of false detections.
Resumo:
Natural stones have been widely used in the construction field since antiquity. Building materials undergo decay processes due to mechanical,chemical, physical and biological causes that can act together. Therefore an interdisciplinary approach is required in order to understand the interaction between the stone and the surrounding environment. Utilization of buildings, inadequate restoration activities and in general anthropogenic weathering factors may contribute to this degradation process. For this reasons, in the last few decades new technologies and techniques have been developed and introduced in the restoration field. Consolidants are largely used in restoration and conservation of cultural heritage in order to improve the internal cohesion and to reduce the weathering rate of building materials. It is important to define the penetration depth of a consolidant for determining its efficacy. Impregnation mainly depends on the microstructure of the stone (i.e. porosity) and on the properties of the product itself. Throughout this study, tetraethoxysilane (TEOS) applied on globigerina limestone samples has been chosen as object of investigation. After hydrolysis and condensation, TEOS deposits silica gel inside the pores, improving the cohesion of the grains. X-ray computed tomography has been used to characterize the internal structure of the limestone samples,treated and untreated with a TEOS-based consolidant. The aim of this work is to investigate the penetration depth and the distribution of the TEOS inside the porosity, using both traditional approaches and advanced X-ray tomographic techniques, the latter allowing the internal visualization in three dimensions of the materials. Fluid transport properties and porosity have been studied both at macroscopic scale, by means of capillary uptake tests and radiography, and at microscopic scale,investigated with X-ray Tomographic Microscopy (XTM). This allows identifying changes in the porosity, by comparison of the images before and after the treatment, and locating the consolidant inside the stone. Tests were initially run at University of Bologna, where characterization of the stone was carried out. Then the research continued in Switzerland: X-ray tomography and radiography were performed at Empa, Swiss Federal Laboratories for Materials Science and Technology, while XTM measurements with synchrotron radiation were run at Paul Scherrer Institute in Villigen.
Resumo:
Despite the many proposed advantages related to nanotechnology, there are increasing concerns as to the potential adverse human health and environmental effects that the production of, and subsequent exposure to nanoparticles (NPs) might pose. In regard to human health, these concerns are founded upon the plethora of knowledge gained from research relating to the effects observed following exposure to environmental air pollution. It is known that increased exposure to environmental air pollution can cause reduced respiratory health, as well as exacerbate pre-existing conditions such as cardiovascular disease and chronic obstructive pulmonary disease. Such disease states have also been associated with exposure to the NP component contained within environmental air pollution, raising concerns as to the effects of NP exposure. It is not only exposure to accidentally produced NPs however, which should be approached with caution. Over the past decades, NPs have been specifically engineered for a wide range of consumer, industrial and technological applications. Due to the inevitable exposure of NPs to humans, owing to their use in such applications, it is therefore imperative that an understanding of how NPs interact with the human body is gained. In vivo research poses a beneficial model for gaining immediate and direct knowledge of human exposure to such xenobiotics. This research outlook however, has numerous limitations. Increased research using in vitro models has therefore been performed, as these models provide an inexpensive and high-throughput alternative to in vivo research strategies. Despite such advantages, there are also various restrictions in regard to in vitro research. Therefore, the aim of this review, in addition to providing a short perspective upon the field of nanotoxicology, is to discuss (1) the advantages and disadvantages of in vitro research and (2) how in vitro research may provide essential information pertaining to the human health risks posed by NP exposure.
Resumo:
In the past few decades, integrated circuits have become a major part of everyday life. Every circuit that is created needs to be tested for faults so faulty circuits are not sent to end-users. The creation of these tests is time consuming, costly and difficult to perform on larger circuits. This research presents a novel method for fault detection and test pattern reduction in integrated circuitry under test. By leveraging the FPGA's reconfigurability and parallel processing capabilities, a speed up in fault detection can be achieved over previous computer simulation techniques. This work presents the following contributions to the field of Stuck-At-Fault detection: We present a new method for inserting faults into a circuit net list. Given any circuit netlist, our tool can insert multiplexers into a circuit at correct internal nodes to aid in fault emulation on reconfigurable hardware. We present a parallel method of fault emulation. The benefit of the FPGA is not only its ability to implement any circuit, but its ability to process data in parallel. This research utilizes this to create a more efficient emulation method that implements numerous copies of the same circuit in the FPGA. A new method to organize the most efficient faults. Most methods for determinin the minimum number of inputs to cover the most faults require sophisticated softwareprograms that use heuristics. By utilizing hardware, this research is able to process data faster and use a simpler method for an efficient way of minimizing inputs.
Resumo:
Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.
Resumo:
The GLAaS algorithm for pretreatment intensity modulation radiation therapy absolute dose verification based on the use of amorphous silicon detectors, as described in Nicolini et al. [G. Nicolini, A. Fogliata, E. Vanetti, A. Clivio, and L. Cozzi, Med. Phys. 33, 2839-2851 (2006)], was tested under a variety of experimental conditions to investigate its robustness, the possibility of using it in different clinics and its performance. GLAaS was therefore tested on a low-energy Varian Clinac (6 MV) equipped with an amorphous silicon Portal Vision PV-aS500 with electronic readout IAS2 and on a high-energy Clinac (6 and 15 MV) equipped with a PV-aS1000 and IAS3 electronics. Tests were performed for three calibration conditions: A: adding buildup on the top of the cassette such that SDD-SSD = d(max) and comparing measurements with corresponding doses computed at d(max), B: without adding any buildup on the top of the cassette and considering only the intrinsic water-equivalent thickness of the electronic portal imaging devices device (0.8 cm), and C: without adding any buildup on the top of the cassette but comparing measurements against doses computed at d(max). This procedure is similar to that usually applied when in vivo dosimetry is performed with solid state diodes without sufficient buildup material. Quantitatively, the gamma index (gamma), as described by Low et al. [D. A. Low, W. B. Harms, S. Mutic, and J. A. Purdy, Med. Phys. 25, 656-660 (1998)], was assessed. The gamma index was computed for a distance to agreement (DTA) of 3 mm. The dose difference deltaD was considered as 2%, 3%, and 4%. As a measure of the quality of results, the fraction of field area with gamma larger than 1 (%FA) was scored. Results over a set of 50 test samples (including fields from head and neck, breast, prostate, anal canal, and brain cases) and from the long-term routine usage, demonstrated the robustness and stability of GLAaS. In general, the mean values of %FA remain below 3% for deltaD equal or larger than 3%, while they are slightly larger for deltaD = 2% with %FA in the range from 3% to 8%. Since its introduction in routine practice, 1453 fields have been verified with GLAaS at the authors' institute (6 MV beam). Using a DTA of 3 mm and a deltaD of 4% the authors obtained %FA = 0.9 +/- 1.1 for the entire data set while, stratifying according to the dose calculation algorithm, they observed: %FA = 0.7 +/- 0.9 for fields computed with the analytical anisotropic algorithm and %FA = 2.4 +/- 1.3 for pencil-beam based fields with a statistically significant difference between the two groups. If data are stratified according to field splitting, they observed %FA = 0.8 +/- 1.0 for split fields and 1.0 +/- 1.2 for nonsplit fields without any significant difference.
Resumo:
Rice (Oryza sativa L.) is an important cash crop in Honduras because of the rice lobby’s size, willingness to protest, and ability to negotiate favorable price guarantees on a year-to-year basis. Despite the availability of inexpensive irrigation in the study area in Flores, La Villa de San Antonio, Comayagua, the rice farmers do not cultivate the crop using prescribed methods such as land leveling, puddling, and water conservation structures. Soil moisture (Volumetric Water Content) was measured using a soil moisture probe after the termination of the first irrigation within the tillering/vegetative, panicle emergence/flowering, post-flowering/pre-maturation and maturation stages. Yield data was obtained by harvesting on 1 m2 plots in each soil moisture testing site. Data was analyzed to find the influence of toposequential position along transects, slope, soil moisture, and farmers on yields. The results showed that toposequential position was more important than slope and soil moisture on yields. Soil moisture was not a significant predictor of rice yields. Irrigation politics, precipitation, and land tenure were proposed as the major explanatory variables for this result.
Resumo:
There has been a continuous evolutionary process in asphalt pavement design. In the beginning it was crude and based on past experience. Through research, empirical methods were developed based on materials response to specific loading at the AASHO Road Test. Today, pavement design has progressed to a mechanistic-empirical method. This methodology takes into account the mechanical properties of the individual layers and uses empirical relationships to relate them to performance. The mechanical tests that are used as part of this methodology include dynamic modulus and flow number, which have been shown to correlate with field pavement performance. This thesis was based on a portion of a research project being conducted at Michigan Technological University (MTU) for the Wisconsin Department of Transportation (WisDOT). The global scope of this project dealt with the development of a library of values as they pertain to the mechanical properties of the asphalt pavement mixtures paved in Wisconsin. Additionally, a comparison with the current associated pavement design to that of the new AASHTO Design Guide was conducted. This thesis describes the development of the current pavement design methodology as well as the associated tests as part of a literature review. This report also details the materials that were sampled from field operations around the state of Wisconsin and their testing preparation and procedures. Testing was conducted on available round robin and three Wisconsin mixtures and the main results of the research were: The test history of the Superpave SPT (fatigue and permanent deformation dynamic modulus) does not affect the mean response for both dynamic modulus and flow number, but does increase the variability in the test results of the flow number. The method of specimen preparation, compacting to test geometry versus sawing/coring to test geometry, does not statistically appear to affect the intermediate and high temperature dynamic modulus and flow number test results. The 2002 AASHTO Design Guide simulations support the findings of the statistical analyses that the method of specimen preparation did not impact the performance of the HMA as a structural layer as predicted by the Design Guide software. The methodologies for determining the temperature-viscosity relationship as stipulated by Witczak are sensitive to the viscosity test temperatures employed. The increase in asphalt binder content by 0.3% was found to actually increase the dynamic modulus at the intermediate and high test temperature as well as flow number. This result was based the testing that was conducted and was contradictory to previous research and the hypothesis that was put forth for this thesis. This result should be used with caution and requires further review. Based on the limited results presented herein, the asphalt binder grade appears to have a greater impact on performance in the Superpave SPT than aggregate angularity. Dynamic modulus and flow number was shown to increase with traffic level (requiring an increase in aggregate angularity) and with a decrease in air voids and confirm the hypotheses regarding these two factors. Accumulated micro-strain at flow number as opposed to the use of flow number appeared to be a promising measure for comparing the quality of specimens within a specific mixture. At the current time the Design Guide and its associate software needs to be further improved prior to implementation by owner/agencies.
Resumo:
BACKGROUND: Visual acuity serves as only a rough gauge of macular function. The aim therefore was to ascertain whether central an assessment of the central visual field afforded a closer insight into visual function after removal of epiretinal membranes and Infracyanine-Green- or Trypan-Blue-assisted peeling of the inner limiting membrane. Patients and methods: Fourty-three patients undergoing pars-plana vitrectomy for the removal of epimacular membranes and dye-assisted peeling of the inner limiting membrane using either Infracyanine Green (n = 29; group 1) or Trypan Blue (n = 14; group 2) were monitored prospectively for 12 months. Preoperatively, and 1, 6 and 12 months postoperatively, distance and reading visual acuities were evaluated; the central visual field was assessed by automated static perimetry. RESULTS: Twelve months after surgery, distance and reading visual acuities had improved in both groups, but to a significant degree only in Trypan-Blue-treated eyes. The difference between the two groups was not significant. Likewise at this juncture, the mean size of the visual-field defect remained unchanged in Trypan-Blue-treated eyes (preoperative: 4.3 (SD 2.1) dB; 12 months: 4.0 (2.1) dB (p = 0.15)), but had increased in Infracyanine-Green-treated ones (from 5.3 (3.7) dB to 8.0 (5.2) dB (p = 0.027)). CONCLUSION: Unlike visual acuity, the central visual field had deteriorated in Infracyanine-Green-treated eyes but not in Trypan-Blue-treated eyes 12 months after surgery. Hence, as a predictor of functional outcome, testing of the central visual field may be a more sensitive gauge than visual acuity. Furthermore, Infracyanine Green may have a chronic and potentially clinically relevant effect on the macula which is not reflected in the visual acuity.
Resumo:
Contagious caprine pleuropneumonia (CCPP) is a highly contagious disease caused by Mycoplasma capricolum subsp. capripneumoniae that affects goats in Africa and Asia. Current available methods for the diagnosis of Mycoplasma infection, including cultivation, serological assays, and PCR, are time-consuming and require fully equipped stationary laboratories, which make them incompatible with testing in the resource-poor settings that are most relevant to this disease. We report a rapid, specific, and sensitive assay employing isothermal DNA amplification using recombinase polymerase amplification (RPA) for the detection of M. capricolum subsp. capripneumoniae. We developed the assay using a specific target sequence in M. capricolum subsp. capripneumoniae, as found in the genome sequence of the field strain ILRI181 and the type strain F38 and that was further evidenced in 10 field strains from different geographical regions. Detection limits corresponding to 5 × 10(3) and 5 × 10(4) cells/ml were obtained using genomic DNA and bacterial culture from M. capricolum subsp. capripneumoniae strain ILRI181, while no amplification was obtained from 71 related Mycoplasma isolates or from the Acholeplasma or the Pasteurella isolates, demonstrating a high degree of specificity. The assay produces a fluorescent signal within 15 to 20 min and worked well using pleural fluid obtained directly from CCPP-positive animals without prior DNA extraction. We demonstrate that the diagnosis of CCPP can be achieved, with a short sample preparation time and a simple read-out device that can be powered by a car battery, in <45 min in a simulated field setting.
Resumo:
Prenatal diagnosis is traditionally made via invasive procedures such as amniocentesis and chorionic villus sampling (CVS). However, both procedures carry a risk of complications, including miscarriage. Many groups have spent years searching for a way to diagnose a chromosome aneuploidy without putting the fetus or the mother at risk for complications. Non-invasive prenatal testing (NIPT) for chromosome aneuploidy became commercially available in the fall of 2011, with detection rates similar to those of invasive procedures for the common autosomal aneuploidies (Palomaki et al., 2011; Ashoor et al. 2012; Bianchi et al. 2012). Eventually NIPT may become the diagnostic standard of care and reduce invasive procedure-related losses (Palomaki et al., 2011). The integration of NIPT into clinical practice has potential to revolutionize prenatal diagnosis; however, it also raises some crucial issues for practitioners. Now that the test is clinically available, no studies have looked at the physicians that will be ordering the testing or referring patients to practitioners who do. This study aimed to evaluate the attitudes of OB/GYN’s and how they are incorporating the test into clinical practice. Our study shows that most physicians are offering this new, non-invasive technology to their patients, and that their practices were congruent with the literature and available professional society opinions. Those physicians who do not offer NIPT to their patients would like more literature on the topic as well as instructive guidelines from their professional societies. Additionally, this study shows that the practices and attitudes of MFMs and OBs differ. Our population feels that the incorporation of NIPT will change their practices by lowering the amount of invasive procedures, possibly replacing maternal serum screening, and that it will simplify prenatal diagnosis. However, those physicians who do not offer NIPT to their patients are not quite sure how the test will affect their clinical practice. From this study we are able to glean how physicians are incorporating this new technology into their practice and how they feel about the addition to their repertoire of tests. This knowledge gives insight as to how to best move forward with the quickly changing field of prenatal diagnosis.
Resumo:
The goal of the AEgIS experiment is to measure the gravitational acceleration of antihydrogen – the simplest atom consisting entirely of antimatter – with the ultimate precision of 1%. We plan to verify the Weak Equivalence Principle (WEP), one of the fundamental laws of nature, with an antimatter beam. The experiment consists of a positron accumulator, an antiproton trap and a Stark accelerator in a solenoidal magnetic field to form and accelerate a pulsed beam of antihydrogen atoms towards a free-fall detector. The antihydrogen beam passes through a moir ́e deflectometer to measure the vertical displacement due to the gravitational force. A position and time sensitive hybrid detector registers the annihilation points of the antihydrogen atoms and their time-of-flight. The detection principle has been successfully tested with antiprotons and a miniature moir ́e deflectometer coupled to a nuclear emulsion detector.