823 resultados para Corrosion testing and monitoring
Resumo:
The purpose of this investigation was to develop and implement a general purpose VLSI (Very Large Scale Integration) Test Module based on a FPGA (Field Programmable Gate Array) system to verify the mechanical behavior and performance of MEM sensors, with associated corrective capabilities; and to make use of the evolving System-C, a new open-source HDL (Hardware Description Language), for the design of the FPGA functional units. System-C is becoming widely accepted as a platform for modeling, simulating and implementing systems consisting of both hardware and software components. In this investigation, a Dual-Axis Accelerometer (ADXL202E) and a Temperature Sensor (TMP03) were used for the test module verification. Results of the test module measurement were analyzed for repeatability and reliability, and then compared to the sensor datasheet. Further study ideas were identified based on the study and results analysis. ASIC (Application Specific Integrated Circuit) design concepts were also being pursued.
Resumo:
Over the last decade advances and innovations from Silicon Photonics technology were observed in the telecommunications and computing industries. This technology which employs Silicon as an optical medium, relies on current CMOS micro-electronics fabrication processes to enable medium scale integration of many nano-photonic devices to produce photonic integrated circuitry. ^ However, other fields of research such as optical sensor processing can benefit from silicon photonics technology, specially in sensors where the physical measurement is wavelength encoded. ^ In this research work, we present a design and application of a thermally tuned silicon photonic device as an optical sensor interrogator. ^ The main device is a micro-ring resonator filter of 10 μm of diameter. A photonic design toolkit was developed based on open source software from the research community. With those tools it was possible to estimate the resonance and spectral characteristics of the filter. From the obtained design parameters, a 7.8 × 3.8 mm optical chip was fabricated using standard micro-photonics techniques. In order to tune a ring resonance, Nichrome micro-heaters were fabricated on top of the device. Some fabricated devices were systematically characterized and their tuning response were determined. From measurements, a ring resonator with a free-spectral-range of 18.4 nm and with a bandwidth of 0.14 nm was obtained. Using just 5 mA it was possible to tune the device resonance up to 3 nm. ^ In order to apply our device as a sensor interrogator in this research, a model of wavelength estimation using time interval between peaks measurement technique was developed and simulations were carried out to assess its performance. To test the technique, an experiment using a Fiber Bragg grating optical sensor was set, and estimations of the wavelength shift of this sensor due to axial strains yield an error within 22 pm compared to measurements from spectrum analyzer. ^ Results from this study implies that signals from FBG sensors can be processed with good accuracy using a micro-ring device with the advantage of ts compact size, scalability and versatility. Additionally, the system also has additional applications such as processing optical wavelength shifts from integrated photonic sensors and to be able to track resonances from laser sources.^
Resumo:
The lead author, Nimai Senapati (Post doc), was funded by the European community’s Seventh Framework programme (FP2012-2015) under grant agreement no. 262060 (ExpeER). The research leading to these results has received funding principally from the ANR (ANR-11-INBS-0001), AllEnvi, CNRS-INSU. We would like to thank the National Research Infrastructure ‘Agro-écosystèmes, Cycles Biogéochimique et Biodiversité (SOERE-ACBB http://www.soere-acbb.com/fr/) for their support in field experiment. We are deeply indebted to Christophe deBerranger, Xavier Charrier for their substantial technical assistance and Patricia Laville for her valuables suggestion regarding N2O flux estimation.
Resumo:
Assays that assess cellular mediated immune responses performed under Good Clinical Laboratory Practice (GCLP) guidelines are required to provide specific and reproducible results. Defined validation procedures are required to establish the Standard Operating Procedure (SOP), include pass and fail criteria, as well as implement positivity criteria. However, little to no guidance is provided on how to perform longitudinal assessment of the key reagents utilized in the assay. Through the External Quality Assurance Program Oversight Laboratory (EQAPOL), an Interferon-gamma (IFN-γ) Enzyme-linked immunosorbent spot (ELISpot) assay proficiency testing program is administered. A limit of acceptable within site variability was estimated after six rounds of proficiency testing (PT). Previously, a PT send-out specific within site variability limit was calculated based on the dispersion (variance/mean) of the nine replicate wells of data. Now an overall 'dispersion limit' for the ELISpot PT program within site variability has been calculated as a dispersion of 3.3. The utility of this metric was assessed using a control sample to calculate the within (precision) and between (accuracy) experiment variability to determine if the dispersion limit could be applied to bridging studies (studies that assess lot-to-lot variations of key reagents) for comparing the accuracy of results with new lots to results with old lots. Finally, simulations were conducted to explore how this dispersion limit could provide guidance in the number of replicate wells needed for within and between experiment variability and the appropriate donor reactivity (number of antigen-specific cells) to be used for the evaluation of new reagents. Our bridging study simulations indicate using a minimum of six replicate wells of a control donor sample with reactivity of at least 150 spot forming cells per well is optimal. To determine significant lot-to-lot variations use the 3.3 dispersion limit for between and within experiment variability.
Resumo:
Family health history (FHH) in the context of risk assessment has been shown to positively impact risk perception and behavior change. The added value of genetic risk testing is less certain. The aim of this study was to determine the impact of Type 2 Diabetes (T2D) FHH and genetic risk counseling on behavior and its cognitive precursors. Subjects were non-diabetic patients randomized to counseling that included FHH +/- T2D genetic testing. Measurements included weight, BMI, fasting glucose at baseline and 12 months and behavioral and cognitive precursor (T2D risk perception and control over disease development) surveys at baseline, 3, and 12 months. 391 subjects enrolled of which 312 completed the study. Behavioral and clinical outcomes did not differ across FHH or genetic risk but cognitive precursors did. Higher FHH risk was associated with a stronger perceived T2D risk (pKendall < 0.001) and with a perception of "serious" risk (pKendall < 0.001). Genetic risk did not influence risk perception, but was correlated with an increase in perception of "serious" risk for moderate (pKendall = 0.04) and average FHH risk subjects (pKendall = 0.01), though not for the high FHH risk group. Perceived control over T2D risk was high and not affected by FHH or genetic risk. FHH appears to have a strong impact on cognitive precursors of behavior change, suggesting it could be leveraged to enhance risk counseling, particularly when lifestyle change is desirable. Genetic risk was able to alter perceptions about the seriousness of T2D risk in those with moderate and average FHH risk, suggesting that FHH could be used to selectively identify individuals who may benefit from genetic risk testing.
Resumo:
In an attempt to reduce the heart failure epidemic,screening and prevention will become an increasing focus ofmanagement in the wider at-risk population. Refining riskprediction through the use of biomarkers in isolation or incombination is emerging as a critical step in this process.The utility of biomarkers to identify disease manifestationsbefore the onset of symptoms and detrimental myocardialdamage is proving to be valuable. In addition, biomarkers thatpredict the likelihood and rate of disease progression over timewill help streamline and focus clinical efforts and therapeuticstrategies. Importantly, several recent early intervention studiesusing biomarker strategies are promising and indicate thatnot only can new-onset heart failure be reduced but also thedevelopment of other cardiovascular conditions.
Resumo:
The BlackEnergy malware targeting critical infrastructures has a long history. It evolved over time from a simple DDoS platform to a quite sophisticated plug-in based malware. The plug-in architecture has a persistent malware core with easily installable attack specific modules for DDoS, spamming, info-stealing, remote access, boot-sector formatting etc. BlackEnergy has been involved in several high profile cyber physical attacks including the recent Ukraine power grid attack in December 2015. This paper investigates the evolution of BlackEnergy and its cyber attack capabilities. It presents a basic cyber attack model used by BlackEnergy for targeting industrial control systems. In particular, the paper analyzes cyber threats of BlackEnergy for synchrophasor based systems which are used for real-time control and monitoring functionalities in smart grid. Several BlackEnergy based attack scenarios have been investigated by exploiting the vulnerabilities in two widely used synchrophasor communication standards: (i) IEEE C37.118 and (ii) IEC 61850-90-5. Specifically, the paper addresses reconnaissance, DDoS, man-in-the-middle and replay/reflection attacks on IEEE C37.118 and IEC 61850-90-5. Further, the paper also investigates protection strategies for detection and prevention of BlackEnergy based cyber physical attacks.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.
Resumo:
International research with regard to the intended as well as to the unintended outcomes and effects of high-stakes testing shows that the impact of high-stakes tests has important consequences for the participants involved in the respective educational systems. The purpose of this special issue is to examine the implementation of high-stakes testing in different national school systems and to refer to the effects in view of the concept of Educational Governance. (DIPF/Orig.)
Resumo:
Invasive insects that successfully establish in introduced areas can significantly alter natural communities. These pests require specific establishment criteria (e.g. host suitability) that, when known, can help quantify potential damage to infested areas. Emerald ash borer (Agrilus planipennis [Coleoptera: Buprestidae]) is an invasive phloem-feeding pest which is responsible for the death of millions of ash trees (Fraxinus spp. L.). Over 200 surviving ash trees were previously identified in the Huron-Clinton Metroparks located in southeast Michigan. Trees were assessed over a four year period and a hierarchical cluster analysis was performed on dieback, vigor, and presence of signs and symptoms, in order to place trees into one of three tolerance groups. The clustering of trees with different responses to emerald ash borer attack suggests that there are different tolerance levels in North American ash trees in southeastern Michigan, and these groups were designated as apparently tolerant, not tolerant and intermediate tolerance. Adult landing rates and evidence of adult emergence were significantly lower in the apparently tolerant group compared with the not tolerant group, but larval survival from eggs placed on trees did not differ between tolerance groups. Therefore, it appears that apparently tolerant trees survive because they are less attractive to adult beetles which results in fewer eggs being laid on them. Trees in the apparently tolerant group remained of higher vigor over the four years of the study. North American ash may survive the emerald ash borer epidemic due to natural variation and inherent resistance regardless of the lack of co-evolutionary history with emerald ash borer.