914 resultados para Testing and Debugging
Resumo:
A system for the NDI' testing of the integrity of conposite materials and of adhesive bonds has been developed to meet industrial requirements. The vibration techniques used were found to be applicable to the development of fluid measuring transducers. The vibrational spectra of thin rectangular bars were used for the NDT work. A machined cut in a bar had a significant effect on the spectrum but a genuine crack gave an unambiguous response at high amplitudes. This was the generation of fretting crack noise at frequencies far above that of the drive. A specially designed vibrational decrement meter which, in effect, measures mechanical energy loss enabled a numerical classification of material adhesion to be obtained. This was used to study bars which had been flame or plasma sprayed with a variety of materials. It has become a useful tool in optimising coating methods. A direct industrial application was to classify piston rings of high performance I.C. engines. Each consists of a cast iron ring with a channel into which molybdenum, a good bearing surface, is sprayed. The NDT classification agreed quite well with the destructive test normally used. The techniques and equipment used for the NOT work were applied to the development of the tuning fork transducers investigated by Hassan into commercial density and viscosity devices. Using narrowly spaced, large area tines a thin lamina of fluid is trapped between them. It stores a large fraction of the vibrational energy which, acting as an inertia load reduces the frequency. Magnetostrictive and piezoelectric effects together or in combination enable the fork to be operated through a flange. This allows it to be used in pipeline or 'dipstick' applications. Using a different tine geometry the viscosity loading can be predoninant. This as well as the signal decrement of the density transducer makes a practical viscometer.
Resumo:
The airway epithelium is the first point of contact in the lung for inhaled material, including infectious pathogens and particulate matter, and protects against toxicity from these substances by trapping and clearance via the mucociliary escalator, presence of a protective barrier with tight junctions and initiation of a local inflammatory response. The inflammatory response involves recruitment of phagocytic cells to neutralise and remove and invading materials and is oftern modelled using rodents. However, development of valid in vitro airway epithelial models is of great importance due to the restrictions on animal studies for cosmetic compound testing implicit in the 7th amendment to the European Union Cosmetics Directive. Further, rodent innate immune responses have fundamental differences to human. Pulmonary endothelial cells and leukocytes are also involved in the innate response initiated during pulmonary inflammation. Co-culture models of the airways, in particular where epithelial cells are cultured at air liquid interface with the presence of tight junctions and differentiated mucociliary cells, offer a solution to this problem. Ideally validated models will allow for detection of early biomarkers of response to exposure and investigation into inflammatory response during exposure. This thesis describes the approaches taken towards developing an in vitro epithelial/endothelial cell model of the human airways and identification biomarkers of response to exposure to xenobiotics. The model comprised normal human primary microvascular endothelial cells and the bronchial epithelial cell line BEAS-2B or normal human bronchial epithelial cells. BEAS-2B were chosen as their characterisation at air liquid interface is limited but they are robust in culture, thereby predicted to provide a more reliable test system. Proteomics analysis was undertaken on challenged cells to investigate biomarkers of exposure. BEAS-2B morphology was characterised at air liquid interface compared with normal human bronchial epithelial cells. The results indicate that BEAS-2B cells at an air liquid interface form tight junctions as shown by expression of the tight junction protein zonula occludens-1. To this author’s knowledge this is the first time this result has been reported. The inflammatory response of BEAS-2B (measured as secretion of the inflammatory mediators interleukin-8 and -6) air liquid interface mono-cultures to Escherichia coli lipopolysaccharide or particulate matter (fine and ultrafine titanium dioxide) was comparable to published data for epithelial cells. Cells were also exposed to polymers of “commercial interest” which were in the nanoparticle range (and referred to particles hereafter). BEAS-2B mono-cultures showed an increased secretion of inflammatory mediators after challenge. Inclusion of microvascular endothelial cells resulted in protection against LPS- and particle- induced epithelial toxicity, measured as cell viability and inflammatory response, indicating the importance of co-cultures for investigations into toxicity. Two-dimensional proteomic analysis of lysates from particle-challenged cells failed to identify biomarkers of toxicity due to assay interference and experimental variability. Separately, decreased plasma concentrations of serine protease inhibitors, and the negative acute phase proteins transthyretin, histidine-rich glycoprotein and alpha2-HS glycoprotein were identified as potential biomarkers of methyl methacrylate/ethyl methacrylate/butylacrylate treatment in rats.
Resumo:
This article describes some approaches to problem of testing and documenting automation in information systems with graphical user interface. Combination of data mining methods and theory of finite state machines is used for testing automation. Automated creation of software documentation is based on using metadata in documented system. Metadata is built on graph model. Described approaches improve performance and quality of testing and documenting processes.
Resumo:
A versenyképesség, illetve a gazdaságos működés elengedhetetlen feltétele a fogyasztói elégedettség, melynek egyik meghatározó eleme az észlelt és elvárt minőség közti kapcsolat. A minőségi elvárások az internettel, mint napjaink egyik meghatározó csatornájával kapcsolatban is megfogalmazódtak már, így kapott jelentős szerepet az online szolgáltatásminőség meghatározása, illetve ezzel összekapcsolódva az online-fogyasztói elégedettségmérés. A tanulmány célja, hogy szakirodalmi áttekintést nyújtson a témában, és a szakirodalomból ismert E-S-QUAL és E-RecS-QUAL online-fogyasztói elégedettségmérésre szolgáló skálát megvizsgálja, érvényességét a magyar körülmények között letesztelje, és a szükségesnek látszó módosítások elvégzésével egy Magyarországon használható skálát hozzon létre. Az online-fogyasztók elégedettségmérésének alapjaként az online szolgáltatásminőség fogyasztói érzékelésével, illetve értékelésével kapcsolatos elméleteket járja körbe a tanulmány, és ezután kerül sor a különböző mérési módszerek bemutatására, kiemelt szerepet szánva az E-S-QUAL és E-RecS-QUAL skálának, mely az egyik leginkább alkalmazott módszernek számít. Az áttekintés középpontjában azok a honlapok állnak, melyeken vásárolni is lehet, a kutatást pedig az egyik jelentős hazai online könyvesbolt ügyfélkörében végeztem el. ______ Over the last decade the business-to-consumer online market has been growing very fast. In marketing literature a lot of studies have been created focusing on understanding and measuring e-service quality (e-sq) and online-customer satisfaction. The aim of the study is to summarize these concepts, analyse the relationship between e-sq and customer’s loyalty, which increases the competitiveness of the companies, and to create a valid and reliable scale to the Hungarian market for measuring online-customer satisfaction. The base of the empirical study is the E-S-QUAL and its second scale the E-RecS-QUAL that are widely used multiple scales measuring e-sq with seven dimensions: efficiency, system availability, fulfilment, privacy, responsiveness, compensation, and contact. The study is focusing on the websites customers use to shop online.
Resumo:
This study identifies and describes HIV Voluntary Counseling and Testing (VCT) of middle aged and older Latinas. The rate of new cases of HIV in people age 45 and older is rapidly increasing, with a 40.6% increase in the numbers of older Latinas infected with HIV between 1998 and 2002. Despite this increase, there is paucity of research on this population. This research seeks to address the gap through a secondary data analysis of Latina women. The aim of this study is twofold: (1) Develop and empirically test a multivariate model of VCT utilization for middle aged and older Latinas; (2) To test how the three individual components of the Andersen Behavioral Model impact VCT for middle aged and older Latinas. The study is organized around the three major domains of the Andersen Behavioral Model of service use that include: (a) predisposing factors; (b) enabling characteristics and (c) need. Logistic regression using structural equation modeling techniques were used to test multivariate relationships of variables on VCT for a sample of 135 middle age and older Latinas residing in Miami-Dade County, Florida. Over 60% of participants had been tested for HIV. Provider endorsement was found to he the strongest predictor of VCT (odds ration [OR] 6.38), followed by having a clinic as a regular source of healthcare (OR=3.88). Significant negative associations with VCT included self rated health status (OR=.592); Age (OR=.927); Spanish proficiency (OR=.927); number of sexual partners (OR=.613) and consumption of alcohol during sexual activity (.549). As this line of inquiry provides a critical glimpse into the VCT of older Latinas, recommendations for enhanced service provision and research will he offered.
Resumo:
The purpose of this investigation was to develop and implement a general purpose VLSI (Very Large Scale Integration) Test Module based on a FPGA (Field Programmable Gate Array) system to verify the mechanical behavior and performance of MEM sensors, with associated corrective capabilities; and to make use of the evolving System-C, a new open-source HDL (Hardware Description Language), for the design of the FPGA functional units. System-C is becoming widely accepted as a platform for modeling, simulating and implementing systems consisting of both hardware and software components. In this investigation, a Dual-Axis Accelerometer (ADXL202E) and a Temperature Sensor (TMP03) were used for the test module verification. Results of the test module measurement were analyzed for repeatability and reliability, and then compared to the sensor datasheet. Further study ideas were identified based on the study and results analysis. ASIC (Application Specific Integrated Circuit) design concepts were also being pursued.
Resumo:
The lead author, Nimai Senapati (Post doc), was funded by the European community’s Seventh Framework programme (FP2012-2015) under grant agreement no. 262060 (ExpeER). The research leading to these results has received funding principally from the ANR (ANR-11-INBS-0001), AllEnvi, CNRS-INSU. We would like to thank the National Research Infrastructure ‘Agro-écosystèmes, Cycles Biogéochimique et Biodiversité (SOERE-ACBB http://www.soere-acbb.com/fr/) for their support in field experiment. We are deeply indebted to Christophe deBerranger, Xavier Charrier for their substantial technical assistance and Patricia Laville for her valuables suggestion regarding N2O flux estimation.
Resumo:
Assays that assess cellular mediated immune responses performed under Good Clinical Laboratory Practice (GCLP) guidelines are required to provide specific and reproducible results. Defined validation procedures are required to establish the Standard Operating Procedure (SOP), include pass and fail criteria, as well as implement positivity criteria. However, little to no guidance is provided on how to perform longitudinal assessment of the key reagents utilized in the assay. Through the External Quality Assurance Program Oversight Laboratory (EQAPOL), an Interferon-gamma (IFN-γ) Enzyme-linked immunosorbent spot (ELISpot) assay proficiency testing program is administered. A limit of acceptable within site variability was estimated after six rounds of proficiency testing (PT). Previously, a PT send-out specific within site variability limit was calculated based on the dispersion (variance/mean) of the nine replicate wells of data. Now an overall 'dispersion limit' for the ELISpot PT program within site variability has been calculated as a dispersion of 3.3. The utility of this metric was assessed using a control sample to calculate the within (precision) and between (accuracy) experiment variability to determine if the dispersion limit could be applied to bridging studies (studies that assess lot-to-lot variations of key reagents) for comparing the accuracy of results with new lots to results with old lots. Finally, simulations were conducted to explore how this dispersion limit could provide guidance in the number of replicate wells needed for within and between experiment variability and the appropriate donor reactivity (number of antigen-specific cells) to be used for the evaluation of new reagents. Our bridging study simulations indicate using a minimum of six replicate wells of a control donor sample with reactivity of at least 150 spot forming cells per well is optimal. To determine significant lot-to-lot variations use the 3.3 dispersion limit for between and within experiment variability.
Resumo:
Family health history (FHH) in the context of risk assessment has been shown to positively impact risk perception and behavior change. The added value of genetic risk testing is less certain. The aim of this study was to determine the impact of Type 2 Diabetes (T2D) FHH and genetic risk counseling on behavior and its cognitive precursors. Subjects were non-diabetic patients randomized to counseling that included FHH +/- T2D genetic testing. Measurements included weight, BMI, fasting glucose at baseline and 12 months and behavioral and cognitive precursor (T2D risk perception and control over disease development) surveys at baseline, 3, and 12 months. 391 subjects enrolled of which 312 completed the study. Behavioral and clinical outcomes did not differ across FHH or genetic risk but cognitive precursors did. Higher FHH risk was associated with a stronger perceived T2D risk (pKendall < 0.001) and with a perception of "serious" risk (pKendall < 0.001). Genetic risk did not influence risk perception, but was correlated with an increase in perception of "serious" risk for moderate (pKendall = 0.04) and average FHH risk subjects (pKendall = 0.01), though not for the high FHH risk group. Perceived control over T2D risk was high and not affected by FHH or genetic risk. FHH appears to have a strong impact on cognitive precursors of behavior change, suggesting it could be leveraged to enhance risk counseling, particularly when lifestyle change is desirable. Genetic risk was able to alter perceptions about the seriousness of T2D risk in those with moderate and average FHH risk, suggesting that FHH could be used to selectively identify individuals who may benefit from genetic risk testing.
Resumo:
In an attempt to reduce the heart failure epidemic,screening and prevention will become an increasing focus ofmanagement in the wider at-risk population. Refining riskprediction through the use of biomarkers in isolation or incombination is emerging as a critical step in this process.The utility of biomarkers to identify disease manifestationsbefore the onset of symptoms and detrimental myocardialdamage is proving to be valuable. In addition, biomarkers thatpredict the likelihood and rate of disease progression over timewill help streamline and focus clinical efforts and therapeuticstrategies. Importantly, several recent early intervention studiesusing biomarker strategies are promising and indicate thatnot only can new-onset heart failure be reduced but also thedevelopment of other cardiovascular conditions.
Resumo:
Ageing and deterioration of infrastructure is a challenge facing transport authorities. In
particular, there is a need for increased bridge monitoring in order to provide adequate
maintenance and to guarantee acceptable levels of transport safety. The Intelligent
Infrastructure group at Queens University Belfast (QUB) are working on a number of aspects
of infrastructure monitoring and this paper presents summarised results from three distinct
monitoring projects carried out by this group. Firstly the findings from a project on next
generation Bridge Weight in Motion (B-WIM) are reported, this includes full scale field testing
using fibre optic strain sensors. Secondly, results from early phase testing of a computer
vision system for bridge deflection monitoring are reported on. This research seeks to exploit
recent advances in image processing technology with a view to developing contactless
bridge monitoring approaches. Considering the logistical difficulty of installing sensors on a
‘live’ bridge, contactless monitoring has some inherent advantages over conventional
contact based sensing systems. Finally the last section of the paper presents some recent
findings on drive by bridge monitoring. In practice a drive-by monitoring system will likely
require GPS to allow the response of a given bridge to be identified; this study looks at the
feasibility of using low-cost GPS sensors for this purpose, via field trials. The three topics
outlined above cover a spectrum of SHM approaches namely, wired monitoring, contactless
monitoring and drive by monitoring
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08