896 resultados para Software testing. Test generation. Grammars
Resumo:
BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.
Resumo:
Background. Molecular tests for breast cancer (BC) risk assessment are reimbursed by health insurances in Switzerland since the beginning of year 2015. The main current role of these tests is to help oncologists to decide about the usefulness of adjuvant chemotherapy in patients with early stage endocrine-sensitive and human epidermal growth factor receptor 2 (HER2)-negative BC. These gene expression signatures aim at predicting the risk of recurrence in this subgroup. One of them (OncotypeDx/OT) also predicts distant metastases rate with or without the addition of cytotoxic chemotherapy to endocrine therapy. The clinical utility of these tests -in addition to existing so-called "clinico-pathological" prognostic and predictive criteria (e.g. stage, grade, biomarkers status)-is still debated. We report a single center one year experience of the use of one molecular test (OT) in clinical decision making. Methods. We extracted from the CHUV Breast Cancer Center data base the total number of BC cases with estrogen-receptor positive (ER+), HER2-negative early breast cancer (node negative (pN0) disease or micrometastases in up to 3 lymph nodes) operated between September 2014 and August 2015. For the cases from this group in which a molecular test had been decided by the tumor board, we collected the clinicopathologic parameters, the initial tumor board decision, and the final adjuvant systemic therapy decision. Results. A molecular test (OT) was done in 12.2% of patients with ER + HER2 negative early BC. The median age was 57.4 years and the median invasive tumor size was 1.7 cm. These patients were classified by ODX testing (Recurrence Score) into low-, intermediate-, and high risk groups, respectively in 27.2%, 63.6% and 9% of cases. Treatment recommendations changed in 18.2%, predominantly from chemotherapyendocrine therapy to endocrine treatment alone. Of 8 patients originally recommended chemotherapy, 25% were recommended endocrine treatment alone after receiving the Recurrence Score result. Conclusions. Though reimbursed by health insurances since January 2015, molecular tests are used moderately in our institution as per the decision of the multidisciplinary tumor board. It's mainly used to obtain a complementary confirmation supporting the decision of no chemotherapy. The OncotypeDx Recurrence Score results were in the intermediate group in 66% of the 9 tested cases but contributed to avoid chemotherapy in 2 patients during the last 12 months.
Resumo:
The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.
Resumo:
A major problem in developmental neurotoxicity (DNT) risk assessment is the lack of toxicological hazard information for most compounds. Therefore, new approaches are being considered to provide adequate experimental data that allow regulatory decisions. This process requires a matching of regulatory needs on the one hand and the opportunities provided by new test systems and methods on the other hand. Alignment of academically and industrially driven assay development with regulatory needs in the field of DNT is a core mission of the International STakeholder NETwork (ISTNET) in DNT testing. The first meeting of ISTNET was held in Zurich on 23-24 January 2014 in order to explore the concept of adverse outcome pathway (AOP) to practical DNT testing. AOPs were considered promising tools to promote test systems development according to regulatory needs. Moreover, the AOP concept was identified as an important guiding principle to assemble predictive integrated testing strategies (ITSs) for DNT. The recommendations on a road map towards AOP-based DNT testing is considered a stepwise approach, operating initially with incomplete AOPs for compound grouping, and focussing on key events of neurodevelopment. Next steps to be considered in follow-up activities are the use of case studies to further apply the AOP concept in regulatory DNT testing, making use of AOP intersections (common key events) for economic development of screening assays, and addressing the transition from qualitative descriptions to quantitative network modelling.
Resumo:
Matkapuhelinverkot kehittyvät jatkuvasti tarjoten asiakkailleen uusia palveluja ja nopeampia datayhteyksiä. Verkkojen eri protokollien testaamisessa käytetään apuna tietoliikenneanalysaattoreita, joiden avulla matkapuhelinverkkojen eri rajapinnoissa liikkuvaa informaatiota voidaan tutkia yksityiskohtaisesti. Tämän työn tarkoituksena oli suunnitella ja toteuttaa etämonitorointianalysaattorin testauksessa käytettävä testausohjelmisto ICONIX-prosessin avulla. Suunnitteluun katsottiin kuuluvan prosessiin mukaiset vaatimusmäärittelyn, analyysin ja alustavan suunnittelun sekä yksityiskohtaisen suunnittelun vaiheet. Toteutus muodostui vastaavasti ohjelmointityöstä ja yksikkötestauksesta. Työn tuloksena saatiin suunnittelun ja toteutuksen aikana syntyneet erilaiset kaaviot ja ohjelmakoodi. Lisäksi testausohjelmistoa käytettiin etämonitorointianalysaattorin toiminnallisuus- ja suorituskykytesteissä, joiden perusteella arvioitiin toteutetun testausohjelmiston toimivuutta. Testausohjelmiston todettiin sopivan etämonitorointianalysaattorin testaukseen, sillä niin toiminnallisuustestit kuin kuormitustestitkin saatiin suoritettua onnistuneesti toteutetun testausohjelmiston avulla. ICONIX-prosessin todettiin sopivan testausohjelmiston suunnitteluun, vaikka testausohjelmisto onkin toimintaperiaatteeltaan erilainen, kuin prosessia esittelevissä lähteissä esimerkkeinä käytetyt ohjelmistot. Eri suunnitteluvaiheisiin kului prosessiin tottumattomalta aikaa, mutta toisaalta laadittuja suunnitelmia ei tarvinnut enää toteutusvaiheen aikana muuttaa ja ohjelmointityö oli hyvin suoraviivaista.
Resumo:
Different types of aerosolization and deagglomeration testing systems exist for studying the properties of nanomaterial powders and their aerosols. However, results are dependent on the specific methods used. In order to have well-characterized aerosols, we require a better understanding of how system parameters and testing conditions influence the properties of the aerosols generated. In the present study, four experimental setups delivering different aerosolization energies were used to test the resultant aerosols of two distinct nanomaterials (hydrophobic and hydrophilic TiO2). The reproducibility of results within each system was good. However, the number concentrations and size distributions of the aerosols created varied across the four systems; for number concentrations, e.g., from 10(3) to 10(6) #/cm(3). Moreover, distinct differences were also observed between the two materials with different surface coatings. The article discusses how system characteristics and other pertinent conditions modify the test results. We propose using air velocity as a suitable proxy for estimating energy input levels in aerosolization systems. The information derived from this work will be especially useful for establishing standard operating procedures for testing nanopowders, as well as for estimating their release rates under different energy input conditions, which is relevant for occupational exposure.
Resumo:
AIM: To develop and test the Parental PELICAN Questionnaire, an instrument to retrospectively assess parental experiences and needs during their child's end-of-life care. BACKGROUND: To offer appropriate care for dying children, healthcare professionals need to understand the illness experience from the family perspective. A questionnaire specific to the end-of-life experiences and needs of parents losing a child is needed to evaluate the perceived quality of paediatric end-of-life care. DESIGN: This is an instrument development study applying mixed methods based on recommendations for questionnaire design and validation. METHOD: The Parental PELICAN Questionnaire was developed in four phases between August 2012-March 2014: phase 1: item generation; phase 2: validity testing; phase 3: translation; phase 4: pilot testing. Psychometric properties were assessed after applying the Parental PELICAN Questionnaire in a sample of 224 bereaved parents in April 2014. Validity testing covered the evidence based on tests of content, internal structure and relations to other variables. RESULTS: The Parental PELICAN Questionnaire consists of approximately 90 items in four slightly different versions accounting for particularities of the four diagnostic groups. The questionnaire's items were structured according to six quality domains described in the literature. Evidence of initial validity and reliability could be demonstrated with the involvement of healthcare professionals and bereaved parents. CONCLUSION: The Parental PELICAN Questionnaire holds promise as a measure to assess parental experiences and needs and is applicable to a broad range of paediatric specialties and settings. Future validation is needed to evaluate its suitability in different cultures.
Resumo:
La dermatite irritative est décrite comme une réaction réversible, non immunologique caractérisée par des lésions d'aspect très variable, allant de la simple rougeur jusqu'à la formation de bulles voire d'une nécrose, accompagnée de prurit ou d'une sensation de brûlure suite à I' application d'une substance chimique. Le teste de prédiction d'irritation cutanée est traditionnellement depuis les années 1940 le Test de Draize. Ce test consiste en l'application d'une substance chimique sur une peau rasée de lapin pendant 4h et de regarder à 24h si des signes cliniques d'irritations sont présents. Cette méthode critiquable autant d'un point de vue éthique que qualitative reste actuellement le teste le plus utilisé. Depuis le début des années 2000 de nouvelles méthodes in vitro se sont développées tel que le model d'épiderme humain recombiné (RHE). Il s agit d'une multicouche de kératinocyte bien différencié obtenu depuis une culture de don d'ovocyte. Cependant cette méthode en plus d'être très couteuse n'obtient au mieux que 76% de résultat similaire comparé au test in vivo humain. Il existe donc la nécessité de développer une nouvelle méthode in vitro qui simulerait encore mieux la réalité anatomique et physiologique retrouvée in vivo. Notre objectif a été de développer cette nouvelle méthode in vitro. Pour cela nous avons travaillé avec de la peau humaine directement prélevée après une abdominoplastie. Celle ci après préparation avec un dermatome, un couteau dont la lame est réglable pour découper l'épaisseur souhaitée de peau, est montée dans un système de diffusion cellulaire. La couche cornée est alors exposée de manière optimale à 1 ml de la substance chimique testée pendant 4h. L'échantillon de peau est alors fixé dans du formaldéhyde pour permettre la préparation de lames standards d'hématoxyline et éosine. L'irritation est alors investiguée selon des critères histopathologiques de spongioses, de nécroses et de vacuolisations cellulaires. Les résultats de ce.tte première batterie de testes sont plus que prometteurs. En effet, comparé au résultat in vivo, nous obtenons 100% de concordance pour les 4 même substances testes irritantes ou non irritantes, ce qui est supérieur au model d épiderme humain recombiné (76%). De plus le coefficient de variation entre les 3 différentes séries est inférieur à 0.1 ce qui montre une bonne reproductibilité dans un même laboratoire. Dans le futur cette méthode va devoir être testée avec un plus grand nombre de substances chimiques et sa reproductibilité évaluée dans différents laboratoires. Mais cette première evaluation, très encourageante, ouvre des pistes précieuses pour l'avenir des tests irritatifs.
Resumo:
Identification of CD8+ cytotoxic T lymphocyte (CTL) epitopes has traditionally relied upon testing of overlapping peptide libraries for their reactivity with T cells in vitro. Here, we pursued deep ligand sequencing (DLS) as an alternative method of directly identifying those ligands that are epitopes presented to CTLs by the class I human leukocyte antigens (HLA) of infected cells. Soluble class I HLA-A*11:01 (sHLA) was gathered from HIV-1 NL4-3-infected human CD4+ SUP-T1 cells. HLA-A*11:01 harvested from infected cells was immunoaffinity purified and acid boiled to release heavy and light chains from peptide ligands that were then recovered by size-exclusion filtration. The ligands were first fractionated by high-pH high-pressure liquid chromatography and then subjected to separation by nano-liquid chromatography (nano-LC)–mass spectrometry (MS) at low pH. Approximately 10 million ions were selected for sequencing by tandem mass spectrometry (MS/MS). HLA-A*11:01 ligand sequences were determined with PEAKS software and confirmed by comparison to spectra generated from synthetic peptides. DLS identified 42 viral ligands presented by HLA-A*11:01, and 37 of these were previously undetected. These data demonstrate that (i) HIV-1 Gag and Nef are extensively sampled, (ii) ligand length variants are prevalent, particularly within Gag and Nef hot spots where ligand sequences overlap, (iii) noncanonical ligands are T cell reactive, and (iv) HIV-1 ligands are derived from de novo synthesis rather than endocytic sampling. Next-generation immunotherapies must factor these nascent HIV-1 ligand length variants and the finding that CTL-reactive epitopes may be absent during infection of CD4+ T cells into strategies designed to enhance T cell immunity.
Resumo:
Objective: The present study was aimed at evaluating the viability of replacing 18F with 99mTc in dose calibrator linearity testing. Materials and Methods: The test was performed with sources of 99mTc (62 GBq) and 18F (12 GBq) whose activities were measured up to values lower than 1 MBq. Ratios and deviations between experimental and theoretical 99mTc and 18F sources activities were calculated and subsequently compared. Results: Mean deviations between experimental and theoretical 99mTc and 18F sources activities were 0.56 (± 1.79)% and 0.92 (± 1.19)%, respectively. The mean ratio between activities indicated by the device for the 99mTc source as measured with the equipment pre-calibrated to measure 99mTc and 18F was 3.42 (± 0.06), and for the 18F source this ratio was 3.39 (± 0.05), values considered constant over the measurement time. Conclusion: The results of the linearity test using 99mTc were compatible with those obtained with the 18F source, indicating the viability of utilizing both radioisotopes in dose calibrator linearity testing. Such information in association with the high potential of radiation exposure and costs involved in 18F acquisition suggest 99mTc as the element of choice to perform dose calibrator linearity tests in centers that use 18F, without any detriment to the procedure as well as to the quality of the nuclear medicine service.
Resumo:
Many educators and educational institutions have yet to integrate web-based practices into their classrooms and curricula. As a result, it can be difficult to prototype and evaluate approaches to transforming classrooms from static endpoints to dynamic, content-creating nodes in the online information ecosystem. But many scholastic journalism programs have already embraced the capabilities of the Internet for virtual collaboration, dissemination, and reader participation. Because of this, scholastic journalism can act as a test-bed for integrating web-based sharing and collaboration practices into classrooms. Student Journalism 2.0 was a research project to integrate open copyright licenses into two scholastic journalism programs, to document outcomes, and to identify recommendations and remaining challenges for similar integrations. Video and audio recordings of two participating high school journalism programs informed the research. In describing the steps of our integration process, we note some important legal, technical, and social challenges. Legal worries such as uncertainty over copyright ownership could lead districts and administrators to disallow open licensing of student work. Publication platforms among journalism classrooms are far from standardized, making any integration of new technologies and practices difficult to achieve at scale. And teachers and students face challenges re-conceptualizing the role their class work can play online.
Resumo:
Tässä diplomityössä tutkitaan kolmiulotteisen konenak6järjestelmän soveltuvuutta matkapuhelimien alikokoonpanojen testaukseen massatuotannossa. Tavoitteena oli saada tietoa 3D järjestelmän käytettävyydestä suorituskyvyn ja kapasiteetin suhteen sekä selvittää järjestelmän joustavuutta testattavia tuotteita vaihdettaessa. Haluttiin myös selvittää, että ratkaiseeko 3D järjestelmä käytössä olevan 2D järjestelmän kanssa ilmenneitä ongelmia Järjestelmän suorituskykyä tutkittiin tekemällä testiajoja, joista saatuja tuloksia analysoitiin tilastollisella laadunvalvontaohjelmistolla. Varsinaisen kehitystyön teki konenäköjärjestelmiin erikoistunut alihankkija, jolla on kokemusta 3D testauksesta muilta teollisuuden aloilta. Matkapuhelimien alikokoonpanojen testauksessa laatukriteerit ja tarkkuusvaatimukset ovat kuitenkin liian kovia saavutettaviksi 3D lasertriangulaatioon perustuvaa testausjärjestelmää käytettäessä
Resumo:
One of the primary goals for food packages is to protect food against harmful environment, especially oxygen and moisture. The gas transmission rate is the total gas transport through the package, both by permeation through the package material and by leakage through pinholes and cracks. The shelf life of a product can be extended, if the food is stored in a gas tight package. Thus there is a need to test gas tightness of packages. There are several tightness testing methods, and they can be broadly divided into destructive and nondestructive methods. One of the most sensitive methods to detect leaks is by using a non destructive tracer gas technique. Carbon dioxide, helium and hydrogen are the most commonly used tracer gases. Hydrogen is the lightest and the smallest of all gases, which allows it to escape rapidly from the leak areas. The low background concentration of H2 in air (0.5 ppm) enables sensitive leak detection. With a hydrogen leak detector it is also possible to locate leaks. That is not possible with many other tightness testing methods. The experimental work has been focused on investigating the factors which affect the measurement results with the H2leak detector. Also reasons for false results were searched to avoid them in upcoming measurements. From the results of these experiments, the appropriate measurement practice was created in order to have correct and repeatable results. The most important thing for good measurement results is to keep the probe of the detector tightly against the leak. Because of its high diffusion rate, the HZ concentration decreases quickly if holding the probe further away from the leak area and thus the measured H2 leaks would be incorrect and small leaks could be undetected. In the experimental part hydrogen, oxygen and water vapour transmissions through laser beam reference holes (diameters 1 100 μm) were also measured and compared. With the H2 leak detector it was possible to detect even a leakage through 1 μm (diameter) within a few seconds. Water vapour did not penetrate even the largest reference hole (100 μm), even at tropical conditions (38 °C, 90 % RH), whereas some O2 transmission occurred through the reference holes larger than 5 μm. Thus water vapour transmission does not have a significant effect on food deterioration, if the diameter of the leak is less than 100 μm, but small leaks (5 100 μm) are more harmful for the food products, which are sensitive to oxidation.
Resumo:
Control on regional government budgets is important in a monetary union as lower tiers of government have fewer incentives to consolidate debt. According to the Fiscal Theory of the Price Level; unsustainable non-Ricardian fiscal policies eventually force monetary policy to adjust. Hence, uncoordinated and non-regulated regional fiscal policies would therefore threaten price stability for the monetary union as a whole. However, the union central bank is not without defense. A federal government that internalises the spillover effect of non-Ricardian fiscal policies on the price level can offset non-Ricardian regional fiscal policies. A federal government, which taxes and transfers resources between regions, may compensate for unsustainable regional fiscal policies so as to keep fiscal policy Ricardian on aggregate. Following Canzoneri et al. (2001), we test the validity of the Fiscal Theory of the Price Level for both federal and regional governments in Germany. We find evidence of a spillover effect of unsustainable policies on the price level for other Länder. However, the German federal government offsets this effect on the price level by running Ricardian policies. These results have implications for the regulation of fiscal policies in the EMU.
Resumo:
The objective of this thesis is to shed light on the vertical vibration of granular materials for potential interest in the power generation industry. The main focus is investigating the drag force and frictional resistance that influence the movement of a granular material (in the form of glass beads) contained in a vessel, which is subjected to sinusoidal oscillation. The thesis is divided into three parts: theoretical analysis, experiments and computer simulations. The theoretical part of this study presents the underlying physical phenomena of the vibration of granular materials. Experiments are designed to determine fundamental parameters that contribute to the behavior of vibrating granular media. Numerical simulations include the use of three different software applications: FLUENT, LS-DYNA and ANSYS Workbench. The goal of these simulations is to test theoretical and semiempirical models for granular materials in order to validate their compatibility with the experimental findings, to assist in predicting their behavior, and to estimate quantities that are hard to measure in laboratory.