986 resultados para testing tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most studies investigating the determinants of R&D investment consider pooled estimates. However, if the parameters are heterogeneous, pooled coefficients may not provide reliable estimates of individual industry effects. Hence pooled parameters may conceal valuable information that may help target government tools more efficiently across heterogeneous industries. There is little evidence to date on the decomposition of the determinants of R&D investment by industry. Moreover, the existing work does not distinguish between those R&D determinants for which pooling may be valid and those for which it is not. In this paper, we test the pooling assumption for a panel of manufacturing industries and find that pooling is valid only for output fluctuations, adjustment costs and interest rates. Implementing the test results into our model, we find government funding is significant only for low-tech R&D. Foreign R&D and skilled labour matter only in high-tech sectors. These results suggest important implications for R&D policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Veterinary medicines (VMs) from agricultural industry can enter the environment in a number of ways. This includes direct exposure through aquaculture, accidental spillage and disposal, and indirect entry by leaching from manure or runoff after treatment. Many compounds used in animal treatments have ecotoxic properties that may have chronic or sometimes lethal effects when they come into contact with non-target organisms. VMs enter the environment in mixtures, potentially having additive effects. Traditional ecotoxicology tests are used to determine the lethal and sometimes reproductive effects on freshwater and terrestrial organisms. However, organisms used in ecotoxicology tests can be unrepresentative of the populations that are likely to be exposed to the compound in the environment. Most often the tests are on single compound toxicity but mixture effects may be significant and should be included in ecotoxicology testing. This work investigates the use, measured environmental concentrations (MECs) and potential impact of sea lice treatments on salmon farms in Scotland. Alternative methods for ecotoxicology testing including mixture toxicity, and the use of in silico techniques to predict the chronic impact of VMs on different species of aquatic organisms were also investigated. The Scottish Environmental Protection Agency (SEPA) provided information on the use of five sea lice treatments from 2008-2011 on Scottish salmon farms. This information was combined with the recently available data on sediment MECs for the years 2009-2012 provided by SEPA using ArcGIS 10.1. In depth analysis of this data showed that from a total of 55 sites, 30 sites had a MEC higher than the maximum allowable concentration (MAC) as set out by SEPA for emamectin benzoate and 7 sites had a higher MEC than MAC for teflubenzuron. A number of sites that were up to 16 km away from the nearest salmon farm reported as using either emamectin benzoate or teflubenzuron measured positive for the two treatments. There was no relationship between current direction and the distribution of the sea lice treatments, nor was there any evidence for alternative sources of the compounds e.g. land treatments. The sites that had MECs higher than the MAC could pose a risk to non-target organisms and disrupt the species dynamics of the area. There was evidence that some marine protected sites might be at risk of exposure to these compounds. To complement this work, effects on acute mixture toxicity of the 5 sea lice treatments, plus one major metabolite 3-phenoxybenzoic acid (3PBA), were measured using an assay using the bioluminescent bacteria Aliivibrio fischeri. When exposed to the 5 sea lice treatments and 3PBA A. fischeri showed a response to 3PBA, emamectin benzoate and azamethiphos as well as combinations of the three. In order to establish any additive effect of the sea lice treatments, the efficacy of two mixture prediction equations, concentration addition (CA) and independent action ii(IA) were tested using the results from single compound dose response curves. In this instance IA was the more effective prediction method with a linear regression confidence interval of 82.6% compared with 22.6% of CA. In silico molecular docking was carried out to predict the chronic effects of 15 VMs (including the five used as sea lice control). Molecular docking has been proposed as an alternative screening method for the chronic effects of large animal treatments on non-target organisms. Oestrogen receptor alpha (ERα) of 7 non-target bony fish and the African clawed frog Xenopus laevis were modelled using SwissModel. These models were then ‘docked’ to oestradiol, the synthetic oestrogen ethinylestradiol, two known xenoestrogens dichlorodiphenyltrichloroethane (DDT) and bisphenol A (BPA), the antioestrogen breast cancer treatment tamoxifen and 15 VMs using Auto Dock 4. Based on the results of this work, four VMs were identified as being possible xenoestrogens or anti-oestrogens; these were cypermethrin, deltamethrin, fenbendazole and teflubenzuron. Further investigation, using in vitro assays, into these four VMs has been suggested as future work. A modified recombinant yeast oestrogen screen (YES) was attempted using the cDNA of the ERα of the zebrafish Danio rerio and the rainbow trout Oncorhynchus mykiss. Due to time and difficulties in cloning protocols this work was unable to be completed. Use of such in vitro assays would allow for further investigation of the highlighted VMs into their oestrogenic potential. In conclusion, VMs used as sea lice treatments, such as teflubenzuron and emamectin benzoate may be more persistent and have a wider range in the environment than previously thought. Mixtures of sea lice treatments have been found to persist together in the environment, and effects of these mixtures on the bacteria A. fischeri can be predicted using the IA equation. Finally, molecular docking may be a suitable tool to predict chronic endocrine disrupting effects and identify varying degrees of impact on the ERα of nine species of aquatic organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: It is important to assess the clinical competence of nursing students to gauge their educational needs. Competence can be measured by self-assessment tools; however, Anema and McCoy (2010) contend that currently available measures should be further psychometrically tested.
Aim: To test the psychometric properties of Nursing Competencies Questionnaire (NCQ) and Self-Efficacy in Clinical Performance (SECP) clinical competence scales.
Method: A non-randomly selected sample of n=248 2nd year nursing students completed NCQ, SECP and demographic questionnaires (June and September 2013). Mokken Scaling Analysis (MSA) was used to investigate structural validity and scale properties; convergent and discriminant validity and reliability were also tested for each scale.
Results: MSA analysis identified that the NCQ is a unidimensional scale with strong scale scalability coefficients Hs =0.581; but limited item rankability HT =0.367. The SECP scale MSA suggested that the scale could be potentially split into two unidimensional scales (SECP28 and SECP7), each with good/reasonable scalablity psychometric properties as summed scales but negligible/very limited scale rankability (SECP28: Hs = 0.55, HT=0.211; SECP7: Hs = 0.61, HT=0.049). Analysis of between cohort differences and NCQ/SECP scores produced evidence of discriminant and convergent validity; good internal reliability was also found: NCQ α = 0.93, SECP28 α = 0.96 and SECP7 α=0.89.

Discussion: In line with previous research further evidence of the NCQ’s reliability and validity was demonstrated. However, as the SECP findings are new and the sample small with reference to Straat and colleagues (2014), the SECP results should be interpreted with caution and verified on a second sample.
Conclusions: Measurement of perceived self-competence could start early in a nursing programme to support students’ development of clinical competence. Further testing of the SECP scale with larger nursing student samples from different programme years is indicated.

References:
Anema, M., G and McCoy, JK. (2010) Competency-Based Nursing Education: Guide to Achieving Outstanding Learner Outcomes. New York: Springer.
Straat, JH., van der Ark, LA and Sijtsma, K. (2014) Minimum Sample Size Requirements for Mokken Scale Analysis Educational and Psychological Measurement 74 (5), 809-822.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Brazil, human and canine visceral leishmaniasis (CVL) caused by Leishmania infantum has undergone urbanisation since 1980, constituting a public health problem, and serological tests are tools of choice for identifying infected dogs. Until recently, the Brazilian zoonoses control program recommended enzyme-linked immunosorbent assays (ELISA) and indirect immunofluorescence assays (IFA) as the screening and confirmatory methods, respectively, for the detection of canine infection. The purpose of this study was to estimate the accuracy of ELISA and IFA in parallel or serial combinations. The reference standard comprised the results of direct visualisation of parasites in histological sections, immunohistochemical test, or isolation of the parasite in culture. Samples from 98 cases and 1,327 noncases were included. Individually, both tests presented sensitivity of 91.8% and 90.8%, and specificity of 83.4 and 53.4%, for the ELISA and IFA, respectively. When tests were used in parallel combination, sensitivity attained 99.2%, while specificity dropped to 44.8%. When used in serial combination (ELISA followed by IFA), decreased sensitivity (83.3%) and increased specificity (92.5%) were observed. Serial testing approach improved specificity with moderate loss in sensitivity. This strategy could partially fulfill the needs of public health and dog owners for a more accurate diagnosis of CVL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet has changed the way in which organizations communicate with their publics, and museums are not an exception. The consolidation of Web 2.0 has not only given museums access to a powerful new tool for disseminating information, but has involved significant changes in the relationship between institutions and their publics, facilitating and enhancing the interaction between them. The overall objective of this paper is to analyze the degree of interactivity implemented in the websites of major international art museums, in order to assess if museums are evolving towards more dialogic systems with relation to their publics. The results indicate that museums still have a low level of interactivity on their websites, both in the tools used to present information and the resources available for interaction with virtual visitors. But it has also observed that museums are progressively implementing interactive and dialogic sources, suggesting a clear trend towards new ways of managing these platforms in order to establish more participatory and collaborative communication systems with virtual users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology has an important role in children's lives and education. Based on several projects developed with ICT, both in Early Childhood Education (3-6 years old) and Primary Education (6-10 years old), since 1997, the authors argue that research and educational practices need to "go outside", addressing ways to connect technology with outdoor education. The experience with the projects and initiatives developed supported a conceptual framework, developed and discussed with several partners throughout the years and theoretically informed. Three main principles or axis have emerged: strengthening Children's Participation, promoting Critical Citizenship and establishing strong Connections to Pedagogy and Curriculum. In this paper, those axis will be presented and discussed in relation to the challenge posed by Outdoor Education to the way ICT in Early Childhood and Primary Education is understood, promoted and researched. The paper is exploratory, attempting to connect theoretical and conceptual contributions from Early Childhood Pedagogy with contributions from ICT in Education. The research-based knowledge available is still scarce, mostly based on studies developed with other purposes. The paper, therefore, focus the connections and interpellations between concepts established through the theoretical framework and draws on the almost 20 years of experience with large and small scale action-research projects of ICT in schools. The more recent one is already testing the conceptual framework by supporting children in non-formal contexts to explore vineyards and the cycle of wine production with several ICT tools. Approaching Outdoor Education as an arena where pedagogical and cultural dimensions influence decisions and practices, the paper tries to argue that the three axis are relevant in supporting a stronger connection between technology and the outdoor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The world currently faces a paradox in terms of accessibility for people with disabilities. While digital technologies hold immense potential to improve their quality of life, the majority of web content still exhibits critical accessibility issues. This PhD thesis addresses this challenge by proposing two interconnected research branches. The first introduces a groundbreaking approach to improving web accessibility by rethinking how it is approached, making it more accessible itself. It involves the development of: 1. AX, a declarative framework of web components that enforces the generation of accessible markup by means of static analysis. 2. An innovative accessibility testing and evaluation methodology, which communicates test results by exploiting concepts that developers are already familiar with (visual rendering and mouse operability) to convey the accessibility of a page. This methodology is implemented through the SAHARIAN browser extension. 3. A11A, a categorized and structured collection of curated accessibility resources aimed at facilitating their intended audiences discover and use them. The second branch focuses on unleashing the full potential of digital technologies to improve accessibility in the physical world. The thesis proposes the SCAMP methodology to make scientific artifacts accessible to blind, visually impaired individuals, and the general public. It enhances the natural characteristics of objects, making them more accessible through interactive, multimodal, and multisensory experiences. Additionally, the prototype of \gls{a11yvt}, a system supporting accessible virtual tours, is presented. It provides blind and visually impaired individuals with features necessary to explore unfamiliar indoor environments, while maintaining universal design principles that makes it suitable for usage by the general public. The thesis extensively discusses the theoretical foundations, design, development, and unique characteristics of these innovative tools. Usability tests with the intended target audiences demonstrate the effectiveness of the proposed artifacts, suggesting their potential to significantly improve the current state of accessibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge graphs and ontologies are closely related concepts in the field of knowledge representation. In recent years, knowledge graphs have gained increasing popularity and are serving as essential components in many knowledge engineering projects that view them as crucial to their success. The conceptual foundation of the knowledge graph is provided by ontologies. Ontology modeling is an iterative engineering process that consists of steps such as the elicitation and formalization of requirements, the development, testing, refactoring, and release of the ontology. The testing of the ontology is a crucial and occasionally overlooked step of the process due to the lack of integrated tools to support it. As a result of this gap in the state-of-the-art, the testing of the ontology is completed manually, which requires a considerable amount of time and effort from the ontology engineers. The lack of tool support is noticed in the requirement elicitation process as well. In this aspect, the rise in the adoption and accessibility of knowledge graphs allows for the development and use of automated tools to assist with the elicitation of requirements from such a complementary source of data. Therefore, this doctoral research is focused on developing methods and tools that support the requirement elicitation and testing steps of an ontology engineering process. To support the testing of the ontology, we have developed XDTesting, a web application that is integrated with the GitHub platform that serves as an ontology testing manager. Concurrently, to support the elicitation and documentation of competency questions, we have defined and implemented RevOnt, a method to extract competency questions from knowledge graphs. Both methods are evaluated through their implementation and the results are promising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In pursuit of aligning with the European Union's ambitious target of achieving a carbon-neutral economy by 2050, researchers, vehicle manufacturers, and original equipment manufacturers have been at the forefront of exploring cutting-edge technologies for internal combustion engines. The introduction of these technologies has significantly increased the effort required to calibrate the models implemented in the engine control units. Consequently the development of tools that reduce costs and the time required during the experimental phases, has become imperative. Additionally, to comply with ever-stricter limits on 〖"CO" 〗_"2" emissions, it is crucial to develop advanced control systems that enhance traditional engine management systems in order to reduce fuel consumption. Furthermore, the introduction of new homologation cycles, such as the real driving emissions cycle, compels manufacturers to bridge the gap between engine operation in laboratory tests and real-world conditions. Within this context, this thesis showcases the performance and cost benefits achievable through the implementation of an auto-adaptive closed-loop control system, leveraging in-cylinder pressure sensors in a heavy-duty diesel engine designed for mining applications. Additionally, the thesis explores the promising prospect of real-time self-adaptive machine learning models, particularly neural networks, to develop an automatic system, using in-cylinder pressure sensors for the precise calibration of the target combustion phase and optimal spark advance in a spark-ignition engines. To facilitate the application of these combustion process feedback-based algorithms in production applications, the thesis discusses the results obtained from the development of a cost-effective sensor for indirect cylinder pressure measurement. Finally, to ensure the quality control of the proposed affordable sensor, the thesis provides a comprehensive account of the design and validation process for a piezoelectric washer test system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter provides a short review of quantum dots (QDs) physics, applications, and perspectives. The main advantage of QDs over bulk semiconductors is the fact that the size became a control parameter to tailor the optical properties of new materials. Size changes the confinement energy which alters the optical properties of the material, such as absorption, refractive index, and emission bands. Therefore, by using QDs one can make several kinds of optical devices. One of these devices transforms electrons into photons to apply them as active optical components in illumination and displays. Other devices enable the transformation of photons into electrons to produce QDs solar cells or photodetectors. At the biomedical interface, the application of QDs, which is the most important aspect in this book, is based on fluorescence, which essentially transforms photons into photons of different wavelengths. This chapter introduces important parameters for QDs' biophotonic applications such as photostability, excitation and emission profiles, and quantum efficiency. We also present the perspectives for the use of QDs in fluorescence lifetime imaging (FLIM) and Förster resonance energy transfer (FRET), so useful in modern microscopy, and how to take advantage of the usually unwanted blinking effect to perform super-resolution microscopy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chronic myeloid leukemia (CML) requires strict daily compliance with oral medication and regular blood and bone marrow control tests. The objective was to evaluate CML patients' perceptions about the disease, their access to information regarding the diagnosis, monitoring and treatment, adverse effects and associations of these variables with patients' demographics, region and healthcare access. Prospective cross-sectional study among CML patients registered with the Brazilian Lymphoma and Leukemia Association (ABRALE). CML patients receiving treatment through the public healthcare system were interviewed by telephone. Among 1,102 patients interviewed, the symptoms most frequently leading them to seek medical care were weakness or fatigue. One third were diagnosed by means of routine tests. The time that elapsed between first symptoms and seeking medical care was 42.28 ± 154.21 days. Most patients had been tested at least once for Philadelphia chromosome, but 43.2% did not know the results. 64.8% had had polymerase chain reaction testing for the BCR/ABL gene every three months. 47% believed that CML could be controlled, but 33.1% believed that there was no treatment. About 24% reported occasionally stopping their medication. Imatinib was associated with nausea, cramps and muscle pain. Self-reported treatment adherence was significantly associated with normalized blood count, and positively associated with imatinib. There is a lack of information or understanding about disease monitoring tools among Brazilian CML patients; they are diagnosed quickly and have good access to treatment. Correct comprehension of CML control tools is impaired in Brazilian patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In about 50% of first trimester spontaneous abortion the cause remains undetermined after standard cytogenetic investigation. We evaluated the usefulness of array-CGH in diagnosing chromosome abnormalities in products of conception from first trimester spontaneous abortions. Cell culture was carried out in short- and long-term cultures of 54 specimens and cytogenetic analysis was successful in 49 of them. Cytogenetic abnormalities (numerical and structural) were detected in 22 (44.89%) specimens. Subsequent, array-CGH based on large insert clones spaced at ~1 Mb intervals over the whole genome was used in 17 cases with normal G-banding karyotype. This revealed chromosome aneuplodies in three additional cases, giving a final total of 51% cases in which an abnormal karyotype was detected. In keeping with other recently published works, this study shows that array-CGH detects abnormalities in a further ~10% of spontaneous abortion specimens considered to be normal using standard cytogenetic methods. As such, array-CGH technique may present a suitable complementary test to cytogenetic analysis in cases with a normal karyotype.