977 resultados para Testing Framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report on The Potential of Mode of Action (MoA) Information Derived from Non-testing and Screening Methodologies to Support Informed Hazard Assessment, resulted from a workshop organised within OSIRIS (Optimised Strategies for Risk Assessment of Industrial Chemicals through Integration of Non-test and Test Information), a project partly funded by the EU Commission within the Sixth Framework Programme. The workshop was held in Liverpool, UK, on 30 October 2008, with 35 attendees. The goal of the OSIRIS project is to develop integrated testing strategies (ITS) fit for use in the REACH system, that would enable a significant increase in the use of non-testing information for regulatory decision making, and thus minimise the need for animal testing. One way to improve the evaluation of chemicals may be through categorisation by way of mechanisms or modes of toxic action. Defining such groups can enhance read-across possibilities and priority settings for certain toxic modes or chemical structures responsible for these toxic modes. Overall, this may result in a reduction of in vivo testing on organisms, through combining available data on mode of action and a focus on the potentially most-toxic groups. In this report, the possibilities of a mechanistic approach to assist in and guide ITS are explored, and the differences between human health and environmental areas are summarised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of biomarkers to infer drug response in patients is being actively pursued, yet significant challenges with this approach, including the complicated interconnection of pathways, have limited its application. Direct empirical testing of tumor sensitivity would arguably provide a more reliable predictive value, although it has garnered little attention largely due to the technical difficulties associated with this approach. We hypothesize that the application of recently developed microtechnologies, coupled to more complex 3-dimensional cell cultures, could provide a model to address some of these issues. As a proof of concept, we developed a microfluidic device where spheroids of the serous epithelial ovarian cancer cell line TOV112D are entrapped and assayed for their chemoresponse to carboplatin and paclitaxel, two therapeutic agents routinely used for the treatment of ovarian cancer. In order to index the chemoresponse, we analyzed the spatiotemporal evolution of the mortality fraction, as judged by vital dyes and confocal microscopy, within spheroids subjected to different drug concentrations and treatment durations inside the microfluidic device. To reflect microenvironment effects, we tested the effect of exogenous extracellular matrix and serum supplementation during spheroid formation on their chemotherapeutic response. Spheroids displayed augmented chemoresistance in comparison to monolayer culturing. This resistance was further increased by the simultaneous presence of both extracellular matrix and high serum concentration during spheroid formation. Following exposure to chemotherapeutics, cell death profiles were not uniform throughout the spheroid. The highest cell death fraction was found at the center of the spheroid and the lowest at the periphery. Collectively, the results demonstrate the validity of the approach, and provide the basis for further investigation of chemotherapeutic responses in ovarian cancer using microfluidics technology. In the future, such microdevices could provide the framework to assay drug sensitivity in a timeframe suitable for clinical decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE The implementation of genomic-based medicine is hindered by unresolved questions regarding data privacy and delivery of interpreted results to health-care practitioners. We used DNA-based prediction of HIV-related outcomes as a model to explore critical issues in clinical genomics. METHODS We genotyped 4,149 markers in HIV-positive individuals. Variants allowed for prediction of 17 traits relevant to HIV medical care, inference of patient ancestry, and imputation of human leukocyte antigen (HLA) types. Genetic data were processed under a privacy-preserving framework using homomorphic encryption, and clinical reports describing potentially actionable results were delivered to health-care providers. RESULTS A total of 230 patients were included in the study. We demonstrated the feasibility of encrypting a large number of genetic markers, inferring patient ancestry, computing monogenic and polygenic trait risks, and reporting results under privacy-preserving conditions. The average execution time of a multimarker test on encrypted data was 865 ms on a standard computer. The proportion of tests returning potentially actionable genetic results ranged from 0 to 54%. CONCLUSIONS The model of implementation presented herein informs on strategies to deliver genomic test results for clinical care. Data encryption to ensure privacy helps to build patient trust, a key requirement on the road to genomic-based medicine.Genet Med advance online publication 14 January 2016Genetics in Medicine (2016); doi:10.1038/gim.2015.167.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional wisdom on the insufficiency of existing WTO disciplines on export restrictions has triggered momentum on the issue. In this book, Ilaria Espa offers a comprehensive analysis of the scope and coverage of WTO disciplines on export restrictions in light of emerging case law. She investigates whether such rules still provide a credible and effective framework capable of preventing abuses in the use of export restrictive measures on critical minerals and metals during a period of economic crisis and change in international trade patterns. Giving a broad overview of the export restrictions applied to these materials, Espa identifies distinctive features in the proliferation of export barriers and analyses the existing WTO rules to reveal their gaps and inconsistencies. She goes on to present solutions based upon her findings with the aim of bringing more coherence and equity to WTO rules on the export side.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the IES-UPM experience from 2006 to 2010 in the field of the characterization of PV arrays of commercial large PV plants installed in Spain within the framework of the profitable economic scenarios associated to feed-in tariff laws. This experience has extended to 200 MW and has provided valuable lessons to minimize uncertainty, which plays a key role in quality assurance procedures. The paper deals not only with classic I–V measurements but also with watt-metering-based procedures. Particular attention is paid to the selection of irradiance and cell temperature sensors

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to present what kind of elements and evaluation methods should be included into a framework for evaluating the achievements and impacts of transport projects supported in EC Framework Programmes (FP). Further, the paper discusses the possibilities of such an evaluation framework in producing recommendations regarding future transport research and policy objectives as well as mutual learning for the basis of strategic long term planning. Methods The paper describes the two-dimensional evaluation methodology developed in the course of the FP7 METRONOME project. The dimensions are: (1) achievement of project objectives and targets in different levels and (2) research project impacts according to four impact groups. The methodology uses four complementary approaches in evaluation, namely evaluation matrices, coordinator questionnaires, lead user interviews and workshops. Results Based on the methodology testing, with a sample of FP5 and FP6 projects, the main results relating to the rationale, implementation and achievements of FP projects is presented. In general, achievement of objectives in both FPs was good. Strongest impacts were identified within the impact group of management and co-ordination. Also scientific and end-user impacts of the projects were adequate, but wider societal impacts quite modest. The paper concludes with a discussion both on the theoretical and practical implications of the proposed methodology and by presenting some relevant future research needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the use of recommender systems becomes more consolidated on the Net, an increasing need arises to develop some kind of evaluation framework for collaborative filtering measures and methods which is capable of not only testing the prediction and recommendation results, but also of other purposes which until now were considered secondary, such as novelty in the recommendations and the users? trust in these. This paper provides: (a) measures to evaluate the novelty of the users? recommendations and trust in their neighborhoods, (b) equations that formalize and unify the collaborative filtering process and its evaluation, (c) a framework based on the above-mentioned elements that enables the evaluation of the quality results of any collaborative filtering applied to the desired recommender systems, using four graphs: quality of the predictions, the recommendations, the novelty and the trust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, after testing that neither the definition of Agile methodologies, nor the current tools that support them, such as Scrum or XP, gave guidance for stages of software development prior to the definition of the first interaction of development; we proceeded to study the state of the art of Inception techniques, that is, techniques to deal with this early phase of the project, that would help guide its development. From the analysis of these Inception techniques, we defined what we considered as the essential properties of an Inception framework. With that list at hand, it was found that no current Inception framework supported all the features, also, we found that it did not exist, either, any software application on the market that did it. Finally, after checking the above gaps, we defined the Inception framework "Agile Incepti-ON", with all the practices necessary to meet the requirements specified above. In addition to this, a software application was developed to support the practices defined in the Inception framework, called "Agile Dojo".

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Empirical Software Engineering (ESE) replication researchers need to store and manipulate experimental data for several purposes, in particular analysis and reporting. Current research needs call for sharing and preservation of experimental data as well. In a previous work, we analyzed Replication Data Management (RDM) needs. A novel concept, called Experimental Ecosystem, was proposed to solve current deficiencies in RDM approaches. The empirical ecosystem provides replication researchers with a common framework that integrates transparently local heterogeneous data sources. A typical situation where the Empirical Ecosystem is applicable, is when several members of a research group, or several research groups collaborating together, need to share and access each other experimental results. However, to be able to apply the Empirical Ecosystem concept and deliver all promised benefits, it is necessary to analyze the software architectures and tools that can properly support it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the EU circuit (especially the European Parliament, the Council and Coreper) as well as in national parliaments of the EU Member States, one observes a powerful tendency to regard 'subsidiarity' as a 'political' issue. Moreover, subsidiarity is frequently seen as a one-way street : powers going 'back to' Member States. Both interpretations are at least partly flawed and less than helpful when looking for practical ways to deal with subsidiarity at both EU and Member states' levels. The present paper shows that subsidiarity as a principle is profoundly 'functional' in nature and, hence, is and must be a two-way principle. A functional subsidiarity test is developed and its application is illustrated for a range of policy issues in the internal market in its widest sense, for equity and for macro-economic stabilisation questions in European integration. Misapplications of 'subsidiarity' are also demonstrated. For a good understanding, subsidiarity being a functional, two-way principle neither means that elected politicians should not have the final (political!) say (for which they are accountable), nor that subsidiarity tests, even if properly conducted, cannot and will not be politicised once the results enter the policy debate. Such politicisation forms a natural run-up to the decision-making by those elected for it. But the quality and reasoning of the test as well as structuring the information in a logical sequence ( in accordance with the current protocol and with the one in the constitutional treaty) is likely to be directly helpful for decisionmakers, confronted with complicated and often specialised proposals. EU debates and decision-making is therefore best served by separating the functional subsidiarity test (prepared by independent professionals) from the final political decision itself. If the test were accepted Union-wide, it would also assist national parliaments in conducting comparable tests in a relatively short period, as the basis for possible joint action (as suggested by the constitutional treaty). The core of the paper explains how the test is formulated and applied. A functional approach to subsidiarity in the framework of European representative democracy seeks to find the optimal assignment of regulatory or policy competences to the various tiers of government. In the final analysis, this is about structures facilitating the highest possible welfare in the Union, in the fundamental sense that preferences and needs are best satisfied. What is required for such an analysis is no less than a systematic cost/benefit framework to assess the (de)merits of (de)centralisation in the EU.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architectural decisions are often encoded in the form of constraints and guidelines. Non-functional requirements can be ensured by checking the conformance of the implementation against this kind of invariant. Conformance checking is often a costly and error-prone process that involves the use of multiple tools, differing in effectiveness, complexity and scope of applicability. To reduce the overall effort entailed by this activity, we propose a novel approach that supports verification of human- readable declarative rules through the use of adapted off-the-shelf tools. Our approach consists of a rule specification DSL, called Dicto, and a tool coordination framework, called Probo. The approach has been implemented in a soon to be evaluated prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vector error-correction models (VECMs) have become increasingly important in their application to financial markets. Standard full-order VECM models assume non-zero entries in all their coefficient matrices. However, applications of VECM models to financial market data have revealed that zero entries are often a necessary part of efficient modelling. In such cases, the use of full-order VECM models may lead to incorrect inferences. Specifically, if indirect causality or Granger non-causality exists among the variables, the use of over-parameterised full-order VECM models may weaken the power of statistical inference. In this paper, it is argued that the zero–non-zero (ZNZ) patterned VECM is a more straightforward and effective means of testing for both indirect causality and Granger non-causality. For a ZNZ patterned VECM framework for time series of integrated order two, we provide a new algorithm to select cointegrating and loading vectors that can contain zero entries. Two case studies are used to demonstrate the usefulness of the algorithm in tests of purchasing power parity and a three-variable system involving the stock market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary change results from selection acting on genetic variation. For migration to be successful, many different aspects of an animal's physiology and behaviour need to function in a co-coordinated way. Changes in one migratory trait are therefore likely to be accompanied by changes in other migratory and life-history traits. At present, we have some knowledge of the pressures that operate at the various stages of migration, but we know very little about the extent of genetic variation in various aspects of the migratory syndrome. As a consequence, our ability to predict which species is capable of what kind of evolutionary change, and at which rate, is limited. Here, we review how our evolutionary understanding of migration may benefit from taking a quantitative-genetic approach and present a framework for studying the causes of phenotypic variation. We review past research, that has mainly studied single migratory traits in captive birds, and discuss how this work could be extended to study genetic variation in the wild and to account for genetic correlations and correlated selection. In the future, reaction-norm approaches may become very important, as they allow the study of genetic and environmental effects on phenotypic expression within a single framework, as well as of their interactions. We advocate making more use of repeated measurements on single individuals to study the causes of among-individual variation in the wild, as they are easier to obtain than data on relatives and can provide valuable information for identifying and selecting traits. This approach will be particularly informative if it involves systematic testing of individuals under different environmental conditions. We propose extending this research agenda by using optimality models to predict levels of variation and covariation among traits and constraints. This may help us to select traits in which we might expect genetic variation, and to identify the most informative environmental axes. We also recommend an expansion of the passerine model, as this model does not apply to birds, like geese, where cultural transmission of spatio-temporal information is an important determinant of migration patterns and their variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of multiple partial viewpoints is recommended for specification. We believe they also can be useful for devising strategies for testing. In this paper, we use Object-Z to formally specify concurrent Java components from viewpoints based on the separation of application and synchronisation concerns inherent in Java monitors. We then use the Test-Template Framework on the Object-Z viewpoints to devise a strategy for testing the components. When combining the test templates for the different viewpoints we focus on the observable behaviour of the application to systematically derive a practical testing strategy. The Producer-Consumer and Readers-Writers problems are considered as case studies.