960 resultados para completeness


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The English Improving Access to Psychological Therapies (IAPT) initiative aims to make evidence-based psychological therapies for depression and anxiety disorder more widely available in the National Health Service (NHS). 32 IAPT services based on a stepped care model were established in the first year of the programme. We report on the reliable recovery rates achieved by patients treated in the services and identify predictors of recovery at patient level, service level, and as a function of compliance with National Institute of Health and Care Excellence (NICE) Treatment Guidelines. METHOD: Data from 19,395 patients who were clinical cases at intake, attended at least two sessions, had at least two outcomes scores and had completed their treatment during the period were analysed. Outcome was assessed with the patient health questionnaire depression scale (PHQ-9) and the anxiety scale (GAD-7). RESULTS: Data completeness was high for a routine cohort study. Over 91% of treated patients had paired (pre-post) outcome scores. Overall, 40.3% of patients were reliably recovered at post-treatment, 63.7% showed reliable improvement and 6.6% showed reliable deterioration. Most patients received treatments that were recommended by NICE. When a treatment not recommended by NICE was provided, recovery rates were reduced. Service characteristics that predicted higher reliable recovery rates were: high average number of therapy sessions; higher step-up rates among individuals who started with low intensity treatment; larger services; and a larger proportion of experienced staff. CONCLUSIONS: Compliance with the IAPT clinical model is associated with enhanced rates of reliable recovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Incorporating an emerging therapy as a new randomisation arm in a clinical trial that is open to recruitment would be desirable to researchers, regulators and patients to ensure that the trial remains current, new treatments are evaluated as quickly as possible, and the time and cost for determining optimal therapies is minimised. It may take many years to run a clinical trial from concept to reporting within a rapidly changing drug development environment; hence, in order for trials to be most useful to inform policy and practice, it is advantageous for them to be able to adapt to emerging therapeutic developments. This paper reports a comprehensive literature review on methodologies for, and practical examples of, amending an ongoing clinical trial by adding a new treatment arm. Relevant methodological literature describing statistical considerations required when making this specific type of amendment is identified, and the key statistical concepts when planning the addition of a new treatment arm are extracted, assessed and summarised. For completeness, this includes an assessment of statistical recommendations within general adaptive design guidance documents. Examples of confirmatory ongoing trials designed within the frequentist framework that have added an arm in practice are reported; and the details of the amendment are reviewed. An assessment is made as to how well the relevant statistical considerations were addressed in practice, and the related implications. The literature review confirmed that there is currently no clear methodological guidance on this topic, but that guidance would be advantageous to help this efficient design amendment to be used more frequently and appropriately in practice. Eight confirmatory trials were identified to have added a treatment arm, suggesting that trials can benefit from this amendment and that it can be practically feasible; however, the trials were not always able to address the key statistical considerations, often leading to uninterpretable or invalid outcomes. If the statistical concepts identified within this review are considered and addressed during the design of a trial amendment, it is possible to effectively assess a new treatment arm within an ongoing trial without compromising the original trial outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to seek to shed light on the practice of incomplete corporate disclosure of quantitative Greenhouse gas (GHG) emissions and investigates whether external stakeholder pressure influences the existence, and separately, the completeness of voluntary GHG emissions disclosures by 431 European companies. Design/methodology/approach – A classification of reporting completeness is developed with respect to the scope, type and reporting boundary of GHG emissions based on the guidelines of the GHG Protocol, Global Reporting Initiative and the Carbon Disclosure Project. Logistic regression analysis is applied to examine whether proxies for exposure to climate change concerns from different stakeholder groups influence the existence and/or completeness of quantitative GHG emissions disclosure. Findings – From 2005 to 2009, on average only 15 percent of companies that disclose GHG emissions report them in a manner that the authors consider complete. Results of regression analyses suggest that external stakeholder pressure is a determinant of the existence but not the completeness of emissions disclosure. Findings are consistent with stakeholder theory arguments that companies respond to external stakeholder pressure to report GHG emissions, but also with legitimacy theory claims that firms can use carbon disclosure, in this case the incomplete reporting of emissions, as a symbolic act to address legitimacy exposures. Practical implications – Bringing corporate GHG emissions disclosure in line with recommended guidelines will require either more direct stakeholder pressure or, perhaps, a mandated disclosure regime. In the meantime, users of the data will need to carefully consider the relevance of the reported data and develop the necessary competencies to detect and control for its incompleteness. A more troubling concern is that stakeholders may instead grow to accept less than complete disclosure. Originality/value – The paper represents the first large-scale empirical study into the completeness of companies’ disclosure of quantitative GHG emissions and is the first to analyze these disclosures in the context of stakeholder pressure and its relation to legitimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the present work is to study the potential short-term atmospheric and biospheric influence of Gamma Ray Bursts on the Earth. We focus in the ultraviolet flash at planet`s surface, which occurs as a result of the retransmission of the gamma radiation through the atmosphere. This would be the only important short-term effect on life. We mostly consider Archean and Proterozoic eons, and for completeness we also comment on the Phanerozoic. Therefore, in our study we consider atmospheres with oxygen levels ranging from 10(-5) to 1 of the present atmospheric level, representing different moments in the oxygen rise history. Ecological consequences and some strategies to estimate their importance are outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We employ the recently installed near-infrared Multi-Conjugate Adaptive Optics demonstrator (MAD) to determine the basic properties of a newly identified, old and distant, Galactic open cluster (FSR 1415). The MAD facility remarkably approaches the diffraction limit, reaching a resolution of 0.07 arcsec (in K), that is also uniform in a field of similar to 1.8 arcmin in diameter. The MAD facility provides photometry that is 50 per cent complete at K similar to 19. This corresponds to about 2.5 mag below the cluster main-sequence turn-off. This high-quality data set allows us to derive an accurate heliocentric distance of 8.6 kpc, a metallicity close to solar and an age of similar to 2.5 Gyr. On the other hand, the deepness of the data allows us to reconstruct (completeness-corrected) mass functions (MFs) indicating a relatively massive cluster, with a flat core MF. The Very Large Telescope/MAD capabilities will therefore provide fundamental data for identifying/analysing other faint and distant open clusters in the Galaxy III and IV quadrants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a detailed description of the Voronoi Tessellation (VT) cluster finder algorithm in 2+1 dimensions, which improves on past implementations of this technique. The need for cluster finder algorithms able to produce reliable cluster catalogs up to redshift 1 or beyond and down to 10(13.5) solar masses is paramount especially in light of upcoming surveys aiming at cosmological constraints from galaxy cluster number counts. We build the VT in photometric redshift shells and use the two-point correlation function of the galaxies in the field to both determine the density threshold for detection of cluster candidates and to establish their significance. This allows us to detect clusters in a self-consistent way without any assumptions about their astrophysical properties. We apply the VT to mock catalogs which extend to redshift 1.4 reproducing the ACDM cosmology and the clustering properties observed in the Sloan Digital Sky Survey data. An objective estimate of the cluster selection function in terms of the completeness and purity as a function of mass and redshift is as important as having a reliable cluster finder. We measure these quantities by matching the VT cluster catalog with the mock truth table. We show that the VT can produce a cluster catalog with completeness and purity > 80% for the redshift range up to similar to 1 and mass range down to similar to 10(13.5) solar masses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we consider a classical problem of complete test generation for deterministic finite-state machines (FSMs) in a more general setting. The first generalization is that the number of states in implementation FSMs can even be smaller than that of the specification FSM. Previous work deals only with the case when the implementation FSMs are allowed to have the same number of states as the specification FSM. This generalization provides more options to the test designer: when traditional methods trigger a test explosion for large specification machines, tests with a lower, but yet guaranteed, fault coverage can still be generated. The second generalization is that tests can be generated starting with a user-defined test suite, by incrementally extending it until the desired fault coverage is achieved. Solving the generalized test derivation problem, we formulate sufficient conditions for test suite completeness weaker than the existing ones and use them to elaborate an algorithm that can be used both for extending user-defined test suites to achieve the desired fault coverage and for test generation. We present the experimental results that indicate that the proposed algorithm allows obtaining a trade-off between the length and fault coverage of test suites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Complex networks obtained from real-world networks are often characterized by incompleteness and noise, consequences of imperfect sampling as well as artifacts in the acquisition process. Because the characterization, analysis and modeling of complex systems underlain by complex networks are critically affected by the quality and completeness of the respective initial structures, it becomes imperative to devise methodologies for identifying and quantifying the effects of the sampling on the network structure. One way to evaluate these effects is through an analysis of the sensitivity of complex network measurements to perturbations in the topology of the network. In this paper, measurement sensibility is quantified in terms of the relative entropy of the respective distributions. Three particularly important kinds of progressive perturbations to the network are considered, namely, edge suppression, addition and rewiring. The measurements allowing the best balance of stability (smaller sensitivity to perturbations) and discriminability (separation between different network topologies) are identified with respect to each type of perturbation. Such an analysis includes eight different measurements applied on six different complex networks models and three real-world networks. This approach allows one to choose the appropriate measurements in order to obtain accurate results for networks where sampling bias cannot be avoided-a very frequent situation in research on complex networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermophilic endo-1,3(4)-beta-glucanase (laminarinase) from Rhodothermus marinus was crystallized by the hanging-drop vapor diffusion method. The needle-like crystals belong to space group P2(1) and contain two protein molecules in the asymmetric unit with a solvent content of 51.75%. Diffraction data were collected to a resolution of 1.95 angstrom and resulted in a dataset with an overall R-merge of 10.4% and a completeness of 97.8%. Analysis of the structure factors revealed pseudomerohedral twinning of the crystals with a twin fraction of approximately 42%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inspired by the recent work on approximations of classical logic, we present a method that approximates several modal logics in a modular way. Our starting point is the limitation of the n-degree of introspection that is allowed, thus generating modal n-logics. The semantics for n-logics is presented, in which formulas are evaluated with respect to paths, and not possible worlds. A tableau-based proof system is presented, n-SST, and soundness and completeness is shown for the approximation of modal logics K, T, D, S4 and S5. (c) 2008 Published by Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For first-order Horn clauses without equality, resolution is complete with an arbitrary selection of a single literal in each clause [dN 96]. Here we extend this result to the case of clauses with equality for superposition-based inference systems. Our result is a generalization of the result given in [BG 01]. We answer their question about the completeness of a superposition-based system for general clauses with an arbitrary selection strategy, provided there exists a refutation without applications of the factoring inference rule.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Until recently, First-Order Temporal Logic (FOTL) has been only partially understood. While it is well known that the full logic has no finite axiomatisation, a more detailed analysis of fragments of the logic was not previously available. However, a breakthrough by Hodkinson et al., identifying a finitely axiomatisable fragment, termed the monodic fragment, has led to improved understanding of FOTL. Yet, in order to utilise these theoretical advances, it is important to have appropriate proof techniques for this monodic fragment.In this paper, we modify and extend the clausal temporal resolution technique, originally developed for propositional temporal logics, to enable its use in such monodic fragments. We develop a specific normal form for monodic formulae in FOTL, and provide a complete resolution calculus for formulae in this form. Not only is this clausal resolution technique useful as a practical proof technique for certain monodic classes, but the use of this approach provides us with increased understanding of the monodic fragment. In particular, we here show how several features of monodic FOTL can be established as corollaries of the completeness result for the clausal temporal resolution method. These include definitions of new decidable monodic classes, simplification of existing monodic classes by reductions, and completeness of clausal temporal resolution in the case of monodic logics with expanding domains, a case with much significance in both theory and practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We prove the completeness of the regular strategy of derivations for superposition-based calculi. The regular strategy was pioneered by Kanger in [Kan63], who proposed that all equality inferences take place before all other steps in the proof. We show that the strategy is complete with the elimination of tautologies. The implication of our result is the completeness of non-standard selection functions by which in non-relational clauses only equality literals (and all of them) are selected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

First-order temporal logic is a concise and powerful notation, with many potential applications in both Computer Science and Artificial Intelligence. While the full logic is highly complex, recent work on monodic first-order temporal logics has identified important enumerable and even decidable fragments. Although a complete and correct resolution-style calculus has already been suggested for this specific fragment, this calculus involves constructions too complex to be of practical value. In this paper, we develop a machine-oriented clausal resolution method which features radically simplified proof search. We first define a normal form for monodic formulae and then introduce a novel resolution calculus that can be applied to formulae in this normal form. By careful encoding, parts of the calculus can be implemented using classical first-order resolution and can, thus, be efficiently implemented. We prove correctness and completeness results for the calculus and illustrate it on a comprehensive example. An implementation of the method is briefly discussed.