166 resultados para compression set


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The European Cystic Fibrosis Society Clinical Trial Network (ECFS-CTN) has established a Standardization Committee to undertake a rigorous evaluation of promising outcome measures with regard to use in multicentre clinical trials in cystic fibrosis (CF). The aim of this article is to present a review of literature on clinimetric properties of the infant raised-volume rapid thoracic compression (RVRTC) technique in the context of CF, to summarise the consensus amongst the group on feasibility and answer key questions regarding the promotion of this technique to surrogate endpoint status.

METHODS: A literature search (from 1985 onwards) identified 20 papers that met inclusion criteria of RVRTC use in infants with CF. Data were extracted and tabulated regarding repeatability, validity, correlation with other outcome measures, responsiveness and reference values. A working group discussed the tables and answered 4 key questions.

RESULTS: Overall, RVRTC in particular forced expiratory volume in 0.5s, showed good clinimetric properties despite presence of individual variability. Few studies showed a relationship between RVRTC and inflammation and infection, and to date, data remains limited regarding the responsiveness of RVRTC after an intervention. Concerns were raised regarding feasibility in multi-centre studies and availability of reference values.

CONCLUSION: The ECFS-CTN Working Group considers that RVRTC cannot be used as a primary outcome in clinical trials in infants with CF before universal standardization of this measurement is achieved and implementation of inter-institutional networking is in place. We advise its use currently in phase I/II trials and as a secondary endpoint in phase III studies. We emphasise the need for (1) more short-term variability and longitudinal 'natural history' studies, and (2) robust reference values for commercially available devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although Answer Set Programming (ASP) is a powerful framework for declarative problem solving, it cannot in an intuitive way handle situations in which some rules are uncertain, or in which it is more important to satisfy some constraints than others. Possibilistic ASP (PASP) is a natural extension of ASP in which certainty weights are associated with each rule. In this paper we contrast two different views on interpreting the weights attached to rules. Under the first view, weights reflect the certainty with which we can conclude the head of a rule when its body is satisfied. Under the second view, weights reflect the certainty that a given rule restricts the considered epistemic states of an agent in a valid way, i.e. it is the certainty that the rule itself is correct. The first view gives rise to a set of weighted answer sets, whereas the second view gives rise to a weighted set of classical answer sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Answer Set Programming (ASP) is a popular framework for modelling combinatorial problems. However, ASP cannot be used easily for reasoning about uncertain information. Possibilistic ASP (PASP) is an extension of ASP that combines possibilistic logic and ASP. In PASP a weight is associated with each rule, whereas this weight is interpreted as the certainty with which the conclusion can be established when the body is known to hold. As such, it allows us to model and reason about uncertain information in an intuitive way. In this paper we present new semantics for PASP in which rules are interpreted as constraints on possibility distributions. Special models of these constraints are then identified as possibilistic answer sets. In addition, since ASP is a special case of PASP in which all the rules are entirely certain, we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction, called weak disjunction, that has not been previously considered in the literature. In addition to introducing and motivating the semantics of weak disjunction, we also pinpoint its computational complexity. In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Boolean games are a framework for reasoning about the rational behaviour of agents, whose goals are formalized using propositional formulas. They offer an attractive alternative to normal-form games, because they allow for a more intuitive and more compact encoding. Unfortunately, however, there is currently no general, tailor-made method available to compute the equilibria of Boolean games. In this paper, we introduce a method for finding the pure Nash equilibria based on disjunctive answer set programming. Our method is furthermore capable of finding the core elements and the Pareto optimal equilibria, and can easily be modified to support other forms of optimality, thanks to the declarative nature of disjunctive answer set programming. Experimental results clearly demonstrate the effectiveness of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Possibilistic answer set programming (PASP) extends answer set programming (ASP) by attaching to each rule a degree of certainty. While such an extension is important from an application point of view, existing semantics are not well-motivated, and do not always yield intuitive results. To develop a more suitable semantics, we first introduce a characterization of answer sets of classical ASP programs in terms of possibilistic logic where an ASP program specifies a set of constraints on possibility distributions. This characterization is then naturally generalized to define answer sets of PASP programs. We furthermore provide a syntactic counterpart, leading to a possibilistic generalization of the well-known Gelfond-Lifschitz reduct, and we show how our framework can readily be implemented using standard ASP solvers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Answer set programming is a form of declarative programming that has proven very successful in succinctly formulating and solving complex problems. Although mechanisms for representing and reasoning with the combined answer set programs of multiple agents have already been proposed, the actual gain in expressivity when adding communication has not been thoroughly studied. We show that allowing simple programs to talk to each other results in the same expressivity as adding negation-as-failure. Furthermore, we show that the ability to focus on one program in a network of simple programs results in the same expressivity as adding disjunction in the head of the rules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fuzzy answer set programming (FASP) is a generalization of answer set programming to continuous domains. As it can not readily take uncertainty into account, however, FASP is not suitable as a basis for approximate reasoning and cannot easily be used to derive conclusions from imprecise information. To cope with this, we propose an extension of FASP based on possibility theory. The resulting framework allows us to reason about uncertain information in continuous domains, and thus also about information that is imprecise or vague. We propose a syntactic procedure, based on an immediate consequence operator, and provide a characterization in terms of minimal models, which allows us to straightforwardly implement our framework using existing FASP solvers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hidden Markov models (HMMs) are widely used probabilistic models of sequential data. As with other probabilistic models, they require the specification of local conditional probability distributions, whose assessment can be too difficult and error-prone, especially when data are scarce or costly to acquire. The imprecise HMM (iHMM) generalizes HMMs by allowing the quantification to be done by sets of, instead of single, probability distributions. iHMMs have the ability to suspend judgment when there is not enough statistical evidence, and can serve as a sensitivity analysis tool for standard non-stationary HMMs. In this paper, we consider iHMMs under the strong independence interpretation, for which we develop efficient inference algorithms to address standard HMM usage such as the computation of likelihoods and most probable explanations, as well as performing filtering and predictive inference. Experiments with real data show that iHMMs produce more reliable inferences without compromising the computational efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many problems in artificial intelligence can be encoded as answer set programs (ASP) in which some rules are uncertain. ASP programs with incorrect rules may have erroneous conclusions, but due to the non-monotonic nature of ASP, omitting a correct rule may also lead to errors. To derive the most certain conclusions from an uncertain ASP program, we thus need to consider all situations in which some, none, or all of the least certain rules are omitted. This corresponds to treating some rules as optional and reasoning about which conclusions remain valid regardless of the inclusion of these optional rules. While a version of possibilistic ASP (PASP) based on this view has recently been introduced, no implementation is currently available. In this paper we propose a simulation of the main reasoning tasks in PASP using (disjunctive) ASP programs, allowing us to take advantage of state-of-the-art ASP solvers. Furthermore, we identify how several interesting AI problems can be naturally seen as special cases of the considered reasoning tasks, including cautious abductive reasoning and conformant planning. As such, the proposed simulation enables us to solve instances of the latter problem types that are more general than what current solvers can handle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To assess the bacterial contamination risk in cataract surgery associated with mechanical compression of the lid margin immediately after sterilization of the ocular surface.

Setting: Department of Cataract, Zhongshan Ophthalmic Center, Sun Yat-sen University, Guangzhou, China.

Design: Prospective randomized controlled double-masked trial.

Methods: Patients with age-related cataract were randomly assigned to 1 of 2 groups. In Group A (153 eyes), the lid margin was compressed and scrubbed for 360 degrees 5 times with a dry sterile cotton-tipped applicator immediately after ocular sterilization and before povidone-iodine irrigation of the conjunctival sac. Group B (153 eyes) had identical sterilization but no lid scrubbing. Samples from the lid margin, liquid in the collecting bag, and aqueous humor were collected for bacterial culture. Primary outcome measures included the rate of positive bacterial culture for the above samples. The species of bacteria isolated were recorded.

Results: Group A and Group B each comprised 153 eyes. The positive rate of lid margin cultures was 54.24%. The positive rate of cultures for liquid in the collecting bag was significantly higher in Group A (23.53%) than in Group B (9.80%) (P=.001).The bacterial species cultured from the collecting bag in Group B were the same as those from the lid margin in Group A. The positive culture rate of aqueous humor in both groups was 0%.

Conclusion: Mechanical compression of the lid margin immediately before and during cataract surgery increased the risk for bacterial contamination of the surgical field, perhaps due to secretions from the lid margin glands.

Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seafloor massive sulfide (SMS) mining will likely occur at hydrothermal systems in the near future. Alongside their mineral wealth, SMS deposits also have considerable biological value. Active SMS deposits host endemic hydrothermal vent communities, whilst inactive deposits support communities of deep water corals and other suspension feeders. Mining activities are expected to remove all large organisms and suitable habitat in the immediate area, making vent endemic organisms particularly at risk from habitat loss and localised extinction. As part of environmental management strategies designed to mitigate the effects of mining, areas of seabed need to be protected to preserve biodiversity that is lost at the mine site and to preserve communities that support connectivity among populations of vent animals in the surrounding region. These "set-aside" areas need to be biologically similar to the mine site and be suitably connected, mostly by transport of larvae, to neighbouring sites to ensure exchange of genetic material among remaining populations. Establishing suitable set-asides can be a formidable task for environmental managers, however the application of genetic approaches can aid set-aside identification, suitability assessment and monitoring. There are many genetic tools available, including analysis of mitochondrial DNA (mtDNA) sequences (e.g. COI or other suitable mtDNA genes) and appropriate nuclear DNA markers (e.g. microsatellites, single nucleotide polymorphisms), environmental DNA (eDNA) techniques and microbial metagenomics. When used in concert with traditional biological survey techniques, these tools can help to identify species, assess the genetic connectivity among populations and assess the diversity of communities. How these techniques can be applied to set-aside decision making is discussed and recommendations are made for the genetic characteristics of set-aside sites. A checklist for environmental regulators forms a guide to aid decision making on the suitability of set-aside design and assessment using genetic tools. This non-technical primer document represents the views of participants in the VentBase 2014 workshop.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BaH (and its isotopomers) is an attractive molecular candidate for laser cooling to ultracold temperatures and a potential precursor for the production of ultracold gases of hydrogen and deuterium. The theoretical challenge is to simulate the laser cooling cycle as reliably as possible and this paper addresses the generation of a highly accurate ab initio $^{2}\Sigma^+$ potential for such studies. The performance of various basis sets within the multi-reference configuration-interaction (MRCI) approximation with the Davidson correction (MRCI+Q)is tested and taken to the Complete Basis Set (CBS) limit. It is shown that the calculated molecular constants using a 46 electron Effective Core-Potential (ECP) and even-tempered augmented polarized core-valence basis sets (aug-pCV$n$Z-PP, n= 4 and 5) but only including three active electrons in the MRCI calculation are in excellent agreement with the available experimental values. The predicted dissociation energy De for the X$^2\Sigma^+$ state (extrapolated to the CBS limit) is 16895.12 cm$^{-1}$ (2.094 eV), which agrees within 0.1$\%$ of a revised experimental value of <16910.6 cm$^{-1}$, while the calculated re is within 0.03 pm of the experimental result.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The duration compression effect is a phenomenon in which prior adaptation to a spatially circumscribed dynamic stimulus results in the duration of subsequent subsecond stimuli presented in the adapted region being underestimated. There is disagreement over the frame of reference within which the duration compression phenomenon occurs. One view holds that the effect is driven by retinotopic-tuned mechanisms located at early stages of visual processing, and an alternate position is that the mechanisms are spatiotopic and occur at later stages of visual processing (MT+). We addressed the retinotopic-spatiotopic question by using adapting stimuli – drifting plaids - that are known to activate global-motion mechanisms in area MT. If spatiotopic mechanisms contribute to the duration compression effect, drifting plaid adaptors should be well suited to revealing them. Following adaptation participants were tasked with estimating the duration of a 600ms random dot stimulus, whose direction was identical to the pattern direction of the adapting plaid, presented at either the same retinotopic or the same spatiotopic location as the adaptor. Our results reveal significant duration compression in both conditions, pointing to the involvement of both retinotopic-tuned and spatiotopic-tuned mechanisms in the duration compression effect.