336 resultados para 166-1007
Resumo:
Objective: Community surveys have shown that many otherwise well individuals report delusional-like experiences. The authors examined psychopathology during childhood and adolescence as a predictor of delusional-like experiences in young adulthood. ---------- Method: The authors analyzed prospective data from the Mater-University of Queensland Study of Pregnancy, a birth cohort of 3,617 young adults born between 1981 and 1983. Psychopathology was measured at ages 5 and 14 using the Child Behavior Checklist (CBCL) and at age 14 using the Youth Self-Report (YSR). Delusional-like experiences were measured at age 21 using the Peters Delusional Inventory. The association between childhood and adolescent symptoms and later delusional-like experiences was examined using logistic regression. ---------- Results: High CBCL scores at ages 5 and 14 predicted high levels of delusional-like experiences at age 21 (odds ratios for the highest versus the other quartiles combined were 1.25 and 1.85, respectively). Those with YSR scores in the highest quartile at age 14 were nearly four times as likely to have high levels of delusional-like experiences at age 21 (odds ratio=3.71). Adolescent-onset psychopathology and continuous psychopathology through both childhood and adolescence strongly predicted delusional-like experiences at age 21. Hallucinations at age 14 were significantly associated with delusional-like experiences at age 21. The general pattern of associations persisted when adjusted for previous drug use or the presence of nonaffective psychoses at age 21. ---------- Conclusion: Psychopathology during childhood and adolescence predicts adult delusional-like experiences. Understanding the biological and psychosocial factors that influence this developmental trajectory may provide clues to the pathogenesis of psychotic-like experiences.
Resumo:
Controlled rate thermal analysis (CRTA) technology offers better resolution and a more detailed interpretation of the decomposition processes of a clay mineral such as sepiolite via approaching equilibrium conditions of decomposition through the elimination of the slow transfer of heat to the sample as a controlling parameter on the process of decomposition. Constant-rate decomposition processes of non-isothermal nature reveal changes in the sepiolite as the sepiolite is converted to an anhydride. In the dynamic experiment two dehydration steps are observed over the *20–170 and 170–350 �C temperature range. In the dynamic experiment three dehydroxylation steps are observed over the temperature ranges 201–337, 337–638 and 638–982 �C. The CRTA technology enables the separation of the thermal decomposition steps.
Resumo:
The evolution of organisms that cause healthcare acquired infections (HAI) puts extra stress on hospitals already struggling with rising costs and demands for greater productivity and cost containment. Infection control can save scarce resources, lives, and possibly a facility’s reputation, but statistics and epidemiology are not always sufficient to make the case for the added expense. Economics and Preventing Healthcare Acquired Infection presents a rigorous analytic framework for dealing with this increasingly serious problem. ----- Engagingly written for the economics non-specialist, and brimming with tables, charts, and case examples, the book lays out the concepts of economic analysis in clear, real-world terms so that infection control professionals or infection preventionists will gain competence in developing analyses of their own, and be confident in the arguments they present to decision-makers. The authors: ----- Ground the reader in the basic principles and language of economics. ----- Explain the role of health economists in general and in terms of infection prevention and control. ----- Introduce the concept of economic appraisal, showing how to frame the problem, evaluate and use data, and account for uncertainty. ----- Review methods of estimating and interpreting the costs and health benefits of HAI control programs and prevention methods. ----- Walk the reader through a published economic appraisal of an infection reduction program. ----- Identify current and emerging applications of economics in infection control. ---- Economics and Preventing Healthcare Acquired Infection is a unique resource for practitioners and researchers in infection prevention, control and healthcare economics. It offers valuable alternate perspective for professionals in health services research, healthcare epidemiology, healthcare management, and hospital administration. ----- Written for: Professionals and researchers in infection control, health services research, hospital epidemiology, healthcare economics, healthcare management, hospital administration; Association of Professionals in Infection Control (APIC), Society for Healthcare Epidemiologists of America (SHEA)
Resumo:
A data-driven background dataset refinement technique was recently proposed for SVM based speaker verification. This method selects a refined SVM background dataset from a set of candidate impostor examples after individually ranking examples by their relevance. This paper extends this technique to the refinement of the T-norm dataset for SVM-based speaker verification. The independent refinement of the background and T-norm datasets provides a means of investigating the sensitivity of SVM-based speaker verification performance to the selection of each of these datasets. Using refined datasets provided improvements of 13% in min. DCF and 9% in EER over the full set of impostor examples on the 2006 SRE corpus with the majority of these gains due to refinement of the T-norm dataset. Similar trends were observed for the unseen data of the NIST 2008 SRE.
Resumo:
This paper presents Scatter Difference Nuisance Attribute Projection (SD-NAP) as an enhancement to NAP for SVM-based speaker verification. While standard NAP may inadvertently remove desirable speaker variability, SD-NAP explicitly de-emphasises this variability by incorporating a weighted version of the between-class scatter into the NAP optimisation criterion. Experimental evaluation of SD-NAP with a variety of SVM systems on the 2006 and 2008 NIST SRE corpora demonstrate that SD-NAP provides improved verification performance over standard NAP in most cases, particularly at the EER operating point.
Resumo:
It is known that adenosine 5'-triphosphate (ATP) is a cotransmitter in the heart. Additionally, ATP is released from ischemic and hypoxic myocytes. Therefore, cardiac-derived sources of ATP have the potential to modify cardiac function. ATP activates P2X(1-7) and P2Y(1-14) receptors; however, the presence of P2X and P2Y receptor subtypes in strategic cardiac locations such as the sinoatrial node has not been determined. An understanding of P2X and P2Y receptor localization would facilitate investigation of purine receptor function in the heart. Therefore, we used quantitative PCR and in situ hybridization to measure the expression of mRNA of all known purine receptors in rat left ventricle, right atrium and sinoatrial node (SAN), and human right atrium and SAN. Expression of mRNA for all the cloned P2 receptors was observed in the ventricles, atria, and SAN of the rat. However, their abundance varied in different regions of the heart. P2X(5) was the most abundant of the P2X receptors in all three regions of the rat heart. In rat left ventricle, P2Y(1), P2Y(2), and P2Y(14) mRNA levels were highest for P2Y receptors, while in right atrium and SAN, P2Y(2) and P2Y(14) levels were highest, respectively. We extended these studies to investigate P2X(4) receptor mRNA in heart from rats with coronary artery ligation-induced heart failure. P2X(4) receptor mRNA was upregulated by 93% in SAN (P < 0.05), while a trend towards an increase was also observed in the right atrium and left ventricle (not significant). Thus, P2X(4)-mediated effects might be modulated in heart failure. mRNA for P2X(4-7) and P2Y(1,2,4,6,12-14), but not P2X(2,3) and P2Y(11), was detected in human right atrium and SAN. In addition, mRNA for P2X(1) was detected in human SAN but not human right atrium. In human right atrium and SAN, P2X(4) and P2X(7) mRNA was the highest for P2X receptors. P2Y(1) and P2Y(2) mRNA were the most abundant for P2Y receptors in the right atrium, while P2Y(1), P2Y(2), and P2Y(14) were the most abundant P2Y receptor subtypes in human SAN. This study shows a widespread distribution of P2 receptor mRNA in rat heart tissues but a more restricted presence and distribution of P2 receptor mRNA in human atrium and SAN. This study provides further direction for the elucidation of P2 receptor modulation of heart rate and contractility.
Resumo:
In Orissa state, India, the DakNet system supports asynchronous Internet communication between an urban hub and rural nodes. DakNet is noteworthy in many respects, not least in how the system leverages existing transport infrastructure. Wi-Fi transceivers mounted on local buses send and receive user data from roadside kiosks, for later transfer to/from the Internet via wireless protocols. This store-and-forward system allows DakNet to offer asynchronous communication capacity to rural users at low cost. The original ambition of the DakNet system was to provide email and SMS facilities to rural communities. Our 2008 study of the communicative ecology surrounding the DakNet system revealed that this ambition has now evolved – in response to market demand – to the extent that e-shopping (rather than email) has become the primary driver behind the DakNet offer.
Resumo:
This paper presents a novel approach of estimating the confidence interval of speaker verification scores. This approach is utilised to minimise the utterance lengths required in order to produce a confident verification decision. The confidence estimation method is also extended to address both the problem of high correlation in consecutive frame scores, and robustness with very limited training samples. The proposed technique achieves a drastic reduction in the typical data requirements for producing confident decisions in an automatic speaker verification system. When evaluated on the NIST 2005 SRE, the early verification decision method demonstrates that an average of 5–10 seconds of speech is sufficient to produce verification rates approaching those achieved previously using an average in excess of 100 seconds of speech.
Resumo:
Intuitively, any `bag of words' approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distri- butions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document's initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur's search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.
Resumo:
Collaborative tagging can help users organize, share and retrieve information in an easy and quick way. For the collaborative tagging information implies user’s important personal preference information, it can be used to recommend personalized items to users. This paper proposes a novel tag-based collaborative filtering approach for recommending personalized items to users of online communities that are equipped with tagging facilities. Based on the distinctive three dimensional relationships among users, tags and items, a new similarity measure method is proposed to generate the neighborhood of users with similar tagging behavior instead of similar implicit ratings. The promising experiment result shows that by using the tagging information the proposed approach outperforms the standard user and item based collaborative filtering approaches.
Resumo:
Anomalous dynamics in complex systems have gained much interest in recent years. In this paper, a two-dimensional anomalous subdiffusion equation (2D-ASDE) is considered. Two numerical methods for solving the 2D-ASDE are presented. Their stability, convergence and solvability are discussed. A new multivariate extrapolation is introduced to improve the accuracy. Finally, numerical examples are given to demonstrate the effectiveness of the schemes and confirm the theoretical analysis.
Resumo:
Association rule mining has made many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we firstly propose a definition for redundancy; then we propose a concise representation called Reliable basis for representing non-redundant association rules for both exact rules and approximate rules. An important contribution of this paper is that we propose to use the certainty factor as the criteria to measure the strength of the discovered association rules. With the criteria, we can determine the boundary between redundancy and non-redundancy to ensure eliminating as many redundant rules as possible without reducing the inference capacity of and the belief to the remaining extracted non-redundant rules. We prove that the redundancy elimination based on the proposed Reliable basis does not reduce the belief to the extracted rules. We also prove that all association rules can be deduced from the Reliable basis. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules.
Resumo:
One of the new challenges in aeronautics is combining and accounting for multiple disciplines while considering uncertainties or variability in the design parameters or operating conditions. This paper describes a methodology for robust multidisciplinary design optimisation when there is uncertainty in the operating conditions. The methodology, which is based on canonical evolution algorithms, is enhanced by its coupling with an uncertainty analysis technique. The paper illustrates the use of this methodology on two practical test cases related to Unmanned Aerial Systems (UAS). These are the ideal candidates due to the multi-physics involved and the variability of missions to be performed. Results obtained from the optimisation show that the method is effective to find useful Pareto non-dominated solutions and demonstrate the use of robust design techniques.
Resumo:
Train scheduling is a complex and time consuming task of vital importance. To schedule trains more accurately and efficiently than permitted by current techniques a novel hybrid job shop approach has been proposed and implemented. Unique characteristics of train scheduling are first incorporated into a disjunctive graph model of train operations. A constructive algorithm that utilises this model is then developed. The constructive algorithm is a general procedure that constructs a schedule using insertion, backtracking and dynamic route selection mechanisms. It provides a significant search capability and is valid for any objective criteria. Simulated Annealing and Local Search meta-heuristic improvement algorithms are also adapted and extended. An important feature of these approaches is a new compound perturbation operator that consists of many unitary moves that allows trains to be shifted feasibly and more easily within the solution. A numerical investigation and case study is provided and demonstrates that high quality solutions are obtainable on real sized applications.
Resumo:
Privacy enhancing protocols (PEPs) are a family of protocols that allow secure exchange and management of sensitive user information. They are important in preserving users’ privacy in today’s open environment. Proof of the correctness of PEPs is necessary before they can be deployed. However, the traditional provable security approach, though well established for verifying cryptographic primitives, is not applicable to PEPs. We apply the formal method of Coloured Petri Nets (CPNs) to construct an executable specification of a representative PEP, namely the Private Information Escrow Bound to Multiple Conditions Protocol (PIEMCP). Formal semantics of the CPN specification allow us to reason about various security properties of PIEMCP using state space analysis techniques. This investigation provides us with preliminary insights for modeling and verification of PEPs in general, demonstrating the benefit of applying the CPN-based formal approach to proving the correctness of PEPs.