979 resultados para consistency


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent years have witnessed a rapid growth in the demand for streaming video over the Internet, exposing challenges in coping with heterogeneous device capabilities and varying network throughput. When we couple this rise in streaming with the growing number of portable devices (smart phones, tablets, laptops) we see an ever-increasing demand for high-definition videos online while on the move. Wireless networks are inherently characterised by restricted shared bandwidth and relatively high error loss rates, thus presenting a challenge for the efficient delivery of high quality video. Additionally, mobile devices can support/demand a range of video resolutions and qualities. This demand for mobile streaming highlights the need for adaptive video streaming schemes that can adjust to available bandwidth and heterogeneity, and can provide us with graceful changes in video quality, all while respecting our viewing satisfaction. In this context the use of well-known scalable media streaming techniques, commonly known as scalable coding, is an attractive solution and the focus of this thesis. In this thesis we investigate the transmission of existing scalable video models over a lossy network and determine how the variation in viewable quality is affected by packet loss. This work focuses on leveraging the benefits of scalable media, while reducing the effects of data loss on achievable video quality. The overall approach is focused on the strategic packetisation of the underlying scalable video and how to best utilise error resiliency to maximise viewable quality. In particular, we examine the manner in which scalable video is packetised for transmission over lossy networks and propose new techniques that reduce the impact of packet loss on scalable video by selectively choosing how to packetise the data and which data to transmit. We also exploit redundancy techniques, such as error resiliency, to enhance the stream quality by ensuring a smooth play-out with fewer changes in achievable video quality. The contributions of this thesis are in the creation of new segmentation and encapsulation techniques which increase the viewable quality of existing scalable models by fragmenting and re-allocating the video sub-streams based on user requirements, available bandwidth and variations in loss rates. We offer new packetisation techniques which reduce the effects of packet loss on viewable quality by leveraging the increase in the number of frames per group of pictures (GOP) and by providing equality of data in every packet transmitted per GOP. These provide novel mechanisms for packetizing and error resiliency, as well as providing new applications for existing techniques such as Interleaving and Priority Encoded Transmission. We also introduce three new scalable coding models, which offer a balance between transmission cost and the consistency of viewable quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Spirituality is fundamental to all human beings, existing within a person, and developing until death. This research sought to operationalise spirituality in a sample of individuals with chronic illness. A review of the conceptual literature identified three dimensions of spirituality: connectedness, transcendence, and meaning in life. A review of the empirical literature identified one instrument that measures the three dimensions together. Yet, recent appraisals of this instrument highlighted issues with item formulation and limited evidence of reliability and validity. Aim: The aim of this research was to develop a theoretically-grounded instrument to measure spirituality – the Spirituality Instrument-27 (SpI-27). A secondary aim was to psychometrically evaluate this instrument in a sample of individuals with chronic illness (n=249). Methods: A two-phase design was adopted. Phase one consisted of the development of the SpI-27 based on item generation from a concept analysis, a literature review, and an instrument appraisal. The second phase established the psychometric properties of the instrument and included: a qualitative descriptive design to establish content validity; a pilot study to evaluate the mode of administration; and a descriptive correlational design to assess the instrument’s reliability and validity. Data were analysed using SPSS (Version 18). Results: Results of exploratory factor analysis concluded a final five-factor solution with 27 items. These five factors were labelled: Connectedness with Others, Self-Transcendence, Self-Cognisance, Conservationism, and Connectedness with a Higher Power. Cronbach’s alpha coefficients ranged from 0.823 to 0.911 for the five factors, and 0.904 for the overall scale, indicating high internal consistency. Paired-sample t-tests, intra-class correlations, and weighted kappa values supported the temporal stability of the instrument over 2 weeks. A significant positive correlation was found between the SpI-27 and the Spirituality Index of Well-Being, providing evidence for convergent validity. Conclusion: This research addresses a call for a theoretically-grounded instrument to measure spirituality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The original solution to the high failure rate of software development projects was the imposition of an engineering approach to software development, with processes aimed at providing a repeatable structure to maintain a consistency in the ‘production process’. Despite these attempts at addressing the crisis in software development, others have argued that the rigid processes of an engineering approach did not provide the solution. The Agile approach to software development strives to change how software is developed. It does this primarily by relying on empowered teams of developers who are trusted to manage the necessary tasks, and who accept that change is a necessary part of a development project. The use of, and interest in, Agile methods in software development projects has expanded greatly, yet this has been predominantly practitioner driven. There is a paucity of scientific research on Agile methods and how they are adopted and managed. This study aims at addressing this paucity by examining the adoption of Agile through a theoretical lens. The lens used in this research is that of double loop learning theory. The behaviours required in an Agile team are the same behaviours required in double loop learning; therefore, a transition to double loop learning is required for a successful Agile adoption. The theory of triple loop learning highlights that power factors (or power mechanisms in this research) can inhibit the attainment of double loop learning. This study identifies the negative behaviours - potential power mechanisms - that can inhibit the double loop learning inherent in an Agile adoption, to determine how the Agile processes and behaviours can create these power mechanisms, and how these power mechanisms impact on double loop learning and the Agile adoption. This is a critical realist study, which acknowledges that the real world is a complex one, hierarchically structured into layers. An a priori framework is created to represent these layers, which are categorised as: the Agile context, the power mechanisms, and double loop learning. The aim of the framework is to explain how the Agile processes and behaviours, through the teams of developers and project managers, can ultimately impact on the double loop learning behaviours required in an Agile adoption. Four case studies provide further refinement to the framework, with changes required due to observations which were often different to what existing literature would have predicted. The study concludes by explaining how the teams of developers, the individual developers, and the project managers, working with the Agile processes and required behaviours, can inhibit the double loop learning required in an Agile adoption. A solution is then proposed to mitigate these negative impacts. Additionally, two new research processes are introduced to add to the Information Systems research toolkit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent evidence that echinoids of the genus Echinometra have moderate visual acuity that appears to be mediated by their spines screening off-axis light suggests that the urchin Strongylocentrotus purpuratus, with its higher spine density, may have even more acute spatial vision. We analyzed the movements of 39 specimens of S. purpuratus after they were placed in the center of a featureless tank containing a round, black target that had an angular diameter of 6.5 deg. or 10 deg. (solid angles of 0.01 sr and 0.024 sr, respectively). An average orientation vector for each urchin was determined by testing the animal four times, with the target placed successively at bearings of 0 deg., 90 deg., 180 deg. and 270 deg. (relative to magnetic east). The urchins showed no significant unimodal or axial orientation relative to any non-target feature of the environment or relative to the changing position of the 6.5 deg. target. However, the urchins were strongly axially oriented relative to the changing position of the 10 deg. target (mean axis from -1 to 179 deg.; 95% confidence interval +/- 12 deg.; P<0.001, Moore's non-parametric Hotelling's test), with 10 of the 20 urchins tested against that target choosing an average bearing within 10 deg. of either the target center or its opposite direction (two would be expected by chance). In addition, the average length of the 20 target-normalized bearings for the 10 deg. target (each the vector sum of the bearings for the four trials) were far higher than would be expected by chance (P<10(-10); Monte Carlo simulation), showing that each urchin, whether it moved towards or away from the target, did so with high consistency. These results strongly suggest that S. purpuratus detected the 10 deg. target, responding either by approaching it or fleeing it. Given that the urchins did not appear to respond to the 6.5 deg. target, it is likely that the 10 deg. target was close to the minimum detectable size for this species. Interestingly, measurements of the spine density of the regions of the test that faced horizontally predicted a similar visual resolution (8.3+/-0.5 deg. for the interambulacrum and 11+/-0.54 deg. for the ambulacrum). The function of this relatively low, but functional, acuity - on par with that of the chambered Nautilus and the horseshoe crab - is unclear but, given the bimodal response, is likely to be related to both shelter seeking and predator avoidance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Writing plays a central role in the communication of scientific ideas and is therefore a key aspect in researcher education, ultimately determining the success and long-term sustainability of their careers. Despite the growing popularity of e-learning, we are not aware of any existing study comparing on-line vs. traditional classroom-based methods for teaching scientific writing. METHODS: Forty eight participants from a medical, nursing and physiotherapy background from US and Brazil were randomly assigned to two groups (n = 24 per group): An on-line writing workshop group (on-line group), in which participants used virtual communication, google docs and standard writing templates, and a standard writing guidance training (standard group) where participants received standard instruction without the aid of virtual communication and writing templates. Two outcomes, manuscript quality was assessed using the scores obtained in Six subgroup analysis scale as the primary outcome measure, and satisfaction scores with Likert scale were evaluated. To control for observer variability, inter-observer reliability was assessed using Fleiss's kappa. A post-hoc analysis comparing rates of communication between mentors and participants was performed. Nonparametric tests were used to assess intervention efficacy. RESULTS: Excellent inter-observer reliability among three reviewers was found, with an Intraclass Correlation Coefficient (ICC) agreement = 0.931882 and ICC consistency = 0.932485. On-line group had better overall manuscript quality (p = 0.0017, SSQSavg score 75.3 +/- 14.21, ranging from 37 to 94) compared to the standard group (47.27 +/- 14.64, ranging from 20 to 72). Participant satisfaction was higher in the on-line group (4.3 +/- 0.73) compared to the standard group (3.09 +/- 1.11) (p = 0.001). The standard group also had fewer communication events compared to the on-line group (0.91 +/- 0.81 vs. 2.05 +/- 1.23; p = 0.0219). CONCLUSION: Our protocol for on-line scientific writing instruction is better than standard face-to-face instruction in terms of writing quality and student satisfaction. Future studies should evaluate the protocol efficacy in larger longitudinal cohorts involving participants from different languages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While genome-wide gene expression data are generated at an increasing rate, the repertoire of approaches for pattern discovery in these data is still limited. Identifying subtle patterns of interest in large amounts of data (tens of thousands of profiles) associated with a certain level of noise remains a challenge. A microarray time series was recently generated to study the transcriptional program of the mouse segmentation clock, a biological oscillator associated with the periodic formation of the segments of the body axis. A method related to Fourier analysis, the Lomb-Scargle periodogram, was used to detect periodic profiles in the dataset, leading to the identification of a novel set of cyclic genes associated with the segmentation clock. Here, we applied to the same microarray time series dataset four distinct mathematical methods to identify significant patterns in gene expression profiles. These methods are called: Phase consistency, Address reduction, Cyclohedron test and Stable persistence, and are based on different conceptual frameworks that are either hypothesis- or data-driven. Some of the methods, unlike Fourier transforms, are not dependent on the assumption of periodicity of the pattern of interest. Remarkably, these methods identified blindly the expression profiles of known cyclic genes as the most significant patterns in the dataset. Many candidate genes predicted by more than one approach appeared to be true positive cyclic genes and will be of particular interest for future research. In addition, these methods predicted novel candidate cyclic genes that were consistent with previous biological knowledge and experimental validation in mouse embryos. Our results demonstrate the utility of these novel pattern detection strategies, notably for detection of periodic profiles, and suggest that combining several distinct mathematical approaches to analyze microarray datasets is a valuable strategy for identifying genes that exhibit novel, interesting transcriptional patterns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

People often do not realize they are being influenced by an incidental emotional state. As a result, decisions based on a fleeting incidental emotion can become the basis for future decisions and hence outlive the original cause for the behavior (i.e., the emotion itself). Using a sequence of ultimatum and dictator games, we provide empirical evidence for the enduring impact of transient emotions on economic decision making. Behavioral consistency and false consensus are presented as potential underlying processes. © 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Religious congruence refers to consistency among an individual’s religious beliefs and attitudes, consistency between religious ideas and behavior, and religious ideas, identities, or schemas that are chronically salient and accessible to individuals across contexts and situations. Decades of anthropological, sociological, and psychological research establish that religious congruence is rare, but much thinking about religion presumes that it is common. The religious congruence fallacy occurs when interpretations or explanations unjustifiably presume religious congruence. I illustrate the ubiquity of religious incongruence, show how the religious congruence fallacy distorts thinking about religion, and outline an approach to help overcome the fallacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gliomagenesis is driven by a complex network of genetic alterations and while the glioma genome has been a focus of investigation for many years; critical gaps in our knowledge of this disease remain. The identification of novel molecular biomarkers remains a focus of the greater cancer community as a method to improve the consistency and accuracy of pathological diagnosis. In addition, novel molecular biomarkers are drastically needed for the identification of targets that may ultimately result in novel therapeutics aimed at improving glioma treatment. Through the identification of new biomarkers, laboratories will focus future studies on the molecular mechanisms that underlie glioma development. Here, we report a series of genomic analyses identifying novel molecular biomarkers in multiple histopathological subtypes of glioma and refine the classification of malignant gliomas. We have completed a large scale analysis of the WHO grade II-III astrocytoma exome and report frequent mutations in the chromatin modifier, alpha thalassemia mental retardation x-linked (ATRX), isocitrate dehydrogenase 1 and 2 (IDH1 and IDH2), and mutations in tumor protein 53 (TP53) as the most frequent genetic mutations in low grade astrocytomas. Furthermore, by analyzing the status of recurrently mutated genes in 363 brain tumors, we establish that highly recurrent gene mutational signatures are an effective tool in stratifying homogeneous patient populations into distinct groups with varying outcomes, thereby capable of predicting prognosis. Next, we have established mutations in the promoter of telomerase reverse transcriptase (TERT) as a frequent genetic event in gliomas and in tissues with low rates of self renewal. We identify TERT promoter mutations as the most frequently mutated gene in primary glioblastoma. Additionally, we show that TERT promoter mutations in combination with IDH1 and IDH2 mutations are able to delineate distinct clinical tumor cohorts and are capable of predicting median overall survival more effectively than standard histopathological diagnosis alone. Taken together, these data advance our understanding of the genetic alterations that underlie the transformation of glial cells into neoplasms and we provide novel genetic biomarkers and multi – gene mutational signatures that can be utilized to refine the classification of malignant gliomas and provide opportunity for improved diagnosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A total of 30 undergraduates recalled the same 20 autobiographical memories at two sessions separated by 2 weeks. At each session they dated their memories and rated them on 18 properties commonly studied in autobiographical memory experiments. Individuals showed moderate stability in their ratings on the 18 scales (r approximately .5), with consistency of dating being much higher (r = .96). There was more stability in the individuals' average rating on each scale (r approximately .8), even when the averages were calculated on different memories in the different sessions. The results are consistent with a constructive view of autobiographical memory, in which stable individual differences in cognitive style are important.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The wealth of phenotypic descriptions documented in the published articles, monographs, and dissertations of phylogenetic systematics is traditionally reported in a free-text format, and it is therefore largely inaccessible for linkage to biological databases for genetics, development, and phenotypes, and difficult to manage for large-scale integrative work. The Phenoscape project aims to represent these complex and detailed descriptions with rich and formal semantics that are amenable to computation and integration with phenotype data from other fields of biology. This entails reconceptualizing the traditional free-text characters into the computable Entity-Quality (EQ) formalism using ontologies. METHODOLOGY/PRINCIPAL FINDINGS: We used ontologies and the EQ formalism to curate a collection of 47 phylogenetic studies on ostariophysan fishes (including catfishes, characins, minnows, knifefishes) and their relatives with the goal of integrating these complex phenotype descriptions with information from an existing model organism database (zebrafish, http://zfin.org). We developed a curation workflow for the collection of character, taxonomic and specimen data from these publications. A total of 4,617 phenotypic characters (10,512 states) for 3,449 taxa, primarily species, were curated into EQ formalism (for a total of 12,861 EQ statements) using anatomical and taxonomic terms from teleost-specific ontologies (Teleost Anatomy Ontology and Teleost Taxonomy Ontology) in combination with terms from a quality ontology (Phenotype and Trait Ontology). Standards and guidelines for consistently and accurately representing phenotypes were developed in response to the challenges that were evident from two annotation experiments and from feedback from curators. CONCLUSIONS/SIGNIFICANCE: The challenges we encountered and many of the curation standards and methods for improving consistency that we developed are generally applicable to any effort to represent phenotypes using ontologies. This is because an ontological representation of the detailed variations in phenotype, whether between mutant or wildtype, among individual humans, or across the diversity of species, requires a process by which a precise combination of terms from domain ontologies are selected and organized according to logical relations. The efficiencies that we have developed in this process will be useful for any attempt to annotate complex phenotypic descriptions using ontologies. We also discuss some ramifications of EQ representation for the domain of systematics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a psychological principle, the golden rule represents an ethic of universal empathic concern. It is, surprisingly, present in the sacred texts of virtually all religions, and in philosophical works across eras and continents. Building on the literature demonstrating a positive impact of prosocial behavior on well-being, the present study investigates the psychological function of universal empathic concern in Indian Hindus, Christians, Muslims and Sikhs.

I develop a measure of the centrality of the golden rule-based ethic, within an individual’s understanding of his or her religion, that is applicable to all theistic religions. I then explore the consistency of its relationships with psychological well-being and other variables across religious groups.

Results indicate that this construct, named Moral Concern Religious Focus, can be reliably measured in disparate religious groups, and consistently predicts well-being across them. With measures of Intrinsic, Extrinsic and Quest religious orientations in the model, only Moral Concern and religiosity predict well-being. Moral Concern alone mediates the relationship between religiosity and well-being, and explains more variance in well-being than religiosity alone. The relationship between Moral Concern and well-being is mediated by increased preference for prosocial values, more satisfying interpersonal relationships, and greater meaning in life. In addition, across religious groups Moral Concern is associated with better self-reported physical and mental health, and more compassionate attitudes toward oneself and others.

Two additional types of religious focus are identified: Personal Gain, representing the motive to use religion to improve one’s life, and Relationship with God. Personal Gain is found to predict reduced preference for prosocial values, less meaning in life, and lower quality of relationships. It is associated with greater interference of pain and physical or mental health problems with daily activities, and lower self-compassion. Relationship with God is found to be associated primarily with religious variables and greater meaning in life.

I conclude that individual differences in the centrality of the golden rule and its associated ethic of universal empathic concern may play an important role in explaining the variability in associations between religion, prosocial behavior and well-being noted in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Anticoagulation can reduce quality of life, and different models of anticoagulation management might have different impacts on satisfaction with this component of medical care. Yet, to our knowledge, there are no scales measuring quality of life and satisfaction with anticoagulation that can be generalized across different models of anticoagulation management. We describe the development and preliminary validation of such an instrument - the Duke Anticoagulation Satisfaction Scale (DASS). METHODS: The DASS is a 25-item scale addressing the (a) negative impacts of anticoagulation (limitations, hassles and burdens); and (b) positive impacts of anticoagulation (confidence, reassurance, satisfaction). Each item has 7 possible responses. The DASS was administered to 262 patients currently receiving oral anticoagulation. Scales measuring generic quality of life, satisfaction with medical care, and tendency to provide socially desirable responses were also administered. Statistical analysis included assessment of item variability, internal consistency (Cronbach's alpha), scale structure (factor analysis), and correlations between the DASS and demographic variables, clinical characteristics, and scores on the above scales. A follow-up study of 105 additional patients assessed test-retest reliability. RESULTS: 220 subjects answered all items. Ceiling and floor effects were modest, and 25 of the 27 proposed items grouped into 2 factors (positive impacts, negative impacts, this latter factor being potentially subdivided into limitations versus hassles and burdens). Each factor had a high degree of internal consistency (Cronbach's alpha 0.78-0.91). The limitations and hassles factors consistently correlated with the SF-36 scales measuring generic quality of life, while the positive psychological impact scale correlated with age and time on anticoagulation. The intra-class correlation coefficient for test-retest reliability was 0.80. CONCLUSIONS: The DASS has demonstrated reasonable psychometric properties to date. Further validation is ongoing. To the degree that dissatisfaction with anticoagulation leads to decreased adherence, poorer INR control, and poor clinical outcomes, the DASS has the potential to help identify reasons for dissatisfaction (and positive satisfaction), and thus help to develop interventions to break this cycle. As an instrument designed to be applicable across multiple models of anticoagulation management, the DASS could be crucial in the scientific comparison between those models of care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In most diffusion tensor imaging (DTI) studies, images are acquired with either a partial-Fourier or a parallel partial-Fourier echo-planar imaging (EPI) sequence, in order to shorten the echo time and increase the signal-to-noise ratio (SNR). However, eddy currents induced by the diffusion-sensitizing gradients can often lead to a shift of the echo in k-space, resulting in three distinct types of artifacts in partial-Fourier DTI. Here, we present an improved DTI acquisition and reconstruction scheme, capable of generating high-quality and high-SNR DTI data without eddy current-induced artifacts. This new scheme consists of three components, respectively, addressing the three distinct types of artifacts. First, a k-space energy-anchored DTI sequence is designed to recover eddy current-induced signal loss (i.e., Type 1 artifact). Second, a multischeme partial-Fourier reconstruction is used to eliminate artificial signal elevation (i.e., Type 2 artifact) associated with the conventional partial-Fourier reconstruction. Third, a signal intensity correction is applied to remove artificial signal modulations due to eddy current-induced erroneous T2(∗) -weighting (i.e., Type 3 artifact). These systematic improvements will greatly increase the consistency and accuracy of DTI measurements, expanding the utility of DTI in translational applications where quantitative robustness is much needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Posttraumatic Stress Disorder is a diagnosis related to the past. Pre-traumatic stress reactions, as measured by intrusive involuntary images of possible future stressful events and their associated avoidance and increased arousal, have been overlooked in the PTSD literature. Here we introduce a scale that measures pre-traumatic stress reactions providing a clear future-oriented parallel to the posttraumatic stress reactions described in the diagnostic criteria for PTSD. We apply this pre-traumatic stress reactions checklist (PreCL) to Danish soldiers before, during, and after deployment to Afghanistan. The PreCL has good internal consistency and is highly correlated with a standard measure of PTSD symptoms. The PreCL as answered before the soldiers' deployment significantly predicted level of PTSD symptoms during and after their deployment, while controlling for baseline PTSD symptoms and combat exposure measured during and after deployment. The findings have implications for the conceptualization of PTSD, screening, and treatment.