865 resultados para Filmic approach methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes observational research and verbal protocols methods, how these methods are applied and integrated within different contexts, and how they complement each other. The first case study focuses on nurses’ interaction during bandaging of patients’ lower legs. To maintain research rigor a triangulation approach was applied that links observations of current procedures, ‘talk-aloud’ protocol during interaction and retrospective protocol. Maps of interactions demonstrated that some nurses bandage more intuitively than others. Nurses who bandage intuitively assemble long sequences of bandaging actions while nurses who bandage less intuitively ‘focus-shift’ in between bandaging actions. Thus different levels of expertise have been identified. The second case study consists of two laboratory experiments. It focuses on analysing and comparing software and product design teams and how they approached a design problem. It is based on the observational and verbal data analysis. The coding scheme applied evolved during the analysis of the activity of each team and is identical for all teams. The structure of knowledge captured from the analysis of the design team maps of interaction is identified. The significance of this work is within its methodological approach. The maps of interaction are instrumental for understanding the activities and interactions of the people observed. By examining the maps of interaction, it is possible to draw conclusions about interactions, structure of knowledge captured and level of expertise. This research approach is transferable to other design domains. Designers will be able to transfer the interaction maps outcomes to systems and services they design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To derive preference-based measures from various condition-specific descriptive health-related quality of life (HRQOL) measures. A general 2-stage method is evolved: 1) an item from each domain of the HRQOL measure is selected to form a health state classification system (HSCS); 2) a sample of health states is valued and an algorithm derived for estimating the utility of all possible health states. The aim of this analysis was to determine whether confirmatory or exploratory factor analysis (CFA, EFA) should be used to derive a cancer-specific utility measure from the EORTC QLQ-C30. Methods: Data were collected with the QLQ-C30v3 from 356 patients receiving palliative radiotherapy for recurrent or metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter based on a conceptual model (the established domain structure of the QLQ-C30: physical, role, emotional, social and cognitive functioning, plus several symptoms) and clinical considerations (views of both patients and clinicians about issues relevant to HRQOL in cancer). The dimensions determined by each method were then subjected to item response theory, including Rasch analysis. Results: CFA results generally supported the proposed conceptual model, with residual correlations requiring only minor adjustments (namely, introduction of two cross-loadings) to improve model fit (increment χ2(2) = 77.78, p < .001). Although EFA revealed a structure similar to the CFA, some items had loadings that were difficult to interpret. Further assessment of dimensionality with Rasch analysis aligned the EFA dimensions more closely with the CFA dimensions. Three items exhibited floor effects (>75% observation at lowest score), 6 exhibited misfit to the Rasch model (fit residual > 2.5), none exhibited disordered item response thresholds, 4 exhibited DIF by gender or cancer site. Upon inspection of the remaining items, three were considered relatively less clinically important than the remaining nine. Conclusions: CFA appears more appropriate than EFA, given the well-established structure of the QLQ-C30 and its clinical relevance. Further, the confirmatory approach produced more interpretable results than the exploratory approach. Other aspects of the general method remain largely the same. The revised method will be applied to a large number of data sets as part of the international and interdisciplinary project to develop a multi-attribute utility instrument for cancer (MAUCa).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed methods research is the use of qualitative and quantitative methods in the same study to gain a more rounded and holistic understanding of the phenomena under investigation. This type of research approach is gaining popularity in the nursing literature as a way to understand the complexity of nursing care and as a means to enhance evidenced-based practice. This paper introduces nephrology nurses to mixed methods research, its terminology and application to nephrology nursing. Five common mixed methods designs will be described highlighting the purposes, strengths and weaknesses of each design. Examples of mixed methods research will be given to illustrate the wide application of mixed methods research to nursing and its usefulness in nephrology nursing research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with applying a particle-based approach to simulate the micro-level cellular structural changes of plant cells during drying. The objective of the investigation was to relate the micro-level structural properties such as cell area, diameter and perimeter to the change of moisture content of the cell. Model assumes a simplified cell which consists of two basic components, cell wall and cell fluid. The cell fluid is assumed to be a Newtonian fluid with higher viscosity compared to water and cell wall is assumed to be a visco-elastic solid boundary located around the cell fluid. Cell fluid is modelled with Smoothed Particle Hydrodynamics (SPH) technique and for the cell wall; a Discrete Element Method (DEM) is used. The developed model is two-dimensional, but accounts for three-dimensional physical properties of real plant cells. Drying phenomena is simulated as fluid mass reductions and the model is used to predict the above mentioned structural properties as a function of cell fluid mass. Model predictions are found to be in fairly good agreement with experimental data in literature and the particle-based approach is demonstrated to be suitable for numerical studies of drying related structural deformations. Also a sensitivity analysis is included to demonstrate the influence of key model parameters to model predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate(subject pool) level the results are (roughly) consistent, on an individual(within-subject) level,behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining e�ects in other experimental games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stormwater is a potential and readily available alternative source for potable water in urban areas. However, its direct use is severely constrained by the presence of toxic pollutants, such as heavy metals (HMs). The presence of HMs in stormwater is of concern because of their chronic toxicity and persistent nature. In addition to human health impacts, metals can contribute to adverse ecosystem health impact on receiving waters. Therefore, the ability to predict the levels of HMs in stormwater is crucial for monitoring stormwater quality and for the design of effective treatment systems. Unfortunately, the current laboratory methods for determining HM concentrations are resource intensive and time consuming. In this paper, applications of multivariate data analysis techniques are presented to identify potential surrogate parameters which can be used to determine HM concentrations in stormwater. Accordingly, partial least squares was applied to identify a suite of physicochemical parameters which can serve as indicators of HMs. Datasets having varied characteristics, such as land use and particle size distribution of solids, were analyzed to validate the efficacy of the influencing parameters. Iron, manganese, total organic carbon, and inorganic carbon were identified as the predominant parameters that correlate with the HM concentrations. The practical extension of the study outcomes to urban stormwater management is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: The purpose of this research is to examine School Based Youth Health Nurses experience of a true health promotion approach. Background: The School Based Youth Health Nurse Program is a state-wide school nursing initiative in Queensland, Australia. The program employs more than 120 fulltime and fractional school nurses who provide health services in state high schools. The role incorporates two primary components: individual health consultations and health promotion strategies. Design/Methods: This study is a retrospective inquiry generated from a larger qualitative research project about the experience of school based youth health nursing. The original methodology was phenomenography. In-depth interviews were conducted with sixteen school nurses recruited through purposeful and snowball sampling. This study accesses a specific set of raw data about School Based Youth Health Nurses experience of a true health promotion approach. The Ottawa Charter for Health Promotion (1986) is used as a framework for deductive analysis. Results: The findings indicate school nurses have neither an adverse or affirmative conceptual experience of a true health promotion approach and an adverse operational experience of a true health promotion approach based on the action areas of the Ottawa Charter. Conclusions: The findings of this research are important because they challenge the notion that school nurses are the most appropriate health professionals to undertake a true health promotion approach. If school nurses are the most appropriate health professionals to do a true health promotion approach, there are implications for recruitment and training and qualifications. If school nurses are not, who are the most appropriate health professionals to do school health promotion? Implications for Practice: These findings can be applied to other models of school nursing in Australia which emphasises a true health promotion approach because they relate specifically to school nurses’ experience of a true health promotion approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes the use of Bayesian approaches with the cross likelihood ratio (CLR) as a criterion for speaker clustering within a speaker diarization system, using eigenvoice modeling techniques. The CLR has previously been shown to be an effective decision criterion for speaker clustering using Gaussian mixture models. Recently, eigenvoice modeling has become an increasingly popular technique, due to its ability to adequately represent a speaker based on sparse training data, as well as to provide an improved capture of differences in speaker characteristics. The integration of eigenvoice modeling into the CLR framework to capitalize on the advantage of both techniques has also been shown to be beneficial for the speaker clustering task. Building on that success, this paper proposes the use of Bayesian methods to compute the conditional probabilities in computing the CLR, thus effectively combining the eigenvoice-CLR framework with the advantages of a Bayesian approach to the diarization problem. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 33.5% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vietnam has a unique culture which is revealed in the way that people have built and designed their traditional housing. Vietnamese dwellings reflect occupants’ activities in their everyday lives, while adapting to tropical climatic conditions impacted by seasoning monsoons. It is said that these characteristics of Vietnamese dwellings have remained unchanged until the economic reform in 1986, when Vietnam experienced an accelerated development based on the market-oriented economy. New housing types, including modern shop-houses, detached houses, and apartments, have been designed in many places, especially satisfying dwellers’ new lifestyles in Vietnamese cities. The contemporary housing, which has been mostly designed by architects, has reflected rules of spatial organisation so that occupants’ social activities are carried out. However, contemporary housing spaces seem unsustainable in relation to socio-cultural values because they has been influenced by globalism that advocates the use of homogeneous spatial patterns, modern technologies, materials and construction methods. This study investigates the rules of spaces in Vietnamese houses that were built before and after the reform to define the socio-cultural implications in Vietnamese housing design. Firstly, it describes occupants’ views of their current dwellings in terms of indoor comfort conditions and social activities in spaces. Then, it examines the use of spaces in pre-reform Vietnamese housing through occupants’ activities and material applications. Finally, it discusses the organisation of spaces in both pre- and post-reform housing to understand how Vietnamese housing has been designed for occupants to live, act, work, and conduct traditional activities. Understanding spatial organisation is a way to identify characteristics of the lived spaces of the occupants created from the conceived space, which is designed by designers. The characteristics of the housing spaces will inform the designers the way to design future Vietnamese housing in response to cultural contexts. The study applied an abductive approach for the investigation of housing spaces. It used a conceptual framework in relation to Henri Lefebvre’s (1991) theory to understand space as the main factor constituting the language of design, and the principles of semiotics to examine spatial structure in housing as a language used in the everyday life. The study involved a door-knocking survey to 350 households in four regional cities of Vietnam for interpretation of occupancy conditions and levels of occupants’ comfort. A statistical analysis was applied to interpret the survey data. The study also required a process of data selection and collection of fourteen cases of housing in three main climatic regions of the country for analysing spatial organisation and housing characteristics. The study found that there has been a shift in the relationship of spaces from the pre- to post-reform Vietnamese housing. It also indentified that the space for guest welcoming and family activity has been the central space of the Vietnamese housing. Based on the relationships of the central space with the others, theoretical models were proposed for three types of contemporary Vietnamese housing. The models will be significant in adapting to Vietnamese conditions to achieve socioenvironmental characteristics for housing design because it was developed from the occupants’ requirements for their social activities. Another contribution of the study is the use of methodological concepts to understand the language of living spaces. Further work will be needed to test future Vietnamese housing designs from the applications of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper present an efficient method using system state sampling technique in Monte Carlo simulation for reliability evaluation of multi-area power systems, at Hierarchical Level One (HLI). System state sampling is one of the common methods used in Monte Carlo simulation. The cpu time and memory requirement can be a problem, using this method. Combination of analytical and Monte Carlo method known as Hybrid method, as presented in this paper, can enhance the efficiency of the solution. Incorporation of load model in this study can be utilised either by sampling or enumeration. Both cases are examined in this paper, by application of the methods on Roy Billinton Test System(RBTS).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, natural convection heat transfer and buoyancy driven flows have been investigated in a right angled triangular enclosure. The heater located on the bottom wall while the inclined wall is colder and the remaining walls are maintained as adiabatic. Governing equations of natural convection are solved through the finite volume approach, in which buoyancy is modeled via the Boussinesq approximation. Effects of different parameters such as Rayleigh number, aspect ratio, prantdl number and heater location are considered. Results show that heat transfer increases when the heater is moved toward the right corner of the enclosure. It is also revealed that increasing the Rayleigh number, increases the strength of free convection regime and consequently increases the value of heat transfer rate. Moreover, larger aspect ratio enclosure has larger Nusselt number value. In order to have better insight, streamline and isotherms are shown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collaborative methods are promising tools for solving complex security tasks. In this context, the authors present the security overlay framework CIMD (Collaborative Intrusion and Malware Detection), enabling participants to state objectives and interests for joint intrusion detection and find groups for the exchange of security-related data such as monitoring or detection results accordingly; to these groups the authors refer as detection groups. First, the authors present and discuss a tree-oriented taxonomy for the representation of nodes within the collaboration model. Second, they introduce and evaluate an algorithm for the formation of detection groups. After conducting a vulnerability analysis of the system, the authors demonstrate the validity of CIMD by examining two different scenarios inspired sociology where the collaboration is advantageous compared to the non-collaborative approach. They evaluate the benefit of CIMD by simulation in a novel packet-level simulation environment called NeSSi (Network Security Simulator) and give a probabilistic analysis for the scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose and evaluate a speaker attribution system using a complete-linkage clustering method. Speaker attribution refers to the annotation of a collection of spoken audio based on speaker identities. This can be achieved using diarization and speaker linking. The main challenge associated with attribution is achieving computational efficiency when dealing with large audio archives. Traditional agglomerative clustering methods with model merging and retraining are not feasible for this purpose. This has motivated the use of linkage clustering methods without retraining. We first propose a diarization system using complete-linkage clustering and show that it outperforms traditional agglomerative and single-linkage clustering based diarization systems with a relative improvement of 40% and 68%, respectively. We then propose a complete-linkage speaker linking system to achieve attribution and demonstrate a 26% relative improvement in attribution error rate (AER) over the single-linkage speaker linking approach.