804 resultados para video as a research tool
Resumo:
Aims: To establish the sensitivity and reliability of objective image analysis in direct comparison with subjective grading of bulbar hyperaemia. Methods: Images of the same eyes were captured with a range of bulbar hyperaemia caused by vasodilation. The progression was recorded and 45 images extracted. The images were objectively analysed on 14 occasions using previously validated edge-detection and colour-extraction techniques. They were also graded by 14 eye-care practitioners (ECPs) and 14 non-clinicians (NCb) using the Efron scale. Six ECPs repeated the grading on three separate occasions Results: Subjective grading was only able to differentiate images with differences in grade of 0.70-1.03 Efron units (sensitivity of 0.30-0.53), compared to 0,02-0.09 Efron units with objective techniques (sensitivity of 0.94-0.99). Significant differences were found between ECPs and individual repeats were also inconsistent (p<0.001). Objective analysis was 16x more reliable than subjective analysis. The NCLs used wider ranges of the scale but were more variable than ECPs, implying that training may have an effect on grading. Conclusions: Objective analysis may offer a new gold standard in anterior ocular examination, and should be developed further as a clinical research tool to allow more highly powered analysis, and to enhance the clinical monitoring of anterior eye disease.
Resumo:
This article uses a research project into the online conversations of sex offenders and the children they abuse to further the arguments for the acceptability of experimental work as a research tool for linguists. The research reported here contributes to the growing body of work within linguistics that has found experimental methods to be useful in answering questions about representation and constraints on linguistic expression (Hemforth 2013). The wider project examines online identity assumption in online paedophile activity and the policing of such activity, and involves dealing with the linguistic analysis of highly sensitive sexual grooming transcripts. Within the linguistics portion of the project, we examine theories of idiolect and identity through analysis of the ‘talk’ of perpetrators of online sexual abuse, and of the undercover officers that must assume alternative identities in order to investigate such crimes. The essential linguistic question in this article is methodological and concerns the applicability of experimental work to exploration of online identity and identity disguise. Although we touch on empirical questions, such as the sufficiency of linguistic description that will enable convincing identity disguise, we do not explore the experimental results in detail. In spite of the preference within a range of discourse analytical paradigms for ‘naturally occurring’ data, we argue that not only does the term prove conceptually problematic, but in certain contexts, and particularly in the applied forensic context described, a rejection of experimentally elicited data would limit the possible types and extent of analyses. Thus, it would restrict the contribution that academic linguistics can make in addressing a serious social problem.
Resumo:
In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.
Resumo:
Purpose – The purpose of this paper is to provide an analysis of sistematización’s use as a research tool in the operationalization of a “neighborhood approach” to the implementation of disaster risk reduction (DRR) in informal urban settlements. Design/methodology/approach – The first section highlights sistematización’s historical origins in Latin America in the fields of popular adult education, social work, and rural development. The second explains why sistematización was made a required component of project implementation. The third section addresses the approach to sistematización used. The final discusses how this experience both contributes to sistematización’s theoretical development and practical application as a methodology. Findings – The introduction of “sistematización” as a research tool facilitated real-time assessment of project implementation, providing timely information that positively influenced decision-making processes. This on-going feedback, collective learning, and open-exchange of know-how between NGOs and partner institutions allowed for the evaluation of existing practices and development of new ways of collaborating to address disaster risk in complex and dynamic urban environments. Practical implications – Sistematización transcends the narrow focus of traditional monitoring and evaluation on final results, emphasizing a comprehensive understanding of processes and contexts. Originality/value – Its use in the implementation of DRR initiatives in informal urban environments is particularly novel, highlighting the capacity of the methodology to be tailored to a variety of needs, in this case, bridging the gap between NGOs, local governments, and vulnerable communities, as well as between urban, development, and disaster risk management planning.
Resumo:
Biodiversity citizen science projects are growing in number, size, and scope, and are gaining recognition as valuable data sources that build public engagement. Yet publication rates indicate that citizen science is still infrequently used as a primary tool for conservation research and the causes of this apparent disconnect have not been quantitatively evaluated. To uncover the barriers to the use of citizen science as a research tool, we surveyed professional biodiversity scientists (n = 423) and citizen science project managers (n = 125). We conducted three analyses using non-parametric recursive modeling (random forest), using questions that addressed: scientists' perceptions and preferences regarding citizen science, scientists' requirements for their own data, and the actual practices of citizen science projects. For all three analyses we identified the most important factors that influence the probability of publication using citizen science data. Four general barriers emerged: a narrow awareness among scientists of citizen science projects that match their needs; the fact that not all biodiversity science is well-suited for citizen science; inconsistency in data quality across citizen science projects; and bias among scientists for certain data sources (institutions and ages/education levels of data collectors). Notably, we find limited evidence to suggest a relationship between citizen science projects that satisfy scientists' biases and data quality or probability of publication. These results illuminate the need for greater visibility of citizen science practices with respect to the requirements of biodiversity science and show that addressing bias among scientists could improve application of citizen science in conservation.
Resumo:
This study aims to understand the growth dynamics of Ceará-Mirim city considering the aspects that define its current urban sprawl. We chose to use as methodological approach the concept of space as a product and producer of social relations and the same constitutes by objects and actions that relate in a dialectical process over time. As a research tool, it took place bibliographical study that features the historical aspects of use and occupation, in order to evidence the regional ground agents that explain its current urban setting. Then, it was collected a secondary data of economic analysis activities from municipality and their impact on local social structure. Between these aspects, the sugar economy, even in decline, had appeared as defining the boundaries of urban growth. At the regional scale, other factors were discussed in a way of urban influence processes forming the Greater Natal (RMNatal), and Ceará-Mirim appears integrating into this scale in a very low level according to Metropolis Observatory (2012). However, we have pointed out that in recent years, especially in growth vector of BR 406, settled metropolitan scope equipment. These objects have been associating in a real estate sector's reasoning while the possible "metropolization" have been promoting as investments in the city's expansion area.
Resumo:
Private Higher Education Institutions are embedded in a market where competitiveness is a key factor. To remain competitive, HEIs needs to have proactive and innovative strategies, especially to understand their main customers, students, with regard to their expectations about the quality of HEI. This study is to evaluate the overall private institutions of higher education in the city of Natal / RN, as the strategies adopted to remain on the market , based on the perceived quality of students. For conduct this research, it has developed two private institutions in the city of Natal, through the application using exploratory research to guide the survey for data collection with questionnaire to apply the overview with students, being directed to senior students courses in Bussiness, Accounting and Law. This research tool addresses aspects relevant to map the dimensions: (1) teaching, perspectives related to methods and teaching tools; ( 2 ) teachers, specifies the quality attributes related to teachers; (3 ) Infrastructure, describes the environment of the HEI; ( 4 ) services , evaluates the quality processes that attach to the HEI; and ( 5 ) intangible relates aspects with student satisfaction. The results were analyzed using descriptive statistical techniques using the Statistical Package Tool for Social Sciences (SPSS). The first stage of results characterizes the descriptive analysis of the overall sample and by HEI and course, plus a build univariate analysis of the HEI and also bivariate analysis shows that correlation of the factors through Spearman correlation coefficient. The results were used to compose a matrix of importance versus performance that compare with the contents of the Ministry of Education and Culture (MEC). Finally, these comparisons allowed identification of the most important factors for the quality of the HEI and the level of performance from institutions in the development of each attributes of quality dimensions.
Avaliação dos impactos do uso do sistema de gestão hospitalar no Hospital Universitário Onofre Lopes
Resumo:
The object of this study was motivated by the need to know the possible causes of differences in results achieved in the implementation of a Computerised Management System (CMS) in a Federal University Hospital, located in northeastern Brazil, to understand the factors that influenced the results in different groups when was used the same systems implementation methodologies. Considering the implication of managers, health professionals, other professionals involved and the existing organizational structure in the period when implantation occurred, aimed to know the perception of these people about the development of CMS in the deployment process in your group or sector and also in the organization.The methodology used in this study was the content analysis which provides a rich set of methodological tools for evaluating speeches,enabling us to discourse from the unknown analysis and subjectivity, but with scientific rigor, allowing, at the end, to understand the disparity in results in the implementation of CMS.It was used as a research tool, a semi-structured interview, which exploits a qualitative approach, as suggested by the authors. It was used the approach of the episodic interview, to be more narrative about the experiences of the interview participants in their practical experience along the CMS deployment process in the hospital.Were interviewed three groups of professional and a group of managers, all with higher education in their professions and who participated in the entire implementation process from the beginning.It followed the Bardin's methodology (2009) in all the phases of treatment and interpretation of data, where emerged three categories: the "Thought and Knowledge"; the "Practices and Changes"; the "Obtained Results". From the category "Thought and Knowledge"emerged three subcategories: the "Administrative", the "Institutional" and the "IT Knowledge". From the category "Practices and Changes" emerged three subcategories: "Reality Prior to CMS"; "The IT Project and the implementation of CMS" and "Impacts of the CMS Implementation". From the category "Results Obtained" emerged three subcategories: "Benefits Promoted by CMS", "Dissatisfaction Observed" and "Level of Use and Understanding CMS ". It was observed that the lack of integration of the sectors was a determinant problem in the implementation of CMS. The CMS implementation project was not well dimensioned and divulged in the institution. Different models of leaderships and of objectives of the sectors influenced in the course of the CMS implementation process. We can mention that an CMS should be a consolidation of organizational practices tool already institutionalized and of integration amongthe sectors and not supporting to isolated practices and personalistsfrom sectors of the institution.
Resumo:
This paper derives a theoretical framework for consideration of both the technologically driven dimensions of mobile payment solutions, and the associated value proposition for customers. Banks promote traditional payment instruments whose value proposition is the management of risk for both consumers and merchants. These instruments are centralised, costly and lack decision support functionality. The ubiquity of the mobile phone has provided a decentralised platform for managing payment processes in a new way, but the value proposition for customers has yet to be elaborated clearly. This inertia has stalled the design of sustainable revenue models for a mobile payments ecosystem. Merchants and consumers in the meantime are being seduced by the convenience of on-line and mobile payment solutions. Adopting the purchase and payment process as the unit of analysis, the current mobile payment landscape is reviewed with respect to the creation and consumption of customer value. From this analysis, a framework is derived juxtaposing customer value, related to what is being paid for, with payment integration, related to how payments are being made. The framework provides a theoretical and practical basis for considering the contribution of mobile technologies to the payment industry. The framework is then used to describe the components of a mobile payments pilot project being run on a trial population of 250 students on a campus in Ireland. In this manner, weaknesses in the value proposition for consumers and merchants were highlighted. Limitations of the framework as a research tool are also discussed.
Resumo:
In an overcapacity world, where the customers can choose from many similar products to satisfy their needs, enterprises are looking for new approaches and tools that can help them not only to maintain, but also to increase their competitive edge. Innovation, flexibility, quality, and service excellence are required to, at the very least, survive the on-going transition that industry is experiencing from mass production to mass customization. In order to help these enterprises, this research develops a Supply Chain Capability Maturity Model named S(CM)2. The Supply Chain Capability Maturity Model is intended to model, analyze, and improve the supply chain management operations of an enterprise. The Supply Chain Capability Maturity Model provides a clear roadmap for enterprise improvement, covering multiple views and abstraction levels of the supply chain, and provides tools to aid the firm in making improvements. The principal research tool applied is the Delphi method, which systematically gathered the knowledge and experience of eighty eight experts in Mexico. The model is validated using a case study and interviews with experts in supply chain management. The resulting contribution is a holistic model of the supply chain integrating multiple perspectives, and providing a systematic procedure for the improvement of a company’s supply chain operations.
Resumo:
Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals’ protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.
Resumo:
This paper describes a substantial effort to build a real-time interactive multimodal dialogue system with a focus on emotional and non-verbal interaction capabilities. The work is motivated by the aim to provide technology with competences in perceiving and producing the emotional and non-verbal behaviours required to sustain a conversational dialogue. We present the Sensitive Artificial Listener (SAL) scenario as a setting which seems particularly suited for the study of emotional and non-verbal behaviour, since it requires only very limited verbal understanding on the part of the machine. This scenario allows us to concentrate on non-verbal capabilities without having to address at the same time the challenges of spoken language understanding, task modeling etc. We first summarise three prototype versions of the SAL scenario, in which the behaviour of the Sensitive Artificial Listener characters was determined by a human operator. These prototypes served the purpose of verifying the effectiveness of the SAL scenario and allowed us to collect data required for building system components for analysing and synthesising the respective behaviours. We then describe the fully autonomous integrated real-time system we created, which combines incremental analysis of user behaviour, dialogue management, and synthesis of speaker and listener behaviour of a SAL character displayed as a virtual agent. We discuss principles that should underlie the evaluation of SAL-type systems. Since the system is designed for modularity and reuse, and since it is publicly available, the SAL system has potential as a joint research tool in the affective computing research community.
Resumo:
Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals' protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.
Resumo:
The identification of subjects at high risk for Alzheimer’s disease is important for prognosis and early intervention. We investigated the polygenic architecture of Alzheimer’s disease and the accuracy of Alzheimer’s disease prediction models, including and excluding the polygenic component in the model. This study used genotype data from the powerful dataset comprising 17 008 cases and 37 154 controls obtained from the International Genomics of Alzheimer’s Project (IGAP). Polygenic score analysis tested whether the alleles identified to associate with disease in one sample set were significantly enriched in the cases relative to the controls in an independent sample. The disease prediction accuracy was investigated in a subset of the IGAP data, a sample of 3049 cases and 1554 controls (for whom APOE genotype data were available) by means of sensitivity, specificity, area under the receiver operating characteristic curve (AUC) and positive and negative predictive values. We observed significant evidence for a polygenic component enriched in Alzheimer’s disease (P = 4.9 × 10−26). This enrichment remained significant after APOE and other genome-wide associated regions were excluded (P = 3.4 × 10−19). The best prediction accuracy AUC = 78.2% (95% confidence interval 77–80%) was achieved by a logistic regression model with APOE, the polygenic score, sex and age as predictors. In conclusion, Alzheimer’s disease has a significant polygenic component, which has predictive utility for Alzheimer’s disease risk and could be a valuable research tool complementing experimental designs, including preventative clinical trials, stem cell selection and high/low risk clinical studies. In modelling a range of sample disease prevalences, we found that polygenic scores almost doubles case prediction from chance with increased prediction at polygenic extremes.
Resumo:
We report on the development of a Java-based application devised to support collaborative learning of Art concepts and ideas over the Internet. Starting from an examination of the pedagogy of both Art education and collaborative learning we propose principles which are useful for the design and construction of a “lightweight” software application which supports interactive Art learning in groups. This application makes “dynamics” of an art work explicit, and supports group interaction with simple messaging and “chat” facilities. This application may be used to facilitate learning and teaching of Art, but also as a research tool to investigate the learning of Art and also the development and dynamics of collaborating groups. Evaluation of a pilot study of the use of our system with a group of 20 school children is presented.