450 resultados para methodologies
Resumo:
The existence of any film genre depends on the effective operation of distribution networks. Contingencies of distribution play an important role in determining the content of individual texts and the characteristics of film genres; they enable new genres to emerge at the same time as they impose limits on generic change. This article sets out an alternative way of doing genre studies, based on an analysis of distributive circuits rather than film texts or generic categories. Our objective is to provide a conceptual framework that can account for the multiple ways in which distribution networks leave their traces on film texts and audience expectations, with specific reference to international horror networks, and to offer some preliminary suggestions as to how distribution analysis can be integrated into existing genre studies methodologies.
Resumo:
As a result of the growing adoption of Business Process Management (BPM) technology different stakeholders need to understand and agree upon the process models that are used to configure BPM systems. However, BPM users have problems dealing with the complexity of such models. Therefore, the challenge is to improve the comprehension of process models. While a substantial amount of literature is devoted to this topic, there is no overview of the various mechanisms that exist to deal with managing complexity in (large) process models. It is thus hard to obtain comparative insight into the degree of support offered for various complexity reducing mechanisms by state-of-the-art languages and tools. This paper focuses on complexity reduction mechanisms that affect the abstract syntax of a process model, i.e. the structure of a process model. These mechanisms are captured as patterns, so that they can be described in their most general form and in a language- and tool-independent manner. The paper concludes with a comparative overview of the degree of support for these patterns offered by state-of-the-art languages and language implementations.
Resumo:
The overall theme for the 4th Biennial International Network of Indigenous Health Knowledge and Development (INIHKD)Conference was ‘Knowing Our Roots: Indigenous Medicines, Health Knowledges and Best Practices’. Conference activities were grouped around the following broad themes: •Building of Indigenous research capacity, partnerships and workforce; •Sharing of innovative, traditional and contemporary Indigenous knowledges, especially with respect to culturally-grounded interventions and evidenced-based “best and promising practices”; •Identification of successful Indigenous health policy solutions; and •Sharing of ethical, Indigenous-based research protocols and methodologies. This keynote plenary presentation focused on 'best practice' in research asking the questions: What kind of research will I do? What kind of research will I be? What is the contribution that I will make? what will be my legacy?
Resumo:
Queensland University of Technology (QUT) is a multidisciplinary university in Brisbane, Queensland, Australia, and has 40,000 students and 1,700 researchers. Notable eResearch infrastructure includes the QUT ePrints repository, Microsoft QUT Research Centre, the OAK (Open Access to Knowledge) Law Project, Cambia and leading research institutes. ---------- The Australian Government, via the Australian National Data Service (ANDS), is funding institutions to identify and describe their research datasets, to develop and populate data repositories and collaborative infrastructure, and to seed the Australian Research Data Commons. QUT is currently broadening its range of research support services, including those to support the management of research data, in recognition of the value of these datasets as products of the research process, and in order to maximize the potential for reuse. QUT is integrating Library and High Performance Computing (HPC) services to achieve its research support goals. ---------- The Library and HPC released an online survey using Key Survey to 1,700 researchers in September 2009. A comprehensive range of eResearch practices and skills was presented for response, and grouped into areas of scholarly communication and open access publishing, using collaborative technologies, data management, data collection and management, computation and visualization tools. Researchers were asked to rate their skill level on each practice. 254 responses were received over two weeks. Eight focus groups were also held with 35 higher degree research (HDR) students and staff to provide additional qualitative feedback. A similar survey was released to 100 support staff and 73 responses were received.---------- Preliminary results from the researcher survey and focus groups indicate a gap between current eResearch practices, and the potential for researchers to engage in eResearch practices. Researchers are more likely to seek advice from their peers, than from support staff. HDR students are more positive about eResearch practices and are more willing to learn new ways of conducting research. An account of the survey methodology, the results obtained, and proposed strategies to embed eResearch practices and skills across and within the research disciplines will be provided.
Resumo:
Vendors provide reference process models as consolidated, off-the-shelf solutions to capture best practices in a given industry domain. Customers can then adapt these models to suit their specific requirements. Traditional process flexibility approaches facilitate this operation, but do not fully address it as they do not sufficiently take controlled change guided by vendors’ reference models into account. This tension between the customer’s freedom of adapting reference models, and the ability to incorporate with relatively low effort vendor-initiated reference model changes, thus needs to be carefully balanced. This paper introduces process extensibility as a new paradigm for customising reference processes and managing their evolution over time. Process extensibility mandates a clear recognition of the different responsibilities and interests of reference model vendors and consumers, and is concerned with keeping the effort of customer-side reference model adaptations low while allowing sufficient room for model change.
Resumo:
This paper describes the approach taken to the clustering task at INEX 2009 by a group at the Queensland University of Technology. The Random Indexing (RI) K-tree has been used with a representation that is based on the semantic markup available in the INEX 2009 Wikipedia collection. The RI K-tree is a scalable approach to clustering large document collections. This approach has produced quality clustering when evaluated using two different methodologies.
Resumo:
In this paper we introduce the Reaction Wheel Pendulum, a novel mechanical system consisting of a physical pendulum with a rotating bob. This system has several attractive features both from a pedagogical standpoint and from a research standpoint. From a pedagogical standpoint, the dynamics are the simplest among the various pendulum experiments available so that the system can be introduced to students earlier in their education. At the same time, the system is nonlinear and underactuated so that it can be used as a benchmark experiment to study recent advanced methodologies in nonlinear control, such as feedback linearization, passivity methods, backstepping and hybrid control. In this paper we discuss two control approaches for the problems of swingup and balance, namely, feedback linearization and passivity based control. We first show that the system is locally feedback linearizable by a local diffeomorphism in state space and nonlinear feedback. We compare the feedback linearization control with a linear pole-placement control for the problem of balancing the pendulum about the inverted position. For the swingup problem we discuss an energy approach based on collocated partial feedback linearization, and passivity of the resulting zero dynamics. A hybrid/switching control strategy is used to switch between the swingup and the balance control. Experimental results are presented.
Resumo:
An Asset Management (AM) life-cycle constitutes a set of processes that align with the development, operation and maintenance of assets, in order to meet the desired requirements and objectives of the stake holders of the business. The scope of AM is often broad within an organization due to the interactions between its internal elements such as human resources, finance, technology, engineering operation, information technology and management, as well as external elements such as governance and environment. Due to the complexity of the AM processes, it has been proposed that in order to optimize asset management activities, process modelling initiatives should be adopted. Although organisations adopt AM principles and carry out AM initiatives, most do not document or model their AM processes, let alone enacting their processes (semi-) automatically using a computer-supported system. There is currently a lack of knowledge describing how to model AM processes through a methodical and suitable manner so that the processes are streamlines and optimized and are ready for deployment in a computerised way. This research aims to overcome this deficiency by developing an approach that will aid organisations in constructing AM process models quickly and systematically whilst using the most appropriate techniques, such as workflow technology. Currently, there is a wealth of information within the individual domains of AM and workflow. Both fields are gaining significant popularity in many industries thus fuelling the need for research in exploring the possible benefits of their cross-disciplinary applications. This research is thus inspired to investigate these two domains to exploit the application of workflow to modelling and execution of AM processes. Specifically, it will investigate appropriate methodologies in applying workflow techniques to AM frameworks. One of the benefits of applying workflow models to AM processes is to adapt and enable both ad-hoc and evolutionary changes over time. In addition, this can automate an AM process as well as to support the coordination and collaboration of people that are involved in carrying out the process. A workflow management system (WFMS) can be used to support the design and enactment (i.e. execution) of processes and cope with changes that occur to the process during the enactment. So far few literatures can be found in documenting a systematic approach to modelling the characteristics of AM processes. In order to obtain a workflow model for AM processes commonalities and differences between different AM processes need to be identified. This is the fundamental step in developing a conscientious workflow model for AM processes. Therefore, the first stage of this research focuses on identifying the characteristics of AM processes, especially AM decision making processes. The second stage is to review a number of contemporary workflow techniques and choose a suitable technique for application to AM decision making processes. The third stage is to develop an intermediate ameliorated AM decision process definition that improves the current process description and is ready for modelling using the workflow language selected in the previous stage. All these lead to the fourth stage where a workflow model for an AM decision making process is developed. The process model is then deployed (semi-) automatically in a state-of-the-art WFMS demonstrating the benefits of applying workflow technology to the domain of AM. Given that the information in the AM decision making process is captured at an abstract level within the scope of this work, the deployed process model can be used as an executable guideline for carrying out an AM decision process in practice. Moreover, it can be used as a vanilla system that, once being incorporated with rich information from a specific AM decision making process (e.g. in the case of a building construction or a power plant maintenance), is able to support the automation of such a process in a more elaborated way.
Resumo:
Undertaking empirical research on crime and violence can be a tricky enterprise fraught with ethical, methodological, intellectual and legal implications. This chapter takes readers on a reflective journey through the qualitative methodologies I used to research sex work in Kings Cross, miscarriages of justice, female delinquency, sexual violence, and violence in rural and regional settings over a period of nearly 30 years. Reflecting on these experiences, the chapter explores and analyses the reality of doing qualitative field research, the role of the researcher, the politics of subjectivity, the exercise of power, and the ‘muddiness’ of the research process, which is often overlooked in sanitised accounts of the research process (Byrne-Armstrong, Higgs and Horsfall, 2001; Davies, 2000).
Resumo:
Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.
Resumo:
To investigate the meaning and understanding of domestic food preparation within the lived experience of the household's main food preparer this ethnographic study used a combination of qualitative and quantitative methodologies. Data were collected from three sources: the literature; an in-store survey of251 food shoppers chosen at random while shopping during both peak and off peak shopping periods at metropolitan supermarkets; and semi-structured interviews with the principal food shopper and food preparer of 15 different Brisbane households. Male and female respondents representing a cross section of socio-economic groupings, ranged in age from 19-79 years and were all from English speaking backgrounds. Changes in paid labour force participation, income and education have increased the value of the respondents' time, instigating massive changes in the way they shop, cook and eat. Much of their food preparation has moved from the domestic kitchen into the kitchens of other food establishments. For both sexes, the dominant motivating force behind these changes is a combination of the their self perceived lack of culinary skill; lack of enjoyment of cooking and lack of motivation to cook. The females in paid employment emphasise all factors, particularly the latter two, significantly more than the non-employed females. All factors are of increasing importance for individuals aged less than 35 years and conversely, of significantly diminished importance to older respondents. Overall, it is the respondents aged less than 25 years who indicate the lowest cooking frequency and/or least cooking ability. Inherent in this latter group is an indifference to the art/practice of preparing food. Increasingly, all respondents want to do less cooking and/or get the cooking over with as quickly as possible. Convenience is a powerful lure by which to spend less time in the kitchen. As well, there is an apparent willingness to pay a premium for convenience. Because children today are increasingly unlikely to be taught to cook, addressing the food skills deficit and encouraging individuals to cook for themselves are significant issues confronting health educators. These issues are suggested as appropriate subjects of future research.
Resumo:
This study focused on a group of primary school teachers as they implemented a variety of intervention actions within their class programs aimed towards supporting the reduction of high levels of communication apprehension (CA) among students.Six teachers and nine students, located across three primary schools, four year levels,and six classes, participated in this study. For reasons of confidentiality the schools,principals, parents, teachers, teacher assistants, and students who were involved in this study were given fictitious names. The following research question was explored in this study: What intervention actions can primary school teachers implement within their class programs that support the reduction of high CA levels among students? Throughout this study the term CA referred to "an individual's level of fear or anxiety associated with either real or anticipated (oral) communication with another person or persons" (McCroskey, 1984, p. 13). The sources of CA were explained with reference to McCroskey's state-trait continuum. The distinctions between high and appropriate levels of CA were determined conceptually and empirically. The education system within which this study was conducted promoted the philosophy of inclusion and the practices of inclusive schooling. Teachers employed in this system were encouraged to create class programs inclusive of and successful for all students. Consequently the conceptual framework within which this study was conducted was based around the notion of inclusion. Action research and case study research were the methodologies used in the study. Case studies described teachers' action research as they responded to the challenge of executing intervention actions within their class programs directed towards supporting the reduction of high CA levels among students. Consequently the teachers and not the researcher were the central characters in each of the case studies. Three principal data collection instruments were used in this study: Personal Report of Communication Fear (PRCF) scale, semistructured interviews, and dialogue journals. The PRCF scale was the screening tool used to identify a pool of students eligible for the study. Data relevant to the students involved in the study were gathered during semistructured interviews and throughout the dialogue journaling process. Dialogue journaling provided the opportunity for regular contact between teachers and the researcher, a sequence to teacher and student intervention behaviours, and a permanent record of teacher and student growth and development. The majority of teachers involved in this study endeavoured to develop class programs inclusive of all students.These teachers acknowledged the importance of modifying aspects of their class programs in response to the diverse and often multiple needs of individual students with high levels of CA. Numerous conclusions were drawn regarding practical ways that the teachers in this study supported the reduction of high CA levels among students. What this study has shown is that teachers can incorporate intervention actions within their class programs aimed towards supporting students lower their high levels of CA. Whilst no teacher developed an identical approach to intervention, similarities and differences were evident among teachers regarding their selection, interpretation, and implementation of intervention actions. Actions that teachers enacted within their class programs emerged from numerous fields of research including CA, inclusion, social skills, behaviour teaching, co-operative learning, and quality schools. Each teacher's knowledge of and familiarity with these research fields influenced their preference for and commitment to particular intervention actions. Additional factors including each teacher's paradigm of inclusion and exclusion contributed towards their choice of intervention actions. Possible implications of these conclusions were noted with reference to teachers,school administrators, support personnel, system personnel, teacher educators, parents, and researchers.
Resumo:
This research has established, through ultrasound, near infrared spectroscopy and biomechanics experiments, parameters and parametric relationships that can form the framework for quantifying the integrity of the articular cartilage-on-bone laminate, and objectively distinguish between normal/healthy and abnormal/degenerated joint tissue, with a focus on articular cartilage. This has been achieved by: 1. using traditional experimental methods to produce new parameters for cartilage assessment; 2. using novel methodologies to develop new parameters; and 3. investigating the interrelationships between mechanical, structural and molec- ular properties to identify and select those parameters and methodologies that can be used in a future arthroscopic probe based on points 1 and 2. By combining the molecular, micro- and macro-structural characteristics of the tissue with its mechanical properties, we arrive at a set of critical benchmarking parameters for viable and early-stage non-viable cartilage. The interrelationships between these characteristics, examined using a multivariate analysis based on principal components analysis, multiple linear regression and general linear modeling, could then to deter- mine those parameters and relationships which have the potential to be developed into a future clinical device. Specifically, this research has found that the ultrasound and near infrared techniques can subsume the mechanical parameters and combine to characterise the tissue at the molecular, structural and mechanical levels over the full depth of the cartilage matrix. It is the opinion in this thesis that by enabling the determination of the precise area of in uence of a focal defect or disease in the joint, demarcating the boundaries of articular cartilage with dierent levels of degeneration around a focal defect, better surgical decisions that will advance the processes of joint management and treatment will be achieved. Providing the basis for a surgical tool, this research will contribute to the enhancement and quanti�cation of arthroscopic procedures, extending to post- treatment monitoring and as a research tool, will enable a robust method for evaluating developing (particularly focalised) treatments.
Resumo:
This book focuses on practical applications for using adult and embryonic stem cells in the pharmaceutical development process. It emphasizes new technologies to help overcome the bottlenecks in developing stem cells as therapeutic agents. A key reference for professionals working in stem cell science, it presents the general principles and methodologies in stem cell research and covers topics such as derivitization and characterization of stem cells, stem cell culture and maintenance, stem cell engineering, applications of high-throughput screening, and stem cell genetic modification with their use for drug delivery.