958 resultados para consensus methods
Resumo:
Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.
Resumo:
Qualitative Health Psychology aims to contribute to the debate about the nature of psychology and of science through ‘an examination of the role of qualitative research within health psychology’ (p. 3). The editors, in bringing together contributors from the UK, Ireland, Canada, Brazil, New Zealand and Australia, have compiled a text that reflects different uses of qualitative health research in diverse social and cultural contexts. Structured into three parts, the book encompasses key theoretical and methodological issues in qualitative research in its attempt to encourage broad epistemological debate within health psychology.
Resumo:
The appearance of poststructuralism as a research methodology in public health literature raises questions about the history and purpose of this research. We examine (a) some aspects of the history of qualitative methods and their place within larger social and research domains, and (b) the purposes of a public health research that employs poststructuralist philosophy delineating the methodological issues that require consideration in positing a poststructural analysis. We argue against poststructuralism becoming a research methodology deployed to seize the pubic health debate, rather than being employed for its own particular critical strengths.
Resumo:
This dissertation analyses how physical objects are translated into digital artworks using techniques which can lead to ‘imperfections’ in the resulting digital artwork that are typically removed to arrive at a ‘perfect’ final representation. The dissertation discusses the adaptation of existing techniques into an artistic workflow that acknowledges and incorporates the imperfections of translation into the final pieces. It presents an exploration of the relationship between physical and digital artefacts and the processes used to move between the two. The work explores the 'craft' of digital sculpting and the technology used in producing what the artist terms ‘a naturally imperfect form’, incorporating knowledge of traditional sculpture, an understanding of anatomy and an interest in the study of bones (Osteology). The outcomes of the research are presented as a series of digital sculptural works, exhibited as a collection of curiosities in multiple mediums, including interactive game spaces, augmented reality (AR), rapid prototype prints (RP) and video displays.
Resumo:
1. Autonomous acoustic recorders are widely available and can provide a highly efficient method of species monitoring, especially when coupled with software to automate data processing. However, the adoption of these techniques is restricted by a lack of direct comparisons with existing manual field surveys. 2. We assessed the performance of autonomous methods by comparing manual and automated examination of acoustic recordings with a field-listening survey, using commercially available autonomous recorders and custom call detection and classification software. We compared the detection capability, time requirements, areal coverage and weather condition bias of these three methods using an established call monitoring programme for a nocturnal bird, the little spotted kiwi(Apteryx owenii). 3. The autonomous recorder methods had very high precision (>98%) and required <3% of the time needed for the field survey. They were less sensitive, with visual spectrogram inspection recovering 80% of the total calls detected and automated call detection 40%, although this recall increased with signal strength. The areal coverage of the spectrogram inspection and automatic detection methods were 85% and 42% of the field survey. The methods using autonomous recorders were more adversely affected by wind and did not show a positive association between ground moisture and call rates that was apparent from the field counts. However, all methods produced the same results for the most important conservation information from the survey: the annual change in calling activity. 4. Autonomous monitoring techniques incur different biases to manual surveys and so can yield different ecological conclusions if sampling is not adjusted accordingly. Nevertheless, the sensitivity, robustness and high accuracy of automated acoustic methods demonstrate that they offer a suitable and extremely efficient alternative to field observer point counts for species monitoring.
Resumo:
Background There is growing consensus that a multidisciplinary, comprehensive and standardised process for assessing the fitness of older patients for chemotherapy should be undertaken to determine appropriate cancer treatment. Aim This study tested a model of cancer care for the older patient incorporating Comprehensive Geriatric Assessment (CGA), which aimed to ensure that 'fit' individuals amenable to active treatment were accurately identified; 'vulnerable' patients more suitable for modified or supportive regimens were determined; and 'frail 'individuals who would benefit most from palliative regimens were also identified and offered the appropriate level of care. Methods A consecutive-series n=178 sample of patients >65 years was recruited from a major Australian cancer centre. The following instruments were administered by an oncogeriatric nurse prior to treatment: Vulnerable Elders Survey-13; Cumulative Illness Rating Scale (Geriatric); Malnutrition Screening Tool; Mini-mental State Examination; Geriatric Depression Scale; Barthel Index; and Lawton Instrumental Activities of Daily Living Scale. Scores from these instruments were aggregated to predict patient fitness, vulnerability or frailty for chemotherapy. Physicians provided a concurrent (blinded) prediction of patient fitness, vulnerability or frailty based on their clinical assessment. Data were also collected on actual patient outcomes (eg treatment completed as predicted, treatment reduced) during monthly audits of patient trajectories. Data analysis Data analysis is underway. A sample of 178 is adequate to detect, with 90% power, kappa coefficients of agreement between CGA and physician assessments of K>0.90 ("almost perfect agreement"). Primary endpoints comprise a) whether the nurse-led CGA determination of fit, vulnerable or frail agrees with the oncologist's assessments of fit, vulnerable or frail and b) whether the CGA and physician assessments accurately predict actual patient outcomes. Conclusion An oncogeriatric nurse-led model of care is currently being developed from the results. We conclude with a discussion of the pivotal role of nurses in CGA-based models of care.
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
Overview: - Development of mixed methods research - Benefits and challenges of “mixing” - Different models - Good design - Two examples - How to report? - Have a go!
Resumo:
Over a seven-year period, Mark Radvan directed a suite of children’s theatre productions adapted from the original Tashi stories by Australian writers Anna and Barbara Fienberg. The Tashi Project’s repertoire of plays performed to over 40,000 children aged between 3 and 10 years old, and their carers, in seasons at the Out of the Box Festival, at Brisbane Powerhouse and in venues across Australia in two interstate tours in 2009 and 2010. The project investigated how best to combine an exploration of theatrical forms and conventions, with a performance style evolved in a specially developed training program and a deliberate positioning of young children as audiences capable of sophisticated readings of action, symbol, theme and character. The results of this project show that when brought into appropriate relationship with the theatre artists, young children aged 3-5 can engage with sophisticated narrative forms, and with the right contextual framing they enjoy heightened dramatic and emotional tension, bringing to the event sustained and highly engaged concentration. Older children aged 6-10 also bring sustained and heightened engagement to the same stories, providing that other more sophisticated dramatic elements are woven into the construction of the performances, such as character, theme and style.
Resumo:
Universities are more and more challenged by the emerging global higher education market, facilitated by advances in Information and Communication Technologies (ICT). This requires them to reconsider their mission and direction in order to function effectively and efficiently, and to be responsive to changes in their environment. In the face of increasing demands and competitive pressures, Universities like other companies, seek to continuously innovate and improve their performance. Universities are considering co-operating or sharing, both internally and externally, in a wide range of areas to achieve cost effectiveness and improvements in performance. Shared services are an effective model for re-organizing to reduce costs, increase quality and create new capabilities. Shared services are not limited to the Higher Education (HE) sector. Organizations across different sectors are adopting shared services, in particular for support functions such as Finance, Accounting, Human Resources and Information Technology. While shared services has been around for more than three decades, commencing in the 1970’s in the banking sector and then been adopted by other sectors, it is an under researched domain, with little consensus on the most fundamental issues even as basic as defining what shared services is. Moreover, the interest in shared services within Higher Education is a global phenomenon. This study on shared services is situated within the Higher Education Sector of Malaysia, and originated as an outcome resulting from a national project (2005 – 2007) conducted by the Ministry of Higher Education (MOHE) entitled "Knowledge, Information Communication Technology Strategic Plan (KICTSP) for Malaysian Public Higher Education"- where progress towards more collaborations via shared services was a key recommendation. The study’s primary objective was to understand the nature and potential for ICT shared services, in particular in the Malaysian HE sector; by laying a foundation in terms of definition, typologies and research agenda and deriving theoretically based conceptualisations of the potential benefits of shared services, success factors and issues of pursuing shared services. The study embarked on this objective with a literature review and pilot case study as a means to further define the context of the study, given the current under-researched status of ICT shared services and of shared services in Higher Education. This context definition phase illustrated a range of unaddressed issues; including a lack of common understanding of what shared services are, how they are formed, what objectives they full fill, who is involved etc. The study thus embarked on a further investigation of a more foundational nature with an exploratory phase that aimed to address these gaps, where a detailed archival analysis of shared services literature within the IS context was conducted to better understand shared services from an IS perspective. The IS literature on shared services was analysed in depth to report on the current status of shared services research in the IS domain; in particular definitions, objectives, stakeholders, the notion of sharing, theories used, and research methods applied were analysed, which provided a firmer base to this study’s design. The study also conducted a detailed content analysis of 36 cases (globally) of shared services implementations in the HE sector to better understand how shared services are structured within the HE sector and what is been shared. The results of the context definition phase and exploratory phase formed a firm basis in the multiple case studies phase which was designed to address the primary goals of this study (as presented above). Three case sites within the Malaysian HE sector was included in this analysis, resulting in empirically supported theoretical conceptualizations of shared services success factors, issues and benefits. A range of contributions are made through this study. First, the detailed archival analysis of shared services in Information Systems (IS) demonstrated the dearth of research on shared services within Information Systems. While the existing literature was synthesised to contribute towards an improved understanding of shared services in the IS domain, the areas that are yet under-developed and requires further exploration is identified and presented as a proposed research agenda for the field. This study also provides theoretical considerations and methodological guidelines to support the research agenda; to conduct better empirical research in this domain. A number of literatures based a priori frameworks (i.e. on the forms of sharing and shared services stakeholders etc) are derived in this phase, contributing to practice and research with early conceptualisations of critical aspects of shared services. Furthermore, the comprehensive archival analysis design presented and executed here is an exemplary approach of a systematic, pre-defined and tool-supported method to extract, analyse and report literature, and is documented as guidelines that can be applied for other similar literature analysis, with particular attention to supporting novice researchers. Second, the content analysis of 36 shared services initiatives in the Higher Education sector presented eight different types of structural arrangements for shared services, as observed in practice, and the salient dimensions along which those types can be usefully differentiated. Each of the eight structural arrangement types are defined and demonstrated through case examples, with further descriptive details and insights to what is shared and how the sharing occurs. This typology, grounded on secondary empirical evidence, can serve as a useful analytical tool for researchers investigating the shared services phenomenon further, and for practitioners considering the introduction or further development of shared services. Finally, the multiple case studies conducted in the Malaysian Higher Education sector, provided further empirical basis to instantiate the conceptual frameworks and typology derived from the prior phases and develops an empirically supported: (i) framework of issues and challenges, (ii) a preliminary theory of shared services success, and (iii) a benefits framework, for shared services in the Higher Education sector.
Resumo:
The impact-induced deposition of Al13 clusters with icosahedral structure on Ni(0 0 1) surface was studied by molecular dynamics (MD) simulation using Finnis–Sinclair potentials. The incident kinetic energy (Ein) ranged from 0.01 to 30 eV per atom. The structural and dynamical properties of Al clusters on Ni surfaces were found to be strongly dependent on the impact energy. At much lower energy, the Al cluster deposited on the surface as a bulk molecule. However, the original icosahedral structure was transformed to the fcc-like one due to the interaction and the structure mismatch between the Al cluster and Ni surface. With increasing the impinging energy, the cluster was deformed severely when it contacted the substrate, and then broken up due to dense collision cascade. The cluster atoms spread on the surface at last. When the impact energy was higher than 11 eV, the defects, such as Al substitutions and Ni ejections, were observed. The simulation indicated that there exists an optimum energy range, which is suitable for Al epitaxial growth in layer by layer. In addition, at higher impinging energy, the atomic exchange between Al and Ni atoms will be favourable to surface alloying.
Resumo:
Quantitative market data has traditionally been used throughout marketing and business as a tool to inform and direct design decisions. However, in our changing economic climate, businesses need to innovate and create products their customers will love. Deep customer insight methods move beyond just questioning customers and aims to provoke true emotional responses in order to reveal new opportunities that go beyond functional product requirements. This paper explores traditional market research methods and compares them to methods used to gain deep customer insights. This study reports on a collaborative research project with seven small to medium enterprises and four multi-national organisations. Firms were introduced to a design led innovation approach, and were taught the different methods to gain deep customer insights. Interviews were conducted to understand the experience and outcomes of pre-existing research methods and deep customer insight approaches. Findings concluded that deep customer insights were unlikely to be revealed through traditional market research techniques. The theoretical outcome of this study is a complementary methods matrix, providing guidance on appropriate research methods in accordance to a project’s timeline.
Resumo:
BACKGROUND: Numerous strategies are available to prevent surgical site infections in hip arthroplasty, but there is no consensus on which might be the best. This study examined infection prevention strategies currently recommended for patients undergoing hip arthroplasty. METHODS: Four clinical guidelines on infection prevention/orthopedics were reviewed. Infection control practitioners, infectious disease physicians, and orthopedic surgeons were consulted through structured interviews and an online survey. Strategies were classified as "highly important" if they were recommended by at least one guideline and ranked as significantly or critically important by >/=75% of the experts. RESULTS: The guideline review yielded 28 infection prevention measures, with 7 identified by experts as being highly important in this context: antibiotic prophylaxis, antiseptic skin preparation of patients, hand/forearm antisepsis by surgical staff, sterile gowns/surgical attire, ultraclean/laminar air operating theatres, antibiotic-impregnated cement, and surveillance. Controversial measures included antibiotic-impregnated cement and, considering recent literature, laminar air operating theatres. CONCLUSIONS: Some of these measures may already be accepted as routine clinical practice, whereas others are controversial. Whether these practices should be continued for this patient group will be informed by modeling the cost-effectiveness of infection prevention strategies. This will allow predictions of long-term health and cost outcomes and thus inform decisions on how to best use scarce health care resources for infection control.
Resumo:
The methodology undertaken, the channel model and the system model created for developing a novel adaptive equalization method and a novel channel tracking method for uplink of MU-MIMO-OFDM systems is presented in this paper. The results show that the channel tracking method works with 97% accuracy, while the training-based initial channel estimation method shows poor performance in estimating the actual channel comparatively.