924 resultados para Iterative probing
Resumo:
This paper presents and discusses organisational barriers and opportunities arising from the dissemination of design led innovation within a leading Australian airport corporation. This research is part of a greater action research program which aims to integrate design as a strategic capability through design led innovation within Australian businesses. Findings reveal that there is an opportunity to employ the theoretical framework and tools of design led innovation in practice to build collaborative idea generation by involving customers and stakeholders within the proposal of new to world propositions. The iterative gathering of deep customer insights also provided an opportunity to leverage a greater understanding of stakeholders and customers in strengthening continuing business partnerships through co-design. Challenges to the design led approach include resistance to the exploratory nature of gathering deep customer insights, the testing of long held assumptions and market data, and the disruption of an organisational mindset geared toward risk aversion instilled within the aviation industry. The implication from these findings is that design led innovation can provide the critical platform to allow for a business to grow and sustain internal design capabilities necessary to challenge prevailing assumptions about how its business model operates to deliver value to customers and stakeholders alike. The platform of design led innovation also provides an avenue to support a cultural transformation towards anticipating future needs necessary for establishing a position of leadership within the broader economic environment.
Resumo:
Expert searchers engage with information as information brokers, researchers, reference librarians, information architects, faculty who teach advanced search, and in a variety of other information-intensive professions. Their experiences are characterized by a profound understanding of information concepts and skills and they have an agile ability to apply this knowledge to interacting with and having an impact on the information environment. This study explored the learning experiences of searchers to understand the acquisition of search expertise. The research question was: What can be learned about becoming an expert searcher from the learning experiences of proficient novice searchers and highly experienced searchers? The key objectives were: (1) to explore the existence of threshold concepts in search expertise; (2) to improve our understanding of how search expertise is acquired and how novice searchers, intent on becoming experts, can learn to search in more expertlike ways. The participant sample drew from two population groups: (1) highly experienced searchers with a minimum of 20 years of relevant professional experience, including LIS faculty who teach advanced search, information brokers, and search engine developers (11 subjects); and (2) MLIS students who had completed coursework in information retrieval and online searching and demonstrated exceptional ability (9 subjects). Using these two groups allowed a nuanced understanding of the experience of learning to search in expertlike ways, with data from those who search at a very high level as well as those who may be actively developing expertise. The study used semi-structured interviews, search tasks with think-aloud narratives, and talk-after protocols. Searches were screen-captured with simultaneous audio-recording of the think-aloud narrative. Data were coded and analyzed using NVivo9 and manually. Grounded theory allowed categories and themes to emerge from the data. Categories represented conceptual knowledge and attributes of expert searchers. In accord with grounded theory method, once theoretical saturation was achieved, during the final stage of analysis the data were viewed through lenses of existing theoretical frameworks. For this study, threshold concept theory (Meyer & Land, 2003) was used to explore which concepts might be threshold concepts. Threshold concepts have been used to explore transformative learning portals in subjects ranging from economics to mathematics. A threshold concept has five defining characteristics: transformative (causing a shift in perception), irreversible (unlikely to be forgotten), integrative (unifying separate concepts), troublesome (initially counter-intuitive), and may be bounded. Themes that emerged provided evidence of four concepts which had the characteristics of threshold concepts. These were: information environment: the total information environment is perceived and understood; information structures: content, index structures, and retrieval algorithms are understood; information vocabularies: fluency in search behaviors related to language, including natural language, controlled vocabulary, and finesse using proximity, truncation, and other language-based tools. The fourth threshold concept was concept fusion, the integration of the other three threshold concepts and further defined by three properties: visioning (anticipating next moves), being light on one's 'search feet' (dancing property), and profound ontological shift (identity as searcher). In addition to the threshold concepts, findings were reported that were not concept-based, including praxes and traits of expert searchers. A model of search expertise is proposed with the four threshold concepts at its core that also integrates the traits and praxes elicited from the study, attributes which are likewise long recognized in LIS research as present in professional searchers. The research provides a deeper understanding of the transformative learning experiences involved in the acquisition of search expertise. It adds to our understanding of search expertise in the context of today's information environment and has implications for teaching advanced search, for research more broadly within library and information science, and for methodologies used to explore threshold concepts.
Resumo:
Australian governments face the twin challenges of dealing with extreme weather-related disasters (such as floods and bushfires) and adapting to the impacts of climate change. These challenges are connected, so any response would benefit from a more integrated approach across and between the different levels of government.This report summarises the findings of an NCCARF-funded project that addresses this problem. The project undertook a three-way comparative case study of the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods. It collected data from the official inquiry reports into each of these events, and conducted new interviews and workshops with key stakeholders. The findings of this project included recommendations that range from the conceptual to the practical. First, it was argued that a reconceptualization of terms such as ‘community’ and ‘resilience’ was necessary to allow for more tailored responses to varying circumstances. Second, it was suggested that the high level of uncertainty inherent in disaster risk management and climate change adaptation requires a more iterative approach to policymaking and planning. Third, some specific institutional reforms were proposed that included: 1) a new funding mechanism that would encourage collaboration between and across different levels of government, as well as promoting partnerships with business and the community; 2) improving community engagement through new resilience grants run by local councils; 3) embedding climate change researchers within disaster risk management agencies to promote institutional learning, and; 4) creating an inter-agency network that encourages collaboration between organisations.
Resumo:
This paper discusses the fast emerging challenges for Malay and Muslim sexual minority storytellers in the face of an aggressive state-sponsored Islamisation of a constitutionally secular Malaysia. I examine the case of Azwan Ismail, a gay Malay and Muslim Malaysian who took part in the local ‘It Gets Better’ Project, initiated in December 2010 by Seksualiti Merdeka (an annual sexuality rights festival) and who suffered an onslaught of hostile comments from fellow Malay Muslims. In this paper, I ask how a message aimed at discouraging suicidal tendencies among sexual minority teenagers can go so wrong. In discussing the contradictions between Azwan’s constructions of self and the expectations others have of him, I highlight the challenges for Azwan’s existential self. For storytellers who are vulnerable if visible, the inevitable sharing of a personal story with unintended and hostile audiences when placed online, can have significant repercussions. The purist Sunni Islam agenda in Malaysia not only rejects the human rights of the sexual minority in Malaysia but has influenced and is often a leading hostile voice in both regional and international blocs. This self-righteous and supremacist political Islam fosters a more disabling environment for vulnerable, minority communities and their human rights. It creates a harsher reality for the sexual minority that manifests in State-endorsed discrimination, compulsory counselling, forced rehabilitation and their criminalisation. It places the right of the sexual minority to live within such a community in doubt. I draw on existing literature on how personal stories have historically been used to advance human rights. Included too, is the signifance and implications of the work by social psychologists in explaining this loss of credibility of personal stories. I then advance an analytical framework that will allow storytelling as a very individual form of witnessing to reclaim and regain its ‘truth to power’.
Resumo:
Bayesian networks (BNs) provide a statistical modelling framework which is ideally suited for modelling the many factors and components of complex problems such as healthcare-acquired infections. The methicillin-resistant Staphylococcus aureus (MRSA) organism is particularly troublesome since it is resistant to standard treatments for Staph infections. Overcrowding and understa�ng are believed to increase infection transmission rates and also to inhibit the effectiveness of disease control measures. Clearly the mechanisms behind MRSA transmission and containment are very complicated and control strategies may only be e�ective when used in combination. BNs are growing in popularity in general and in medical sciences in particular. A recent Current Content search of the number of published BN journal articles showed a fi�ve fold increase in general and a six fold increase in medical and veterinary science from 2000 to 2009. This chapter introduces the reader to Bayesian network (BN) modelling and an iterative modelling approach to build and test the BN created to investigate the possible role of high bed occupancy on transmission of MRSA while simultaneously taking into account other risk factors.
Resumo:
Sector wide interest in Reframe: QUT’s Evaluation Framework continues with a number of institutions requesting finer details as QUT embeds the new approach to evaluation across the university in 2013. This interest, both nationally and internationally has warranted QUT’s collegial response to draw upon its experiences from developing Reframe into distilling and offering Kaleidoscope back to the sector. The word Reframe is a relevant reference for QUT’s specific re-evaluation, reframing and adoption of a new approach to evaluation; whereas Kaleidoscope reflects the unique lens through which any other institution will need to view their own cultural specificity and local context through an extensive user-led stakeholder engagement approach when introducing new approaches to learning and teaching evaluation. Kaleidoscope’s objectives are for QUT to develop its research-based stakeholder approach to distil the successful experience exhibited in the Reframe Project into a transferable set of guidelines for use by other tertiary institutions across the sector. These guidelines will assist others to design, develop, and deploy, their own culturally specific widespread organisational change informed by stakeholder engagement and organisational buy-in. It is intended that these guidelines will promote, support and enable other tertiary institutions to embark on their own evaluation projects and maximise impact. Kaleidoscope offers an institutional case study of widespread organisational change underpinned by Reframe’s (i) evidence-based methodology; (ii) research including published environmental scan, literature review (Alderman, et al., 2012), development of a conceptual model (Alderman, et al., in press 2013), project management principles (Alderman & Melanie, 2012) and national conference peer reviews; and (iii) year-long strategic project with national outreach to collaboratively engage the development of a draft set of National Guidelines. Kaleidoscope’s aims are to inform Higher Education evaluation policy development through national stakeholder engagement, the finalisation of proposed National Guidelines. In correlation with the conference paper, the authors will present a Draft Guidelines and Framework ready for external peer review by evaluation practitioners from the Higher Education sector, as part of Kaleidoscope’s dissemination strategy (Hinton & Gannaway, 2011) applying illuminative evaluation theory (Parlett & Hamilton, 1976), through conference workshops and ongoing discussions (Shapiro, et al., 1983; Jacobs, 2000). The initial National Guidelines will be distilled from the Reframe: QUT’s Evaluation Framework’s Policy, Protocols, and incorporated Business Rules. It is intended that the outcomes of Kaleidoscope are owned by and reflect sectoral engagement, including iterative evaluation through multiple avenues of dissemination and collaboration including the Higher Education sector. The dissemination strategy with the inclusion of Illuminative Evaluation methodology provides an inclusive opportunity for other institutions and stakeholders across the Higher Education sector to give voice through the information-gathering component of evaluating the draft Guidelines, providing a comprehensive understanding of the complex realities experienced across the Higher Education sector, and thereby ‘illuminating’ both the shared and unique lenses and contexts. This process will enable any final guidelines developed to have broader applicability, greater acceptance, enhanced sustainability and additional relevance benefiting the Higher Education sector, and the adoption and adaption by any single institution for their local contexts.
Resumo:
Osteocytes are the mature cells and perform as mechanosensors within the bone. The mechanical property of osteocytes plays an important role to fulfill these functions. However, little researches have been done to investigate the mechanical deformation properties of single osteocytes. Atomic Force Microscopy (AFM) is a state-of-art experimental facility for high resolution imaging of tissues, cells and any surfaces as well as for probing mechanical properties of the samples both qualitatively and quantitatively. In this paper, the experimental study based on AFM is firstly used to obtain forceindentation curves of single round osteocytes. The porohyperelastic (PHE) model of a single osteocyte is then developed by using the inverse finite element analysis (FEA) to identify and extract mechanical properties from the experiment results. It has been found that the PHE model is a good candidature for biomechanics studies of osteocytes.
Resumo:
Articular cartilage is the load-bearing tissue that consists of proteoglycan macromolecules entrapped between collagen fibrils in a three-dimensional architecture. To date, the drudgery of searching for mathematical models to represent the biomechanics of such a system continues without providing a fitting description of its functional response to load at micro-scale level. We believe that the major complication arose when cartilage was first envisaged as a multiphasic model with distinguishable components and that quantifying those and searching for the laws that govern their interaction is inadequate. To the thesis of this paper, cartilage as a bulk is as much continuum as is the response of its components to the external stimuli. For this reason, we framed the fundamental question as to what would be the mechano-structural functionality of such a system in the total absence of one of its key constituents-proteoglycans. To answer this, hydrated normal and proteoglycan depleted samples were tested under confined compression while finite element models were reproduced, for the first time, based on the structural microarchitecture of the cross-sectional profile of the matrices. These micro-porous in silico models served as virtual transducers to produce an internal noninvasive probing mechanism beyond experimental capabilities to render the matrices micromechanics and several others properties like permeability, orientation etc. The results demonstrated that load transfer was closely related to the microarchitecture of the hyperelastic models that represent solid skeleton stress and fluid response based on the state of the collagen network with and without the swollen proteoglycans. In other words, the stress gradient during deformation was a function of the structural pattern of the network and acted in concert with the position-dependent compositional state of the matrix. This reveals that the interaction between indistinguishable components in real cartilage is superimposed by its microarchitectural state which directly influences macromechanical behavior.
Resumo:
The objectives of this chapter are to examine the contribution of public spaces to the wellbeing of young people and how participatory action research can be used as a process tool for contextually responsive, collaborative, iterative and multi-method community level practice. The addition of a spatial frame to practice can open up unexpected alliances and opportunities for enhancing the wellbeing of people.
Resumo:
Asking why is an important foundation of inquiry and fundamental to the development of reasoning skills and learning. Despite this, and despite the relentless and often disruptive nature of innovations in information and communications technology (ICT), sophisticated tools that directly support this basic act of learning appear to be undeveloped, not yet recognized, or in the very early stages of development. Why is this so? To this question, there is no single satisfactory answer; instead, numerous plausible explanations and related questions arise. After learning something, however, explaining why can be revealing of a person’s understanding (or lack of it). What then differentiates explanation from information; and, explanatory from descriptive content? What ICT scaffolding might support inquiry instigated by why-questioning? What is the role of reflective practice in inquiry-based learning? These and other questions have emerged from this investigation and underscore that why-questions often propagate further questions and are a catalyst for cognitive engagement and dialogue. This paper reports on a multi-disciplinary, theoretical investigation that informs the broad discourse on e-learning and points to a specific frontier for design and development of e-learning tools. Probing why reveals that versatile and ambiguous semantics present the core challenge – asking, learning, knowing, understanding, and explaining why.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Moreover, several optimization techniques are also proposed to reduce the cost of estimating the confidence of imputation queries at both the tuple-level and the database-level. Experiments based on several real-world data collections demonstrate not only the effectiveness of WebPut compared to existing approaches, but also the efficiency of our proposed algorithms and optimization techniques.
Resumo:
There is currently a wide range of research into the recent introduction of student response systems in higher education and tertiary settings (Banks 2006; Kay and Le Sange, 2009; Beatty and Gerace 2009; Lantz 2010; Sprague and Dahl 2009). However, most of this pedagogical literature has generated ‘how to’ approaches regarding the use of ‘clickers’, keypads, and similar response technologies. There are currently no systematic reviews on the effectiveness of ‘GoSoapBox’ – a more recent, and increasingly popular student response system – for its capacity to enhance critical thinking, and achieve sustained learning outcomes. With rapid developments in teaching and learning technologies across all undergraduate disciplines, there is a need to obtain comprehensive, evidence-based advice on these types of technologies, their uses, and overall efficacy. This paper addresses this current gap in knowledge. Our teaching team, in an undergraduate Sociology and Public Health unit at the Queensland University of Technology (QUT), introduced GoSoapBox as a mechanism for discussing controversial topics, such as sexuality, gender, economics, religion, and politics during lectures, and to take opinion polls on social and cultural issues affecting human health. We also used this new teaching technology to allow students to interact with each other during class – both on both social and academic topics – and to generate discussions and debates during lectures. The paper reports on a data-driven study into how this interactive online tool worked to improve engagement and the quality of academic work produced by students. This paper will firstly, cover the recent literature reviewing student response systems in tertiary settings. Secondly, it will outline the theoretical framework used to generate this pedagogical research. In keeping with the social and collaborative features of Web 2.0 technologies, Bandura’s Social Learning Theory (SLT) will be applied here to investigate the effectiveness of GoSoapBox as an online tool for improving learning experiences and the quality of academic output by students. Bandura has emphasised the Internet as a tool for ‘self-controlled learning’ (Bandura 2001), as it provides the education sector with an opportunity to reconceptualise the relationship between learning and thinking (Glassman & Kang 2011). Thirdly, we describe the methods used to implement the use of GoSoapBox in our lectures and tutorials, and which aspects of the technology we drew on for learning purposes, as well as the methods for obtaining feedback from the students about the effectiveness or otherwise of this tool. Fourthly, we report cover findings from an examination of all student/staff activity on GoSoapBox as well as reports from students about the benefits and limitations of it as a learning aid. We then display a theoretical model that is produced via an iterative analytical process between SLT and our data analysis for use by academics and teachers across the undergraduate curriculum. The model has implications for all teachers considering the use of student response systems to improve the learning experiences of their students. Finally, we consider some of the negative aspects of GoSoapBox as a learning aid.
Resumo:
In this paper, a model-predictive control (MPC) method is detailed for the control of nonlinear systems with stability considerations. It will be assumed that the plant is described by a local input/output ARX-type model, with the control potentially included in the premise variables, which enables the control of systems that are nonlinear in both the state and control input. Additionally, for the case of set point regulation, a suboptimal controller is derived which has the dual purpose of ensuring stability and enabling finite-iteration termination of the iterative procedure used to solve the nonlinear optimization problem that is used to determine the control signal.
Resumo:
Introduction Multidisciplinary models of organising and providing care have been proposed to decrease the health services gap between urban and rural populations but health workforce shortages exist across most professions and are further exacerbated by maldistribution. Flexibility and expansion of the range of tasks that a health professional can undertake were proposed. Dispensing doctors (DDs) are such an example. As part of DDs’ routine medical practice, DDs are able to both prescribe and dispense medicines to their patients. The granting of a dispensing licence to a doctor is intended to improve rural community access to medicines where there is no pharmacy within a reasonable distance. Method An iterative, qualitative descriptive methodology was used to identify factors which influenced DDs’ practice. Qualitative data were collected by in-depth face-to-face and telephone interviews with DDs. A combination of processes: qualitative content analysis and constant comparison were used to analyse the interview transcripts thematically. Member checking and separate coding were utilised to ensure rigour. Result Thirty-one interviews were conducted. The respondents universally acknowledged that the main reason for dispensing were for the convenience and benefits of their patients and to ensure continuity of care. DDs’ communities were generally more isolated and smaller when compared to their non-dispensing counterparts. DD-respondents viewed their dispensary as a service to the community. Peer pressure on prescribing was a key factors in self-regulating prescribing and dispensing. Conclusion DDs fulfill an important area of unmet needs by providing continuity of pharmaceutical care but the practice is hindered by significant barriers