859 resultados para Real-world Context
Resumo:
A theoretical basis is required for comparing key features and critical elements in wild fisheries and aquaculture supply chains under a changing climate. Here we develop a new quantitative metric that is analogous to indices used to analyse food-webs and identify key species. The Supply Chain Index (SCI) identifies critical elements as those elements with large throughput rates, as well as greater connectivity. The sum of the scores for a supply chain provides a single metric that roughly captures both the resilience and connectedness of a supply chain. Standardised scores can facilitate cross-comparisons both under current conditions as well as under a changing climate. Identification of key elements along the supply chain may assist in informing adaptation strategies to reduce anticipated future risks posed by climate change. The SCI also provides information on the relative stability of different supply chains based on whether there is a fairly even spread in the individual scores of the top few key elements, compared with a more critical dependence on a few key individual supply chain elements. We use as a case study the Australian southern rock lobster Jasus edwardsii fishery, which is challenged by a number of climate change drivers such as impacts on recruitment and growth due to changes in large-scale and local oceanographic features. The SCI identifies airports, processors and Chinese consumers as the key elements in the lobster supply chain that merit attention to enhance stability and potentially enable growth. We also apply the index to an additional four real-world Australian commercial fishery and two aquaculture industry supply chains to highlight the utility of a systematic method for describing supply chains. Overall, our simple methodological approach to empirically-based supply chain research provides an objective method for comparing the resilience of supply chains and highlighting components that may be critical.
Resumo:
BACKGROUND Many koala populations around Australia are in serious decline, with a substantial component of this decline in some Southeast Queensland populations attributed to the impact of Chlamydia. A Chlamydia vaccine for koalas is in development and has shown promise in early trials. This study contributes to implementation preparedness by simulating vaccination strategies designed to reverse population decline and by identifying which age and sex category it would be most effective to target. METHODS We used field data to inform the development and parameterisation of an individual-based stochastic simulation model of a koala population endemic with Chlamydia. The model took into account transmission, morbidity and mortality caused by Chlamydia infections. We calibrated the model to characteristics of typical Southeast Queensland koala populations. As there is uncertainty about the effectiveness of the vaccine in real-world settings, a variety of potential vaccine efficacies, half-lives and dosing schedules were simulated. RESULTS Assuming other threats remain constant, it is expected that current population declines could be reversed in around 5-6 years if female koalas aged 1-2 years are targeted, average vaccine protective efficacy is 75%, and vaccine coverage is around 10% per year. At lower vaccine efficacies the immunological effects of boosting become important: at 45% vaccine efficacy population decline is predicted to reverse in 6 years under optimistic boosting assumptions but in 9 years under pessimistic boosting assumptions. Terminating a successful vaccination programme at 5 years would lead to a rise in Chlamydia prevalence towards pre-vaccination levels. CONCLUSION For a range of vaccine efficacy levels it is projected that population decline due to endemic Chlamydia can be reversed under realistic dosing schedules, potentially in just 5 years. However, a vaccination programme might need to continue indefinitely in order to maintain Chlamydia prevalence at a sufficiently low level for population growth to continue.
Resumo:
Developing innovative library services requires a real world understanding of faculty members' desired curricular goals. This study aimed to develop a comprehensive and deeper understanding of Purdue's nutrition science and political science faculties' expectations for student learning related to information and data information literacies. Course syllabi were examined using grounded theory techniques that allowed us to identify how faculty were addressing information and data information literacies in their courses, but it also enabled us to understand the interconnectedness of these literacies to other departmental intentions for student learning, such as developing a professional identity or learning to conduct original research. The holistic understanding developed through this research provides the necessary information for designing and suggesting information literacy and data information literacy services to departmental faculty in ways supportive of curricular learning outcomes.
Resumo:
This paper explores the changing employment expectations that frame the early professional work experiences of young planners in Australia. In particular, it considers the rising popularity of pre-graduation professional work experience as a precursor to formal entry into the workforce as a practising planner. This shift is being driven in part by employer expectations that graduates will already have ‘real world’ and relevant work experience. However, an equally significant driver appears to be a growing desire for early career and graduate planners to find ways to distinguish themselves from their peers in an increasingly tight labour market. Using data from an ongoing research project into the formative work experiences of young people this paper describes the three main types of pre-graduation professional work experience undertaken by young planners. It highlights the potential challenges and benefits of pre-graduation work experience from a legal, social and ethical perspective as well as from the perspective of young planners themselves. The paper concludes by reflecting on the role of the planning profession – employers, peak bodies and planning educators – in managing the tensions between producing ‘work ready’ graduates and safeguarding the employment conditions of early career planning professionals.
Resumo:
Document clustering is one of the prominent methods for mining important information from the vast amount of data available on the web. However, document clustering generally suffers from the curse of dimensionality. Providentially in high dimensional space, data points tend to be more concentrated in some areas of clusters. We take advantage of this phenomenon by introducing a novel concept of dynamic cluster representation named as loci. Clusters’ loci are efficiently calculated using documents’ ranking scores generated from a search engine. We propose a fast loci-based semi-supervised document clustering algorithm that uses clusters’ loci instead of conventional centroids for assigning documents to clusters. Empirical analysis on real-world datasets shows that the proposed method produces cluster solutions with promising quality and is substantially faster than several benchmarked centroid-based semi-supervised document clustering methods.
Resumo:
From Kurt Vonnegut to Stephen King, many novelists use metanarrative techniques to insert fictional versions of themselves in the stories they tell. The function of deploying such techniques is often to draw attention to the liminal space between the fictional constructs inherent in the novel as a form, and the real world from which the constructs draw inspiration, and indeed, are read by an audience. For emerging writers working in short form narratives, however, the structural demands of the short story or flash fiction make the use of similar techniques problematic in the level of depth to which they can be deployed. Slow Napalm, the first in series of short stories, works to overcome the structural limitations of a succinct form by developing a fractured fictional version of the author over a number of pieces and published across a range of sites. The accumulative affect is a richer metanarrative textual arrangement that also allows for the individual short stories to function independently.
Resumo:
A major question in current network science is how to understand the relationship between structure and functioning of real networks. Here we present a comparative network analysis of 48 wasp and 36 human social networks. We have compared the centralisation and small world character of these interaction networks and have studied how these properties change over time. We compared the interaction networks of (1) two congeneric wasp species (Ropalidia marginata and Ropalidia cyathiformis), (2) the queen-right (with the queen) and queen-less (without the queen) networks of wasps, (3) the four network types obtained by combining (1) and (2) above, and (4) wasp networks with the social networks of children in 36 classrooms. We have found perfect (100%) centralisation in a queen-less wasp colony and nearly perfect centralisation in several other queen-less wasp colonies. Note that the perfectly centralised interaction network is quite unique in the literature of real-world networks. Differences between the interaction networks of the two wasp species are smaller than differences between the networks describing their different colony conditions. Also, the differences between different colony conditions are larger than the differences between wasp and children networks. For example, the structure of queen-right R. marginata colonies is more similar to children social networks than to that of their queen-less colonies. We conclude that network architecture depends more on the functioning of the particular community than on taxonomic differences (either between two wasp species or between wasps and humans).
Resumo:
From Kurt Vonnegut to Stephen King, many novelists use metanarrative techniques to insert fictional versions of themselves in the stories they tell. The function of deploying such techniques is often to draw attention to the liminal space between the fictional constructs inherent in the novel as a form, and the real world from which the constructs draw inspiration, and indeed, are read by an audience. For emerging writers working in short form narratives, however, the structural demands of the short story or flash fiction make the use of similar techniques problematic in the level of depth to which they can be deployed. 'The Joke', the second in series of short stories, works to overcome the structural limitations of a succinct form by developing a fractured fictional version of the author over a number of pieces and published across a range of sites. The accumulative affect is a richer metanarrative textual arrangement that also allows for the individual short stories to function independently.
Resumo:
From Kurt Vonnegut to Stephen King, many novelists use metanarrative techniques to insert fictional versions of themselves in the stories they tell. The function of deploying such techniques is often to draw attention to the liminal space between the fictional constructs inherent in the novel as a form, and the real world from which the constructs draw inspiration, and indeed, are read by an audience. For emerging writers working in short form narratives, however, the structural demands of the short story or flash fiction make the use of similar techniques problematic in the level of depth to which they can be deployed. Experimental People, the third in series of short stories, works to overcome the structural limitations of a succinct form by developing a fractured fictional version of the author over a number of pieces and published across a range of sites. The accumulative affect is a richer metanarrative textual arrangement that also allows for the individual short stories to function independently.
Resumo:
With the introduction of the PCEHR (Personally Controlled Electronic Health Record), the Australian public is being asked to accept greater responsibility for the management of their health information. However, the implementation of the PCEHR has occasioned poor adoption rates underscored by criticism from stakeholders with concerns about transparency, accountability, privacy, confidentiality, governance, and limited capabilities. This study adopts an ethnographic lens to observe how information is created and used during the patient journey and the social factors impacting on the adoption of the PCEHR at the micro-level in order to develop a conceptual model that will encourage the sharing of patient information within the cycle of care. Objective: This study aims to firstly, establish a basic understanding of healthcare professional attitudes toward a national platform for sharing patient summary information in the form of a PCEHR. Secondly, the studies aims to map the flow of patient related information as it traverses a patient’s personal cycle of care. Thus, an ethnographic approach was used to bring a “real world” lens to information flow in a series of case studies in the Australian healthcare system to discover themes and issues that are important from the patient’s perspective. Design: Qualitative study utilising ethnographic case studies. Setting: Case studies were conducted at primary and allied healthcare professionals located in Brisbane Queensland between October 2013 and July 2014. Results: In the first dimension, it was identified that healthcare professionals’ concerns about trust and medico-legal issues related to patient control and information quality, and the lack of clinical value available with the PCEHR emerged as significant barriers to use. The second dimension of the study which attempted to map patient information flow identified information quality issues, clinical workflow inefficiencies and interoperability misconceptions resulting in duplication of effort, unnecessary manual processes, data quality and integrity issues and an over reliance on the understanding and communication skills of the patient. Conclusion: Opportunities for process efficiencies, improved data quality and increased patient safety emerge with the adoption of an appropriate information sharing platform. More importantly, large scale eHealth initiatives must be aligned with the value proposition of individual stakeholders in order to achieve widespread adoption. Leveraging an Australian national eHealth infrastructure and the PCEHR we offer a practical example of a service driven digital ecosystem suitable for co-creating value in healthcare.
Resumo:
Electric-motored personal mobility devices (PMDs) are appearing on Australian roads. While legal to import and own, their use is typically illegal for adult riders within the road transport system. However, these devices could provide an answer to traffic congestion by getting people out of cars for short trips (“first-and-last mile” travel). City of Ryde council, Macquarie University, and Transport for NSW examined PMD use within the road transport system. Stage 1 of the project examined PMD use within a controlled pedestrian environment on the Macquarie University campus. Three PMD categories were used: one-wheelers (an electric unicycle, the Solowheel); two-wheelers (an electric scooter, the Egret); and three-wheelers (the Qugo). The two-wheeled PMD was most effective in terms of flexibility. In contrast, the three-wheeled PMD was most effective in terms of speed. One-wheeled PMD riders were very satisfied with their device, especially at speed, but significant training and practice was required. Two-wheeled PMD riders had less difficulty navigating through pedestrian precincts and favoured the manoeuvrability of the device as the relative narrowness of the two-wheeled PMD made it easier to use on a diversity of path widths. The usability of all PMDs was compromised by the weight of the devices, difficulties in ascending steeper gradients, portability, and parking. This was a limited trial, with a small number of participants and within a unique environment. However, agreement has been reached for a Stage 2 extension into the Macquarie Park business precinct for further real-world trials within a fully functional road transport system.
Resumo:
Background: A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN) from transcript profiling data. Results: The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting) problem and solved finally by formulating a Linear Program (LP). A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known regulatory associations. In each S. cerevisiae LP-SLGN, the number of nodes with a particular degree follows an approximate power law suggesting that its degree distributions is similar to that observed in real-world networks. Inspection of these LP-SLGNs suggests biological hypotheses amenable to experimental verification. Conclusion: A statistically robust and computationally efficient LP-based method for estimating the topology of a large sparse undirected graph from high-dimensional data yields representations of genetic networks that are biologically plausible and useful abstractions of the structures of real genetic networks. Analysis of the statistical and topological properties of learned LP-SLGNs may have practical value; for example, genes with high random walk betweenness, a measure of the centrality of a node in a graph, are good candidates for intervention studies and hence integrated computational – experimental investigations designed to infer more realistic and sophisticated probabilistic directed graphical model representations of genetic networks. The LP-based solutions of the sparse linear regression problem described here may provide a method for learning the structure of transcription factor networks from transcript profiling and transcription factor binding motif data.
Resumo:
Purpose – Business models to date have remained the creation of management, however, it is the belief of the authors that designers should be critically approaching, challenging and creating new business models as part of their practice. This belief portrays a new era where business model constructs become the new design brief of the future and fuel design and innovation to work together at the strategic level of an organisation. Design/methodology/approach – The purpose of this paper is to explore and investigate business model design. The research followed a deductive structured qualitative content analysis approach utilizing a predetermined categorization matrix. The analysis of forty business cases uncovered commonalities of key strategic drivers behind these innovative business models. Findings – Five business model typologies were derived from this content analysis, from which quick prototypes of new business models can be created. Research limitations/implications – Implications from this research suggest there is no “one right” model, but rather through experimentation, the generation of many unique and diverse concepts can result in greater possibilities for future innovation and sustained competitive advantage. Originality/value – This paper builds upon the emerging research and exploration into the importance and relevance of dynamic, design-driven approaches to the creation of innovative business models. These models aim to synthesize knowledge gained from real world examples into a tangible, accessible and provoking framework that provide new prototyping templates to aid the process of business model experimentation.
Resumo:
This thesis is a comparative case study in Japanese video game localization for the video games Sairen, Sairen 2 and Sairen Nyûtoransurêshon, and English-language localized versions of the same games as published in Scandinavia and Australia/New Zealand. All games are developed by Sony Computer Entertainment Inc. and published exclusively for Playstation2 and Playstation3 consoles. The fictional world of the Sairen games draws much influence from Japanese history, as well as from popular and contemporary culture, and in doing so caters mainly to a Japanese audience. For localization, i.e. the adaptation of a product to make it accessible to users outside the original market it was intended for in the first place, this is a challenging issue. Video games are media of entertainment, and therefore localization practice must preserve the games’ effects on the players’ emotions. Further, video games are digital products that are comprised of a multitude of distinct elements, some of which are part of the game world, while others regulate the connection between the player as part of the real world and the game as digital medium. As a result, video game localization is also a practice that has to cope with the technical restrictions that are inherent to the medium. The main theory used throughout the thesis is Anthony Pym’s framework for localization studies that considers the user of the localized product as a defining part of the localization process. This concept presupposes that localization is an adaptation that is performed to make a product better suited for use during a specific reception situation. Pym also addresses the factor that certain products may resist distribution into certain reception situations because of their content, and that certain aspects of localization aim to reduce this resistance through significant alterations of the original product. While Pym developed his ideas with mainly regular software in mind, they can also be adapted well to study video games from a localization angle. Since modern video games are highly complex entities that often switch between interactive and non-interactive modes, Pym’s ideas are adapted throughout the thesis to suit the particular elements being studied. Instances analyzed in this thesis include menu screens, video clips, in-game action and websites. The main research questions focus on how the games’ rules influence localization, and how the games’ fictional domain influences localization. Because there are so many peculiarities inherent to the medium of the video game, other theories are introduced as well to complement the research at hand. These include Lawrence Venuti’s discussions of foreiginizing and domesticating translation methods for literary translation, and Jesper Juul’s definition of games. Additionally, knowledge gathered from interviews with video game localization professionals in Japan during September and October 2009 is also utilized for this study. Apart from answering the aforementioned research questions, one of this thesis’ aims is to enrich the still rather small field of game localization studies, and the study of Japanese video games in particular, one of Japan’s most successful cultural exports.
Resumo:
The methodology of designing normative terminological products has been described in several guides and international standards. However, this methodology is not always applicable to designing translation-oriented terminological products which differ greatly from normative ones in terms of volume, function, and primary target group. This dissertation has three main goals. The first is to revise and enrich the stock of concepts and terms required in the process of designing an LSP dictionary for translators. The second is to detect, classify, and describe the factors which determine the characteristics of an LSP dictionary for translators and affect the process of its compilation. The third goal is to provide recommendations on different aspects of dictionary design. The study is based on an analysis of dictionaries, dictionary reviews, literature on translation-oriented lexicography, material from several dictionary projects, and the results of questionnaires. Thorough analysis of the concept of a dictionary helped us to compile a list of designable characteristics of a dictionary. These characteristics include target group, function, links to other resources, data carrier, list of lemmata, information about the lemmata, composition of other parts of the dictionary, compression of the data, structure of the data, and access structure. The factors which determine the characteristics of a dictionary have been divided into those derived from the needs of the intended users and those reflecting the restrictions of the real world (e.g. characteristics of the data carrier and organizational factors) and attitudes (e.g. traditions and scientific paradigms). The designer of a dictionary is recommended to take the intended users' needs as the starting point and aim at finding the best compromise between the conflicting factors. When designing an LSP dictionary, much depends on the level of knowledge of the intended users about the domain in question as well as their general linguistic competence, LSP competence, and lexicographic competence. This dissertation discusses the needs of LSP translators and the role of the dictionary in the process of translation of an LSP text. It also emphasizes the importance of planning lexicographic products and activities, and addresses many practical aspects of dictionary design.