122 resultados para one-to-many mapping
Resumo:
This study asks the central question, ‘Are social entrepreneurs using foresight to create innovation based on triple bottom line sustainability measures?’ and ‘if so, how?’ Sustainability is the emergent criteria for evaluating many aspects of the social world, including corporate governance, health systems, economics, social welfare and the environment. All the while, innovation is one of the key factors in the constitution of our social worlds, be this legislative, organisational, social or technical change. Therefore, it appears that the drive toward sustainability should be coupled with an emphasis on innovation – in particular creating innovation toward sustainability. Yet unexamined assumptions exist behind such language. Sustainability is a concept within the context of ‘the future’, requiring one to question ‘what is the future’ – in essence a utilisation of the strategic capacity for foresight. Foresight, moreover, ranges from the tacit assumed personal foresight of the ordinary individual to the specialised foresight of the professional forecaster, scenario planner, or foresight practitioner.
Resumo:
It is acknowledged around the world that many university students struggle with learning to program (McCracken et al., 2001; McGettrick et al., 2005). In this paper, we describe how we have developed a research programme to systematically study and incrementally improve our teaching. We have adopted a research programme with three elements: (1) a theory that provides an organising framework for defining the type of phenomena and data of interest, (2) data on how the class as a whole performs on formative assessment tasks that are framed from within the organising framework, and (3) data from one-on-one think aloud sessions, to establish why students struggle with some of those in-class formative assessment tasks. We teach introductory computer programming, but this three-element structure of our research is applicable to many areas of engineering education research.
Resumo:
A whole-genome scan was conducted to map quantitative trait loci (QTL) for BSE resistance or susceptibility. Cows from four half-sib families were included and 173 microsatellite markers were used to construct a 2835-cM (Kosambi) linkage map covering 29 autosomes and the pseudoautosomal region of the sex chromosome. Interval mapping by linear regression was applied and extended to a multiple-QTL analysis approach that used identified QTL on other chromosomes as cofactors to increase mapping power. In the multiple-QTL analysis, two genome-wide significant QTL (BTA17 and X/Y ps) and four genome-wide suggestive QTL (BTA1, 6, 13, and 19) were revealed. The QTL identified here using linkage analysis do not overlap with regions previously identified using TDT analysis. One factor that may explain the disparity between the results is that a more extensive data set was used in the present study. Furthermore, methodological differences between TDT and linkage analyses may affect the power of these approaches.
Resumo:
Introduction: The delivery of health care in the 21st century will look like no other in the past. The fast paced technological advances that are being made will need to transition from the information age into clinical practice. The phenomenon of e-Health is the over-arching form of information technology and telehealth is one arm of that phenomenon. The uptake of telehealth both in Australia and overseas, has changed the face of health service delivery to many rural and remote communities for the better, removing what is known as the tyranny of distance. Many studies have evaluated the satisfaction and cost-benefit analysis of telehealth across the organisational aspects as well as the various adaptations of clinical pathways and this is the predominant focus of most studies published to date. However, whilst comments have been made by many researchers about the need to improve and attend to the communication and relationship building aspects of telehealth no studies have examined this further. The aim of this study was to identify the patient and clinician experiences, concerns, behaviours and perceptions of the telehealth interaction and develop a training tool to assist these clinicians to improve their interaction skills. Methods: A mixed methods design combining quantitative (survey analysis and data coding) and qualitative (interview analysis) approaches was adopted. This study utilised four phases to firstly qualitatively explore the needs of clients (patients) and clinicians within a telehealth consultation then designed, developed, piloted and quantitatively and qualitatively evaluated the telehealth communication training program. Qualitative data was collected and analysed during Phase 1 of this study to describe and define the missing 'communication and rapport building' aspects within telehealth. This data was then utilised to develop a self-paced communication training program that enhanced clinicians existing skills, which comprised of Phase 2 of this study to develop the interactive program. Phase 3 included evaluating the training program with 26 clinicians and results were recorded pre and post training, whilst phase 4 was the pilot for future recommendations of this training program using a patient group within a Queensland Health setting at two rural hospitals. Results: Comparisons of pre and post training data on 1) Effective communication styles, 2) Involvement in communication training package, 3) satisfaction pre and post training, and 4) health outcomes pre and post training indicated that there were differences between pre and post training in relation to effective communication style, increased satisfaction and no difference in health outcomes between pre and post training for this patient group. The post training results revealed over half of the participants (N= 17, 65%) were more responsive to non-verbal cues and were better able to reflect and respond to looks of anxiousness and confusion from a 'patient' within a telehealth consultation. It was also found that during post training evaluations, clinicians had enhanced their therapeutic communication with greater detail to their own body postures, eye contact and presentation. There was greater time spent looking at the 'patient' with an increase of 35 second intervals of direct eye contact and less time spent looking down at paperwork which decreased by 20 seconds. Overall 73% of the clinicians were satisfied with the training program and 61% strongly agreed that they recognised areas of their communication that needed improving during a telehealth consultation. For the patient group there was significant difference post training in rapport with a mean score from 42 (SD = 28, n = 27) to 48 (SD = 5.9, n = 24). For communication comfort of the patient group there was a significant difference between the pre and post training scores t(10) = 27.9, p = .002, which meant that overall the patients felt less inhibited whilst talking to the clinicians and more understood. Conclusion: The aim of this study was to explore the characteristics of good patient-clinician communication and unmet training needs for telehealth consultations. The study developed a training program that was specific for telehealth consultations and not dependent on a 'trainer' to deliver the content. In light of the existing literature this is a first of its kind and a valuable contribution to the research on this topic. It was found that the training program was effective in improving the clinician's communication style and increased the satisfaction of patient's within an e-health environment. This study has identified some historical myths that telehealth cannot be part of empathic patient centred care due to its technology tag.
Resumo:
The continuous growth of the XML data poses a great concern in the area of XML data management. The need for processing large amounts of XML data brings complications to many applications, such as information retrieval, data integration and many others. One way of simplifying this problem is to break the massive amount of data into smaller groups by application of clustering techniques. However, XML clustering is an intricate task that may involve the processing of both the structure and the content of XML data in order to identify similar XML data. This research presents four clustering methods, two methods utilizing the structure of XML documents and the other two utilizing both the structure and the content. The two structural clustering methods have different data models. One is based on a path model and other is based on a tree model. These methods employ rigid similarity measures which aim to identifying corresponding elements between documents with different or similar underlying structure. The two clustering methods that utilize both the structural and content information vary in terms of how the structure and content similarity are combined. One clustering method calculates the document similarity by using a linear weighting combination strategy of structure and content similarities. The content similarity in this clustering method is based on a semantic kernel. The other method calculates the distance between documents by a non-linear combination of the structure and content of XML documents using a semantic kernel. Empirical analysis shows that the structure-only clustering method based on the tree model is more scalable than the structure-only clustering method based on the path model as the tree similarity measure for the tree model does not need to visit the parents of an element many times. Experimental results also show that the clustering methods perform better with the inclusion of the content information on most test document collections. To further the research, the structural clustering method based on tree model is extended and employed in XML transformation. The results from the experiments show that the proposed transformation process is faster than the traditional transformation system that translates and converts the source XML documents sequentially. Also, the schema matching process of XML transformation produces a better matching result in a shorter time.
Resumo:
A major challenge for robot localization and mapping systems is maintaining reliable operation in a changing environment. Vision-based systems in particular are susceptible to changes in illumination and weather, and the same location at another time of day may appear radically different to a system using a feature-based visual localization system. One approach for mapping changing environments is to create and maintain maps that contain multiple representations of each physical location in a topological framework or manifold. However, this requires the system to be able to correctly link two or more appearance representations to the same spatial location, even though the representations may appear quite dissimilar. This paper proposes a method of linking visual representations from the same location without requiring a visual match, thereby allowing vision-based localization systems to create multiple appearance representations of physical locations. The most likely position on the robot path is determined using particle filter methods based on dead reckoning data and recent visual loop closures. In order to avoid erroneous loop closures, the odometry-based inferences are only accepted when the inferred path's end point is confirmed as correct by the visual matching system. Algorithm performance is demonstrated using an indoor robot dataset and a large outdoor camera dataset.
Resumo:
Contemporary course designers in schools and faculties of Education are finding themselves dancing to many tunes, arguably too many tunes, in order to have their initial teacher education courses accredited by external agencies whilst satisfying internal approval processes and, critically, maintaining the philosophical integrity of their programs and their institutional watermarks. The “tunes” here are the agendas driven by and the demands made by distinct independent agencies. The external agencies influencing Education include: TEQSA (Tertiary Education Quality and Standards Agency) which will assure alignment to the AQF (Australian Qualifications Framework); professional bodies such as AITSL (Australian Institute for Teaching and School Leadership) which now accredits all pre-service teacher Education courses across Australia and assures alignment with the Australian Professional Standards for Teachers; and the state and territory regulatory authorities that have an impact within a specific jurisdiction, for example, the Queensland College of Teachers (QCT) and the Teacher Registration Board of Western Australia (TRBWA). This paper – whose findings have been arrived at through a year-long OLT National Teaching Fellowship - will outline the complex and competing agendas currently at play and focus on the disjuncture evident in the fundamental defining of who is a “graduate.” It will also attempt to identify where there are synergies between the complex demands being made. It will argue that there are too many “tunes” and the task of finding a balance between compliance and delivering effective initial teacher education may not be possible because of the cacophony of their conflicting demands.
Resumo:
Whole System Design is increasingly being seen as one of the most cost effective ways to both increase the productivity and reduce the negative environmental impacts of an engineered system. A focus on design is critical, as the output from this stage of the project locks-in most of the economic and environmental performance of the designed system throughout its life, which can span from a few years to many decades. Indeed, it is now widely acknowledged that all designers – particularly engineers, architects and industrial designers – need to be able to understand and implement a whole system design approach. This book provides a clear design methodology, based on leading efforts in the field, and is supported by worked examples that demonstrate how advances in energy, materials and water productivity can be achieved through applying an integrated approach to sustainable engineering. Chapters 1–5 outline the approach and explain how it can be implemented to enhance the established Systems Engineering framework. Chapters 6–10 demonstrate, through detailed worked examples, the application of the approach to industrial pumping systems, passenger vehicles, electronics and computer systems, temperature control of buildings, and domestic water systems.
Resumo:
Loss of cell-cell adhesion in carcinoma cells may be an important step in the acquisition of an invasive, metastatic phenotype. We have examined the expression of the epithelial-specific cell adhesion molecule uvomorulin (E-cadherin, cell-CAM 120/80, L-CAM) in human breast cancer cell lines. We find that fibroblastoid, highly invasive, vimentin-expressing breast cancer cell lines do not express uvomorulin. Of the more epithelial-appearing, less invasive, keratin-expressing breast cancer cell lines, some express uvomorulin, and some do not. We examined the morphologies of the cell lines in the reconstituted basement membrane matrix Matrigel and measured the ability of the cells to traverse a Matrigel-coated filter as in vitro models for detachment of carcinoma cells from neighboring cells and invasion through basement membrane into surrounding tissue. Colonies of uvomorulin-positive cells have a characteristic fused appearance in Matrigel, whereas uvomorulin-negative cells appear detached. Cells which are uvomorulin negative and vimentin positive have a stellate morphology in Matrigel. We show that uvomorulin is responsible for the fused colony morphology in Matrigel since treatment of uvomorulin-positive MCF-7 cells with an antibody to uvomorulin caused the cells to detach from one another but did not induce invasiveness in these cells, as measured by their ability to cross a Matrigel-coated polycarbonate filter in a modified Boyden chamber assay. Two uvomorulin-negative, vimentin-negative cell lines are also not highly invasive as measured by this assay. We suggest that loss of uvomorulin-mediated cell-cell adhesion may be one of many changes involved in the progression of a carcinoma cell to an invasive phenotype.
Resumo:
This project is led by scientists in conservation decision appraisal and brings together a group of experts working across the Lake Eyre Basin (LEB). The LEB covers a sixth of Australia, with an array of globally significant natural values that are threatened by invasive plants, among other things. Managers at various levels are investing in attempts to control, contain and eradicate these invasive plant species, under severe time and resources limitations. To date there has been no basin-wide assessment of which weed management strategies and locations provide the best investments for maximising outcomes for biodiversity per unit cost. Further, there has been no assessment of the extent of ecosystem intactness that may be lost without effective invasive plant species management strategies. Given that there are insufficient resources to manage all invasive plant species everywhere, this information has the potential to improve current investment decisions. Here, we provide a prioritisation of invasive plant management strategies in the LEB. Prioritisation was based on cost-effectiveness for biodiversity benefits. We identify the key invasive plant species to target to protect ecosystem intactness across the bioregions of the LEB, the level of investment required and the likely reduction in invasive species dominance gained per dollar spent on each strategy. Our focus is on strategies that are technically and socially feasible and reduce the likelihood that high impact invasive plant species will dominate native ecosystems, and therefore change their form and function. The outputs of this work are designed to help guide decision-making and further planning and investment in weed management for the Basin. Experts in weed management, policy-making, community engagement, biodiversity and natural values of the Basin, attended a workshop and agreed upon 12 strategies to manage invasive plants. The strategies focused primarily on 10 weeds which were considered to have a high potential for broad, significant impacts on natural ecosystems in the next 50 years and for which feasible management strategies could be defined. Each strategy consisted of one or more supporting actions, many of which were spatially linked to IBRA (Interim Biogeographical Regionalisation of Australia) bioregions. The first strategy was an over-arching recommendation for improved mapping, information sharing, education and extension efforts in order to facilitate the more specific weed management strategies. The 10 more specific weed management strategies targeted the control and/or eradication of the following high-impact exotic plants: mesquite, parkinsonia, rubber vine, bellyache bush, cacti, mother of millions, chinee apple, athel pine and prickly acacia, as well as a separate strategy for eradicating all invasive plants from one key threatened ecological community, the GAB (Great Artesian Basin dependant) mound springs. Experts estimated the expected biodiversity benefit of each strategy as the reduction in area that an invasive plant species is likely to dominate in over a 50-year period, where dominance was defined as more than 30% coverage at a site. Costs were estimated in present day terms over 50 years largely during follow up discussions post workshop. Cost-effectiveness was then calculated for each strategy in each bioregion by dividing the average expected benefit by the average annual costs. Overall, the total cost of managing 12 invasive plant strategies over the next 50 years was estimated at $1.7 billion. It was estimated that implementation of these strategies would result in a reduction of invasive plant dominance by 17 million ha (a potential 32% reduction), roughly 14% of the LEB. If only targeting Weeds of National Significance (WONS), the total cost was estimated to be $113 million over the next 50 years. Over the next 50 years, $2.3 million was estimated to eradicate all invasive plant species from the Great Artesian Basin Mound Springs threatened ecological community. Prevention and awareness programs were another key strategy targeted across the Basin and estimated at $17.5 million in total over 50 years. The cost of controlling, eradicating and containing buffel grass were the most expensive, over $1.5 billion over 50 years; this strategy was estimated to result in a reduction in buffel grass dominance of a million ha in areas where this species is identified as an environmental problem. Buffel grass has been deliberately planted across the Basin for pasture production and is by far the most widely distributed exotic species. Its management is contentious, having economic value to many graziers while posing serious threats to biodiversity and sites of high cultural and conservation interest. The strategy for containing and locally eradicating buffel grass was a challenge to cost based on expert knowledge, possibly because of the dual nature of this species as a valued pastoral grass and environmental weed. Based on our conversations with experts, it appears that control and eradication programs for this species, in conservation areas, are growing rapidly and that information on the most cost-effective strategies for this species will continue to develop over time. The top five most cost-effective strategies for the entire LEB were for the management of: 1) parkinsonia, 2) chinee apple, 3) mesquite, 4) rubber vine and 5) bellyache bush. Chinee apple and mother of millions are not WONS and have comparatively small populations within the semi-arid bioregions of Queensland. Experts felt that there was an opportunity to eradicate these species before they had the chance to develop into high-impact species within the LEB. Prickly acacia was estimated to have one of the highest benefits, but the costs of this strategy were high, therefore it was ranked 7th overall. The buffel grass strategy was ranked the lowest (10th) in terms of cost effectiveness. The top five most cost-effective strategies within and across the bioregions were the management of: 1) parkinsonia in the Channel Country, 2) parkinsonia in the Desert Uplands, 3) mesquite in the Mitchell Grass Downs, 4) parkinsonia in the Mitchell Grass Downs, and 5) mother of millions in the Desert Uplands. Although actions for several invasive plant species like parkinsonia and prickly acacia were concentrated in the Queensland part of the LEB, the actions involved investing in containment zones to prevent the spread of these species into other states. In the NT and SA bioregions of the LEB, the management of athel pine, parkinsonia and cacti were the main strategies. While outside the scientific research goals of study, this work highlighted a number of important incidental findings that led us to make the following recommendations for future research and implementation of weed management in the Basin: • Ongoing stakeholder engagement, extension and participation is required to ensure this prioritisation effort has a positive impact in affecting on-ground decision making and planning. • Short term funding for weed management was identified as a major reason for failure of current efforts, hence future funding needs to be secure and ongoing. • Improved mapping and information sharing is essential to implement effective weed management. • Due to uncertainties in the outcomes and impacts of management options, strategies should be implemented as part of an adaptive management program. The information provided in this report can be used to guide investment for controlling high-impact invasive plant species for the benefits of biodiversity conservation. We do not present a final prioritisation of invasive plant strategies for the LEB, and we have not addressed the cultural, socio-economic or spatial components necessary for an implementation plan. Cost-effectiveness depends on the objectives used; in our case we used the intactness of ecosystems as a surrogate for expected biodiversity benefits, measured by the extent that each invasive plant species is likely to dominate in a bioregion. When other relevant factors for implementation are considered the priorities may change and some actions may not be appropriate in some locations. We present the costs, ecological benefits and cost-effectiveness of preventing, containing, reducing and eradicating the dominance of high impact invasive plants through realistic management actions over the next 50 years. In doing so, we are able to estimate the size of the weed management problem in the LEB and provide expert-based estimates of the likely outcomes and benefits of implementing weed management strategies. The priorities resulting from this work provide a prospectus for guiding further investment in management and in improving information availability.
Resumo:
Australia has had two recent public apologies, one to the ‘ Stolen Generation’ of Aboriginal and Torres Strait Islander Australians and the second to the ‘Forgotten Australians' – people who had been removed from their parents as children and institutionalized. Both acts occurred in time when there was no Internet and peoples’ stories took years to collect and decades for their weight to carry the public momentum required to gain a public apology. Now, in a digital age, the reports and the testimonies held within them are available for all to read on the Internet. We all now know what happened and formal public apologies ensued. Both public apologies also draw attention to an emerging intersection between digital technologies, personal historical stories and public apology. Research has identified the potential of digital narrative, such as digital storytelling3 and videoed oral histories to assist in the production of digital narratives that can help to present the multiple voices and viewpoints of those affected by these subjects co-creatively (Burgess et al, pp.152-153). Not all Australians however have access or the skills to use digital tools so as to benefit from these technologies ⎯ especially Indigenous Australians. While the Federal Government is committed to helping Australians enjoy digital confidence and digital media literacy skills, experience inclusive digital participation and benefit through online engagement (Department of Broadband, communications and the Digital Economy, 2009) there are many initiatives that can also be undertaken locally by State funded institutions, such as libraries to assist. This paper highlights the outcomes of recent empirical projects undertaken at the State Library of Queensland (SLQ) in particular focusing on digital initiatives in Family History practices by Indigenous users, and a digital story project in response to the public apology to the Stolen Generation instigated by SLQ.
Resumo:
This thesis aimed at identifying cytokine markers associated with chlamydial infection and disease in koalas which is facing many threats to its survival, Chlamydia pecorum infections being a major one. To identify immunological markers associated with chlamydial infection and disease in koalas, key cytokines such as TNF alpha, IL10, IFN gamma and IL17A were cloned and sequenced and subsequently developed Quantitative Real Time PCR (qrtPCR) assays. The thesis provides preliminary data on the role of these cytokines in koala chlamydial disease and further longitudinal studies are required to confirm the role played by cytokines in pathology and protection against C. pecorum infection in the koala.
Resumo:
In stark contrast to its horticultural origins, modern genetics is an extremely technology-driven field. Almost all the major advances in the field over the past 20 years have followed technological developments that have permitted change in study designs. The development of PCR in the 1980s led to RFLP mapping of monogenic diseases. The development of fluorescent-tagged genotyping methods led to linkage mapping approaches for common diseases that dominated the 1990s. The development of microarray SNP genotyping has led to the genome-wide association study era of the new millennium. And now the development of next-generation sequencing technologies is about to open up a new era of gene-mapping, enabling many potential new study designs. This review aims to present the strengths and weaknesses of the current approaches, and present some new ideas about gene-mapping approaches that are likely to advance our knowledge of the genes involved in heritable bone traits such as bone mineral density (BMD) and fracture.
Resumo:
Similar to most other creative industries, the evolution of the music industry is heavily shaped by media technologies. This was equally true in 1999, when the global recorded music industry had experienced two decades of continuous growth largely driven by the rapid transition from vinyl records to Compact Discs. The transition encouraged avid music listeners to purchase much of their music collections all over again in order to listen to their favourite music with ‘digital sound’. As a consequence of this successful product innovation, recorded music sales (unit measure) more than doubled between the early 1980s and the end of the 1990s. It was with this backdrop that the first peer-to-peer file sharing service was developed and released to the mainstream music market in 1999 by the college student Shawn Fanning. The service was named Napster and it marks the beginning of an era that is now a classic example of how an innovation is able to disrupt an entire industry and make large swathes of existing industry competences obsolete. File sharing services such as Napster, followed by a range of similar services in its path, reduced physical unit sales in the music industry to levels that had not been seen since the 1970s. The severe impact of the internet on physical sales shocked many music industry executives who spent much of the 2000s vigorously trying to reverse the decline and make the disruptive technologies go away. At the end, they learned that their efforts were to no avail and the impact on the music industry proved to be transformative, irreversible and, to many music industry professionals, also devastating. Thousands of people lost their livelihood, large and small music companies have folded or been forced into mergers or acquisitions. But as always during periods of disruption, the past 15 years have also been very innovative, spurring a plethora of new music business models. These new business models have mainly emerged outside the music industry and the innovators have been often been required to be both persuasive and persistent in order to get acceptance from the risk-averse and cash-poor music industry establishment. Apple was one such change agent that in 2003 was the first company to open up a functioning and legal market for online music. iTunes Music Store was the first online retail outlet that was able to offer the music catalogues from all the major music companies; it used an entirely novel pricing model, and it allowed consumers to de-bundle the music album and only buy the songs that they actually liked. Songs had previously been bundled by physical necessity as discs or cassettes, but with iTunes Music Store, the institutionalized album bundle slowly started to fall apart. The consequences had an immediate impact on music retailing and within just a few years, many brick and mortar record stores were forced out of business in markets across the world. The transformation also had disruptive consequences beyond music retailing and redefined music companies’ organizational structures, work processes and routines, as well as professional roles. iTunes Music Store in one sense was a disruptive innovation, but it was at the same time relatively incremental, since the major labels’ positions and power structures remained largely unscathed. The rights holders still controlled their intellectual properties and the structures that guided the royalties paid per song that was sold were predictable, transparent and in line with established music industry practices.
Resumo:
Facing with the difficulty in information propagation and synthesizing from conceptual to embodiment design, this paper introduces a function-oriented, axiom based conceptual modeling scheme. Default logic reasoning is exploited for recognition and reconstitution of conceptual product geometric and topological information. The proposed product modeling system and reasoning approach testify a methodology of "structural variation design", which is verified in the implementation of a GPAL (Green Product All Life-cycle) CAD system. The GPAL system includes major enhancement modules of a mechanism layout sketching method based on fuzzy logic, a knowledge-based function-to-form mapping mechanism and conceptual form reconstitution paradigm based on default geometric reasoning. A mechanical hand design example shows a more than 20 times increase in design efficacy with these enhancement modules in the GPAL system on a general 3D CAD platform.