708 resultados para support need
Resumo:
If Australian scientists are to fully and actively participate in international scientific collaborations utilising online technologies, policies and laws must support the data access and reuse objectives of these projects. To date Australia lacks a comprehensive policy and regulatory framework for environmental information and data generally. Instead there exists a series of unconnected Acts that adopt historically-based, sector-specific approaches to the collection, use and reuse of environmental information. This paper sets out the findings of an analysis of a representative sample of Australian statutes relating to environmental management and protection to determine the extent to which they meet best practice criteria for access to and reuse of environmental information established in international initiatives. It identifies issues that need to be addressed in the legislation governing environmental information to ensure that Australian scientists are able to fully engage in international research collaborations.
Resumo:
Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.
Resumo:
Effective academic workforce staff development remains a challenge in higher education. This thesis-by-publication examined the importance of alternative paradigms for academic staff development, focusing specifically on arts-based learning as a non-traditional approach to transformative learning for management and self-development within the business of higher education. The research question asked was whether or not the facilitation of staff development through the practice of arts-based transformational learning supported academic aims in higher education, based on data obtained with the participants of the academic staff development program at one Australian university over a three year period. Over that three year period, eighty academics participated in one large metropolitan Australian university’s arts-based academic development program. The research approach required analysis of the transcribed one-on-one hermeneutic-based conversations with fifteen self-selected academics, five from each year, and with a focus group of twenty other self-selected academics from all three years. The study’s findings provided evidence that supported the need for academic staff development that prepared academics to be engaged and creative and therefore more likely to be responsive to emerging issues and to be innovative in the presence of constraints, including organisational constraints. The qualitative participative conversation transcription data found that arts-based lifelong learning processes provided participant perception of enhanced capabilities for self-creation and clarity of transformational action in academic career management. The study presented a new and innovative Artful Learning Wave Trajectory learning model to engender academic professional artistry. The findings provided developers with support for using a non-traditional strategy of transformational learning.
Resumo:
Throughout this workshop session we have looked at various configurations of Sage as well as using the Sage UI to run Sage applications (e.g. the image viewer). More advanced usage of Sage has been demonstrated using a Sage compatible version of Paraview highlighting the potential of parallel rendering. The aim of this tutorial session is to give a practical introduction to developing visual content for a tiled display using the Sage libraries. After completing this tutorial you should have the basic tools required to develop your own custom Sage applications. This tutorial is designed for software developers and intermediate programming knowledge is assumed, along with some introductory OpenGL . You will be required to write small portions of C/C++ code to complete this worksheet. However if you do not feel comfortable writing code (or have never written in C or C++), we will be on hand throughout this session so feel free to ask for some help. We have a number of machines in this lab running a VNC client to a virtual machine running Fedora 12. You should all be able to log in with the username “escience”, and password “escience10”. Some of the commands in this worksheet require you to run them as the root user, so note the password as you may need to use it a few times. If you need to access the Internet, then use the username “qpsf01”, password “escience10”
Resumo:
This article argues for exploring lesbian, gay, bisexual, and transgender (LGBT) young people’s experiences with police. While research examines how factors such as indigeneity influence young peoples’ experiences with police, how sexuality and/or gender identity mediates these relationships remains largely unexplored. Key bodies of research suggest a need to explore this area further, including: literature documenting links between homophobic violence against LGBT young people and outcomes such as homelessness that fall within the gambit of policing work; research showing reluctance of LGBT communities to report crime to police; international research documenting homophobic police attitudes and Australian research demonstrating arguably homophobic court outcomes; and research outlining increasing police support of LGBT communities. Drawing on these bodies of literature, this article argues that LGBT young people experience policing warrants further research.
Resumo:
Automation technology can provide construction firms with a number of competitive advantages. Technology strategy guides a firm's approach to all technology, including automation. Engineering management educators, researchers, and construction industry professionals need improved understanding of how technology affects results, and how to better target investments to improve competitive performance. A more formal approach to the concept of technology strategy can benefit the construction manager in his efforts to remain competitive in increasingly hostile markets. This paper recommends consideration of five specific dimensions of technology strategy within the overall parameters of market conditions, firm capabilities and goals, and stage of technology evolution. Examples of the application of this framework in the formulation of technology strategy are provided for CAD applications, co-ordinated positioning technology and advanced falsework and formwork mechanisation to support construction field operations. Results from this continuing line of research can assist managers in making complex and difficult decisions regarding reengineering construction processes in using new construction technology and benefit future researchers by providing new tools for analysis. Through managing technology to best suit the existing capabilities of their firm, and addressing the market forces, engineering managers can better face the increasingly competitive environment in which they operate.
Resumo:
Cell based therapies as they apply to tissue engineering and regenerative medicine, require cells capable of self renewal and differentiation, and a prerequisite is to be able to prepare an effective dose of ex vivo expanded cells for autologous transplants. The in vivo identification of a source of physiologically relevant cell types suitable for cell therapies therefore figures as an integral part of tissue engineering. Stem cells serve as a reserve for biological repair, having the potential to differentiate into a number of specialised cell types within the body; they therefore represent the most useful candidates for cell based therapies. The primary goal of stem cell research is to produce cells that are both patient specific, as well as having properties suitable for the specific conditions for which they are intended to remedy. From a purely scientific perspective, stem cells allow scientists to gain a deeper understanding of developmental biology and regenerative therapies. Stem cells have acquired a number of uses for applications in regenerative medicine, immunotherapy, gene therapy, but it is in the area of tissue engineering that they generate most excitement, primarily as a result of their capacity for self-renewal and pluripotency. A unique feature of stem cells is their ability to maintain an uncommitted quiescent state in vivo and then, once triggered by conditions such as disease, injury or natural wear or tear, serve as a reservoir and natural support system to replenish lost cells. Although these cells retain the plasticity to differentiate into various tissues, being able to control this differentiation process is still one of the biggest challenges facing stem cell research. In an effort to harness the potential of these cells a number of studies have been conducted using both embryonic/foetal and adult stem cells. The use of embryonic stem cells (ESC) have been hampered by strong ethical and political concerns, this despite their perceived versatility due to their pluripotency. Ethical issues aside, other concerns raised with ESCs relates to the possibility of tumorigenesis, immune rejection and complications with immunosuppressive therapies, all of which adds layers of complications to the application ESC in research and which has led to the search for alternative sources for stem cells. The adult tissues in higher organisms harbours cells, termed adult stem cells, and these cells are reminiscent of unprogrammed stem cells. A number of sources of adult stem cells have been described. Bone marrow is by far the most accessible source of two potent populations of adult stem cells, namely haematopoietic stem cells (HSCs) and bone marrow mesenchymal stem cells (BMSCs). Autologously harvested adult stem cells can, in contrast to embryonic stem cells, readily be used in autografts, since immune rejection is not an issue; and their use in scientific research has not attracted the ethical concerns which have been the case with embryonic stem cells. The major limitation to their use, however, is the fact that adult stem cells are exceedingly rare in most tissues. This fact makes identifying and isolating these cells problematic; bone marrow being perhaps the only notable exception. Unlike the case of HSCs, there are as yet no rigorous criteria for characterizing MSCs. Changing acuity about the pluripotency of MSCs in recent studies has expanded their potential application; however, the underlying molecular pathways which impart the features distinctive to MSCs remain elusive. Furthermore, the sparse in vivo distribution of these cells imposes a clear limitation to their study in vitro. Also, when MSCs are cultured in vitro, there is a loss of the in vivo microenvironment, resulting in a progressive decline in proliferation potential and multipotentiality. This is further exacerbated with increased passage numbers in culture, characterized by the onset of senescence related changes. As a consequence, it is necessary to establish protocols for generating large numbers of MSCs but without affecting their differentiation potential. MSCs are capable of differentiating into mesenchymal tissue lineages, including bone, cartilage, fat, tendon, muscle, and marrow stroma. Recent findings indicate that adult bone marrow may also contain cells that can differentiate into the mature, nonhematopoietic cells of a number of tissues, including cells of the liver, kidney, lung, skin, gastrointestinal tract, and myocytes of heart and skeletal muscle. MSCs can readily be expanded in vitro and can be genetically modified by viral vectors and be induced to differentiate into specific cell lineages by changing the microenvironment–properties which makes these cells ideal vehicles for cellular gene therapy. MSCs can also exert profound immunosuppressive effects via modulation of both cellular and innate immune pathways, and this property allows them to overcome the issue of immune rejection. Despite the many attractive features associated with MSCs, there are still many hurdles to overcome before these cells are readily available for use in clinical applications. The main concern relates to in vivo characterization and identification of MSCs. The lack of a universal biomarker, sparse in vivo distribution, and a steady age related decline in their numbers, makes it an obvious need to decipher the reprogramming pathways and critical molecular players which govern the characteristics unique to MSCs. This book presents a comprehensive insight into the biology of adult stem cells and their utility in current regeneration therapies. The adult stem cell populations reviewed in this book include bone marrow derived MSCs, adipose derived stem cells (ASCs), umbilical cord blood stem cells, and placental stem cells. The features such as MSC circulation and trafficking, neuroprotective properties, and the nurturing roles and differentiation potential of multiple lineages have been discussed in details. In terms of therapeutic applications, the strengths of MSCs have been presented and their roles in disease treatments such as osteoarthritis, Huntington’s disease, periodontal regeneration, and pancreatic islet transplantation have been discussed. An analysis comparing osteoblast differentiation of umbilical cord blood stem cells and MSCs has been reviewed, as has a comparison of human placental stem cells and ASCs, in terms of isolation, identification and therapeutic applications of ASC in bone, cartilage regeneration, as well as myocardial regeneration. It is my sincere hope that this book will update the reader as to the research progress of MSC biology and potential use of these cells in clinical applications. It will be the best reward to all contributors of this book, if their efforts herein may in some way help the readers in any part of their study, research, and career development.
Resumo:
This paper argues for the need for critical reading comprehension in an era of accountability that often promotes reading comprehension as readily assessable through students answering multiple choice questions of unseen texts. Based upon a 1 year study investigating literacy in Years 4–9 the ways strong-performing primary schools develop serious and in-depth reading for learning are explored. School and teacher features which allow for the development of sophisticated pedagogical repertoires and space for critical reading comprehension, without losing the complexity of curriculum offerings, are outlined. How one experienced middle primary teacher operates strategically, ethically and critically in supporting her ESL students to learn to read is illustrated. The teacher’s work is situated within the complex accountability demands faced by classroom teachers. This was accomplished by a teacher whose pedagogical repertoire has been assembled across a career teaching in low-SES high ESL communities in a school with a balanced literacy program and high level of collegial support. Risks for schools and teachers whose circumstances work against their capacities for prioritisation and strategic decision-making are identified and discussed.
Resumo:
In recent years the Australian tertiary education sector may be said to be undergoing a vocational transformation. Vocationalism, that is, an emphasis on learning directed at work related outcomes is increasingly shaping the nature of tertiary education. This paper reports some findings to date of a project that seeks to identify the key issues faced by students, industry and university partners engaged in the provision of WIL within an undergraduate program offered by the Creative Industries faculty of a major metropolitan university. Here, those findings are focussed on some of the motivations and concerns of the industry partners who make their workplaces available for student internships. Businesses are not universities and do not perceive of themselves as primarily learning institutions. However, their perspectives of work integrated learning and their contributions to it need to understand more fully at practical and conceptual levels of learning provision. This paper and the findings presented here suggest that the diversity of industry partner motivations and concerns contributing to WIL provision requires that universities understand and appreciate those partners as contributors with them to a culture of learning provision and support. These industry partner contribution need to be understood as valuing work as learning, not work as something that needs to be integrated with learning to make that learning more authentic and thereby more vocational.
Resumo:
This article reports on a project to embed information literacy skills development in a first-year undergraduate business course at an Australian university. In accordance with prior research suggesting that first-year students are over-confident about their skills, the project used an optional online quiz to allow students to pre-test their information literacy skills. The students' lower than expected results subsequently encouraged greater skill development. However, not all students elected to undertake the first quiz. A final assessable information literacy quiz increased the levels of student engagement, suggesting that skill development activities need to be made assessable. We found that undertaking the information literacy quizzes resulted in a statistically significant improvement in students' information literacy skills from the pre-test to the post-test. This research therefore extends previous research by providing an effective means of delivering information literacy skill development to large cohorts of first-year students.
Resumo:
Earlier research developed theoretically-based aggregate metrics for technology strategy and used them to analyze California bridge construction firms (Hampson, 1993). Determinants of firm performance, including trend in contract awards, market share and contract awards per employee, were used as indicators for competitive performance. The results of this research were a series of refined theoretically-based measures for technology strategy and a demonstrated positive relationship between technology strategy and competitive performance within the bridge construction sector. This research showed that three technology strategy dimensions—competitive positioning, depth of technology strategy, and organizational fit— show very strong correlation with the competitive performance indicators of absolute growth in contract awards, and contract awards per employee. Both researchers and industry professionals need improved understanding of how technology affects results, and how to better target investments to improve competitive performance in particular industry sectors. This paper builds on the previous research findings by evaluating the strategic fit of firms' approach to technology with industry segment characteristics. It begins with a brief overview of the background regarding technology strategy. The major sections of the paper describe niches and firms in an example infrastructure construction market, analyze appropriate technology strategies, and describe managerial actions to implement these strategies and support the business objectives of the firm.
Resumo:
A basic element in advertising strategy is the choice of an appeal. In business-to-business (B2B) marketing communication, a long-standing approach relies on literal and factual, benefit-laden messages. Given the highly complex, costly and involved processes of business purchases, such approaches are certainly understandable. This project challenges the traditional B2B approach and asks if an alternative approach—using symbolic messages that operate at a more intrinsic or emotional level—is effective in the B2B arena. As an alternative to literal (factual) messages, there is an emerging body of literature that asserts stronger, more enduring results can be achieved through symbolic messages (imagery or text) in an advertisement. The present study contributes to this stream of research. From a theoretical standpoint, the study explores differences in literal-symbolic message content in B2B advertisements. There has been much discussion—mainly in the consumer literature—on the ability of symbolic messages to motivate a prospect to process advertising information by necessitating more elaborate processing and comprehension. Business buyers are regarded as less receptive to indirect or implicit appeals because their purchase decisions are based on direct evidence of product superiority. It is argued here, that these same buyers may be equally influenced by advertising that stimulates internally-directed motivation, feelings and cognitions about the brand. Thus far, studies on the effect of literalism and symbolism are fragmented, and few focus on the B2B market. While there have been many studies about the effects of symbolism no adequate scale exists to measure the continuum of literalism-symbolism. Therefore, a first task for this study was to develop such a scale. Following scale development, content analysis of 748 B2B print advertisements was undertaken to investigate whether differences in literalism-symbolism led to higher advertising performance. Variations of time and industry were also measured. From a practical perspective, the results challenge the prevailing B2B practice of relying on literal messages. While definitive support was not established for the use of symbolic message content, literal messages also failed to predict advertising performance. If the ‘fact, benefit laden’ assumption within B2B advertising cannot be supported, then other approaches used in the business-to-consumer (B2C) sector, such as symbolic messages may be also appropriate in business markets. Further research will need to test the potential effects of such messages, thereby building a revised foundation that can help drive advances in B2B advertising. Finally, the study offers a contribution to the growing body of knowledge on symbolism in advertising. While the specific focus of the study relates to B2B advertising, the Literalism-Symbolism scale developed here provides a reliable measure to evaluate literal and symbolic message content in all print advertisements. The value of this scale to advance our understanding about message strategy may be significant in future consumer and business advertising research.
Resumo:
As online social spaces continue to grow in importance, the complex relationship between users and the private providers of the platforms continues to raise increasingly difficult questions about legitimacy in online governance. This article examines two issues that go to the core of egitimate governance in online communities: how are rules enforced and punishments imposed, and how should the law support legitimate governance and protect participants from the illegitimate exercise of power? Because the rules of online communities are generally ultimately backed by contractual terms of service, the imposition of punishment for the breach of internal rules exists in a difficult conceptual gap between criminal law and the predominantly compensatory remedies of contractual doctrine. When theorists have addressed the need for the rules of virtual communities to be enforced, a dichotomy has generally emerged between the appropriate role of criminal law for 'real' crimes, and the private, internal resolution of 'virtual' or 'fantasy' crimes. In this structure, the punitive effect of internal measures is downplayed and the harm that can be caused to participants by internal sanctions is systemically undervalued.
Resumo:
A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.
Resumo:
In today’s electronic world vast amounts of knowledge is stored within many datasets and databases. Often the default format of this data means that the knowledge within is not immediately accessible, but rather has to be mined and extracted. This requires automated tools and they need to be effective and efficient. Association rule mining is one approach to obtaining knowledge stored with datasets / databases which includes frequent patterns and association rules between the items / attributes of a dataset with varying levels of strength. However, this is also association rule mining’s downside; the number of rules that can be found is usually very big. In order to effectively use the association rules (and the knowledge within) the number of rules needs to be kept manageable, thus it is necessary to have a method to reduce the number of association rules. However, we do not want to lose knowledge through this process. Thus the idea of non-redundant association rule mining was born. A second issue with association rule mining is determining which ones are interesting. The standard approach has been to use support and confidence. But they have their limitations. Approaches which use information about the dataset’s structure to measure association rules are limited, but could yield useful association rules if tapped. Finally, while it is important to be able to get interesting association rules from a dataset in a manageable size, it is equally as important to be able to apply them in a practical way, where the knowledge they contain can be taken advantage of. Association rules show items / attributes that appear together frequently. Recommendation systems also look at patterns and items / attributes that occur together frequently in order to make a recommendation to a person. It should therefore be possible to bring the two together. In this thesis we look at these three issues and propose approaches to help. For discovering non-redundant rules we propose enhanced approaches to rule mining in multi-level datasets that will allow hierarchically redundant association rules to be identified and removed, without information loss. When it comes to discovering interesting association rules based on the dataset’s structure we propose three measures for use in multi-level datasets. Lastly, we propose and demonstrate an approach that allows for association rules to be practically and effectively used in a recommender system, while at the same time improving the recommender system’s performance. This especially becomes evident when looking at the user cold-start problem for a recommender system. In fact our proposal helps to solve this serious problem facing recommender systems.