932 resultados para data complexity
Resumo:
Expert knowledge is used widely in the science and practice of conservation because of the complexity of problems, relative lack of data, and the imminent nature of many conservation decisions. Expert knowledge is substantive information on a particular topic that is not widely known by others. An expert is someone who holds this knowledge and who is often deferred to in its interpretation. We refer to predictions by experts of what may happen in a particular context as expert judgments. In general, an expert-elicitation approach consists of five steps: deciding how information will be used, determining what to elicit, designing the elicitation process, performing the elicitation, and translating the elicited information into quantitative statements that can be used in a model or directly to make decisions. This last step is known as encoding. Some of the considerations in eliciting expert knowledge include determining how to work with multiple experts and how to combine multiple judgments, minimizing bias in the elicited information, and verifying the accuracy of expert information. We highlight structured elicitation techniques that, if adopted, will improve the accuracy and information content of expert judgment and ensure uncertainty is captured accurately. We suggest four aspects of an expert elicitation exercise be examined to determine its comprehensiveness and effectiveness: study design and context, elicitation design, elicitation method, and elicitation output. Just as the reliability of empirical data depends on the rigor with which it was acquired so too does that of expert knowledge.
Resumo:
A Maintenance Test Section Survey (MTSS) was conducted as part of a Peer State Review of the Texas Maintenance Program conducted October 5–7, 2010. The purpose of the MTSS was to conduct a field review of 34 highway test sections and obtain participants’ opinions about pavement, roadside, and maintenance conditions. The goal was to cross reference or benchmark TxDOT’s maintenance practices based on practices used by selected peer states. Representatives from six peer states (California, Georgia, Kansas, Missouri, North Carolina, and Washington) were invited to Austin to attend a 3-day Peer State Review of TxDOT Maintenance Practices Workshop and to participate in a field survey of a number of pre-selected one-mile roadway sections. It should be emphasized that the objective of the survey was not to evaluate and grade or score TxDOT’s road network but rather to determine whether the selected roadway sections met acceptable standards of service as perceived by Directors of Maintenance or senior maintenance managers from the peer states...
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches to the virtual machine placement problem consider the energy consumption by physical machines in a data center only, but do not consider the energy consumption in communication network in the data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement in order to make the data center more energy-efficient. In this paper, we propose a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both the servers and the communication network in the data center. Experimental results show that the genetic algorithm performs well when tackling test problems of different kinds, and scales up well when the problem size increases.
Resumo:
Neighbourhood like the concept of liveability is usually measured by either subjective indicators using surveys of residents’ perceptions or by objective means using secondary data or relative weights for objective indicators of the urban environment. Rarely, have objective and subjective indicators been related to one another in order to understand what constitutes a liveable urban neighbourhood both spatially and behaviourally. This paper explores the use of qualitative (diaries, in-depth interviews) and quantitative (Global Positioning Systems, Geographical Information Systems mapping) liveability research data to examine the perceptions and behaviour of 12 older residents living in six high density urban areas of Brisbane. Older urban Australians are one of the two principal groups highly attracted to high density urban living. The strength of the relationship between the qualitative and quantitative measures was examined. Results of the research indicate a weak relationship between subjective and objective indicators. Linking the two methods (quantitative and qualitative) is important in obtaining a greater understanding of human behaviour and the lived world of older urban Australians and in providing a wider picture of the urban neighbourhood.
Resumo:
Here we present a sequential Monte Carlo approach to Bayesian sequential design for the incorporation of model uncertainty. The methodology is demonstrated through the development and implementation of two model discrimination utilities; mutual information and total separation, but it can also be applied more generally if one has different experimental aims. A sequential Monte Carlo algorithm is run for each rival model (in parallel), and provides a convenient estimate of the marginal likelihood (of each model) given the data, which can be used for model comparison and in the evaluation of utility functions. A major benefit of this approach is that it requires very little problem specific tuning and is also computationally efficient when compared to full Markov chain Monte Carlo approaches. This research is motivated by applications in drug development and chemical engineering.
Resumo:
Australian higher education institutions (HEIs) have entered a new phase of regulation and accreditation which includes performance-based funding relating to the participation and retention of students from social and cultural groups previously underrepresented in higher education. However, in addressing these priorities, it is critical that HEIs do not further disadvantage students from certain groups by identifying them for attention because of their social or cultural backgrounds, circumstances which are largely beyond the control of students. In response, many HEIs are focusing effort on university-wide approaches to enhancing the student experience because such approaches will enhance the engagement, success and retention of all students, and in doing so, particularly benefit those students who come from underrepresented groups. Measuring and benchmarking student experiences and engagement that arise from these efforts is well supported by extensive collections of student experience survey data. However no comparable instrument exists that measures the capability of institutions to influence and/or enhance student experiences where capability is an indication of how well an organisational process does what it is designed to do (Rosemann & de Bruin, 2005). We have proposed that the concept of a maturity model (Marshall, 2010; Paulk, 1999) may be useful as a way of assessing the capability of HEIs to provide and implement student engagement, success and retention activities and we are currently articulating a Student Engagement, Success and Retention Maturity Model (SESR-MM), (Clarke, Nelson & Stoodley, 2012; Nelson, Clarke & Stoodley, 2012). Our research aims to address the current gap by facilitating the development of an SESR-MM instrument that aims (i) to enable institutions to assess the capability of their current student engagement and retention programs and strategies to influence and respond to student experiences within the institution; and (ii) to provide institutions with the opportunity to understand various practices across the sector with a view to further improving programs and practices relevant to their context. Our research extends the generational approach which has been useful in considering the evolutionary nature of the first year experience (FYE) (Wilson, 2009). Three generations have been identified and explored: First generation approaches that focus on co-curricular strategies (e.g. orientation and peer programs); Second generation approaches that focus on curriculum (e.g. pedagogy, curriculum design, and learning and teaching practice); and third generation approaches—also referred to as transition pedagogy—that focus on the production of an institution-wide integrated holistic intentional blend of curricular and co-curricular activities (Kift, Nelson & Clarke, 2010). Our research also moves beyond assessments of students’ experiences to focus on assessing institutional processes and their capability to influence student engagement. In essence, we propose to develop and use the maturity model concept to produce an instrument that will indicate the capability of HEIs to manage and improve student engagement, success and retention programs and strategies. The issues explored in this workshop are (i) whether the maturity model concept can be usefully applied to provide a measure of institutional capability for SESR; (ii) whether the SESR-MM can be used to assess the maturity of a particular set of institutional practices; and (iii) whether a collective assessment of an institution’s SESR capabilities can provide an indication of the maturity of the institution’s SESR activities. The workshop will be approached in three stages. Firstly, participants will be introduced to the key characteristics of maturity models, followed by a discussion of the SESR-MM and the processes involved in its development. Secondly, participants will be provided with resources to facilitate the development of a maturity model and an assessment instrument for a range of institutional processes and related practices. In the final stage of the workshop, participants will “assess” the capability of these practices to provide a collective assessment of the maturity of these processes. References Australian Council for Educational Research. (n.d.). Australasian Survey of Student Engagement. Retrieved from http://www.acer.edu.au/research/ausse/background Clarke, J., Nelson, K., & Stoodley, I. (2012, July). The Maturity Model concept as framework for assessing the capability of higher education institutions to address student engagement, success and retention: New horizon or false dawn? A Nuts & Bolts presentation at the 15th International Conference on the First Year in Higher Education, “New Horizons,” Brisbane, Australia. Department of Education, Employment and Workplace Relations. (n.d.). The University Experience Survey. Advancing quality in higher education information sheet. Retrieved from http://www.deewr.gov.au/HigherEducation/Policy/Documents/University_Experience_Survey.pdf Kift, S., Nelson, K., & Clarke, J. (2010) Transition pedagogy - a third generation approach to FYE: A case study of policy and practice for the higher education sector. The International Journal of the First Year in Higher Education, 1(1), pp. 1-20. Marshall, S. (2010). A quality framework for continuous improvement of e-Learning: The e-Learning Maturity Model. Journal of Distance Education, 24(1), 143-166. Nelson, K., Clarke, J., & Stoodley, I. (2012). An exploration of the Maturity Model concept as a vehicle for higher education institutions to assess their capability to address student engagement. A work in progress. Submitted for publication. Paulk, M. (1999). Using the Software CMM with good judgment, ASQ Software Quality Professional, 1(3), 19-29. Wilson, K. (2009, June–July). The impact of institutional, programmatic and personal interventions on an effective and sustainable first-year student experience. Keynote address presented at the 12th Pacific Rim First Year in Higher Education Conference, “Preparing for Tomorrow Today: The First Year as Foundation,” Townsville, Australia. Retrieved from http://www.fyhe.com.au/past_papers/papers09/ppts/Keithia_Wilson_paper.pdf
Resumo:
There is still no comprehensive information strategy governing access to and reuse of public sector information, applying on a nationwide basis, across all levels of government – local, state and federal - in Australia. This is the case both for public sector materials generally and for spatial data in particular. Nevertheless, the last five years have seen some significant developments in information policy and practice, the result of which has been a considerable lessening of the barriers that previously acted to impede the accessibility and reusability of a great deal of spatial and other material held by public sector agencies. Much of the impetus for change has come from the spatial community which has for many years been a proponent of the view “that government held information, and in particular spatial information, will play an absolutely critical role in increasing the innovative capacity of this nation.”1 However, the potential of government spatial data to contribute to innovation will remain unfulfilled without reform of policies on access and reuse as well as the pervasive practices of public sector data custodians who have relied on government copyright to justify the imposition of restrictive conditions on its use.
Resumo:
Open Educational Resources (OER) are teaching, learning and research materials that have been released under an open licence that permits online access and re-use by others. The 2012 Paris OER Declaration encourages the open licensing of educational materials produced with public funds. Digital data and data sets produced as a result of scientific and non-scientific research are an increasingly important category of educational materials. This paper discusses the legal challenges presented when publicly funded research data is made available as OER, arising from intellectual property rights, confidentiality and information privacy laws, and the lack of a legal duty to ensure data quality. If these legal challenges are not understood, addressed and effectively managed, they may impede and restrict access to and re-use of research data. This paper identifies some of the legal challenges that need to be addressed and describes 10 proposed best practices which are recommended for adoption to so that publicly funded research data can be made available for access and re-use as OER.
Resumo:
Members of the World Trade Organisation (WTO) are obliged to implement the Agreement on Trade-related Intellectual Property Rights 1994 (TRIPS) which establishes minimum standards for the protection and enforcement of intellectual property rights. Almost two decades after TRIPS was adopted at the conclusion of the Uruguay Round of trade negotiations, it is widely accepted that intellectual property systems in developing and least-developed countries must be consistent with, and serve, their development needs and objectives. In adopting the Development Agenda in 2007, the World Intellectual Property Organisation (WIPO) emphasised the importance to developing and least-developed countries of being able to obtain access to knowledge and technology and to participate in collaborations and exchanges with research and scientific institutions in other countries. Access to knowledge, information and technology is crucial if creativity and innovation is to be fostered in developing and least-developed countries. It is particularly important that developing and least-developed countries give effect to their TRIPS obligations by implementing intellectual property systems and adopting intellectual property management practices that enable them to benefit from knowledge flows and support their engagement in international research and science collaborations. However, developing and least-developed countries did not participate in the deliberations leading to the adoption in 2004 by Organisation for Economic Co-operation and Development (OECD) member countries of the Ministerial Declaration on Access to Research Data from Public Funding, nor have they formulated policies on access to publicly funded research outputs such as those developed by the National Institutes of Health in the United States, the United Kingdom Research Councils or the Australian National Health and Medical Research Council. These issues are considered from the viewpoint of Malaysia, a developing country whose economy has grown strongly in recent years. Lacking an established policy covering access to the outputs of publicly funded research, data sharing and licensing practices continue to be fragmented. Obtaining access to research data requires arrangements to be negotiated with individual data owners and custodians. Given the potential for restrictions on access to impact negatively on scientific progress and development in Malaysia, measures are required to ensure that access to knowledge and research results is facilitated. This paper proposes a policy framework for Malaysia‘s public research universities that recognises intellectual property rights while enabling the open access to research data that is essential for innovation and development. It also considers how intellectual property rights in research data can be managed in order to give effect to the policy‘s open access objectives.
Resumo:
Buildings are key mediators between human activity and the environment around them, but details of energy usage and activity in buildings is often poorly communicated and understood. ECOS is an Eco-Visualization project that aims to contextualize the energy generation and consumption of a green building in a variety of different climates. The ECOS project is being developed for a large public interactive space installed in the new Science and Engineering Centre of the Queensland University of Technology that is dedicated to delivering interactive science education content to the public. This paper focuses on how design can develop ICT solutions from large data sets to create meaningful engagement with environmental data.
Resumo:
Key decisions at the collection, pre-processing, transformation, mining and interpretation phase of any knowledge discovery from database (KDD) process depend heavily on assumptions and theorectical perspectives relating to the type of task to be performed and characteristics of data sourced. In this article, we compare and contrast theoretical perspectives and assumptions taken in data mining exercises in the legal domain with those adopted in data mining in TCM and allopathic medicine. The juxtaposition results in insights for the application of KDD for Traditional Chinese Medicine.
Resumo:
Evaluation practices in the Higher Education sector have been criticised for having unclear purpose and principles; ignoring the complexity and changing nature of learning and teaching and the environments in which they occur; relying almost exclusively on student ratings of teachers working in classroom settings; lacking reliability and validity; using data for inappropriate purposes; and focusing on accountability and marketing rather than the improvement of learning and teaching. In response to similar criticism from stakeholders, in 2011 Queensland University of Technology (QUT) began a project which aims to reframe the organisation’s approach to the evaluation of learning and teaching. This paper describes the existing evaluation system; the emergence and early development of the project; and formulation of a conceptual framework identifying key dimensions of evaluation. It then compares the draft framework with other conceptualisations and models of evaluation identified in the literature, to determine its validity and suitability for supporting QUT’s plans for the future. Overall, the paper represents a structured evaluation of the REFRAME project at a particular point in its lifecycle. Given that the project follows an evidence based, practice-led process and applies an ongoing action research cycle, the findings are presented in the belief that QUT’s experience is broadly applicable to other institutions which may be contemplating change in relation to evaluation of learning and teaching.