973 resultados para science process skills.
Resumo:
Bi-2212 tapes are prepared by a combination of dip-coating and partial melt processing. We investigate the effect of re-melting of those tapes by partial melting followed by slow cooling on the structure and superconducting properties. Microstructural studies of re-melted samples show that they have the same overall composition as partially melted tapes. However, the fractional volumes of the secondary phases differ and the amounts and distribution of the secondary phases have a significant effect on the critical current. Critical current of Bi-2212/Ag tapes strongly depends on the maximum processing temperature. Initial J(c)'s of the tapes, which are partially melted, then slowly solidified at optimum conditions and finally post-annealed in an inert atmosphere, are up to 10.4 x 10(3) A/cm(2). It is found that the maximum processing temperature at initial partial melting has an influence on the optimum re-heat treatment conditions for the tapes. Re-melted tapes processed at optimum conditions recover superconducting properties after post-annealing in an inert atmosphere: the J(c) values of the tapes are about 80-110% of initial J(c)'s of those tapes.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Data reliability issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. Participants may submit low quality, misleading, inaccurate, or even malicious data. Therefore, finding a way to improve the data reliability has become an urgent demand. This study aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we propose to design a reputation framework to enhance data reliability and also investigate some critical elements that should be aware of during developing and designing new reputation systems.
Resumo:
Process-oriented thinking has become the major paradigm for managing companies and other organizations. The push for better processes has been even more intense due to rapidly evolving client needs, borderless global markets and innovations swiftly penetrating the market. Thus, education is decisive for successfully introducing and implementing Business Process Management (BPM) initiatives. However, BPM education has been an area of challenge. This special issue aims to provide current research on various aspects of BPM education. It is an initial effort for consolidating better practices, experiences and pedagogical outcomes founded with empirical evidence to contribute towards the three pillars of education: learning, teaching, and disseminating knowledge in BPM.
Resumo:
A video detailing our new virtual world BPMN process modelling tool developed by Erik Poppe. Enables better situational awareness via use of remotely connected avatars and a shared 3D process diagram.
Resumo:
Video detailing three process model visualisation configurations integrated into an agent driven virtual world simulation.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.
Resumo:
This work has led to the development of empirical mathematical models to quantitatively predicate the changes of morphology in osteocyte-like cell lines (MLO-Y4) in culture. MLO-Y4 cells were cultured at low density and the changes in morphology recorded over 11 hours. Cell area and three dimensional shape features including aspect ratio, circularity and solidity were then determined using widely accepted image analysis software (ImageJTM). Based on the data obtained from the imaging analysis, mathematical models were developed using the non-linear regression method. The developed mathematical models accurately predict the morphology of MLO-Y4 cells for different culture times and can, therefore, be used as a reference model for analyzing MLO-Y4 cell morphology changes within various biological/mechanical studies, as necessary.
Resumo:
Since 2000 there has been pressure on education systems for develop in students a number of competences that are described as generic. This pressure stems from studies of the changing nature of work in the Knowledge Society that is now so dominant. The DeSeCo project identified a number of these competences, and listed them under the headings of communicative, analytical and personal. They include thinking, creativity, communication skills, knowing how to learn, working in teams, adapting to change, and problem solving. These competences pose a substantial challenge to the manner in which education as a whole, and science education in particular, has hitherto been generally conceived. It is now common to find their importance acknowledged in new formulation of the curriculum. The paper reviews a number of these curriculum documents and how they have tried to relate these competences to the teaching and learning of Science, a subject with its own very specific content for learning. It will be suggested that the challenge provides an opportunity for a reconstruction of the teaching and learning of science in schools that will increase its effectiveness for more students.
Resumo:
The management and improvement of business processes are a core topic of the information systems discipline. The persistent demand in corporations within all industry sectors for increased operational efficiency and innovation, an emerging set of established and evaluated methods, tools, and techniques as well as the quickly growing body of academic and professional knowledge are indicative for the standing that Business Process Management (BPM) has nowadays. During the last decades, intensive research has been conducted with respect to the design, implementation, execution, and monitoring of business processes. Comparatively low attention, however, has been paid to questions related to organizational issues such as the adoption, usage, implications, and overall success of BPM approaches, technologies, and initiatives. This research gap motivated us to edit a corresponding special focus issue for the journal BISE/WIRTSCHAFTSINFORMATIK. We are happy that we are able to present a selection of three research papers and a state-of-the-art paper in the scientific section of the issue at hand. As these papers differ in the topics they investigate, the research method they apply, and the theoretical foundations they build on, the diversity within the BPM field becomes evident. The academic papers are complemented by an interview with Phil Gilbert, IBM’s Vice President for Business Process and Decision Management, who reflects on the relationship between business processes and the data flowing through them, the need to establish a process context for decision making, and the calibration of BPM efforts toward executives who see processes as a means to an end, rather than a first-order concept in its own right.
Resumo:
Widespread adoption by electricity utilities of Non-Conventional Instrument Transformers, such as optical or capacitive transducers, has been limited due to the lack of a standardised interface and multi-vendor interoperability. Low power analogue interfaces are being replaced by IEC 61850 9 2 and IEC 61869 9 digital interfaces that use Ethernet networks for communication. These ‘process bus’ connections achieve significant cost savings by simplifying connections between switchyard and control rooms; however the in-service performance when these standards are employed is largely unknown. The performance of real-time Ethernet networks and time synchronisation was assessed using a scale model of a substation automation system. The test bed was constructed from commercially available timing and protection equipment supplied by a range of vendors. Test protocols have been developed to thoroughly evaluate the performance of Ethernet networks and network based time synchronisation. The suitability of IEEE Std 1588 Precision Time Protocol (PTP) as a synchronising system for sampled values was tested in the steady state and under transient conditions. Similarly, the performance of hardened Ethernet switches designed for substation use was assessed under a range of network operating conditions. This paper presents test methods that use a precision Ethernet capture card to accurately measure PTP and network performance. These methods can be used for product selection and to assess ongoing system performance as substations age. Key findings on the behaviour of multi-function process bus networks are presented. System level tests were performed using a Real Time Digital Simulator and transformer protection relay with sampled value and Generic Object Oriented Substation Events (GOOSE) capability. These include the interactions between sampled values, PTP and GOOSE messages. Our research has demonstrated that several protocols can be used on a shared process bus, even with very high network loads. This should provide confidence that this technology is suitable for transmission substations.
Resumo:
Communication processes are vital in the lifecycle of BPM projects. With this in mind, much research has been performed into facilitating this key component between stakeholders. Amongst the methods used to support this process are personalized process visualisations. In this paper, we review the development of this visualization trend, then, we propose a theoretical analysis framework based upon communication theory. We use this framework to provide theoretical support to the conjecture that 3D virtual worlds are powerful tools for communicating personalised visualisations of processes within a workplace. Meta requirements are then derived and applied, via 3D virtual world functionalities, to generate example visualisations containing personalized aspects, which we believe enhance the process of communcation between analysts and stakeholders in BPM process (re)design activities.
Resumo:
The Beauty Leaf tree (Calophyllum inophyllum) is a potential source of non-edible vegetable oil for producing future generation biodiesel because of its ability to grow in a wide range of climate conditions, easy cultivation, high fruit production rate, and the high oil content in the seed. This plant naturally occurs in the coastal areas of Queensland and the Northern Territory in Australia, and is also widespread in south-east Asia, India and Sri Lanka. Although Beauty Leaf is traditionally used as a source of timber and orientation plant, its potential as a source of second generation biodiesel is yet to be exploited. In this study, the extraction process from the Beauty Leaf oil seed has been optimised in terms of seed preparation, moisture content and oil extraction methods. The two methods that have been considered to extract oil from the seed kernel are mechanical oil extraction using an electric powered screw press, and chemical oil extraction using n-hexane as an oil solvent. The study found that seed preparation has a significant impact on oil yields, especially in the screw press extraction method. Kernels prepared to 15% moisture content provided the highest oil yields for both extraction methods. Mechanical extraction using the screw press can produce oil from correctly prepared product at a low cost, however overall this method is ineffective with relatively low oil yields. Chemical extraction was found to be a very effective method for oil extraction for its consistence performance and high oil yield, but cost of production was relatively higher due to the high cost of solvent. However, a solvent recycle system can be implemented to reduce the production cost of Beauty Leaf biodiesel. The findings of this study are expected to serve as the basis from which industrial scale biodiesel production from Beauty Leaf can be made.
Resumo:
The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.
Resumo:
Automated process discovery techniques aim at extracting models from information system logs in order to shed light into the business processes supported by these systems. Existing techniques in this space are effective when applied to relatively small or regular logs, but otherwise generate large and spaghetti-like models. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. The result is a collection of process models -- each one representing a variant of the business process -- as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically by means of subprocess extraction. The proposed technique allows users to set a desired bound for the complexity of the produced models. Experiments on real-life logs show that the technique produces collections of models that are up to 64% smaller than those extracted under the same complexity bounds by applying existing trace clustering techniques.