830 resultados para blended workflow


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interdisciplinary research is often funded by national government initiatives or large corporate sponsorship, and as such, demands periodic reporting on the use of those funds. For reasons of accountability, governance and communication to the tax payer, knowledge of the outcomes of the research need to be measured and understood. The interdisciplinary approach to research raises many challenges for impact reporting. This presentation will consider what are the best practice workflow models and methodologies.Novel methodologies that can be added to the usual metrics of academic publications include analysis of percentage share of total publications in a subject or keyword field, calculating most cited publication in a key phrase category, analysis of who has cited or reviewed the work, and benchmarking of this data against others in that same category. At QUT, interest in how collaborative networking is trending in a research theme has led to the creation of some useful co-authorship graphs that demonstrate the network positions of authors and the strength of their scientific collaborations within a group. The scale of international collaborations is also worth including in the assessment. However, despite all of the tools and techniques available, the most useful way a researcher can help themselves and the process is to set up and maintain their researcher identifier and profile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MapReduce frameworks such as Hadoop are well suited to handling large sets of data which can be processed separately and independently, with canonical applications in information retrieval and sales record analysis. Rapid advances in sequencing technology have ensured an explosion in the availability of genomic data, with a consequent rise in the importance of large scale comparative genomics, often involving operations and data relationships which deviate from the classical Map Reduce structure. This work examines the application of Hadoop to patterns of this nature, using as our focus a wellestablished workflow for identifying promoters - binding sites for regulatory proteins - Across multiple gene regions and organisms, coupled with the unifying step of assembling these results into a consensus sequence. Our approach demonstrates the utility of Hadoop for problems of this nature, showing how the tyranny of the "dominant decomposition" can be at least partially overcome. It also demonstrates how load balance and the granularity of parallelism can be optimized by pre-processing that splits and reorganizes input files, allowing a wide range of related problems to be brought under the same computational umbrella.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The complete structural elucidation of complex lipids, including glycerophospholipids, using only mass spectrometry represents a major challenge to contemporary analytical technologies. Here, we demonstrate that product ions arising from the collision-induced dissociation (CID) of the [M + Na] + adduct ions of phospholipids can be isolated and subjected to subsequent gas-phase ozonolysis-known as ozone-induced dissociation (OzID)-in a linear ion-trap mass spectrometer. The resulting CID/OzID experiment yields abundant product ions that are characteristic of the acyl substitution on the glycerol backbone (i.e., sn-position). This approach is shown to differentiate sn-positional isomers, such as the regioisomeric phosphatidylcholine pair of PC 16:0/18:1 and PC 18:1/16:0. Importantly, CID/OzID provides a sensitive diagnostic for the existence of an isomeric mixture in a given sample. This is of very high value for the analysis of tissue extracts since CID/OzID analyses can reveal changes in the relative abundance of isomeric constituents even within different tissues from the same animal. Finally, we demonstrate the ability to assign carbon-carbon double bond positions to individual acyl chains at specific backbone positions by adding subsequent CID and/or OzID steps to the workflow and that this can be achieved in a single step using a hybrid triple quadrupole-linear ion trap mass spectrometer. This unique approach represents the most complete and specific structural analysis of lipids by mass spectrometry demonstrated to date and is a significant step towards comprehensive top-down lipidomics. This journal is © The Royal Society of Chemistry 2014. Grant Number ARC/DP0986628, ARC/FT110100249, ARC/LP110200648

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Business Process Management domain has evolved at a dramatic pace over the past two decades and the notion of the business process has become a ubiquitous part of the modern business enterprise. Most organizations now view their operations in terms of business processes and manage these business processes in the same way as other corporate assets. In recent years, an increasingly broad range of generic technology has become available for automating business processes. This is part of a growing trend in the software engineering field throughout the past 40 years, where aspects of functionality that are potentially reusable on a widespread basis have coalesced into generic software components. Figure 2.1 illustrates this trend and shows how software systems have evolved from the monolithic applications of the 1960s developed in their entirety often by a single development team to today’s offerings that are based on the integration of a range of generic technologies with only a small component of the application actually being developed from scratch. In the 1990s, generic functionality for the automation of business processes first became commercially available in the form of workflow technology and subsequently evolved in the broader field of business process management systems (BPMS). This technology alleviated the necessity to develop process support within applications from scratch and provided a variety of off-the-shelf options on which these requirements could be based. The demand for this technology was significant and it is estimated that by 2000 there were well over 200 distinct workflow offerings in the market, each with a distinct conceptual foundation. Anticipating the difficulties that would be experienced by organizations seeking to utilize and integrate distinct workflow offerings, the Workflow Management Coalition (WfMC), an industry group formed to advance technology in this area, proposed a standard reference model for workflow technology with an express desire to seek a common platform for achieving workflow interoperation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A business process is often modeled using some kind of a directed flow graph, which we call a workflow graph. The Refined Process Structure Tree (RPST) is a technique for workflow graph parsing, i.e., for discovering the structure of a workflow graph, which has various applications. In this paper, we provide two improvements to the RPST. First, we propose an alternative way to compute the RPST that is simpler than the one developed originally. In particular, the computation reduces to constructing the tree of the triconnected components of a workflow graph in the special case when every node has at most one incoming or at most one outgoing edge. Such graphs occur frequently in applications. Secondly, we extend the applicability of the RPST. Originally, the RPST was applicable only to graphs with a single source and single sink such that the completed version of the graph is biconnected. We lift both restrictions. Therefore, the RPST is then applicable to arbitrary directed graphs such that every node is on a path from some source to some sink. This includes graphs with multiple sources and/or sinks and disconnected graphs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of behavioural contradictions is an important aspect of software engineering, in particular for checking the consistency between a business process model used as system specification and a corresponding workflow model used as implementation. In this paper, we propose causal behavioural profiles as the basis for a consistency notion, which capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities. Existing notions of behavioural equivalence, such as bisimulation and trace equivalence, might also be applied as consistency notions. Still, they are exponential in computation. Our novel concept of causal behavioural profiles provides a weaker behavioural consistency notion that can be computed efficiently using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There have now been two decades of rhetoric on the need for culturally and contextually appropriate perspectives in international education. However, the extent to which courses, provision and pedagogy have truly reflected differences in cultural characteristics and learning preferences is still open to question. Little attention has been paid to these matters in quality assurance frameworks. This chapter discusses these issues and draws upon Hofstede’s cultural dimensions framework and studies into Asian pedagogy and uses of educational technology. It proposes a benchmark and performance indicators for assuring cultural, contextual, educational and technological appropriateness in the provision of transnational distance education in Asia by Australian universities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is a case study of a young university striving to generate and sustain a vibrant Research Training culture. The university’s research training framework is informed by a belief in a project management approach to achieving successful research candidature. This has led to the definition and reporting of key milestones during candidature. In turn, these milestones have generated a range of training programs to support Higher Degree Research (HDR) students to meet these milestones in a timely fashion. Each milestone focuses on a specific set of skills blended with supporting the development of different parts of the doctoral thesis. Data on student progress and completion has provided evidence in highlighting the role that the milestones and training are playing in supporting timely completion. A university-wide reporting cycle generated data on the range of workshops and training provided to Higher Degree Research students and supervisors. The report provided details of thesis topic and format, as well as participation in research training events and participant evaluation of those events. Analysis of the data led to recommendations and comments on the strengths and weaknesses of the current research training program. Discussion considered strategies and drivers for enhancements into the future. In particular, the paper reflects on the significant potential role of centrally curated knowledge systems to support HDR student and supervisor access, and engagement and success. The research training program was developed using blended learning as a model. It covered face-to-face workshops as well as online modules. These were supplemented by web portals that offered a range of services to inform and educate students and supervisors and included opportunities for students to interact with each other. Topics ranged from the research life cycle, writing and publication, ethics, managing research data, managing copyright, and project management to use of software and the University’s Code of Conduct for Research. The challenges discussed included: How to reach off campus students and those studying in external modes? How best to promote events to potential participants? How long and what format is best for face-to-face sessions? What online resources best supplement face-to-face offerings? Is there a place for peer-based learning and what form should this take? These questions are raised by a relatively young university seeking to build and sustain a vibrant research culture. The rapid growth in enrolments in recent years has challenged previous one-to-one models of support. This review of research training is timely in seeking strategies to address changing research training support capacity and student needs. Part of the discussion will focus on supervisory training, noting that good supervision is the one remaining place where one-to-one support is provided. Ensuring that supervisors are appropriately equipped to address student expectations is considered in the context of the research training provisions. The paper concludes with reflection on the challenges faced, and recommended ways forward as the number of research students grows into the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New push-pull copolymers based on thiophene (donor) and benzothiadiazole (acceptor) units, poly[4,7-bis(3-dodecylthiophene-2-yl) benzothiadiazole-co- thiophene] (PT3B1) and poly[4,7-bis(3-dodecylthiophene-2-yl) benzothiadiazole-co-benzothiadiazole] (PT2B2), are designed and synthesized via Stille and Suzuki coupling routes respectively. Gel permeation chromatography shows the number average molecular weights are 31100 and 8400 g mol-1 for the two polymers, respectively. Both polymers have shown absorption throughout a wide range of the UV-vis region, from 300 to 650 nm. A significant red shift of the absorption edge is observed in thin films compared to solution of the copolymers; the optical band gap is in the range of 1.7 to 1.8 eV. Cyclic voltammetry indicates reversible oxidation and reduction processes with HOMO energy levels calculated to be in the range of 5.2 to 5.4 eV. Upon testing both materials for organic field-effect transistors (OFETs), PT3B1 showed a hole mobility of 6.1 × 10-4 cm2 V-1 s -1, while PT2B2 did not show any field effect transport. Both copolymers displayed a photovoltaic response when combined with a methanofullerene as an electron acceptor. The best performance was achieved when the copolymer PT3B1 was blended with [70]PCBM in a 1:4 ratio, exhibiting a short-circuit current of 7.27 mA cm-2, an open circuit voltage of 0.85 V, and a fill factor of 41% yielding a power conversion efficiency of 2.54% under simulated air mass (AM) 1.5 global (1.5 G) illumination conditions (100 mW cm-2). Similar devices utilizing PT2B2 in place of PT3B1 demonstrated reduced performance with a short-circuit current of 4.8 mA cm -2, an open circuit voltage of 0.73 V, and a fill factor of 30% resulting in a power conversion efficiency of roughly 1.06%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growing proliferation of statute laws, the skill of statutory interpretation is an increasingly important aspect of legal practice. Despite the importance, statutory interpretation can be a challenging area of law to teach to undergraduate law students, who may find the topic dry and disengaging when taught through traditional methods. Such disengagement may adversely affect knowledge retention, particularly if the material is taught in the first or second year of study and not explicitly reinforced in subsequent years. Concern over the present standard of statutory interpretation skills being exhibited by practitioners, has prompted the Chief Justice of the Supreme Court of Queensland to contact law schools, enquiring how and to what extent statutory interpretation is being taught...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Planning studio pedagogy has long been a part of planning education and has recently re-emerged as a topic of investigation. Scholarship has: 1) critically examined the fluctuating popularity of studio teaching and the changing role of studio teaching in contemporary planning curricula in the USA and New Zealand; 2) challenged conceptualizations of the traditional studio and considered how emerging strategies for blended and online learning, and ‘real world engagement’ are producing new modes of studio delivery; 3) considered the benefits and outcomes of studio teaching, and; 4) provided recommendations for teaching practice by critically analysing studio experiences in different contexts (Aitken-Rose & Dixon, 2009; Balassiano, 2011; Balassiano & West, 2012; Balsas, 2012; Dandekar, 2009; Heumann & Wetmore, 1984; Higgins, Thomas & Hollander, 2010; Lang, 1983; Long, 2012; Németh & Long, 2012; Winkler, 2013). Twenty-three universities in Australia offer accredited planning degrees, yet data about the use of studio teaching in planning programs are limited. How, when and why are studio pedagogies used? If it is not a part of the curriculum – why?, and has this had any impact on student outcomes? What are the opportunities and limitations of new models of studio teaching for student, academic, professional and institutional outcomes? This paper presents early ideas from a QUT seed grant on the use of studio teaching in Australian planning education to gain a better understanding of the different roles of studio teaching in planning curricula at a National level and opportunities and challenges for this pedagogical mode in the face of dilemmas facing planning education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research contributes a fully-operational approach for managing business process risk in near real-time. The approach consists of a language for defining risks on top of process models, a technique to detect such risks as they eventuate during the execution of business processes, a recommender system for making risk-informed decisions, and a technique to automatically mitigate the detected risks when they are no longer tolerable. Through the incorporation of risk management elements in all stages of the lifecycle of business processes, this work contributes to the effective integration of the fields of Business Process Management and Risk Management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The professional doctorate is specifically designed for professionals investigating real-world problems and relevant issues for a profession, industry, and/or the community. The focus is scholarly research into professional practices. The research programme bridges academia and the professions, and offers doctoral candidates the opportunity to investigate issues relevant to their own practices and to apply these understandings to their professional contexts. The study on which this article is based sought to track the scholarly skill development of a cohort of professional doctoral students who commenced the course in January 2008 at an Australian university. Because they hold positions of responsibility and are time-poor, many doctoral students have difficulty transitioning from professional practitioner to researcher and scholar. The struggle many experience is in the development of a theoretical or conceptual standpoint for argumentation (Lesham, 2007; Weese et al., 1999). It was thought that the use of a scaffolded learning environment that drew upon a blended learning approach incorporating face to face intensive blocks and collaborative knowledge-building tools such as wikis would provide a data source for understanding the development of scholarly skills. Wikis, weblogs and similar social networking software have the potential to support communities to share, learn, create and collaborate. The development of a wiki page by each candidate in the 2008 cohort was encouraged to provide the participants and the teaching team members with textual indicators of progress. Learning tasks were scaffolded with the expectation that the candidates would complete these tasks via the wikis. The expectation was that cohort members would comment on each other’s work, together with the supervisor and/or teaching team member who was allocated to each candidate. The supervisor is responsible for supervising the candidate’s work through to submission of the thesis for examination and the teaching team member provides support to both the supervisor and the candidate through to confirmation. This paper reports on the learning journey of a cohort of doctoral students during the first seven months of their professional doctoral programme to determine if there had been any qualitative shifts in understandings, expectations and perceptions regarding their developing knowledge and skills. The paper is grounded in the literature pertaining to doctoral studies and examines the structure of the professional doctoral programme. Following this is a discussion of the qualitative study that helped to unearth key themes regarding the participants’ learning journey.