973 resultados para science process skills.
Resumo:
Business process management (BPM) is becoming the dominant management paradigm. Business process modelling is central to BPM, and the resultant business process model the core artefact guiding subsequent process change. Thus, model quality is at the centre, mediating between the modelling effort and related growing investment in ultimate process improvements. Nonetheless, though research interest in the properties that differentiate high quality process models is longstanding, there have been no past reports of a valid, operationalised, holistic measure of business process model quality. In attention to this gap, this paper reports validation of a Business Process Model Quality measurement model, conceptualised as a single-order, formative index. Such a measurement model has value as the dependent variable in rigorously researching the drivers of model quality; as antecedent of ultimate process improvements; and potentially as an economical comparator and diagnostic for practice.
Resumo:
Peeling is an essential phase of post harvesting and processing industry; however undesirable processing losses are unavoidable and always have been the main concern of food processing sector. There are three methods of peeling fruits and vegetables including mechanical, chemical and thermal, depending on the class and type of fruit. By comparison, the mechanical methods are the most preferred; mechanical peeling methods do not create any harmful effects on the tissue and they keep edible portions of produce fresh. The main disadvantage of mechanical peeling is the rate of material loss and deformations. Obviously reducing material losses and increasing the quality of the process has a direct effect on the whole efficiency of food processing industry, this needs more study on technological aspects of these operations. In order to enhance the effectiveness of food industrial practices it is essential to have a clear understanding of material properties and behaviour of tissues under industrial processes. This paper presents the scheme of research that seeks to examine tissue damage of tough skinned vegetables under mechanical peeling process by developing a novel FE model of the process using explicit dynamic finite element analysis approach. A computer model of mechanical peeling process will be developed in this study to stimulate the energy consumption and stress strain interactions of cutter and tissue. The available Finite Element softwares and methods will be applied to establish the model. Improving the knowledge of interactions and involves variables in food operation particularly in peeling process is the main objectives of the proposed study. Understanding of these interrelationships will help researchers and designer of food processing equipments to develop new and more efficient technologies. Presented work intends to review available literature and previous works has been done in this area of research and identify current gap in modelling and simulation of food processes.
Resumo:
Aspect orientation is an important approach to address complexity of cross-cutting concerns in Information Systems. This approach encapsulates these concerns separately and compose them to the main module when needed. Although there a different works which shows how this separation should be performed in process models, the composition of them is an open area. In this paper, we demonstrate the semantics of a service which enables this composition. The result can also be used as a blueprint to implement the service to support aspect orientation in Business Process Management area.
Resumo:
Reducing complexity in Information Systems is a main concern in both research and industry. One strategy for reducing complexity is separation of concerns. This strategy advocates separating various concerns, like security and privacy, from the main concern. It results in less complex, easily maintainable, and more reusable Information Systems. Separation of concerns is addressed through the Aspect Oriented paradigm. This paradigm has been well researched and implemented in programming, where languages such as AspectJ have been developed. However, the rsearch on aspect orientation for Business Process Management is still at its beginning. While some efforts have been made proposing Aspect Oriented Business Process Modelling, it has not yet been investigated how to enact such process models in a Workflow Management System. In this paper, we define a set of requirements that specifies the execution of aspect oriented business process models. We create a Coloured Petri Net specification for the semantics of so-called Aspect Service that fulfils these requirements. Such a service extends the capability of a Workflow Management System with support for execution of aspect oriented business process models. The design specification of the Aspect Service is also inspected through state space analysis.
Resumo:
This paper examines the effects and origins of balanced skills among nascent entrepreneurs. In a first step we apply Lazear’s jack-of-all-trades theory to investigate performance effects of a balanced skill set. Second, we investigate potential sources of balanced skills, thereby testing the investment hypothesis against the endowment hypothesis. Analyzing data on high-potential nascent projects, we find support for the notion that balanced skills are important for making progress in the venture creation process. Regarding the origins of balanced skills, the data support both hypotheses. In line with the investment hypothesis an early interest in an entrepreneurial career, prior managerial and entrepreneurial experience are significantly related with a more balanced skill set. Supporting the endowment hypothesis, an entrepreneurial personality profile indicating entrepreneurial talent is correlated with a balanced skill set. Our results thus hint at the need for theories on the origins of a balanced skill set that integrate both views.
Resumo:
In managing their operations, organizations have traditionally focused on economic imperatives in terms of time, cost, efficiency, and quality. In doing so, they have been a major contributor to environmental degradation caused by re-source consumption, greenhouse emissions, and wastage. As a consequence, or-ganizations are increasingly encouraged to improve their operations also from an ecological perspective, and thus to consider environmental sustainability as an additional management imperative. In order to lessen their impact on the natural environment, organizations must design and implement environmentally sustainable processes, which we call the challenge of Green Business Process Management (Green BPM). This chapter elaborates on the challenge and perspec-tive of Green BPM, and explores the contributions that business process management can provide to creating environmentally sustainable organizations. Our key premise is that business as well as information technology managers need to engage in a process-focused discussion to enable a common, comprehensive understanding of organizational processes, and the process-centered opportunities for making these processes, and ultimately the organization as a process-centric entity, “green.” Through our review of the key BPM capability areas and how they can be framed in terms of environmental sustainability considerations, we provide an overview and introduction to the subsequent chapters in this book.
Resumo:
A biomass pretreatment process was developed using acidified ionic liquid (IL) solutions containing 10-30% water. Pretreatment of sugarcane bagasse at 130°C for 30min by aqueous 1-butyl-3-methylimidazolium chloride (BMIMCl) solution containing 1.2% HCl resulted in a glucan digestibility of 94-100% after 72h of enzymatic hydrolysis. HCl was found to be a more effective catalyst than H(2)SO(4) or FeCl(3). Increasing acid concentration (from 0.4% to 1.2%) and reaction temperature (from 90 to 130°C) increased glucan digestibility. The glucan digestibility of solid residue obtained with the acidified BMIMCl solution that was re-used for three times was >97%. The addition of water to ILs for pretreatment could significantly reduce IL solvent costs and allow for increased biomass loadings, making the pretreatment by ILs a more economic proposition.
Resumo:
The renovation of biomass waste in the form of date seed waste into activated carbon and biofuel by fixed bed pyrolysis reactor has been focused in this study to obtain gaseous, liquid, and solid products. The date seed in particle form is pyrolysed in an externally heated fixed bed reactor with nitrogen as the carrier gas. The reactor is heated from 400◦C to 600◦C. A maximum liquid yield of 50wt.% and char of 30wt.% are obtained at a reactor bed temperature of 500◦C with a running time of 120 minutes. The oil is found to possess favorable flash point and reasonable density and viscosity. The higher calorific value is found to be 28.636 MJ/kg which is significantly higher than other biomass derived. Decolonization of 85–97% is recorded for the textile effluent and 75–90% for the tannery effluent, in all cases decreasing with temperature increase. Good adsorption capacity of the prepared activated carbon in case of diluted textile and tannery effluent was found.
Resumo:
In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.
Resumo:
Wayfinding is the process of finding your way to a destination in a familiar or unfamiliar setting using any cues given by the environment. Due to its ubiquity in everyday life, wayfinding appears on the surface to be a simply characterised and understood process, however this very ubiquity and the resulting need to refine and optimise wayfinding has lead to a great number of studies that have revealed that it is in fact a deeply complex exercise. In this paper we examine the motivations for investigating wayfinding, with particular attention being paid to the unique challenges faced in transportation hubs, and discuss the associated principles and factors involved as they have been perceived from different research perspectives.We also review the approaches used to date in the modelling of wayfinding in various contexts. We attempt to draw together the different perspectives applied to wayfinding and postulate the importance of wayfinding and the need to understand this seemingly simple, but concurrently complex, process.
Resumo:
A model has been developed to track the flow of cane constituents through the milling process. While previous models have tracked the flow of fibre, brix and water through the process, this model tracks the soluble and insoluble solid cane components using modelling theory and experiment data, assisting in further understanding the flow of constituents into mixed juice and final bagasse. The work provided an opportunity to understand the factors which affect the distribution of the cane constituents in juice and bagasse. Application of the model should lead to improvements in the overall performance of the milling train.
Resumo:
Expert knowledge is used widely in the science and practice of conservation because of the complexity of problems, relative lack of data, and the imminent nature of many conservation decisions. Expert knowledge is substantive information on a particular topic that is not widely known by others. An expert is someone who holds this knowledge and who is often deferred to in its interpretation. We refer to predictions by experts of what may happen in a particular context as expert judgments. In general, an expert-elicitation approach consists of five steps: deciding how information will be used, determining what to elicit, designing the elicitation process, performing the elicitation, and translating the elicited information into quantitative statements that can be used in a model or directly to make decisions. This last step is known as encoding. Some of the considerations in eliciting expert knowledge include determining how to work with multiple experts and how to combine multiple judgments, minimizing bias in the elicited information, and verifying the accuracy of expert information. We highlight structured elicitation techniques that, if adopted, will improve the accuracy and information content of expert judgment and ensure uncertainty is captured accurately. We suggest four aspects of an expert elicitation exercise be examined to determine its comprehensiveness and effectiveness: study design and context, elicitation design, elicitation method, and elicitation output. Just as the reliability of empirical data depends on the rigor with which it was acquired so too does that of expert knowledge.
Resumo:
The representation of business process models has been a continuing research topic for many years now. However, many process model representations have not developed beyond minimally interactive 2D icon-based representations of directed graphs and networks, with little or no annotation for information over- lays. With the rise of desktop computers and commodity mobile devices capable of supporting rich interactive 3D environments, we believe that much of the research performed in computer human interaction, virtual reality, games and interactive entertainment has much potential in areas of BPM; to engage, pro- vide insight, and to promote collaboration amongst analysts and stakeholders alike. This initial visualization workshop seeks to initiate the development of a high quality international forum to present and discuss research in this field. Via this workshop, we intend to create a community to unify and nurture the development of process visualization topics as a continuing research area.
Resumo:
Background: Outside the mass-spectrometer, proteomics research does not take place in a vacuum. It is affected by policies on funding and research infrastructure. Proteomics research both impacts and is impacted by potential clinical applications. It provides new techniques & clinically relevant findings, but the possibilities for such innovations (and thus the perception of the potential for the field by funders) are also impacted by regulatory practices and the readiness of the health sector to incorporate proteomics-related tools & findings. Key to this process is how knowledge is translated. Methods: We present preliminary results from a multi-year social science project, funded by the Canadian Institutes of Health Research, on the processes and motivations for knowledge translation in the health sciences. The proteomics case within this wider study uses qualitative methods to examine the interplay between proteomics science and regulatory and policy makers regarding clinical applications of proteomics. Results: Adopting an interactive format to encourage conference attendees’ feedback, our poster focuses on deficits in effective knowledge translation strategies from the laboratory to policy, clinical, & regulatory arenas. An analysis of the interviews conducted to date suggests five significant choke points: the changing priorities of funding agencies; the complexity of proteomics research; the organisation of proteomics research; the relationship of proteomics to genomics and other omics sciences; and conflict over the appropriate role of standardisation. Conclusion: We suggest that engagement with aspects of knowledge translation, such as those mentioned above, is crucially important for the eventual clinical application ofproteomics science on any meaningful scale.
Resumo:
New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.