960 resultados para Process Steam Consumption
Resumo:
The design and development of process-aware information systems is often supported by specifying requirements as business process models. Although this approach is generally accepted as an effective strategy, it remains a fundamental challenge to adequately validate these models given the diverging skill set of domain experts and system analysts. As domain experts often do not feel confident in judging the correctness and completeness of process models that system analysts create, the validation often has to regress to a discourse using natural language. In order to support such a discourse appropriately, so-called verbalization techniques have been defined for different types of conceptual models. However, there is currently no sophisticated technique available that is capable of generating natural-looking text from process models. In this paper, we address this research gap and propose a technique for generating natural language texts from business process models. A comparison with manually created process descriptions demonstrates that the generated texts are superior in terms of completeness, structure, and linguistic complexity. An evaluation with users further demonstrates that the texts are very understandable and effectively allow the reader to infer the process model semantics. Hence, the generated texts represent a useful input for process model validation.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
The value of information technology (IT) is often realized when continuously being used after users’ initial acceptance. However, previous research on continuing IT usage is limited for dismissing the importance of mental goals in directing users’ behaviors and for inadequately accommodating the group context of users. This in-progress paper offers a synthesis of several literature to conceptualize continuing IT usage as multilevel constructs and to view IT usage behavior as directed and energized by a set of mental goals. Drawing from the self-regulation theory in the social psychology, this paper proposes a process model, positioning continuing IT usage as multiple-goal pursuit. An agent-based modeling approach is suggested to further explore causal and analytical implications of the proposed process model.
Resumo:
Fault identification in industrial machine is a topic of major importance under engineering point of view. In fact, the possibility to identify not only the type, but also the severity and the position of a fault occurred along a shaft-line allows quick maintenance and shorten the downtime. This is really important in the power generation industry where the units are often of several tenths of meters long and where the rotors are enclosed by heavy and pressure-sealed casings. In this paper, an industrial experimental case is presented related to the identification of the unbalance on a large size steam turbine of about 1.3 GW, belonging to a nuclear power plant. The case history is analyzed by considering the vibrations measured by the condition monitoring system of the unit. A model-based method in the frequency domain, developed by the authors, is introduced in detail and it is then used to identify the position of the fault and its severity along the shaft-line. The complete model of the unit (rotor – modeled by means of finite elements, bearings – modeled by linearized damping and stiffness coefficients and foundation – modeled by means of pedestals) is analyzed and discussed before being used for the fault identification. The assessment of the actual fault was done by inspection during a scheduled maintenance and excellent correspondence was found with the identified one by means of authors’ proposed method. Finally a complete discussion is presented about the effectiveness of the method, even in presence of a not fine tuned machine model and considering only few measuring planes for the machine vibration.
Resumo:
The US National Institute of Standards and Technology (NIST) showed that, in 2004, owners and operations managers bore two thirds of the total industry cost burden from inadequate interoperability in construction projects from inception to operation, amounting to USD10.6 billion. Building Information Modelling (BIM) and similar tools were identified by Engineers Australia in 2005 as potential instruments to significantly reduce this sum, which in Australia could amount to total industry-wide cost burden of AUD12 billion. Public sector road authorities in Australia have a key responsibility in driving initiatives to reduce greenhouse gas emissions from the construction and operations of transport infrastructure. However, as previous research has shown the Environmental Impact Assessment process, typically used for project approvals and permitting based on project designs available at the consent stage, lacks Key Performance Indicators (KPIs) that include long-term impact factors and transfer of information throughout the project life cycle. In the building construction industry, BIM is widely used to model sustainability KPIs such as energy consumption, and integrated with facility management systems. This paper proposes that a similar use of BIM in early design phases of transport infrastructure could provide: (i) productivity gains through improved interoperability and documentation; (ii) the opportunity to carry out detailed cost-benefit analyses leading to significant operational cost savings; (iii) coordinated planning of street and highway lighting with other energy and environmental considerations; iv) measurable KPIs that include long-term impact factors which are transferable throughout the project life cycle; and (v) the opportunity for integrating design documentation with sustainability whole-of-life targets.
Resumo:
This project explores issues confronted when authoring a previously authored story, one received from history. Using the defection of Soviet spies, Vladimir and Evdokia Petrov as its focal point, it details how a screenwriter addresses issues arising in the adaptation of both fictional and biographical representations suitable for contemporary cinema. Textual fidelity and concepts of interpretation, aesthetics and audience, negotiating factual and fictional imperatives, authorial visibility and invisibility, moral and ethical conundrums are negotiated and a set of guiding principles emerge from this practice-led investigation.
Resumo:
The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion. In this paper, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate. The approach is first validated using synthetic data. We then demonstrate its benefit using a mobile robot, equipped with a laser scanner and a radar, which operates in an outdoor environment in the presence of large clouds of airborne dust and smoke.
Resumo:
Food is a vital foundation of all human life. It is essential to a myriad of political, socio-cultural, economic and environmental practices throughout history. As Kaplan [1] contends, “the scholarship on food has real pedigree.” Today, practices of food production, consumption and distribution have the potential to go through immensely transformative shifts as network technologies become increasingly embedded in every domain of contemporary life. This presents unique opportunities for further scholarly exploration on this topic, which this special issue intends to address. Information and communication technologies (ICTs) are one of the pillars of contemporary global functionality and sustenance and undoubtedly will continue to present new challenges and opportunities for the future. As such, this special issue of Futures has been brought together to address challenges and opportunities at the intersection of food and ICTs. In particular, the edition asks, what are the key roles that network technologies play in re-shaping social and economic networks of food?
Resumo:
Large arrays and networks of carbon nanotubes, both single- and multi-walled, feature many superior properties which offer excellent opportunities for various modern applications ranging from nanoelectronics, supercapacitors, photovoltaic cells, energy storage and conversation devices, to gas- and biosensors, nanomechanical and biomedical devices etc. At present, arrays and networks of carbon nanotubes are mainly fabricated from the pre-fabricated separated nanotubes by solution-based techniques. However, the intrinsic structure of the nanotubes (mainly, the level of the structural defects) which are required for the best performance in the nanotube-based applications, are often damaged during the array/network fabrication by surfactants, chemicals, and sonication involved in the process. As a result, the performance of the functional devices may be significantly degraded. In contrast, directly synthesized nanotube arrays/networks can preclude the adverse effects of the solution-based process and largely preserve the excellent properties of the pristine nanotubes. Owing to its advantages of scale-up production and precise positioning of the grown nanotubes, catalytic and catalyst-free chemical vapor depositions (CVD), as well as plasma-enhanced chemical vapor deposition (PECVD) are the methods most promising for the direct synthesis of the nanotubes.
Resumo:
This book constitutes the proceedings of the Second Asia Pacific Conference on Business Process Management held in Brisbane, QLD, Australia, in July 2014. In all, 33 contributions from 12 countries were submitted. After each submission was reviewed by at least three Program Committee members, nine full papers were accepted for publication in this volume. These nine papers cover various topics that can be categorized under four main research focuses in BPM: process mining, process modeling and repositories, process model comparison, and process analysis.
Resumo:
Variations that exist in the treatment of patients (with similar symptoms) across different hospitals do substantially impact the quality and costs of healthcare. Consequently, it is important to understand the similarities and differences between the practices across different hospitals. This paper presents a case study on the application of process mining techniques to measure and quantify the differences in the treatment of patients presenting with chest pain symptoms across four South Australian hospitals. Our case study focuses on cross-organisational benchmarking of processes and their performance. Techniques such as clustering, process discovery, performance analysis, and scientific workflows were applied to facilitate such comparative analyses. Lessons learned in overcoming unique challenges in cross-organisational process mining, such as ensuring population comparability, data granularity comparability, and experimental repeatability are also presented.
Resumo:
The purpose of this study is to discover the significant factors causing the bubble defect on the outsoles manufactured by the Case Company. The bubble defect occurs approximately 1.5 per cent of the time or in 36 pairs per day. To understand this problem, experimental studies are undertaken to identify various factors such as injector temperature, mould temperature; that affects the production of waste. The work presented in this paper comprises a review of the relevant literature on the Six Sigma DMAIC improvement process, quality control tools, and the design of the experiments. After the experimentation following the Six Sigma process, the results showed that the defect occurred in approximately 0.5 per cent of the products or in 12 pairs per day; this decreased the production cost from 6,120 AUD per month to 2,040 AUD per month. This research aimed to reduce the amount of waste in men’s flat outsoles. Hence, the outcome of research presented in this paper should be used as a guide for applying the appropriate process for each type of outsole.
Resumo:
The increased interest in the area of process improvement persuaded Rabobank Group ICT in examining its own Change-process in order to improve its competitiveness. The group is looking for answers about the effectiveness of changes applied as part of this process, with particular interest toward the presence of predictive patterns and their parameters. We conducted an analysis of the log using well established process mining techniques (i.e. Fuzzy Miner). The results of the analysis conducted on the log of the process show that a visible impact is missing.
Resumo:
Artemisinin (ART) based combination therapy (ACT) is used as the first line treatment of uncomplicated falciparum malaria in over 100 countries and is the cornerstone of malaria control and elimination programs in these areas. However, despite the high potency and rapid parasite killing action of ART derivatives there is a high rate of recrudescence associated with ART monotherapy and recrudescence is not uncommon even when ACT is used. Compounding this problem are reports that some parasites in Cambodia, a known foci of drug resistance, have decreased in vivo sensitivity to ART. This raises serious concerns for the development of ART resistance in the field even though no major phenotypic and genotypic changes have yet been identified in these parasites. In this article we review available data on the characteristics of ART, its effects on Plasmodium falciparum parasites and present a hypothesis to explain the high rate of recrudescence associated with this potent class of drugs and the current enigma surrounding ART resistance.
Resumo:
This paper evaluates the suitability of sequence classification techniques for analyzing deviant business process executions based on event logs. Deviant process executions are those that deviate in a negative or positive way with respect to normative or desirable outcomes, such as non-compliant executions or executions that undershoot or exceed performance targets. We evaluate a range of feature types and classification methods in terms of their ability to accurately discriminate between normal and deviant executions both when deviances are infrequent (unbalanced) and when deviances are as frequent as normal executions (balanced). We also analyze the ability of the discovered rules to explain potential causes and contributing factors of observed deviances. The evaluation results show that feature types extracted using pattern mining techniques only slightly outperform those based on individual activity frequency. The results also suggest that more complex feature types ought to be explored to achieve higher levels of accuracy.