690 resultados para backorder process
Resumo:
Carbon microcoils (CMCs) have been coated with a nickel-phosphorus (Ni-P) film using an electroless plating process, with sodium hypophosphite as a reducing agent in an alkaline bath. CMC composites have potential applications as microwave absorption materials. The morphology, elemental composition and phases in the coating layer of the CMCs and Ni-coated CMCs were investigated by scanning electron microscopy (SEM), energy dispersive spectroscopy (EDS) and X-ray diffraction (XRD), respectively. The effects of process parameters such as pH, temperature and coating time of the plating bath on the phosphorus content and deposition rate of the electroless Ni-P coating were studied. The results revealed that a continuous, uniform and low-phosphorous nickel coating was deposited on the surface of the CMCs for 20 min at pH 9.0, plating bath temperature 70 °C. The as-deposited coatings with approximately 4.5 wt.% phosphorus were found to consist of a mix of nano- and microcrystalline phases. The mean particle size of Ni-P nanoparticles on the outer surface of the CMCs was around 11.9 nm. The deposition rate was found to moderately increase with increasing pH, whereas, the phosphorous content of the deposit exhibited a significant decrease. Moreover, the material of the coating underwent a phase transition between an amorphous and a crystalline structure. The thickness of the deposit and the deposition rate may be controlled through careful variation of the coating time and plating bath temperature.
Resumo:
Angular distribution of microscopic ion fluxes around nanotubes arranged into a dense ordered pattern on the surface of the substrate is studied by means of multiscale numerical simulation. The Monte Carlo technique was used to show that the ion current density is distributed nonuniformly around the carbon nanotubes arranged into a dense rectangular array. The nonuniformity factor of the ion current flux reaches 7 in dense (5× 1018 m-3) plasmas for a nanotube radius of 25 nm, and tends to 1 at plasma densities below 1× 1017 m-3. The results obtained suggest that the local density of carbon adatoms on the nanotube side surface, at areas facing the adjacent nanotubes of the pattern, can be high enough to lead to the additional wall formation and thus cause the single- to multiwall structural transition, and other as yet unexplained nanoscience phenomena.
Resumo:
Previous studies have found that the lateral posterior fusiform gyri respond more robustly to pictures of animals than pictures of manmade objects and suggested that these regions encode the visual properties characteristic of animals. We suggest that such effects actually reflect processing demands arising when items with similar representations must be finely discriminated. In a positron emission tomography (PET) study of category verification with colored photographs of animals and vehicles, there was robust animal-specific activation in the lateral posterior fusiform gyri when stimuli were categorized at an intermediate level of specificity (e.g., dog or car). However, when the same photographs were categorized at a more specific level (e.g., Labrador or BMW), these regions responded equally strongly to animals and vehicles. We conclude that the lateral posterior fusiform does not encode domain-specific representations of animals or visual properties characteristic of animals. Instead, these regions are strongly activated whenever an item must be discriminated from many close visual or semantic competitors. Apparent category effects arise because, at an intermediate level of specificity, animals have more visual and semantic competitors than do artifacts.
Resumo:
The design and development of process-aware information systems is often supported by specifying requirements as business process models. Although this approach is generally accepted as an effective strategy, it remains a fundamental challenge to adequately validate these models given the diverging skill set of domain experts and system analysts. As domain experts often do not feel confident in judging the correctness and completeness of process models that system analysts create, the validation often has to regress to a discourse using natural language. In order to support such a discourse appropriately, so-called verbalization techniques have been defined for different types of conceptual models. However, there is currently no sophisticated technique available that is capable of generating natural-looking text from process models. In this paper, we address this research gap and propose a technique for generating natural language texts from business process models. A comparison with manually created process descriptions demonstrates that the generated texts are superior in terms of completeness, structure, and linguistic complexity. An evaluation with users further demonstrates that the texts are very understandable and effectively allow the reader to infer the process model semantics. Hence, the generated texts represent a useful input for process model validation.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
The value of information technology (IT) is often realized when continuously being used after users’ initial acceptance. However, previous research on continuing IT usage is limited for dismissing the importance of mental goals in directing users’ behaviors and for inadequately accommodating the group context of users. This in-progress paper offers a synthesis of several literature to conceptualize continuing IT usage as multilevel constructs and to view IT usage behavior as directed and energized by a set of mental goals. Drawing from the self-regulation theory in the social psychology, this paper proposes a process model, positioning continuing IT usage as multiple-goal pursuit. An agent-based modeling approach is suggested to further explore causal and analytical implications of the proposed process model.
Resumo:
This project explores issues confronted when authoring a previously authored story, one received from history. Using the defection of Soviet spies, Vladimir and Evdokia Petrov as its focal point, it details how a screenwriter addresses issues arising in the adaptation of both fictional and biographical representations suitable for contemporary cinema. Textual fidelity and concepts of interpretation, aesthetics and audience, negotiating factual and fictional imperatives, authorial visibility and invisibility, moral and ethical conundrums are negotiated and a set of guiding principles emerge from this practice-led investigation.
Resumo:
The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion. In this paper, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate. The approach is first validated using synthetic data. We then demonstrate its benefit using a mobile robot, equipped with a laser scanner and a radar, which operates in an outdoor environment in the presence of large clouds of airborne dust and smoke.
Resumo:
Large arrays and networks of carbon nanotubes, both single- and multi-walled, feature many superior properties which offer excellent opportunities for various modern applications ranging from nanoelectronics, supercapacitors, photovoltaic cells, energy storage and conversation devices, to gas- and biosensors, nanomechanical and biomedical devices etc. At present, arrays and networks of carbon nanotubes are mainly fabricated from the pre-fabricated separated nanotubes by solution-based techniques. However, the intrinsic structure of the nanotubes (mainly, the level of the structural defects) which are required for the best performance in the nanotube-based applications, are often damaged during the array/network fabrication by surfactants, chemicals, and sonication involved in the process. As a result, the performance of the functional devices may be significantly degraded. In contrast, directly synthesized nanotube arrays/networks can preclude the adverse effects of the solution-based process and largely preserve the excellent properties of the pristine nanotubes. Owing to its advantages of scale-up production and precise positioning of the grown nanotubes, catalytic and catalyst-free chemical vapor depositions (CVD), as well as plasma-enhanced chemical vapor deposition (PECVD) are the methods most promising for the direct synthesis of the nanotubes.
Resumo:
This book constitutes the proceedings of the Second Asia Pacific Conference on Business Process Management held in Brisbane, QLD, Australia, in July 2014. In all, 33 contributions from 12 countries were submitted. After each submission was reviewed by at least three Program Committee members, nine full papers were accepted for publication in this volume. These nine papers cover various topics that can be categorized under four main research focuses in BPM: process mining, process modeling and repositories, process model comparison, and process analysis.
Resumo:
Variations that exist in the treatment of patients (with similar symptoms) across different hospitals do substantially impact the quality and costs of healthcare. Consequently, it is important to understand the similarities and differences between the practices across different hospitals. This paper presents a case study on the application of process mining techniques to measure and quantify the differences in the treatment of patients presenting with chest pain symptoms across four South Australian hospitals. Our case study focuses on cross-organisational benchmarking of processes and their performance. Techniques such as clustering, process discovery, performance analysis, and scientific workflows were applied to facilitate such comparative analyses. Lessons learned in overcoming unique challenges in cross-organisational process mining, such as ensuring population comparability, data granularity comparability, and experimental repeatability are also presented.
Resumo:
The purpose of this study is to discover the significant factors causing the bubble defect on the outsoles manufactured by the Case Company. The bubble defect occurs approximately 1.5 per cent of the time or in 36 pairs per day. To understand this problem, experimental studies are undertaken to identify various factors such as injector temperature, mould temperature; that affects the production of waste. The work presented in this paper comprises a review of the relevant literature on the Six Sigma DMAIC improvement process, quality control tools, and the design of the experiments. After the experimentation following the Six Sigma process, the results showed that the defect occurred in approximately 0.5 per cent of the products or in 12 pairs per day; this decreased the production cost from 6,120 AUD per month to 2,040 AUD per month. This research aimed to reduce the amount of waste in men’s flat outsoles. Hence, the outcome of research presented in this paper should be used as a guide for applying the appropriate process for each type of outsole.
Resumo:
The increased interest in the area of process improvement persuaded Rabobank Group ICT in examining its own Change-process in order to improve its competitiveness. The group is looking for answers about the effectiveness of changes applied as part of this process, with particular interest toward the presence of predictive patterns and their parameters. We conducted an analysis of the log using well established process mining techniques (i.e. Fuzzy Miner). The results of the analysis conducted on the log of the process show that a visible impact is missing.
Resumo:
Artemisinin (ART) based combination therapy (ACT) is used as the first line treatment of uncomplicated falciparum malaria in over 100 countries and is the cornerstone of malaria control and elimination programs in these areas. However, despite the high potency and rapid parasite killing action of ART derivatives there is a high rate of recrudescence associated with ART monotherapy and recrudescence is not uncommon even when ACT is used. Compounding this problem are reports that some parasites in Cambodia, a known foci of drug resistance, have decreased in vivo sensitivity to ART. This raises serious concerns for the development of ART resistance in the field even though no major phenotypic and genotypic changes have yet been identified in these parasites. In this article we review available data on the characteristics of ART, its effects on Plasmodium falciparum parasites and present a hypothesis to explain the high rate of recrudescence associated with this potent class of drugs and the current enigma surrounding ART resistance.
Resumo:
This paper evaluates the suitability of sequence classification techniques for analyzing deviant business process executions based on event logs. Deviant process executions are those that deviate in a negative or positive way with respect to normative or desirable outcomes, such as non-compliant executions or executions that undershoot or exceed performance targets. We evaluate a range of feature types and classification methods in terms of their ability to accurately discriminate between normal and deviant executions both when deviances are infrequent (unbalanced) and when deviances are as frequent as normal executions (balanced). We also analyze the ability of the discovered rules to explain potential causes and contributing factors of observed deviances. The evaluation results show that feature types extracted using pattern mining techniques only slightly outperform those based on individual activity frequency. The results also suggest that more complex feature types ought to be explored to achieve higher levels of accuracy.