999 resultados para Wax-modeling.
Resumo:
Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.
Resumo:
Lean construction and building information modeling (BIM) are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM functionalities with prescriptive lean construction principles, 56 interactions have been identified, all but four of which represent constructive interaction. Although evidence for the majority of these has been found, the matrix is not considered complete but rather a framework for research to explore the degree of validity of the interactions. Construction executives, managers, designers, and developers of information technology systems for construction can also benefit from the framework as an aid to recognizing the potential synergies when planning their lean and BIM adoption strategies.
Resumo:
Unstable density-driven flow can lead to enhanced solute transport in groundwater. Only recently has the complex fingering pattern associated with free convection been documented in field settings. Electrical resistivity (ER) tomography has been used to capture a snapshot of convective instabilities at a single point in time, but a thorough transient analysis is still lacking in the literature. We present the results of a 2 year experimental study at a shallow aquifer in the United Arab Emirates that was designed to specifically explore the transient nature of free convection. ER tomography data documented the presence of convective fingers following a significant rainfall event. We demonstrate that the complex fingering pattern had completely disappeared a year after the rainfall event. The observation is supported by an analysis of the aquifer halite budget and hydrodynamic modeling of the transient character of the fingering instabilities. Modeling results show that the transient dynamics of the gravitational instabilities (their initial development, infiltration into the underlying lower-density groundwater, and subsequent decay) are in agreement with the timing observed in the time-lapse ER measurements. All experimental observations and modeling results are consistent with the hypothesis that a dense brine that infiltrated into the aquifer from a surficial source was the cause of free convection at this site, and that the finite nature of the dense brine source and dispersive mixing led to the decay of instabilities with time. This study highlights the importance of the transience of free convection phenomena and suggests that these processes are more rapid than was previously understood.
Resumo:
Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder-involvement in early design where non-formal techniques supported strong collaboration resulting in deep understanding of requirements and of the feasibility of solutions.
Resumo:
Motivation ?Task analysis for designing modern collaborative work needs a more fine grained approach. Especially in a complex task domain, like collaborative scientific authoring, when there is a single overall goal that can only be accomplished only by collaboration between multiple roles, each requiring its own expertise. We analyzed and re-considered roles, activities, and objects for design for complex collaboration contexts. Our main focus is on a generic approach to design for multiple roles and subtasks in a domain with a shared overall goal, which requires a detailed approach. Collaborative authoring is our current example. This research is incremental: an existing task analysis approach (GTA) is reconsidered by applying it to a case of complex collaboration. Our analysis shows that designing for collaboration indeed requires a refined approach to task modeling: GTA, in future, will need to consider tasks at the lowest level that can be delegated or mandates. These tasks need to be analyzed and redesigned in more in detail, along with the relevant task object.
Resumo:
Process choreographies describe interactions between different business partners and the dependencies between these interactions. While different proposals were made for capturing choreographies at an implementation level, it remains unclear how choreographies should be described on a conceptual level.While the Business Process Modeling Notation (BPMN) is already in use for describing choreographies in terms of interconnected interface behavior models, this paper will introduce interaction modeling using BPMN. Such interaction models do not suffer from incompatibility issues and are better suited for human modelers. BPMN extensions are proposed and a mapping from interaction models to interface behavior models is presented.
Resumo:
This paper presents a novel framework to further advance the recent trend of using query decomposition and high-order term relationships in query language modeling, which takes into account terms implicitly associated with different subsets of query terms. Existing approaches, most remarkably the language model based on the Information Flow method are however unable to capture multiple levels of associations and also suffer from a high computational overhead. In this paper, we propose to compute association rules from pseudo feedback documents that are segmented into variable length chunks via multiple sliding windows of different sizes. Extensive experiments have been conducted on various TREC collections and our approach significantly outperforms a baseline Query Likelihood language model, the Relevance Model and the Information Flow model.
Resumo:
A unique high temporal frequency dataset from an irrigated cotton-wheat rotation was used to test the agroecosystem model DayCent to simulate daily N2O emissions from sub-tropical vertisols under different irrigation intensities. DayCent was able to simulate the effect of different irrigation intensities on N2O fluxes and yield, although it tended to overestimate seasonal fluxes during the cotton season. DayCent accurately predicted soil moisture dynamics and the timing and magnitude of high fluxes associated with fertilizer additions and irrigation events. At the daily scale we found a good correlation of predicted vs. measured N2O fluxes (r2 = 0.52), confirming that DayCent can be used to test agricultural practices for mitigating N2O emission from irrigated cropping systems. A 25 year scenario analysis indicated that N2O losses from irrigated cotton-wheat rotations on black vertisols in Australia can be substantially reduced by an optimized fertilizer and irrigation management system (i.e. frequent irrigation, avoidance of excessive fertiliser application), while sustaining maximum yield potentials.
Resumo:
This paper presents the modeling and motion-sensorless direct torque and flux control of a novel dual-airgap axial-flux permanent-magnet machine optimized for use in flywheel energy storage system (FESS) applications. Independent closed-loop torque and stator flux regulation are performed in the stator flux ( x-y) reference frame via two PI controllers. This facilitates fast torque dynamics, which is critical as far as energy charging/discharging in the FESS is concerned. As FESS applications demand high-speed operation, a new field-weakening algorithm is proposed in this paper. Flux weakening is achieved autonomously once the y-axis voltage exceeds the available inverter voltage. An inherently speed sensorless stator flux observer immune to stator resistance variations and dc-offset effects is also proposed for accurate flux and speed estimation. The proposed observer eliminates the rotary encoder, which in turn reduces the overall weight and cost of the system while improving its reliability. The effectiveness of the proposed control scheme has been verified by simulations and experiments on a machine prototype.
Resumo:
This paper presents the modeling and position-sensorless vector control of a dual-airgap axial flux permanent magnet (AFPM) machine optimized for use in flywheel energy storage system (FESS) applications. The proposed AFPM machine has two sets of three-phase stator windings but requires only a single power converter to control both the electromagnetic torque and the axial levitation force. The proper controllability of the latter is crucial as it can be utilized to minimize the vertical bearing stress to improve the efficiency of the FESS. The method for controlling both the speed and axial displacement of the machine is discussed. An inherent speed sensorless observer is also proposed for speed estimation. The proposed observer eliminates the rotary encoder, which in turn reduces the overall weight and cost of the system while improving its reliability. The effectiveness of the proposed control scheme has been verified by simulations and experiments on a prototype machine.
Resumo:
This paper introduces a novel cage induction generator and presents a mathematical model, through which its behavior can be accurately predicted. The proposed generator system employs a three-phase cage induction machine and generates single-phase and constant-frequency electricity at varying rotor speeds without an intermediate inverter stage. The technique uses any one of the three stator phases of the machine as the excitation winding and the remaining two phases, which are connected in series, as the power winding. The two-series-connected-and-one-isolated (TSCAOI) phase winding configuration magnetically decouples the two sets of windings, enabling independent control. Electricity is generated through the power winding at both sub- and super-synchronous speeds with appropriate excitation to the isolated single winding at any frequency of generation. A dynamic mathematical model, which accurately predicts the behavior of the proposed generator, is also presented and implemented in MATLAB/Simulink. Experimental results of a 2-kW prototype generator under various operating conditions are presented, together with theoretical results, to demonstrate the viability of the TSCAOI power generation. The proposed generator is simple and capable of both storage and retrieval of energy through its excitation winding and is expected to be suitable for applications, such as small wind turbines and microhydro systems.
Resumo:
Graphene and carbon nanotubes are the most promising nanomaterials for application in various modern nanodevices. The successful production of the nanotubes and graphene in a single process was achieved by using a magnetically enhanced arc discharge in helium atmosphere between carbon and metal electrodes. A 3-D fluid model has been used to investigate the discharge parameters.
Resumo:
Cancer is a disease of signal transduction in which the dysregulation of the network of intracellular and extracellular signaling cascades is sufficient to thwart the cells finely-tuned biochemical control mechanisms. A keen interest in the mathematical modeling of cell signaling networks and the regulation of signal transduction has emerged in recent years, and has produced a glimmer of insight into the sophisticated feedback control and network regulation operating within cells. In this review, we present an overview of published theoretical studies on the control aspects of signal transduction, emphasizing the role and importance of mechanisms such as ‘ultrasensitivity’ and feedback loops. We emphasize that these exquisite and often subtle control strategies represent the key to orchestrating ‘simple’ signaling behaviors within the complex intracellular network, while regulating the trade-off between sensitivity and robustness to internal and external perturbations. Through a consideration of these apparent paradoxes, we explore how the basic homeostasis of the intracellular signaling network, in the face of carcinogenesis, can lead to neoplastic progression rather than cell death. A simple mathematical model is presented, furnishing a vivid illustration of how ‘control-oriented’ models of the deranged signaling networks in cancer cells may enucleate improved treatment strategies, including patient-tailored combination therapies, with the potential for reduced toxicity and more robust and potent antitumor activity.
Resumo:
In this paper, a novel data-driven approach to monitoring of systems operating under variable operating conditions is described. The method is based on characterizing the degradation process via a set of operation-specific hidden Markov models (HMMs), whose hidden states represent the unobservable degradation states of the monitored system while its observable symbols represent the sensor readings. Using the HMM framework, modeling, identification and monitoring methods are detailed that allow one to identify a HMM of degradation for each operation from mixed-operation data and perform operation-specific monitoring of the system. Using a large data set provided by a major manufacturer, the new methods are applied to a semiconductor manufacturing process running multiple operations in a production environment.