998 resultados para process specification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abundant serpentinite seamounts are found along the outer high of the Mariana forearc at the top of the inner slope of the trench. One of them, Conical Seamount, was drilled at Sites 778, 779, and 780 during Leg 125. The rocks recovered at Holes 779A and 780C, respectively, on the flanks and at the summit of the seamount, include moderately serpentinized depleted harzburgites and some dunites. These rocks exhibit evidence of resorption of the orthopyroxene, when present, and the local presence of very calcic-rich diopside in veins oblique to the main high-temperature foliation of the rock. The peridotites, initially well-foliated with locally poikiloblastic textures, show overprints of a two-stage deformation history: (1) a high-temperature (>1000°C), low-stress (0.02 GPa), homogeneous deformation that has led to the present Porphyroclastic textures displayed by the rocks and (2) heterogeneous ductile shearing at a much higher stress (0.05 GPa). This heterogeneous shearing probably describes a single tectonic event because it began at high temperatures, producing dynamic recrystallization of olivine in the shear zone, and ended at low temperatures in the stability field of chlorite and serpentine. In a few samples, olivine shows evidence of quasi-hydrostatic recrystallization at a very high temperature. Here, we propose that this recrystallization was related to fluid/magma percolation, a process that can also account for the resorption of the orthopyroxene and for the late crystallization of diopside veins in the rock. The impregnation by fluid or magma, development of the main high-temperature, low-stress deformation, and subsequent migration recrystallization of olivine probably occurred in a mantle fragment involved in the arc formation. In addition, this mantle has preserved structures that may have formed earlier in the oceanic lithosphere upon which the arc formed. Heterogeneous ductile shear zones in the peridotites may have developed during uplift. The "cold" deformation may have taken place during diapiric rise of hot mantle that underwent subsequent serpentinization or gliding along normal faults associated with the extension of the eastern margin of the forearc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the ontology requirements specification activity is to state why the ontology is being built, what its intended uses are, who the end users are, and which requirements the ontology should fulfill. This chapter presents detailed methodological guidelines for specifying ontology requirements efficiently. These guidelines will help ontology engineers to capture ontology requirements and produce the ontology requirements specification document (ORSD). The ORSD will play a key role during the ontology development process because it facilitates, among other activities, (1) the search and reuse of existing knowledge resources with the aim of reengineering them into ontologies, (2) the search and reuse of ontological resources (ontologies, ontology modules, ontology statements as well as ontology design patterns), and (3) the verification of the ontology along the ontology development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All living organisms require accurate mechanisms to faithfully inherit their genetic material during cell division. The centromere is a unique locus on each chromosome that supports a multiprotein structure called the kinetochore. During mitosis, the kinetochore is responsible for connecting chromosomes to spindle microtubules, allowing faithful segregation of the duplicated genome. In most organisms, centromere position and function is not defined by the local DNA sequence context but rather by an epigenetic chromatin-based mechanism. Centromere protein A (CENP-A) is central to this process, as chromatin assembled from this histone H3 variant is essential for assembly of the centromere complex, as well as for its epigenetic maintenance. As a major determinant of centromere function, CENP-A assembly requires tight control, both in its specificity for the centromere and in timing of assembly. In the last few years, there have been several new insights into the molecular mechanism that allow this process to occur. We will review these here and discuss the general implications of the mechanism of cell cycle coupling of centromere inheritance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims The morphogenesis and architecture of a rice plant, Oryza sativa, are critical factors in the yield equation, but they are not well studied because of the lack of appropriate tools for 3D measurement. The architecture of rice plants is characterized by a large number of tillers and leaves. The aims of this study were to specify rice plant architecture and to find appropriate functions to represent the 3D growth across all growth stages. Methods A japonica type rice, 'Namaga', was grown in pots under outdoor conditions. A 3D digitizer was used to measure the rice plant structure at intervals from the young seedling stage to maturity. The L-system formalism was applied to create '3D virtual rice' plants, incorporating models of phenological development and leaf emergence period as a function of temperature and photoperiod, which were used to determine the timing of tiller emergence. Key Results The relationships between the nodal positions and leaf lengths, leaf angles and tiller angles were analysed and used to determine growth functions for the models. The '3D virtual rice' reproduces the structural development of isolated plants and provides a good estimation of the fillering process, and of the accumulation of leaves. Conclusions The results indicated that the '3D virtual rice' has a possibility to demonstrate the differences in the structure and development between cultivars and under different environmental conditions. Future work, necessary to reflect both cultivar and environmental effects on the model performance, and to link with physiological models, is proposed in the discussion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe the creation process of the Minimum Information Specification for In Situ Hybridization and Immunohistochemistry Experiments (MISFISHIE). Modeled after the existing minimum information specification for microarray data, we created a new specification for gene expression localization experiments, initially to facilitate data sharing within a consortium. After successful use within the consortium, the specification was circulated to members of the wider biomedical research community for comment and refinement. After a period of acquiring many new suggested requirements, it was necessary to enter a final phase of excluding those requirements that were deemed inappropriate as a minimum requirement for all experiments. The full specification will soon be published as a version 1.0 proposal to the community, upon which a more full discussion must take place so that the final specification may be achieved with the involvement of the whole community. This paper is part of the special issue of OMICS on data standards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with product performance and specification in new product development. There are many different definitions of performance and specification in the literature. These are reviewed and a new classification scheme for product performance is proposed. The link between performance and specification is discussed in detail using a new model for the new product development process. The new model involves two stages, with each containing three main phases, and is useful for making decisions with regards to product performance and specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow technology has delivered effectively for a large class of business processes, providing the requisite control and monitoring functions. At the same time, this technology has been the target of much criticism due to its limited ability to cope with dynamically changing business conditions which require business processes to be adapted frequently, and/or its limited ability to model business processes which cannot be entirely predefined. Requirements indicate the need for generic solutions where a balance between process control and flexibility may be achieved. In this paper we present a framework that allows the workflow to execute on the basis of a partially specified model where the full specification of the model is made at runtime, and may be unique to each instance. This framework is based on the notion of process constraints. Where as process constraints may be specified for any aspect of the workflow, such as structural, temporal, etc. our focus in this paper is on a constraint which allows dynamic selection of activities for inclusion in a given instance. We call these cardinality constraints, and this paper will discuss their specification and validation requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major challenge in teaching software engineering to undergraduates is that most students have limited industry experience, so the problems addressed are unknown and hence unappreciated. Issues of scope prevent a realistic software engineering experience, and students often graduate with a simplistic view of software engineering’s challenges. Problems and Programmers (PnP) is a competitive, physical card game that simulates the software engineering process from requirements specification to product delivery. Deliverables are abstracted, allowing a focus on process issues and for lessons to be learned in a relatively short time. The rules are easy to understand and the game’s physical nature allows for face-to-face interaction between players. The game’s developers have described PnP in previous publications, but this paper reports the game’s use within a larger educational scheme. Students learn and play PnP, and then are required to create a software requirements specification based on the game. Finally, students reflect on the game’s strengths and weaknesses and their experiences in an individual essay. The paper discusses this approach, students’ experiences and overall outcomes, and offers an independent, critical look at the game, its use, and potential improvements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow technology is currently being deployed in quite diverse domains. However, the element of change is present in some degree and form in almost all domains. A workflow implementation that does not support the process of change will not benefit the organization in the long run. Change can be manifested in different forms in workflow processes. In this paper, we first present a categorization of workflow change characteristics and divide workflow processes into dynamic, adaptive and flexible processes. We define flexibility as the ability of the workflow process to execute on the basis of a loosely, or partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. To provide a modeling framework that offers true flexibility, we need to consider the factors, which influence the paths of (unique) instances together with the process definition. We advocate an approach that aims at making the process of change part of the workflow process itself. We introduce the notion of an open instance that consists of a core process and several pockets of flexibility, and present a framework based on this notion, which makes use of special build activities that provide the functionality to integrate the process of defining a change, into the open workflow instance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of integrations of the state-based specification language Object-Z and the process algebra CSP have been proposed in recent years. In developing such integrations, a number of semantic decisions have to be made. In particular, what happens when an operation's precondition is not satisfied? Is the operation blocked, i.e., prevented from occurring, or can it occur with an undefined result? Also, are outputs from operations angelic, satisfying the environment's constraints on them, or are they demonic and not influenced by the environment at all? In this paper we discuss the differences between the models, and show that by adopting a blocking model of preconditions together with an angelic model of outputs one can specify systems at higher levels of abstraction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the use of the Business Process Execution Language for Web Services (BPEL4WS/BPEL) for managing scientific workflows. This work is result of our attempt to adopt Service Oriented Architecture in order to perform Web services – based simulation of metal vapor lasers. Scientific workflows can be more demanding in their requirements than business processes. In the context of addressing these requirements, the features of the BPEL4WS specification are discussed, which is widely regarded as the de-facto standard for orchestrating Web services for business workflows. A typical use case of calculation the electric field potential and intensity distributions is discussed as an example of building a BPEL process to perform distributed simulation constructed by loosely-coupled services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybridisation is a systematic process along which the characteristic features of hybrid logic, both at the syntactic and the semantic levels, are developed on top of an arbitrary logic framed as an institution. It also captures the construction of first-order encodings of such hybridised institutions into theories in first-order logic. The method was originally developed to build suitable logics for the specification of reconfigurable software systems on top of whatever logic is used to describe local requirements of each system’s configuration. Hybridisation has, however, a broader scope, providing a fresh example of yet another development in combining and reusing logics driven by a problem from Computer Science. This paper offers an overview of this method, proposes some new extensions, namely the introduction of full quantification leading to the specification of dynamic modalities, and exemplifies its potential through a didactical application. It is discussed how hybridisation can be successfully used in a formal specification course in which students progress from equational to hybrid specifications in a uniform setting, integrating paradigms, combining data and behaviour, and dealing appropriately with systems evolution and reconfiguration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Home Automation holds the potential of realizing cost savings for end users while reducing the carbon footprint of domestic energy consumption. Yet, adoption is still very low. High cost of vendor-supplied home automation systems is a major prohibiting factor. Open source systems such as FHEM, Domoticz, OpenHAB etc. are a cheaper alternative and can drive the adoption of home automation. Moreover, they have the advantage of not being limited to a single vendor or communication technology which gives end users flexibility in the choice of devices to include in their installation. However, interaction with devices having diverse communication technologies can be inconvenient for users thus limiting the utility they derive from it. For application developers, creating applications which interact with the several technologies in the home automation systems is not a consistent process. Hence, there is the need for a common description mechanism that makes interaction smooth for end users and which enables application developers to make home automation applications in a consistent and uniform way. This thesis proposes such a description mechanism within the context of an open source home automation system – FHEM, together with a system concept for its application. A mobile application was developed as a proof of concept of the proposed description mechanism and the results of the implementation are reflected upon.