939 resultados para process model collection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our work we have chosen to integrate formalism for knowledge representation with formalism for process representation as a way to specify and regulate the overall activity of a multi-cellular agent. The result of this approach is XP,N, another formalism, wherein a distributed system can be modeled as a collection of interrelated sub-nets sharing a common explicit control structure. Each sub-net represents a system of asynchronous concurrent threads modeled by a set of transitions. XP,N combines local state and control with interaction and hierarchy to achieve a high-level abstraction and to model the complex relationships between all the components of a distributed system. Viewed as a tool XP,N provides a carefully devised conflict resolution strategy that intentionally mimics the genetic regulatory mechanism used in an organic cell to select the next genes to process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Publicado em "Information control in manufacturing 1998 : (INCOM'98) : advances in industrial engineering : a proceedings volume from the 9th IFAC Symposium, Nancy-Metz, France, 24-26 June 1998. Vol. 2"

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciência e Engenharia de Polímeros e Compósitos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an endogenous growth model in which the research activity is financed by intermediaries that are able to reduce the incidence of researcher's moral hazard. It is shown that financial activity is growth promoting because it increases research productivity. It is also found that a subsidy to the financial sector may have larger growth effects than a direct subsidy to research. Moreover, due to the presence of moral hazard, increasing the subsidy rate to R\&D may reduce the growth rate. I show that there exists a negative relation between the financing of innovation and the process of capital accumulation. Concerning welfare, the presence of two externalities of opposite sign steaming from financial activity may cause that the no-tax equilibrium provides an inefficient level of financial services. Thus, policies oriented to balance the effects of the two externalities will be welfare improving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the Model-based Systems and Qualitative Reasoning Group (Technical University of Munich), from September until December 2005. Constructed wetlands (CWs), or modified natural wetlands, are used all over the world as wastewater treatment systems for small communities because they can provide high treatment efficiency with low energy consumption and low construction, operation and maintenance costs. Their treatment process is very complex because it includes physical, chemical and biological mechanisms like microorganism oxidation, microorganism reduction, filtration, sedimentation and chemical precipitation. Besides, these processes can be influenced by different factors. In order to guarantee the performance of CWs, an operation and maintenance program must be defined for each Wastewater Treatment Plant (WWTP). The main objective of this project is to provide a computer support to the definition of the most appropriate operation and maintenance protocols to guarantee the correct performance of CWs. To reach them, the definition of models which represent the knowledge about CW has been proposed: components involved in the sanitation process, relation among these units and processes to remove pollutants. Horizontal Subsurface Flow CWs are chosen as a case study and the filtration process is selected as first modelling-process application. However, the goal is to represent the process knowledge in such a way that it can be reused for other types of WWTP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Description of a costing model developed by digital production librarian to determine the cost to put an item into the Claremont Colleges Digital Library at the Claremont University Consortium. This case study includes variables such as material types and funding sources, data collection methods, and formulas and calculations for analysis. This model is useful for grant applications, cost allocations, and budgeting for digital project coordinators and digital library projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combined media on photographic paper. 51¼" x 83" Private Collection

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social Accounting Matrices (SAM) are normally used to analyse the income generation process. They are also useful, however, for analysing the cost transmission and price formation mechanisms. For price contributions, Roland-Holst and Sancho (1995) used the SAM structure to analyse the price and cost linkages through a representation of the interdependence between activities, households and factors. This paper is a further analysis of the cost transmission mechanisms, in which I add the capital account to the endogenous components of the Roland-Holst and Sancho approach. By doing this I reflect the responses of prices to the exogenous shocks in savings and investment. I also present an additive decomposition of the global price effects into categories of interdependence that isolates the impact on price levels of shocks in the capital account. I use a 1994 Social Accounting Matrix to make an empirical application of the Catalan economy. Keywords: social accounting matrix, cost linkages, price transmission, capital account. JEL Classification: C63, C69, D59.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The oxalatecarbonate pathway involves the oxidation of calcium oxalate to low-magnesium calcite and represents a potential long-term terrestrial sink for atmospheric CO2. In this pathway, bacterial oxalate degradation is associated with a strong local alkalinization and subsequent carbonate precipitation. In order to test whether this process occurs in soil, the role of bacteria, fungi and calcium oxalate amendments was studied using microcosms. In a model system with sterile soil amended with laboratory cultures of oxalotrophic bacteria and fungi, the addition of calcium oxalate induced a distinct pH shift and led to the final precipitation of calcite. However, the simultaneous presence of bacteria and fungi was essential to drive this pH shift. Growth of both oxalotrophic bacteria and fungi was confirmed by qPCR on the frc (oxalotrophic bacteria) and 16S rRNA genes, and the quantification of ergosterol (active fungal biomass) respectively. The experiment was replicated in microcosms with non-sterilized soil. In this case, the bacterial and fungal contribution to oxalate degradation was evaluated by treatments with specific biocides (cycloheximide and bronopol). Results showed that the autochthonous microflora oxidized calcium oxalate and induced a significant soil alkalinization. Moreover, data confirmed the results from the model soil showing that bacteria are essentially responsible for the pH shift, but require the presence of fungi for their oxalotrophic activity. The combined results highlight that the interaction between bacteria and fungi is essential to drive metabolic processes in complex environments such as soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HIV latency is a major obstacle to curing infection. Current strategies to eradicate HIV aim at increasing transcription of the latent provirus. In the present study we observed that latently infected CD4+ T cells from HIV-infected individuals failed to produce viral particles upon ex vivo exposure to SAHA (vorinostat), despite effective inhibition of histone deacetylases. To identify steps that were not susceptible to the action of SAHA or other latency reverting agents, we used a primary CD4+ T cell model, joint host and viral RNA sequencing, and a viral-encoded reporter. This model served to investigate the characteristics of latently infected cells, the dynamics of HIV latency, and the process of reactivation induced by various stimuli. During latency, we observed persistence of viral transcripts but only limited viral translation. Similarly, the reactivating agents SAHA and disulfiram successfully increased viral transcription, but failed to effectively enhance viral translation, mirroring the ex vivo data. This study highlights the importance of post-transcriptional blocks as one mechanism leading to HIV latency that needs to be relieved in order to purge the viral reservoir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1903, the eastern slope of Turtle Mountain (Alberta) was affected by a 30 M m3-rockslide named Frank Slide that resulted in more than 70 casualties. Assuming that the main discontinuity sets, including bedding, control part of the slope morphology, the structural features of Turtle Mountain were investigated using a digital elevation model (DEM). Using new landscape analysis techniques, we have identified three main joint and fault sets. These results are in agreement with those sets identified through field observations. Landscape analysis techniques, using a DEM, confirm and refine the most recent geology model of the Frank Slide. The rockslide was initiated along bedding and a fault at the base of the slope and propagated up slope by a regressive process following a surface composed of pre-existing discontinuities. The DEM analysis also permits the identification of important geological structures along the 1903 slide scar. Based on the so called Sloping Local Base Level (SLBL) an estimation was made of the present unstable volumes in the main scar delimited by the cracks, and around the south area of the scar (South Peak). The SLBL is a method permitting a geometric interpretation of the failure surface based on a DEM. Finally we propose a failure mechanism permitting the progressive failure of the rock mass that considers gentle dipping wedges (30°). The prisms or wedges defined by two discontinuity sets permit the creation of a failure surface by progressive failure. Such structures are more commonly observed in recent rockslides. This method is efficient and is recommended as a preliminary analysis prior to field investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal models of infective endocarditis (IE) induced by high-grade bacteremia revealed the pathogenic roles of Staphylococcus aureus surface adhesins and platelet aggregation in the infection process. In humans, however, S. aureus IE possibly occurs through repeated bouts of low-grade bacteremia from a colonized site or intravenous device. Here we used a rat model of IE induced by continuous low-grade bacteremia to explore further the contributions of S. aureus virulence factors to the initiation of IE. Rats with aortic vegetations were inoculated by continuous intravenous infusion (0.0017 ml/min over 10 h) with 10(6) CFU of Lactococcus lactis pIL253 or a recombinant L. lactis strain expressing an individual S. aureus surface protein (ClfA, FnbpA, BCD, or SdrE) conferring a particular adhesive or platelet aggregation property. Vegetation infection was assessed 24 h later. Plasma was collected at 0, 2, and 6 h postinoculation to quantify the expression of tumor necrosis factor (TNF), interleukin 1α (IL-1α), IL-1β, IL-6, and IL-10. The percentage of vegetation infection relative to that with strain pIL253 (11%) increased when binding to fibrinogen was conferred on L. lactis (ClfA strain) (52%; P = 0.007) and increased further with adhesion to fibronectin (FnbpA strain) (75%; P < 0.001). Expression of fibronectin binding alone was not sufficient to induce IE (BCD strain) (10% of infection). Platelet aggregation increased the risk of vegetation infection (SdrE strain) (30%). Conferring adhesion to fibrinogen and fibronectin favored IL-1β and IL-6 production. Our results, with a model of IE induced by low-grade bacteremia, resembling human disease, extend the essential role of fibrinogen binding in the initiation of S. aureus IE. Triggering of platelet aggregation or an inflammatory response may contribute to or promote the development of IE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.