924 resultados para Distributed process model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our work we have chosen to integrate formalism for knowledge representation with formalism for process representation as a way to specify and regulate the overall activity of a multi-cellular agent. The result of this approach is XP,N, another formalism, wherein a distributed system can be modeled as a collection of interrelated sub-nets sharing a common explicit control structure. Each sub-net represents a system of asynchronous concurrent threads modeled by a set of transitions. XP,N combines local state and control with interaction and hierarchy to achieve a high-level abstraction and to model the complex relationships between all the components of a distributed system. Viewed as a tool XP,N provides a carefully devised conflict resolution strategy that intentionally mimics the genetic regulatory mechanism used in an organic cell to select the next genes to process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Publicado em "Information control in manufacturing 1998 : (INCOM'98) : advances in industrial engineering : a proceedings volume from the 9th IFAC Symposium, Nancy-Metz, France, 24-26 June 1998. Vol. 2"

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractBackground:Although nutritional, metabolic and cardiovascular abnormalities are commonly seen in experimental studies of obesity, it is uncertain whether these effects result from the treatment or from body adiposity.Objective:To evaluate the influence of treatment and body composition on metabolic and cardiovascular aspects in rats receiving high saturated fat diet.Methods:Sixteen Wistar rats were used, distributed into two groups, the control (C) group, treated with isocaloric diet (2.93 kcal/g) and an obese (OB) group, treated with high-fat diet (3.64 kcal/g). The study period was 20 weeks. Analyses of nutritional behavior, body composition, glycemia, cholesterolemia, lipemia, systolic arterial pressure, echocardiography, and cardiac histology were performed.Results:High-fat diet associates with manifestations of obesity, accompanied by changes in glycemia, cardiomyocyte hypertrophy, and myocardial interstitial fibrosis. After adjusting for adiposity, the metabolic effects were normalized, whereas differences in morphometric changes between groups were maintained.Conclusion:It was concluded that adiposity body composition has a stronger association with metabolic disturbances in obese rodents, whereas the high-fat dietary intervention is found to be more related to cardiac morphological changes in experimental models of diet-induced obesity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an endogenous growth model in which the research activity is financed by intermediaries that are able to reduce the incidence of researcher's moral hazard. It is shown that financial activity is growth promoting because it increases research productivity. It is also found that a subsidy to the financial sector may have larger growth effects than a direct subsidy to research. Moreover, due to the presence of moral hazard, increasing the subsidy rate to R\&D may reduce the growth rate. I show that there exists a negative relation between the financing of innovation and the process of capital accumulation. Concerning welfare, the presence of two externalities of opposite sign steaming from financial activity may cause that the no-tax equilibrium provides an inefficient level of financial services. Thus, policies oriented to balance the effects of the two externalities will be welfare improving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the Model-based Systems and Qualitative Reasoning Group (Technical University of Munich), from September until December 2005. Constructed wetlands (CWs), or modified natural wetlands, are used all over the world as wastewater treatment systems for small communities because they can provide high treatment efficiency with low energy consumption and low construction, operation and maintenance costs. Their treatment process is very complex because it includes physical, chemical and biological mechanisms like microorganism oxidation, microorganism reduction, filtration, sedimentation and chemical precipitation. Besides, these processes can be influenced by different factors. In order to guarantee the performance of CWs, an operation and maintenance program must be defined for each Wastewater Treatment Plant (WWTP). The main objective of this project is to provide a computer support to the definition of the most appropriate operation and maintenance protocols to guarantee the correct performance of CWs. To reach them, the definition of models which represent the knowledge about CW has been proposed: components involved in the sanitation process, relation among these units and processes to remove pollutants. Horizontal Subsurface Flow CWs are chosen as a case study and the filtration process is selected as first modelling-process application. However, the goal is to represent the process knowledge in such a way that it can be reused for other types of WWTP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social Accounting Matrices (SAM) are normally used to analyse the income generation process. They are also useful, however, for analysing the cost transmission and price formation mechanisms. For price contributions, Roland-Holst and Sancho (1995) used the SAM structure to analyse the price and cost linkages through a representation of the interdependence between activities, households and factors. This paper is a further analysis of the cost transmission mechanisms, in which I add the capital account to the endogenous components of the Roland-Holst and Sancho approach. By doing this I reflect the responses of prices to the exogenous shocks in savings and investment. I also present an additive decomposition of the global price effects into categories of interdependence that isolates the impact on price levels of shocks in the capital account. I use a 1994 Social Accounting Matrix to make an empirical application of the Catalan economy. Keywords: social accounting matrix, cost linkages, price transmission, capital account. JEL Classification: C63, C69, D59.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The oxalatecarbonate pathway involves the oxidation of calcium oxalate to low-magnesium calcite and represents a potential long-term terrestrial sink for atmospheric CO2. In this pathway, bacterial oxalate degradation is associated with a strong local alkalinization and subsequent carbonate precipitation. In order to test whether this process occurs in soil, the role of bacteria, fungi and calcium oxalate amendments was studied using microcosms. In a model system with sterile soil amended with laboratory cultures of oxalotrophic bacteria and fungi, the addition of calcium oxalate induced a distinct pH shift and led to the final precipitation of calcite. However, the simultaneous presence of bacteria and fungi was essential to drive this pH shift. Growth of both oxalotrophic bacteria and fungi was confirmed by qPCR on the frc (oxalotrophic bacteria) and 16S rRNA genes, and the quantification of ergosterol (active fungal biomass) respectively. The experiment was replicated in microcosms with non-sterilized soil. In this case, the bacterial and fungal contribution to oxalate degradation was evaluated by treatments with specific biocides (cycloheximide and bronopol). Results showed that the autochthonous microflora oxidized calcium oxalate and induced a significant soil alkalinization. Moreover, data confirmed the results from the model soil showing that bacteria are essentially responsible for the pH shift, but require the presence of fungi for their oxalotrophic activity. The combined results highlight that the interaction between bacteria and fungi is essential to drive metabolic processes in complex environments such as soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HIV latency is a major obstacle to curing infection. Current strategies to eradicate HIV aim at increasing transcription of the latent provirus. In the present study we observed that latently infected CD4+ T cells from HIV-infected individuals failed to produce viral particles upon ex vivo exposure to SAHA (vorinostat), despite effective inhibition of histone deacetylases. To identify steps that were not susceptible to the action of SAHA or other latency reverting agents, we used a primary CD4+ T cell model, joint host and viral RNA sequencing, and a viral-encoded reporter. This model served to investigate the characteristics of latently infected cells, the dynamics of HIV latency, and the process of reactivation induced by various stimuli. During latency, we observed persistence of viral transcripts but only limited viral translation. Similarly, the reactivating agents SAHA and disulfiram successfully increased viral transcription, but failed to effectively enhance viral translation, mirroring the ex vivo data. This study highlights the importance of post-transcriptional blocks as one mechanism leading to HIV latency that needs to be relieved in order to purge the viral reservoir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1903, the eastern slope of Turtle Mountain (Alberta) was affected by a 30 M m3-rockslide named Frank Slide that resulted in more than 70 casualties. Assuming that the main discontinuity sets, including bedding, control part of the slope morphology, the structural features of Turtle Mountain were investigated using a digital elevation model (DEM). Using new landscape analysis techniques, we have identified three main joint and fault sets. These results are in agreement with those sets identified through field observations. Landscape analysis techniques, using a DEM, confirm and refine the most recent geology model of the Frank Slide. The rockslide was initiated along bedding and a fault at the base of the slope and propagated up slope by a regressive process following a surface composed of pre-existing discontinuities. The DEM analysis also permits the identification of important geological structures along the 1903 slide scar. Based on the so called Sloping Local Base Level (SLBL) an estimation was made of the present unstable volumes in the main scar delimited by the cracks, and around the south area of the scar (South Peak). The SLBL is a method permitting a geometric interpretation of the failure surface based on a DEM. Finally we propose a failure mechanism permitting the progressive failure of the rock mass that considers gentle dipping wedges (30°). The prisms or wedges defined by two discontinuity sets permit the creation of a failure surface by progressive failure. Such structures are more commonly observed in recent rockslides. This method is efficient and is recommended as a preliminary analysis prior to field investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal models of infective endocarditis (IE) induced by high-grade bacteremia revealed the pathogenic roles of Staphylococcus aureus surface adhesins and platelet aggregation in the infection process. In humans, however, S. aureus IE possibly occurs through repeated bouts of low-grade bacteremia from a colonized site or intravenous device. Here we used a rat model of IE induced by continuous low-grade bacteremia to explore further the contributions of S. aureus virulence factors to the initiation of IE. Rats with aortic vegetations were inoculated by continuous intravenous infusion (0.0017 ml/min over 10 h) with 10(6) CFU of Lactococcus lactis pIL253 or a recombinant L. lactis strain expressing an individual S. aureus surface protein (ClfA, FnbpA, BCD, or SdrE) conferring a particular adhesive or platelet aggregation property. Vegetation infection was assessed 24 h later. Plasma was collected at 0, 2, and 6 h postinoculation to quantify the expression of tumor necrosis factor (TNF), interleukin 1α (IL-1α), IL-1β, IL-6, and IL-10. The percentage of vegetation infection relative to that with strain pIL253 (11%) increased when binding to fibrinogen was conferred on L. lactis (ClfA strain) (52%; P = 0.007) and increased further with adhesion to fibronectin (FnbpA strain) (75%; P < 0.001). Expression of fibronectin binding alone was not sufficient to induce IE (BCD strain) (10% of infection). Platelet aggregation increased the risk of vegetation infection (SdrE strain) (30%). Conferring adhesion to fibrinogen and fibronectin favored IL-1β and IL-6 production. Our results, with a model of IE induced by low-grade bacteremia, resembling human disease, extend the essential role of fibrinogen binding in the initiation of S. aureus IE. Triggering of platelet aggregation or an inflammatory response may contribute to or promote the development of IE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A preliminary study of the pharmacokinetic parameters of t-Butylaminoethyl disulfide was performed after administration of two different single doses (35 and 300 mg/kg) of either the cold or labelled drug. Plasma or blood samples were treated with dithiothreitol, perchloric acid, and, after filtration, submitted to further purification with anionic resein. In the final step, the drug was retained on a cationic resin column, eluted with NaCl 1M and detected according to the method of Ellman (1958). Alternatively, radioactive drug was detected by liquid scintillation counting. The results corresponding to the smaller dose of total drug suggested a pharmacokinetic behavior related to a one open compartment model with the following parameters: area under the intravenous curve (AUC i.v.):671 ± 14; AUC oral: 150 ± 40 µg.min. ml [raised to the power of -1]; elimination rate constant: 0.071 min [raised to the power of -1]; biological half life: 9.8 min; distribution volume: 0.74 ml/g. For the higher dose, the results seemed to obey a more complex undertermined model. Combining the results, the occurence of a dose-dependent pharmacokinetic behavior is suggested, the drug being rapidly absorbed and rapidly eliminated; the elimination process being related mainly to metabolization. The drug seems to be more toxic when administered I.V. because by this route it escapes first pass metabolism, while being quickly distributed to tissues. The maximum tolerated blood level seems to be around 16 µg/ml.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.