991 resultados para Capture methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of genome-scale metabolic models has been rapidly increasing in fields such as metabolic engineering. An important part of a metabolic model is the biomass equation since this reaction will ultimately determine the predictive capacity of the model in terms of essentiality and flux distributions. Thus, in order to obtain a reliable metabolic model the biomass precursors and their coefficients must be as precise as possible. Ideally, determination of the biomass composition would be performed experimentally, but when no experimental data are available this is established by approximation to closely related organisms. Computational methods however, can extract some information from the genome such as amino acid and nucleotide compositions. The main objectives of this study were to compare the biomass composition of several organisms and to evaluate how biomass precursor coefficients affected the predictability of several genome-scale metabolic models by comparing predictions with experimental data in literature. For that, the biomass macromolecular composition was experimentally determined and the amino acid composition was both experimentally and computationally estimated for several organisms. Sensitivity analysis studies were also performed with the Escherichia coli iAF1260 metabolic model concerning specific growth rates and flux distributions. The results obtained suggest that the macromolecular composition is conserved among related organisms. Contrasting, experimental data for amino acid composition seem to have no similarities for related organisms. It was also observed that the impact of macromolecular composition on specific growth rates and flux distributions is larger than the impact of amino acid composition, even when data from closely related organisms are used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The suitability of a total-length-based, minimum capture-size and different protection regimes was investigated for the gooseneck barnacle Pollicipes pollicipes shellfishery in N Spain. For this analysis, individuals that were collected from 10 sites under different fishery protection regimes (permanently open, seasonally closed, and permanently closed) were used. First, we applied a non-parametric regression model to explore the relationship between the capitulum Rostro-Tergum (RT) size and the Total Length (TL). Important heteroskedastic disturbances were detected for this relationship, demon- strating a high variability of TL with respect to RT. This result substantiates the unsuitability of a TL-based minimum size by means of a mathematical model. Due to these disturbances, an alternative growth- based minimum capture size of 26.3 mm RT (23 mm RC) was estimated using the first derivative of a Kernel-based non-parametric regression model for the relationship between RT and dry weight. For this purpose, data from the permanently protected area were used to avoid bias due to the fishery. Second, the size-frequency distribution similarity was computed using a MDS analysis for the studied sites to evaluate the effectiveness of the protection regimes. The results of this analysis indicated a positive effect of the permanent protection, while the effect of the seasonal closure was not detected. This result needs to be interpreted with caution because the current harvesting based on a potentially unsuitable mini- mum capture size may dampen the efficacy of the seasonal protection regime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the "best fit" model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Series title: Springerbriefs in applied sciences and technology, ISSN 2191-530X"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Series: Solid mechanics and its applications, vol. 226"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the incidence of problems requiring reprogramming of atrioventricular pacemakers in a long-term follow-up, and also the causes for this procedure. METHODS: During the period from May '98 to December '99, 657 patients were retrospectively studied, An actuarial curve for the event reprogramming of the stimulation mode was drawn. RESULTS: The follow-up period ranged from 12 to 178 months (mean = 81 months). Eighty-two (12.4%) patients underwent reprogramming of the stimulation mode as follows: 63 (9.5%) changed to VVI,(R/C); 10 (1.5%) changed to DVI,C; 6 (0.9%) changed to VDD,C; and 3 (0.5%) changed to DOO. The causes for the reprogramming were as follows: arrhythmia conducted by the pacemaker in 39 (37.6%) patients; loss of atrial sensitivity or capture, or both, in 39 (38.6%) patients; and microfracture of atrial electrode in 5 (4.9%) patients. The stimulation mode reprogramming free probability after 15 years was 58%. CONCLUSION: In a long-term follow-up, the atrioventricular pacemaker provided a low incidence of complications, a high probability of permanence in the DDD,C mode, and the most common cause of reprogramming was arrhythmia conducted by the pacemaker.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Series title: Computational methods in applied sciences, ISSN1871-3033, vol. 42"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: This study was performed to observe the number of pacemakers that had never been reprogrammed after implantation, and the effect of optimised output programming on estimated longevity of pulse generators in patients with pacemaker METHODS: Sixty patients with Teletronics Reflex pacemakers were evaluated in a pacemaker clinic, from the time of the beginning of its activities, in June 1998, until March 1999. Telemetry was performed during the first clinic visit, and we observed how many pulse generators retained nominal output settings of the manufactures indicating the absence of reprogramming until that date. After evaluation of the capture threshold, reprogramming of pacemakers was performed with a safety margin of 2 to 2.5:1, and we compared the estimated longevity based on battery current at the manufacturer's settings with that based on settings achieved after reprogramming. RESULTS: In 95% of the cases, the original programmed setting was never reprogrammed before the patients attended the pacemaker clinic. Reprogramming the pacemaker prolonged estimated pulse generator life by 19.7±15.6 months (35.5%). CONCLUSION: The majority of the pacemakers evaluated had never been reprogrammed. Estimated pulse generator longevity can be prolonged significantly, using this simple, safe, efficacious, and cost-effective procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.