881 resultados para Large-scale Analysis
Resumo:
Recent large-scale analyses of mainly full-length cDNA libraries generated from a variety of mouse tissues indicated that almost half of all representative cloned sequences did flat contain ail apparent protein-coding sequence, and were putatively derived from non-protein-coding RNA (ncRNA) genes. However, many of these clones were singletons and the majority were unspliced, raising the possibility that they may be derived from genomic DNA or unprocessed pre-rnRNA contamination during library construction, or alternatively represent nonspecific transcriptional noise. Here we Show, using reverse transcriptase-dependent PCR, microarray, and Northern blot analyses, that many of these clones were derived from genuine transcripts Of unknown function whose expression appears to be regulated. The ncRNA transcripts have larger exons and fewer introns than protein-coding transcripts. Analysis of the genomic landscape around these sequences indicates that some cDNA clones were produced not from terminal poly(A) tracts but internal priming sites within longer transcripts, only a minority of which is encompassed by known genes. A significant proportion of these transcripts exhibit tissue-specific expression patterns, as well as dynamic changes in their expression in macrophages following lipopolysaccharide Stimulation. Taken together, the data provide strong support for the conclusion that ncRNAs are an important, regulated component of the mammalian transcriptome.
Resumo:
Large-scale gene discovery has been performed for the grass fungal endophytes Neotyphodium coenophialum, Neotyphodium lolii, and Epichloe festucae. The resulting sequences have been annotated by comparison with public DNA and protein sequence databases and using intermediate gene ontology annotation tools. Endophyte sequences have also been analysed for the presence of simple sequence repeat and single nucleotide polymorphism molecular genetic markers. Sequences and annotation are maintained within a MySQL database that may be queried using a custom web interface. Two cDNA-based microarrays have been generated from this genome resource, They permit the interrogation of 3806 Neotyphodium genes (Nchip (TM) rnicroarray), and 4195 Neotyphodium and 920 Epichloe genes (EndoChip (TM) microarray), respectively. These microarrays provide tools for high-throughput transcriptome analysis, including genome-specific gene expression studies, profiling of novel endophyte genes, and investigation of the host grass-symbiont interaction. Comparative transcriptome analysis in Neotyphodium and Epichloe was performed. (c) 2006 Elsevier
Resumo:
Background: Changes in brain gene expression are thought to be responsible for the tolerance, dependence, and neurotoxicity produced by chronic alcohol abuse, but there has been no large scale study of gene expression in human alcoholism. Methods: RNA was extracted from postmortem samples of superior frontal cortex of alcoholics and nonalcoholics. Relative levels of RNA were determined by array techniques. We used both cDNA and oligonucleotide microarrays to provide coverage of a large number of genes and to allow cross-validation for those genes represented on both types of arrays. Results: Expression levels were determined for over 4000 genes and 163 of these were found to differ by 40% or more between alcoholics and nonalcoholics. Analysis of these changes revealed a selective reprogramming of gene expression in this brain region, particularly for myelin-related genes which were downregulated in the alcoholic samples. In addition, cell cycle genes and several neuronal genes were changed in expression. Conclusions: These gene expression changes suggest a mechanism for the loss of cerebral white matter in alcoholics as well as alterations that may lead to the neurotoxic actions of ethanol.
Resumo:
Formal methods have significant benefits for developing safety critical systems, in that they allow for correctness proofs, model checking safety and liveness properties, deadlock checking, etc. However, formal methods do not scale very well and demand specialist skills, when developing real-world systems. For these reasons, development and analysis of large-scale safety critical systems will require effective integration of formal and informal methods. In this paper, we use such an integrative approach to automate Failure Modes and Effects Analysis (FMEA), a widely used system safety analysis technique, using a high-level graphical modelling notation (Behavior Trees) and model checking. We inject component failure modes into the Behavior Trees and translate the resulting Behavior Trees to SAL code. This enables us to model check if the system in the presence of these faults satisfies its safety properties, specified by temporal logic formulas. The benefit of this process is tool support that automates the tedious and error-prone aspects of FMEA.
Resumo:
This paper presents a method to analyze the first order eigenvalue sensitivity with respect to the operating parameters of a power system. The method is based on explicitly expressing the system state matrix into sub-matrices. The eigenvalue sensitivity is calculated based on the explicitly formed system state matrix. The 4th order generator model and 4th order exciter system model are used to form the system state matrix. A case study using New England 10-machine 39-bus system is provided to demonstrate the effectiveness of the proposed method. This method can be applied into large scale power system eigenvalue sensitivity with respect to operating parameters.
Resumo:
This article explores consumer Web-search satisfaction. It commences with a brief overview of the concepts consumer information search and consumer satisfaction. Consumer Web adoption issues are then briefly discussed and the importance of consumer search satisfaction is highlighted in relation to the adoption of the Web as an additional source of consumer information. Research hypotheses are developed and the methodology of a large scale consumer experiment to record consumer Web search behaviour is described. The hypotheses are tested and the data explored in relation to post-Web-search satisfaction. The results suggest that consumer post-Web-search satisfaction judgments may be derived from subconscious judgments of Web search efficiency, an empirical calculation of which is problematic in unlimited information environments such as the Web. The results are discussed and a future research agenda is briefly outlined.
Resumo:
An inherent weakness in the management of large scale projects is the failure to achieve the scheduled completion date. When projects are planned with the objective of time achievement, the initial planning plays a vital role in the successful achievement of project deadlines. Cost and quality are additional priorities when such projects are being executed. This article proposes a methodology for achieving time duration of a project through risk analysis with the application of a Monte Carlo simulation technique. The methodology is demonstrated using a case application of a cross-country petroleum pipeline construction project.
Resumo:
Rural electrification projects and programmes in many countries have suffered from design, planning, implementation and operational flaws as a result of ineffective project planning and lack of systematic project risk analysis. This paper presents a hierarchical risk-management framework for effectively managing large-scale development projects. The proposed framework first identifies, with the involvement of stakeholders, the risk factors for a rural electrification programme at three different levels (national, state and site). Subsequently it develops a qualitative risk prioritising scheme through probability and severity mapping and provides mitigating measures for most vulnerable risks. The study concludes that the hierarchical risk-management approach provides an effective framework for managing large-scale rural electrification programmes. © IAIA 2007.
Resumo:
To determine the factors influencing the distribution of -amyloid (Abeta) deposits in Alzheimer's disease (AD), the spatial patterns of the diffuse, primitive, and classic A deposits were studied from the superior temporal gyrus (STG) to sector CA4 of the hippocampus in six sporadic cases of the disease. In cortical gyri and in the CA sectors of the hippocampus, the Abeta deposits were distributed either in clusters 200-6400 microm in diameter that were regularly distributed parallel to the tissue boundary or in larger clusters greater than 6400 microm in diameter. In some regions, smaller clusters of Abeta deposits were aggregated into larger 'superclusters'. In many cortical gyri, the density of Abeta deposits was positively correlated with distance below the gyral crest. In the majority of regions, clusters of the diffuse, primitive, and classic deposits were not spatially correlated with each other. In two cases, double immunolabelled to reveal the Abeta deposits and blood vessels, the classic Abeta deposits were clustered around the larger diameter vessels. These results suggest a complex pattern of Abeta deposition in the temporal lobe in sporadic AD. A regular distribution of Abeta deposit clusters may reflect the degeneration of specific cortico-cortical and cortico-hippocampal pathways and the influence of the cerebral blood vessels. Large-scale clustering may reflect the aggregation of deposits in the depths of the sulci and the coalescence of smaller clusters.
Resumo:
Much research is currently centred on the detection of damage in structures using vibrational data. The work presented here examined several areas of interest in support of a practical technique for identifying and locating damage within bridge structures using apparent changes in their vibrational response to known excitation. The proposed goals of such a technique included the need for the measurement system to be operated on site by a minimum number of staff and that the procedure should be as non-invasive to the bridge traffic-flow as possible. Initially the research investigated changes in the vibrational bending characteristics of two series of large-scale model bridge-beams in the laboratory and these included ordinary-reinforced and post-tensioned, prestressed designs. Each beam was progressively damaged at predetermined positions and its vibrational response to impact excitation was analysed. For the load-regime utilised the results suggested that the infuced damage manifested itself as a function of the span of a beam rather than a localised area. A power-law relating apparent damage with the applied loading and prestress levels was then proposed, together with a qualitative vibrational measure of structural damage. In parallel with the laboratory experiments a series of tests were undertaken at the sites of a number of highway bridges. The bridges selected had differing types of construction and geometric design including composite-concrete, concrete slab-and-beam, concrete-slab with supporting steel-troughing constructions together with regular-rectangular, skewed and heavily-skewed geometries. Initial investigations were made of the feasibility and reliability of various methods of structure excitation including traffic and impulse methods. It was found that localised impact using a sledge-hammer was ideal for the purposes of this work and that a cartridge `bolt-gun' could be used in some specific cases.
Resumo:
Dedicated short range communications (DSRC) has been regarded as one of the most promising technologies to provide robust communications for large scale vehicle networks. It is designed to support both road safety and commercial applications. Road safety applications will require reliable and timely wireless communications. However, as the medium access control (MAC) layer of DSRC is based on the IEEE 802.11 distributed coordination function (DCF), it is well known that the random channel access based MAC cannot provide guaranteed quality of services (QoS). It is very important to understand the quantitative performance of DSRC, in order to make better decisions on its adoption, control, adaptation, and improvement. In this paper, we propose an analytic model to evaluate the DSRC-based inter-vehicle communication. We investigate the impacts of the channel access parameters associated with the different services including arbitration inter-frame space (AIFS) and contention window (CW). Based on the proposed model, we analyze the successful message delivery ratio and channel service delay for broadcast messages. The proposed analytical model can provide a convenient tool to evaluate the inter-vehicle safety applications and analyze the suitability of DSRC for road safety applications.
Resumo:
The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.
Resumo:
Maize is the main staple food for most Kenyan households, and it predominates where smallholder, as well as large-scale, farming takes place. In the sugarcane growing areas of Western Kenya, there is pressure on farmers on whether to grow food crops, or grow sugarcane, which is the main cash crop. Further, with small and diminishing land sizes, the question of productivity and efficiency, both for cash and food crops is of great importance. This paper, therefore, uses a two-step estimation technique (DEA meta-frontier and Tobit Regression) to highlight the inefficiencies in maize cultivation, and their causes in Western Kenya.
Resumo:
Aims - To build a population pharmacokinetic model that describes the apparent clearance of tacrolimus and the potential demographic, clinical and genetically controlled factors that could lead to inter-patient pharmacokinetic variability within children following liver transplantation. Methods - The present study retrospectively examined tacrolimus whole blood pre-dose concentrations (n = 628) of 43 children during their first year post-liver transplantation. Population pharmacokinetic analysis was performed using the non-linear mixed effects modelling program (nonmem) to determine the population mean parameter estimate of clearance and influential covariates. Results - The final model identified time post-transplantation and CYP3A5*1 allele as influential covariates on tacrolimus apparent clearance according to the following equation: TVCL = 12.9 x (Weight/13.2)0.35 x EXP (-0.0058 x TPT) x EXP (0.428 x CYP3A5) where TVCL is the typical value for apparent clearance, TPT is time post-transplantation in days and the CYP3A5 is 1 where *1 allele is present and 0 otherwise. The population estimate and inter-individual variability (%CV) of tacrolimus apparent clearance were found to be 0.977 l h−1 kg−1 (95% CI 0.958, 0.996) and 40.0%, respectively, while the residual variability between the observed and predicted concentrations was 35.4%. Conclusion Tacrolimus apparent clearance was influenced by time post-transplantation and CYP3A5 genotypes. The results of this study, once confirmed by a large scale prospective study, can be used in conjunction with therapeutic drug monitoring to recommend tacrolimus dose adjustments that take into account not only body weight but also genetic and time-related changes in tacrolimus clearance.