872 resultados para Large-scale experiments


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The effects of harvesting of callianassid shrimp (Trypaea australiensis) on the abundance and composition of macrobenthic assemblages in unvegetated sediments of a subtropical coastal embayment in Queensland, Australia were examined using a combination of sampling and manipulative experiments. First, the abundance and composition of the benthic infauna in an area regularly used for the collection of shrimp for bait by recreational anglers was compared with multiple reference areas. Second, a BACI design, with multiple reference areas, was used to examine the short-term effects of harvesting on the benthic assemblages from an intensive commercialised fishing competition. Third, a large-scale, controlled manipulative experiment, where shrimp were harvested from 10,000 m(2) plots at intensities commensurate with those from recreational and commercial operators, was done to determine the impacts on different components of the infaunal assemblage. Only a few benthic taxa showed significant declines in abundance in response to the removal of ghost shrimp from the unvegetated sediments. There was evidence, however, of more subtle effects with changes in the degree of spatial variation (patchiness) of several taxa as a result of harvesting.. Groups such as capitellid polychaetes, gammarid amphipods and some bivalves were significantly more patchy in their distribution in areas subjected to harvesting than reference areas, at a scale of tens of metres. This scale corresponds to the patterns of movement and activity of recreational harvesters working in these areas. In contrast, patchiness in the abundance of ghost shrimp decreased significantly under harvesting at scales of hundreds of metres, in response to harvesters focussing their efforts on areas with greater numbers of burrow entrances, leading to a more even distribution of the animals. Controlled experimental harvesting caused declines in the abundance of soldier crabs (Mictyris longicarpus), polychaetes and amphipods and an increase in the spatial patchiness of polychaetes. Populations of ghost shrimp were, however, resilient to harvesting over extended periods of time. In conclusion, harvesting of ghost shrimp for bait by recreational and commercial fishers causes significant but localised impacts on a limited range of benthic fauna in unvegetated sediments, including changes in the degree of spatial patchiness in their distribution. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent large-scale analyses of mainly full-length cDNA libraries generated from a variety of mouse tissues indicated that almost half of all representative cloned sequences did flat contain ail apparent protein-coding sequence, and were putatively derived from non-protein-coding RNA (ncRNA) genes. However, many of these clones were singletons and the majority were unspliced, raising the possibility that they may be derived from genomic DNA or unprocessed pre-rnRNA contamination during library construction, or alternatively represent nonspecific transcriptional noise. Here we Show, using reverse transcriptase-dependent PCR, microarray, and Northern blot analyses, that many of these clones were derived from genuine transcripts Of unknown function whose expression appears to be regulated. The ncRNA transcripts have larger exons and fewer introns than protein-coding transcripts. Analysis of the genomic landscape around these sequences indicates that some cDNA clones were produced not from terminal poly(A) tracts but internal priming sites within longer transcripts, only a minority of which is encompassed by known genes. A significant proportion of these transcripts exhibit tissue-specific expression patterns, as well as dynamic changes in their expression in macrophages following lipopolysaccharide Stimulation. Taken together, the data provide strong support for the conclusion that ncRNAs are an important, regulated component of the mammalian transcriptome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

On a global scale basalts from mid-ocean ridges are strikingly more homogeneous than basalts from intraplate volcanism. The observed geochemical heterogeneity argues strongly for the existence of distinct reservoirs in the Earth's mantle. It is an unresolved problem of Geodynamics as to how these findings can be reconciled with large-scale convection. We review observational constraints, and investigate stirring properties of numerical models of mantle convection. Conditions in the early Earth may have supported layered convection with rapid stirring in the upper layers. Material that has been altered near the surface is transported downwards by small-scale convection. Thereby a layer of homogeneous depleted material develops above pristine mantle. As the mantle cools over Earth history, the effects leading to layering become reduced and models show the large-scale convection favoured for the Earth today. Laterally averaged, the upper mantle below the lithosphere is least affected by material that has experienced near-surface differentiation. The geochemical signature obtained during the previous episode of small-scale convection may be preserved there for the longest time. Additionally, stirring is less effective in the high viscosity layer of the central lower mantle [1, 2], supporting the survival of medium-scale heterogeneities there. These models are the first, using 3-d spherical geometry and mostly Earth-like parameters, to address the suggested change of convective style. Although the models are still far from reproducing our planet, we find that proposal might be helpful towards reconciling geochemical and geophysical constraints.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In empirical studies of Evolutionary Algorithms, it is usually desirable to evaluate and compare algorithms using as many different parameter settings and test problems as possible, in border to have a clear and detailed picture of their performance. Unfortunately, the total number of experiments required may be very large, which often makes such research work computationally prohibitive. In this paper, the application of a statistical method called racing is proposed as a general-purpose tool to reduce the computational requirements of large-scale experimental studies in evolutionary algorithms. Experimental results are presented that show that racing typically requires only a small fraction of the cost of an exhaustive experimental study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rural electrification projects and programmes in many countries have suffered from design, planning, implementation and operational flaws as a result of ineffective project planning and lack of systematic project risk analysis. This paper presents a hierarchical risk-management framework for effectively managing large-scale development projects. The proposed framework first identifies, with the involvement of stakeholders, the risk factors for a rural electrification programme at three different levels (national, state and site). Subsequently it develops a qualitative risk prioritising scheme through probability and severity mapping and provides mitigating measures for most vulnerable risks. The study concludes that the hierarchical risk-management approach provides an effective framework for managing large-scale rural electrification programmes. © IAIA 2007.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Genome sequences from many organisms, including humans, have been completed, and high-throughput analyses have produced burgeoning volumes of 'omics' data. Bioinformatics is crucial for the management and analysis of such data and is increasingly used to accelerate progress in a wide variety of large-scale and object-specific functional analyses. Refined algorithms enable biotechnologists to follow 'computer-aided strategies' based on experiments driven by high-confidence predictions. In order to address compound problems, current efforts in immuno-informatics and reverse vaccinology are aimed at developing and tuning integrative approaches and user-friendly, automated bioinformatics environments. This will herald a move to 'computer-aided biotechnology': smart projects in which time-consuming and expensive large-scale experimental approaches are progressively replaced by prediction-driven investigations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The focus of this research was defined by a poorly characterised filtration train employed to clarify culture broth containing monoclonal antibodies secreted by GS-NSO cells: the filtration train blinded unpredictably and the ability of the positively charged filters to adsorb DNA from process material was unknown. To direct the development of an assay to quantify the ability of depth filters to adsorb DNA, the molecular weight of DNA from a large-scale, fed-batch, mammalian cell culture vessel was evaluated as process material passed through the initial stages of the purification scheme. High molecular weight DNA was substantially cleared from the broth after passage through a disc stack centrifuge and the remaining low molecular weight DNA was largely unaffected by passage through a series of depth filters and a sterilising grade membrane. Removal of high molecular weight DNA was shown to be coupled with clarification of the process stream. The DNA from cell culture supernatant showed a pattern of internucleosomal cleavage of chromatin when fractionated by electrophoresis but the presence of both necrotic and apoptotic cells throughout the fermentation meant that the origin of the fragmented DNA could not be unequivocally determined. An intercalating fluorochrome, PicoGreen, was elected for development of a suitable DNA assay because of its ability to respond to low molecular weight DNA. It was assessed for its ability to determine the concentration of DNA in clarified mammalian cell culture broths containing pertinent monoclonal antibodies. Fluorescent signal suppression was ameliorated by sample dilution or by performing the assay above the pI of secreted IgG. The source of fluorescence in clarified culture broth was validated by incubation with RNase A and DNase I. At least 89.0 % of fluorescence was attributable to nucleic acid and pre-digestion with RNase A was shown to be a requirement for successful quantification of DNA in such samples. Application of the fluorescence based assay resulted in characterisation of the physical parameters governing adsorption of DNA by various positively charged depth filters and membranes in test solutions and the DNA adsorption profile of the manufacturing scale filtration train. Buffers that reduced or neutralised the depth filter or membrane charge, and those that impeded hydrophobic interactions were shown to affect their operational capacity, demonstrating that DNA was adsorbed by a combination of electrostatic and hydrophobic interactions. Production-scale centrifugation of harvest broth containing therapeutic protein resulted in the reduction of total DNA in the process stream from 79.8 μg m1-1 to 9.3 μg m1-1 whereas the concentration of DNA in the supernatant of pre-and post-filtration samples had only marginally reduced DNA content: from 6.3 to 6.0 μg m1-1 respectively. Hence the filtration train was shown to ineffective in DNA removal. Historically, blinding of the depth filters had been unpredictable with data such as numbers of viable cells, non-viable cells, product titre, or process shape (batch, fed-batch, or draw and fill) failing to inform on the durability of depth filters in the harvest step. To investigate this, key fouling contaminants were identified by challenging depth filters with the same mass of one of the following: viable healthy cells, cells that had died by the process of apoptosis, and cells that had died through the process of necrosis. The pressure increase across a Cuno Zeta Plus 10SP depth filter was 2.8 and 16.5 times more sensitive to debris from apoptotic and necrotic cells respectively, when compared to viable cells. The condition of DNA released into the culture broth was assessed. Necrotic cells released predominantly high molecular weight DNA in contrast to apoptotic cells which released chiefly low molecular weight DNA. The blinding of the filters was found to be largely unaffected by variations in the particle size distribution of material in, and viscosity of, solutions with which they were challenged. The exceptional response of the depth filters to necrotic cells may suggest the cause of previously noted unpredictable filter blinding whereby a number of necrotic cells have a more significant impact on the life of a depth filter than a similar number of viable or apoptotic cells. In a final set of experiments the pressure drop caused by non-viable necrotic culture broths which had been treated with DNase I or benzonase was found to be smaller when compared to untreated broths: the abilities of the enzyme treated cultures to foul the depth filter were reduced by 70.4% and 75.4% respectively indicating the importance of DNA in the blinding of the depth filter studied.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Much research is currently centred on the detection of damage in structures using vibrational data. The work presented here examined several areas of interest in support of a practical technique for identifying and locating damage within bridge structures using apparent changes in their vibrational response to known excitation. The proposed goals of such a technique included the need for the measurement system to be operated on site by a minimum number of staff and that the procedure should be as non-invasive to the bridge traffic-flow as possible. Initially the research investigated changes in the vibrational bending characteristics of two series of large-scale model bridge-beams in the laboratory and these included ordinary-reinforced and post-tensioned, prestressed designs. Each beam was progressively damaged at predetermined positions and its vibrational response to impact excitation was analysed. For the load-regime utilised the results suggested that the infuced damage manifested itself as a function of the span of a beam rather than a localised area. A power-law relating apparent damage with the applied loading and prestress levels was then proposed, together with a qualitative vibrational measure of structural damage. In parallel with the laboratory experiments a series of tests were undertaken at the sites of a number of highway bridges. The bridges selected had differing types of construction and geometric design including composite-concrete, concrete slab-and-beam, concrete-slab with supporting steel-troughing constructions together with regular-rectangular, skewed and heavily-skewed geometries. Initial investigations were made of the feasibility and reliability of various methods of structure excitation including traffic and impulse methods. It was found that localised impact using a sledge-hammer was ideal for the purposes of this work and that a cartridge `bolt-gun' could be used in some specific cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis investigates the soil-pipeline interactions associated with the operation of large-diameter chilled gas pipelines in Britain, these are frost/pipe heave and ground cracking. The investigation was biased towards the definition of the mechanism of ground cracking and, the parameters which influence its generation and subsequent development, especially its interaction with frost heave. The study involved a literature review, questionnaire, large-scale test and small-scale laboratory model experiments. The literature review concentrated on soil-pipeline interactions and frost action, with frost/pipe heave often reported but ground cracking was seldom reported. A questionnaire was circulated within British Gas to gain further information on these interactions. The replies indicated that if frost/pipe heave was reported, ground cracking was also likely to be observed. These soil-pipeline interactions were recorded along 19% of pipelines in the survey and were more likely along the larger diameter, higher flow pipelines. A large-scale trial along a 900 mm pipeline was undertaken to assess the soil thermal, hydraulic and stress regimes, together with pipe and ground movements. Results indicated that cracking occurred intermittently along the pipeline during periods of rapid frost/pipe heave and ground movement and, that frozen annulus growth produced a ground surface profile was approximated by a normal probability distribution curve. This curve indicates maximum tensile strain directly over the pipe centre. Finally a small-scale laboratory model was operated to further define the ground cracking mechanism. Ground cracking was observed at small upward ground surface movement, and with continued movement the ground crack increased in width and depth. At the end of the experiments internal soil failure planes slanting upwards and away from the frozen annulus were noted. The suggested mechanism for ground cracking involved frozen annulus growth producing tensile strain in the overlying unfrozen soil, which when sufficient produced a crack.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When composing stock portfolios, managers frequently choose among hundreds of stocks. The stocks' risk properties are analyzed with statistical tools, and managers try to combine these to meet the investors' risk profiles. A recently developed tool for performing such optimization is called full-scale optimization (FSO). This methodology is very flexible for investor preferences, but because of computational limitations it has until now been infeasible to use when many stocks are considered. We apply the artificial intelligence technique of differential evolution to solve FSO-type stock selection problems of 97 assets. Differential evolution finds the optimal solutions by self-learning from randomly drawn candidate solutions. We show that this search technique makes large scale problem computationally feasible and that the solutions retrieved are stable. The study also gives further merit to the FSO technique, as it shows that the solutions suit investor risk profiles better than portfolios retrieved from traditional methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sentiment analysis or opinion mining aims to use automated tools to detect subjective information such as opinions, attitudes, and feelings expressed in text. This paper proposes a novel probabilistic modeling framework called joint sentiment-topic (JST) model based on latent Dirichlet allocation (LDA), which detects sentiment and topic simultaneously from text. A reparameterized version of the JST model called Reverse-JST, obtained by reversing the sequence of sentiment and topic generation in the modeling process, is also studied. Although JST is equivalent to Reverse-JST without a hierarchical prior, extensive experiments show that when sentiment priors are added, JST performs consistently better than Reverse-JST. Besides, unlike supervised approaches to sentiment classification which often fail to produce satisfactory performance when shifting to other domains, the weakly supervised nature of JST makes it highly portable to other domains. This is verified by the experimental results on data sets from five different domains where the JST model even outperforms existing semi-supervised approaches in some of the data sets despite using no labeled documents. Moreover, the topics and topic sentiment detected by JST are indeed coherent and informative. We hypothesize that the JST model can readily meet the demand of large-scale sentiment analysis from the web in an open-ended fashion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether (DME) gas adsorptive separation and steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). Hydrogen is currently receiving increasing interest as an alternative source of clean energy and has high potential applications, including the transportation sector and power generation. Computational fluid dynamic (CFD) modelling has attracted considerable recognition in the engineering sector consequently leading to using it as a tool for process design and optimisation in many industrial processes. In most cases, these processes are difficult or expensive to conduct in lab scale experiments. The CFD provides a cost effective methodology to gain detailed information up to the microscopic level. The main objectives in this project are to: (i) develop a predictive model using ANSYS FLUENT (CFD) commercial code to simulate the flow hydrodynamics, mass transfer, reactions and heat transfer in a large scale dual fluidized bed system for combined gas separation and steam reforming processes (ii) implement a suitable adsorption models in the CFD code, through a user defined function, to predict selective separation of a gas from a mixture (iii) develop a model for dimethyl ether steam reforming (DME-SR) to predict hydrogen production (iv) carry out detailed parametric analysis in order to establish ideal operating conditions for future industrial application. The project has originated from a real industrial case problem in collaboration with the industrial partner Dow Corning (UK) and jointly funded by the Engineering and Physical Research Council (UK) and Dow Corning. The research examined gas separation by adsorption in a bubbling bed, as part of a dual fluidized bed system. The adsorption process was simulated based on the kinetics derived from the experimental data produced as part of a separate PhD project completed under the same fund. The kinetic model was incorporated in FLUENT CFD tool as a pseudo-first order rate equation; some of the parameters for the pseudo-first order kinetics were obtained using MATLAB. The modelling of the DME adsorption in the designed bubbling bed was performed for the first time in this project and highlights the novelty in the investigations. The simulation results were analysed to provide understanding of the flow hydrodynamic, reactor design and optimum operating condition for efficient separation. Bubbling bed validation by estimation of bed expansion and the solid and gas distribution from simulation agreed well with trends seen in the literatures. Parametric analysis on the adsorption process demonstrated that increasing fluidizing velocity reduced adsorption of DME. This is as a result of reduction in the gas residence time which appears to have much effect compared to the solid residence time. The removal efficiency of DME from the bed was found to be more than 88%. Simulation of the DME-SR in FLUENT CFD was conducted using selected kinetics from literature and implemented in the model using an in-house developed user defined function. The validation of the kinetics was achieved by simulating a case to replicate an experimental study of a laboratory scale bubbling bed by Vicente et al [1]. Good agreement was achieved for the validation of the models, which was then applied in the DME-SR in the large scale riser section of the dual fluidized bed system. This is the first study to use the selected DME-SR kinetics in a circulating fluidized bed (CFB) system and for the geometry size proposed for the project. As a result, the simulation produced the first detailed data on the spatial variation and final gas product in such an industrial scale fluidized bed system. The simulation results provided insight in the flow hydrodynamic, reactor design and optimum operating condition. The solid and gas distribution in the CFB was observed to show good agreement with literatures. The parametric analysis showed that the increase in temperature and steam to DME molar ratio increased the production of hydrogen due to the increased DME conversions, whereas the increase in the space velocity has been found to have an adverse effect. Increasing temperature between 200 oC to 350 oC increased DME conversion from 47% to 99% while hydrogen yield increased substantially from 11% to 100%. The CO2 selectivity decreased from 100% to 91% due to the water gas shift reaction favouring CO at higher temperatures. The higher conversions observed as the temperature increased was reflected on the quantity of unreacted DME and methanol concentrations in the product gas, where both decreased to very low values of 0.27 mol% and 0.46 mol% respectively at 350 °C. Increasing the steam to DME molar ratio from 4 to 7.68 increased the DME conversion from 69% to 87%, while the hydrogen yield increased from 40% to 59%. The CO2 selectivity decreased from 100% to 97%. The decrease in the space velocity from 37104 ml/g/h to 15394 ml/g/h increased the DME conversion from 87% to 100% while increasing the hydrogen yield from 59% to 87%. The parametric analysis suggests an operating condition for maximum hydrogen yield is in the region of 300 oC temperatures and Steam/DME molar ratio of 5. The analysis of the industrial sponsor’s case for the given flow and composition of the gas to be treated suggests that 88% of DME can be adsorbed from the bubbling and consequently producing 224.4t/y of hydrogen in the riser section of the dual fluidized bed system. The process also produces 1458.4t/y of CO2 and 127.9t/y of CO as part of the product gas. The developed models and parametric analysis carried out in this study provided essential guideline for future design of DME-SR at industrial level and in particular this work has been of tremendous importance for the industrial collaborator in order to draw conclusions and plan for future potential implementation of the process at an industrial scale.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An alternative explanation for the modes of failure of large scale failures of open pit walls to those of classical slope stability theory is proposed that makes use of the concept of a transition zone, which is described by a modified Prandtls prism.