980 resultados para sequential reduction processes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ship recycling has been considered as the best means to dispose off an obsolete ship. The current state of art of technology combined with the demands of sustainable developments from the global maritime industrial sector has modified the status of erstwhile ‘ship breaking’ involving ship scrap business to a modern industry undertaking dismantling of ships and recycling/reusing the dismantled products in a supply chain of pre owned product market by following the principles of recycling. Industries will have to formulate a set of best practices and blend them with the engineering activities for producing better quality products, improving the productivity and for achieving improved performances related to sustainable development. Improved performance by industries in a sustainable development perspective is accomplished only by implementing the 4E principles, ie.,. ecofriendliness, engineering efficiency, energy conservation and ergonomics in their core operations. The present study has done a comprehensive investigation into various ship recycling operations for formulating a set of best practices.Being the ultimate life cycle stage of a ship, ship recycling activities incorporate certain commercial procedures well in advance to facilitate the objectives of dismantling and recycling/reusing of various parts of the vessel. Thorough knowledge regarding these background procedures in ship recycling is essential for examining and understanding the industrial business operations associated with it. As a first step, the practices followed in merchant shipping operations regarding the decision on decommissioning have been and made available in the thesis. Brief description about the positioning methods and important preparations for the most feasible ship recycling method ie.,. beach method have been provided as a part of the outline of the background information. Available sources of guidelines, codes and rules & regulations for ship recycling have been compiled and included in the discussion.Very brief summary of practices in major ship recycling destinations has been prepared and listed for providing an overview of the global ship recycling activities. The present status of ship recycling by treating it as a full fledged engineering industry has been brought out to establish the need for looking into the development of the best practices. Major engineering attributes of ship as a unique engineering product and the significant influencing factors on her life cycle stage operations have been studied and added to the information base on ship recycling. Role of ship recycling industry as an important player in global sustainable development efforts has been reviewed by analysing the benefits of ship recycling. A brief synopsis on the state of art of ship recycling in major international ship recycling centres has also been incorporated in the backdrop knowledgebase generation on ship recycling processes.Publications available in this field have been reviewed and classified into five subject categories viz., Infrastructure for recycling yards and methods of dismantling, Rules regarding ship recycling activities, Environmental and safety aspects of ship recycling, Role of naval architects and ship classification societies, Application of information technology and Demand forecasting. The inference from the literature survey have been summarised and recorded. Noticeable observations in the inference include need of creation of a comprehensive knowledgebase on ship recycling and its effective implementation in the industry and the insignificant involvement of naval architects and shipbuilding engineers in ship recycling industry. These two important inferences and the message conveyed by them have been addressed with due importance in the subsequent part of the present study.As a part of the study the importance of demand forecasting in ship recycling has been introduced and presented. A sample input for ship recycling data for implementation of computer based methods of demand forecasting has been presented in this section of the thesis.The interdisciplinary nature of engineering processes involved in ship recycling has been identified as one of the important features of this industry. The present study has identified more than a dozen major stake holders in ship recycling having their own interests and roles. It has also been observed that most of the ship recycling activities is carried out in South East Asian countries where the beach based ship recycling is done in yards without proper infrastructure support. A model of beach based ship recycling has been developed and the roles, responsibilities and the mutual interactions of the elements of the system have been documented as a part of the study Subsequently the need of a generation of a wide knowledgebase on ship recycling activities as pointed out by the literature survey has been addressed. The information base and source of expertise required to build a broad knowledgebase on ship recycling operations have been identified and tabulated. Eleven important ship recycling processes have been identified and a brief sketch of steps involved in these processes have been examined and addressed in detail. Based on these findings, a detailed sequential disassembly process plan of ship recycling has been prepared and charted. After having established the need of best practices in ship recycling initially, the present study here identifies development of a user friendly expert system for ship recycling process as one of the constituents of the proposed best practises. A user friendly expert system has been developed for beach based ship recycling processes and is named as Ship Recycling Recommender (SRR). Two important functions of SRR, first one for the ‘Administrators’, the stake holders at the helm of the ship recycling affairs and second one for the ‘Users’, the stake holders who execute the actual dismantling have been presented by highlighting the steps involved in the execution of the software. The important output generated, ie.,. recommended practices for ship dismantling processes and safe handling information on materials present onboard have been presented with the help of ship recycling reports generated by the expert system. A brief account of necessity of having a ship recycling work content estimation as part of the best practices has been presented in the study. This is supported by a detailed work estimation schedule for the same as one of the appendices.As mentioned earlier, a definite lack of involvement of naval architect has been observed in development of methodologies for improving the status of ship recycling industry. Present study has put forward a holistic approach to review the status of ship recycling not simply as end of life activity of all ‘time expired’ vessels, but as a focal point of integrating all life cycle activities. A new engineering design philosophy targeting sustainable development of marine industrial domain, named design for ship recycling has been identified, formulated and presented. A new model of ship life cycle has been proposed by adding few stages to the traditional life cycle after analysing their critical role in accomplishing clean and safe end of life and partial dismantling of ships. Two applications of design for ship recycling viz, recyclability of ships and her products and allotment of Green Safety Index for ships have been presented as a part of implementation of the philosophy in actual practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pollution of water with pesticides has become a threat to the man, material and environment. The pesticides released to the environment reach the water bodies through run off. Industrial wastewater from pesticide manufacturing industries contains pesticides at higher concentration and hence a major source of water pollution. Pesticides create a lot of health and environmental hazards which include diseases like cancer, liver and kidney disorders, reproductive disorders, fatal death, birth defects etc. Conventional wastewater treatment plants based on biological treatment are not efficient to remove these compounds to the desired level. Most of the pesticides are phyto-toxic i.e., they kill the microorganism responsible for the degradation and are recalcitrant in nature. Advanced oxidation process (AOP) is a class of oxidation techniques where hydroxyl radicals are employed for oxidation of pollutants. AOPs have the ability to totally mineralise the organic pollutants to CO2 and water. Different methods are employed for the generation of hydroxyl radicals in AOP systems. Acetamiprid is a neonicotinoid insecticide widely used to control sucking type insects on crops such as leafy vegetables, citrus fruits, pome fruits, grapes, cotton, ornamental flowers. It is now recommended as a substitute for organophosphorous pesticides. Since its use is increasing, its presence is increasingly found in the environment. It has high water solubility and is not easily biodegradable. It has the potential to pollute surface and ground waters. Here, the use of AOPs for the removal of acetamiprid from wastewater has been investigated. Five methods were selected for the study based on literature survey and preliminary experiments conducted. Fenton process, UV treatment, UV/ H2O2 process, photo-Fenton and photocatalysis using TiO2 were selected for study. Undoped TiO2 and TiO2 doped with Cu and Fe were prepared by sol-gel method. Characterisation of the prepared catalysts was done by X-ray diffraction, scanning electron microscope, differential thermal analysis and thermogravimetric analysis. Influence of major operating parameters on the removal of acetamiprid has been investigated. All the experiments were designed using central compoiste design (CCD) of response surface methodology (RSM). Model equations were developed for Fenton, UV/ H2O2, photo-Fenton and photocatalysis for predicting acetamiprid removal and total organic carbon (TOC) removal for different operating conditions. Quality of the models were analysed by statistical methods. Experimental validations were also done to confirm the quality of the models. Optimum conditions obtained by experiment were verified with that obtained using response optimiser. Fenton Process is the simplest and oldest AOP where hydrogen peroxide and iron are employed for the generation of hydroxyl radicals. Influence of H2O2 and Fe2+ on the acetamiprid removal and TOC removal by Fenton process were investigated and it was found that removal increases with increase in H2O2 and Fe2+ concentration. At an initial concentration of 50 mg/L acetamiprid, 200 mg/L H2O2 and 20 mg/L Fe2+ at pH 3 was found to be optimum for acetamiprid removal. For UV treatment effect of pH was studied and it was found that pH has not much effect on the removal rate. Addition of H2O2 to UV process increased the removal rate because of the hydroxyl radical formation due to photolyis of H2O2. An H2O2 concentration of 110 mg/L at pH 6 was found to be optimum for acetamiprid removal. With photo-Fenton drastic reduction in the treatment time was observed with 10 times reduction in the amount of reagents required. H2O2 concentration of 20 mg/L and Fe2+ concentration of 2 mg/L was found to be optimum at pH 3. With TiO2 photocatalysis improvement in the removal rate was noticed compared to UV treatment. Effect of Cu and Fe doping on the photocatalytic activity under UV light was studied and it was observed that Cu doping enhanced the removal rate slightly while Fe doping has decreased the removal rate. Maximum acetamiprid removal was observed for an optimum catalyst loading of 1000 mg/L and Cu concentration of 1 wt%. It was noticed that mineralisation efficiency of the processes is low compared to acetamiprid removal efficiency. This may be due to the presence of stable intermediate compounds formed during degradation Kinetic studies were conducted for all the treatment processes and it was found that all processes follow pseudo-first order kinetics. Kinetic constants were found out from the experimental data for all the processes and half lives were calculated. The rate of reaction was in the order, photo- Fenton>UV/ H2O2>Fenton> TiO2 photocatalysis>UV. Operating cost was calculated for the processes and it was found that photo-Fenton removes the acetamiprid at lowest operating cost in lesser time. A kinetic model was developed for photo-Fenton process using the elementary reaction data and mass balance equations for the species involved in the process. Variation of acetamiprid concentration with time for different H2O2 and Fe2+ concentration at pH 3 can be found out using this model. The model was validated by comparing the simulated concentration profiles with that obtained from experiments. This study established the viability of the selected AOPs for the removal of acetamiprid from wastewater. Of the studied AOPs photo- Fenton gives the highest removal efficiency with lowest operating cost within shortest time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die gegenwärtige Entwicklung der internationalen Klimapolitik verlangt von Deutschland eine Reduktion seiner Treibhausgasemissionen. Wichtigstes Treibhausgas ist Kohlendioxid, das durch die Verbrennung fossiler Energieträger in die Atmosphäre freigesetzt wird. Die Reduktionsziele können prinzipiell durch eine Verminderung der Emissionen sowie durch die Schaffung von Kohlenstoffsenken erreicht werden. Senken beschreiben dabei die biologische Speicherung von Kohlenstoff in Böden und Wäldern. Eine wichtige Einflussgröße auf diese Prozesse stellt die räumliche Dynamik der Landnutzung einer Region dar. In dieser Arbeit wird das Modellsystem HILLS entwickelt und zur Simulation dieser komplexen Wirkbeziehungen im Bundesland Hessen genutzt. Ziel ist es, mit HILLS über eine Analyse des aktuellen Zustands hinaus auch Szenarien über Wege der zukünftigen regionalen Entwicklung von Landnutzung und ihrer Wirkung auf den Kohlenstoffhaushalt bis 2020 zu untersuchen. Für die Abbildung der räumlichen und zeitlichen Dynamik von Landnutzung in Hessen wird das Modell LUCHesse entwickelt. Seine Aufgabe ist die Simulation der relevanten Prozesse auf einem 1 km2 Raster, wobei die Raten der Änderung exogen als Flächentrends auf Ebene der hessischen Landkreise vorgegeben werden. LUCHesse besteht aus Teilmodellen für die Prozesse: (A) Ausbreitung von Siedlungs- und Gewerbefläche, (B) Strukturwandel im Agrarsektor sowie (C) Neuanlage von Waldflächen (Aufforstung). Jedes Teilmodell umfasst Methoden zur Bewertung der Standorteignung der Rasterzellen für unterschiedliche Landnutzungsklassen und zur Zuordnung der Trendvorgaben zu solchen Rasterzellen, die jeweils am besten für eine Landnutzungsklasse geeignet sind. Eine Validierung der Teilmodelle erfolgt anhand von statistischen Daten für den Zeitraum von 1990 bis 2000. Als Ergebnis eines Simulationslaufs werden für diskrete Zeitschritte digitale Karten der Landnutzugsverteilung in Hessen erzeugt. Zur Simulation der Kohlenstoffspeicherung wird eine modifizierte Version des Ökosystemmodells Century entwickelt (GIS-Century). Sie erlaubt einen gesteuerten Simulationslauf in Jahresschritten und unterstützt die Integration des Modells als Komponente in das HILLS Modellsystem. Es werden verschiedene Anwendungsschemata für GIS-Century entwickelt, mit denen die Wirkung der Stilllegung von Ackerflächen, der Aufforstung sowie der Bewirtschaftung bereits bestehender Wälder auf die Kohlenstoffspeicherung untersucht werden kann. Eine Validierung des Modells und der Anwendungsschemata erfolgt anhand von Feld- und Literaturdaten. HILLS implementiert eine sequentielle Kopplung von LUCHesse mit GIS-Century. Die räumliche Kopplung geschieht dabei auf dem 1 km2 Raster, die zeitliche Kopplung über die Einführung eines Landnutzungsvektors, der die Beschreibung der Landnutzungsänderung einer Rasterzelle während des Simulationszeitraums enthält. Außerdem integriert HILLS beide Modelle über ein dienste- und datenbankorientiertes Konzept in ein Geografisches Informationssystem (GIS). Auf diesem Wege können die GIS-Funktionen zur räumlichen Datenhaltung und Datenverarbeitung genutzt werden. Als Anwendung des Modellsystems wird ein Referenzszenario für Hessen mit dem Zeithorizont 2020 berechnet. Das Szenario setzt im Agrarsektor eine Umsetzung der AGENDA 2000 Politik voraus, die in großem Maße zu Stilllegung von Ackerflächen führt, während für den Bereich Siedlung und Gewerbe sowie Aufforstung die aktuellen Trends der Flächenausdehnung fortgeschrieben werden. Mit HILLS ist es nun möglich, die Wirkung dieser Landnutzungsänderungen auf die biologische Kohlenstoffspeicherung zu quantifizieren. Während die Ausdehnung von Siedlungsflächen als Kohlenstoffquelle identifiziert werden kann (37 kt C/a), findet sich die wichtigste Senke in der Bewirtschaftung bestehender Waldflächen (794 kt C/a). Weiterhin führen die Stilllegung von Ackerfläche (26 kt C/a) sowie Aufforstung (29 kt C/a) zu einer zusätzlichen Speicherung von Kohlenstoff. Für die Kohlenstoffspeicherung in Böden zeigen die Simulationsexperimente sehr klar, dass diese Senke nur von beschränkter Dauer ist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An improved understanding of soil organic carbon (Corg) dynamics in interaction with the mechanisms of soil structure formation is important in terms of sustainable agriculture and reduction of environmental costs of agricultural ecosystems. However, information on physical and chemical processes influencing formation and stabilization of water stable aggregates in association with Corg sequestration is scarce. Long term soil experiments are important in evaluating open questions about management induced effects on soil Corg dynamics in interaction with soil structure formation. The objectives of the present thesis were: (i) to determine the long term impacts of different tillage treatments on the interaction between macro aggregation (>250 µm) and light fraction (LF) distribution and on C sequestration in plots differing in soil texture and climatic conditions. (ii) to determine the impact of different tillage treatments on temporal changes in the size distribution of water stable aggregates and on macro aggregate turnover. (iii) to evaluate the macro aggregate rebuilding in soils with varying initial Corg contents, organic matter (OM) amendments and clay contents in a short term incubation experiment. Soil samples were taken in 0-5 cm, 5-25 cm and 25-40 cm depth from up to four commercially used fields located in arable loess regions of eastern and southern Germany after 18-25 years of different tillage treatments with almost identical experimental setups per site. At each site, one large field with spatially homogenous soil properties was divided into three plots. One of the following three tillage treatments was carried in each plot: (i) Conventional tillage (CT) with annual mouldboard ploughing to 25-30 cm (ii) mulch tillage (MT) with a cultivator or disc harrow 10-15 cm deep, and (iii) no tillage (NT) with direct drilling. The crop rotation at each site consisted of sugar beet (Beta vulgaris L.) - winter wheat (Triticum aestivum L.) - winter wheat. Crop residues were left on the field and crop management was carried out following the regional standards of agricultural practice. To investigate the above mentioned research objectives, three experiments were conducted: Experiment (i) was performed with soils sampled from four sites in April 2010 (wheat stand). Experiment (ii) was conducted with soils sampled from three sites in April 2010, September 2011 (after harvest or sugar beet stand), November 2011 (after tillage) and April 2012 (bare soil or wheat stand). An incubation study (experiment (iii)) was performed with soil sampled from one site in April 2010. Based on the aforementioned research objectives and experiments the main findings were: (i) Consistent results were found between the four long term tillage fields, varying in texture and climatic conditions. Correlation analysis of the yields of macro aggregate against the yields of free LF ( ≤1.8 g cm-3) and occluded LF, respectively, suggested that the effective litter translocation in higher soil depths and higher litter input under CT and MT compensated in the long term the higher physical impact by tillage equipment than under NT. The Corg stocks (kg Corg m−2) in 522 kg soil, based on the equivalent soil mass approach (CT: 0–40 cm, MT: 0–38 cm, NT: 0–36 cm) increased in the order CT (5.2) = NT (5.2) < MT (5.7). Significantly (p ≤ 0.05) highest Corg stocks under MT were probably a result of high crop yields in combination with reduced physical tillage impact and effective litter incorporation, resulting in a Corg sequestration rate of 31 g C-2 m-2 yr-1. (ii) Significantly higher yields of macro aggregates (g kg-2 soil) under NT (732-777) and MT (680-726) than under CT (542-631) were generally restricted to the 0-5 cm sampling depth for all sampling dates. Temporal changes on aggregate size distribution were only small and no tillage induced net effect was detectable. Thus, we assume that the physical impact by tillage equipment was only small or the impact was compensated by a higher soil mixing and effective litter translocation into higher soil depths under CT, which probably resulted in a high re aggregation. (iii) The short term incubation study showed that macro aggregate yields (g kg-2 soil) were higher after 28 days in soils receiving OM (121.4-363.0) than in the control soils (22.0-52.0), accompanied by higher contents of microbial biomass carbon and ergosterol. Highest soil respiration rates after OM amendments within the first three days of incubation indicated that macro aggregate formation is a fast process. Most of the rebuilt macro aggregates were formed within the first seven days of incubation (42-75%). Nevertheless, it was ongoing throughout the entire 28 days of incubation, which was indicated by higher soil respiration rates at the end of the incubation period in OM amended soils than in the control soils. At the same time, decreasing carbon contents within macro aggregates over time indicated that newly occluded OM within the rebuilt macro aggregates served as Corg source for microbial biomass. The different clay contents played only minor role in macro aggregate formation under the particular conditions of the incubation study. Overall, no net changes on macro aggregation were identified in the short term. Furthermore, no indications for an effective Corg sequestration on the long term under NT in comparison to CT were found. The interaction of soil disturbance, litter distribution and the fast re aggregation suggested that a distinct steady state per tillage treatment in terms of soil aggregation was established. However, continuous application of MT with a combination of reduced physical tillage impact and effective litter incorporation may offer some potential in improving the soil structure and may therefore prevent incorporated LF from rapid decomposition and result in a higher C sequestration on the long term.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autonomous vehicles are increasingly being used in mission-critical applications, and robust methods are needed for controlling these inherently unreliable and complex systems. This thesis advocates the use of model-based programming, which allows mission designers to program autonomous missions at the level of a coach or wing commander. To support such a system, this thesis presents the Spock generative planner. To generate plans, Spock must be able to piece together vehicle commands and team tactics that have a complex behavior represented by concurrent processes. This is in contrast to traditional planners, whose operators represent simple atomic or durative actions. Spock represents operators using the RMPL language, which describes behaviors using parallel and sequential compositions of state and activity episodes. RMPL is useful for controlling mobile autonomous missions because it allows mission designers to quickly encode expressive activity models using object-oriented design methods and an intuitive set of activity combinators. Spock also is significant in that it uniformly represents operators and plan-space processes in terms of Temporal Plan Networks, which support temporal flexibility for robust plan execution. Finally, Spock is implemented as a forward progression optimal planner that walks monotonically forward through plan processes, closing any open conditions and resolving any conflicts. This thesis describes the Spock algorithm in detail, along with example problems and test results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: This study describes a bioinformatics approach designed to identify Plasmodium vivax proteins potentially involved in reticulocyte invasion. Specifically, different protein training sets were built and tuned based on different biological parameters, such as experimental evidence of secretion and/or involvement in invasion-related processes. A profile-based sequence method supported by hidden Markov models (HMMs) was then used to build classifiers to search for biologically-related proteins. The transcriptional profile of the P. vivax intra-erythrocyte developmental cycle was then screened using these classifiers. Results: A bioinformatics methodology for identifying potentially secreted P. vivax proteins was designed using sequence redundancy reduction and probabilistic profiles. This methodology led to identifying a set of 45 proteins that are potentially secreted during the P. vivax intra-erythrocyte development cycle and could be involved in cell invasion. Thirteen of the 45 proteins have already been described as vaccine candidates; there is experimental evidence of protein expression for 7 of the 32 remaining ones, while no previous studies of expression, function or immunology have been carried out for the additional 25. Conclusions: The results support the idea that probabilistic techniques like profile HMMs improve similarity searches. Also, different adjustments such as sequence redundancy reduction using Pisces or Cd-Hit allowed data clustering based on rational reproducible measurements. This kind of approach for selecting proteins with specific functions is highly important for supporting large-scale analyses that could aid in the identification of genes encoding potential new target antigens for vaccine development and drug design. The present study has led to targeting 32 proteins for further testing regarding their ability to induce protective immune responses against P. vivax malaria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation is made of the impact of a full linearized physical (moist) parameterization package on extratropical singular vectors (SVs) using the ECMWF integrated forecasting system (IFS). Comparison is made for one particular period with a dry physical package including only vertical diffusion and surface drag. The crucial extra ingredient in the full package is found to be the large-scale latent heat release. Consistent with basic theory, its inclusion results in a shift to smaller horizontal scales and enhanced growth for the SVs. Whereas, for the dry SVs, T42 resolution is sufficient, the moist SVs require T63 to resolve their structure and growth. A 24-h optimization time appears to be appropriate for the moist SVs because of the larger growth of moist SVs compared with dry SVs. Like dry SVs, moist SVs tend to occur in regions of high baroclinicity, but their location is also influenced by the availability of moisture. The most rapidly growing SVs appear to enhance or reduce large-scale rain in regions ahead of major cold fronts. The enhancement occurs in and ahead of a cyclonic perturbation and the reduction in and ahead of an anticyclonic perturbation. Most of the moist SVs for this situation are slightly modified versions of the dry SVs. However, some occur in new locations and have particularly confined structures. The most rapidly growing SV is shown to exhibit quite linear behavior in the nonlinear model as it grows from 0.5 to 12 hPa in 1 day. For 5 times this amplitude the structure is similar but the growth is about half as the perturbation damps a potential vorticity (PV) trough or produces a cutoff, depending on its sign.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sequential study design generally makes more efficient use of available information than a fixed sample counterpart of equal power. This feature is gradually being exploited by researchers in genetic and epidemiological investigations that utilize banked biological resources and in studies where time, cost and ethics are prominent considerations. Recent work in this area has focussed on the sequential analysis of matched case-control studies with a dichotomous trait. In this paper, we extend the sequential approach to a comparison of the associations within two independent groups of paired continuous observations. Such a comparison is particularly relevant in familial studies of phenotypic correlation using twins. We develop a sequential twin method based on the intraclass correlation and show that use of sequential methodology can lead to a substantial reduction in the number of observations without compromising the study error rates. Additionally, our approach permits straightforward allowance for other explanatory factors in the analysis. We illustrate our method in a sequential heritability study of dysplasia that allows for the effect of body mass index and compares monozygotes with pairs of singleton sisters. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electrochemistry of Pt nanostructured electrodes is investigated using hydrodynamic modulated voltammetry (HMV). Here a liquid crystal templating process is used to produce platinum-modified electrodes with a range of surface areas (roughness factor 42.4-280.8). The electroreduction of molecular oxygen at these nanostructured platinum surfaces is used to demonstrate the ability of HMV to discriminate between faradaic and nonfaradaic electrode reactions. The HMV approach shows that the reduction of molecular oxygen experiences considerable signal loss within the high pseudocapacitive region of the voltammetry. Evidence for the contribution of the double layer to transient mass transfer events is presented. In addition, a model circuit and appropriate theoretical analysis are used to illustrate the transient responses of a time variant faradaic component. This in conjunction with the experimental evidence shows that, far from being a passive component in this system, the double layer can contribute to HMV faradaic reactions under certain conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetes like many diseases and biological processes is not mono-causal. On the one hand multifactorial studies with complex experimental design are required for its comprehensive analysis. On the other hand, the data from these studies often include a substantial amount of redundancy such as proteins that are typically represented by a multitude of peptides. Coping simultaneously with both complexities (experimental and technological) makes data analysis a challenge for Bioinformatics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces new insights into the hydrochemical functioning of lowland river systems using field-based spectrophotometric and electrode technologies. The streamwater concentrations of nitrogen species and phosphorus fractions were measured at hourly intervals on a continuous basis at two contrasting sites on tributaries of the River Thames – one draining a rural catchment, the River Enborne, and one draining a more urban system, The Cut. The measurements complement those from an existing network of multi-parameter water quality sondes maintained across the Thames catchment and weekly monitoring based on grab samples. The results of the sub-daily monitoring show that streamwater phosphorus concentrations display highly complex dynamics under storm conditions dependent on the antecedent catchment wetness, and that diurnal phosphorus and nitrogen cycles occur under low flow conditions. The diurnal patterns highlight the dominance of sewage inputs in controlling the streamwater phosphorus and nitrogen concentrations at low flows, even at a distance of 7 km from the nearest sewage treatment works in the rural River Enborne. The time of sample collection is important when judging water quality against ecological thresholds or standards. An exhaustion of the supply of phosphorus from diffuse and multiple septic tank sources during storm events was evident and load estimation was not improved by sub-daily monitoring beyond that achieved by daily sampling because of the eventual reduction in the phosphorus mass entering the stream during events. The results highlight the utility of sub-daily water quality measurements and the discussion considers the practicalities and challenges of in situ, sub-daily monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary role of land surface models embedded in climate models is to partition surface available energy into upwards, radiative, sensible and latent heat fluxes. Partitioning of evapotranspiration, ET, is of fundamental importance: as a major component of the total surface latent heat flux, ET affects the simulated surface water balance, and related energy balance, and consequently the feedbacks with the atmosphere. In this context it is also crucial to credibly represent the CO2 exchange between ecosystems and their environment. In this study, JULES, the land surface model used in UK weather and climate models, has been evaluated for temperate Europe. Compared to eddy covariance flux measurements, the CO2 uptake by the ecosystem is underestimated and the ET overestimated. In addition, the contribution to ET from soil and intercepted water evaporation far outweighs the contribution of plant transpiration. To alleviate these biases, adaptations have been implemented in JULES, based on key literature references. These adaptations have improved the simulation of the spatio-temporal variability of the fluxes and the accuracy of the simulated GPP and ET, including its partitioning. This resulted in a shift of the seasonal soil moisture cycle. These adaptations are expected to increase the fidelity of climate simulations over Europe. Finally, the extreme summer of 2003 was used as evaluation benchmark for the use of the model in climate change studies. The improved model captures the impact of the 2003 drought on the carbon assimilation and the water use efficiency of the plants. It, however, underestimates the 2003 GPP anomalies. The simulations showed that a reduction of evaporation from the interception and soil reservoirs, albeit not of transpiration, largely explained the good correlation between the carbon and the water fluxes anomalies that was observed during 2003. This demonstrates the importance of being able to discriminate the response of individual component of the ET flux to environmental forcing.