861 resultados para Ecosystem management -- Queensland -- Johnstone (Shire) -- Data processing.
Resumo:
Tiivistelmä ReferatAbstract Metabolomics is a rapidly growing research field that studies the response of biological systems to environmental factors, disease states and genetic modifications. It aims at measuring the complete set of endogenous metabolites, i.e. the metabolome, in a biological sample such as plasma or cells. Because metabolites are the intermediates and end products of biochemical reactions, metabolite compositions and metabolite levels in biological samples can provide a wealth of information on on-going processes in a living system. Due to the complexity of the metabolome, metabolomic analysis poses a challenge to analytical chemistry. Adequate sample preparation is critical to accurate and reproducible analysis, and the analytical techniques must have high resolution and sensitivity to allow detection of as many metabolites as possible. Furthermore, as the information contained in the metabolome is immense, the data set collected from metabolomic studies is very large. In order to extract the relevant information from such large data sets, efficient data processing and multivariate data analysis methods are needed. In the research presented in this thesis, metabolomics was used to study mechanisms of polymeric gene delivery to retinal pigment epithelial (RPE) cells. The aim of the study was to detect differences in metabolomic fingerprints between transfected cells and non-transfected controls, and thereafter to identify metabolites responsible for the discrimination. The plasmid pCMV-β was introduced into RPE cells using the vector polyethyleneimine (PEI). The samples were analyzed using high performance liquid chromatography (HPLC) and ultra performance liquid chromatography (UPLC) coupled to a triple quadrupole (QqQ) mass spectrometer (MS). The software MZmine was used for raw data processing and principal component analysis (PCA) was used in statistical data analysis. The results revealed differences in metabolomic fingerprints between transfected cells and non-transfected controls. However, reliable fingerprinting data could not be obtained because of low analysis repeatability. Therefore, no attempts were made to identify metabolites responsible for discrimination between sample groups. Repeatability and accuracy of analyses can be influenced by protocol optimization. However, in this study, optimization of analytical methods was hindered by the very small number of samples available for analysis. In conclusion, this study demonstrates that obtaining reliable fingerprinting data is technically demanding, and the protocols need to be thoroughly optimized in order to approach the goals of gaining information on mechanisms of gene delivery.
Resumo:
Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.
Resumo:
Over past few years, the studies of cultured neuronal networks have opened up avenues for understanding the ion channels, receptor molecules, and synaptic plasticity that may form the basis of learning and memory. The hippocampal neurons from rats are dissociated and cultured on a surface containing a grid of 64 electrodes. The signals from these 64 electrodes are acquired using a fast data acquisition system MED64 (Alpha MED Sciences, Japan) at a sampling rate of 20 K samples with a precision of 16-bits per sample. A few minutes of acquired data runs in to a few hundreds of Mega Bytes. The data processing for the neural analysis is highly compute-intensive because the volume of data is huge. The major processing requirements are noise removal, pattern recovery, pattern matching, clustering and so on. In order to interface a neuronal colony to a physical world, these computations need to be performed in real-time. A single processor such as a desk top computer may not be adequate to meet this computational requirements. Parallel computing is a method used to satisfy the real-time computational requirements of a neuronal system that interacts with an external world while increasing the flexibility and scalability of the application. In this work, we developed a parallel neuronal system using a multi-node Digital Signal processing system. With 8 processors, the system is able to compute and map incoming signals segmented over a period of 200 ms in to an action in a trained cluster system in real time.
Resumo:
Structural Health Monitoring has gained wide acceptance in the recent past as a means to monitor a structure and provide an early warning of an unsafe condition using real-time data. Utilization of structurally integrated, distributed sensors to monitor the health of a structure through accurate interpretation of sensor signals and real-time data processing can greatly reduce the inspection burden. The rapid improvement of the Fiber Optic Sensor technology for strain, vibration, ultrasonic and acoustic emission measurements in recent times makes it feasible alternative to the traditional strain gauges, PVDF and conventional Piezoelectric sensors used for Non Destructive Evaluation (NDE) and Structural Health Monitoring (SHM). Optical fiber-based sensors offer advantages over conventional strain gauges, and PZT devices in terms of size, ease of embedment, immunity from electromagnetic interference (EMI) and potential for multiplexing a number of sensors. The objective of this paper is to demonstrate the acoustic wave sensing using Extrinsic Fabry-Perot Interferometric (EFPI) sensor on a GFRP composite laminates. For this purpose experiments have been carried out initially for strain measurement with Fiber Optic Sensors on GFRP laminates with intentionally introduced holes of different sizes as defects. The results obtained from these experiments are presented in this paper. Numerical modeling has been carried out to obtain the relationship between the defect size and strain.
Resumo:
The term Structural Health Monitoring has gained wide acceptance in the recent pastas a means to monitor a structure and provide an early warning of an unsafe conditionusing real-time data. Utilization of structurally integrated, distributed sensors tomonitor the health of a structure through accurate interpretation of sensor signals andreal-time data processing can greatly reduce the inspection burden. The rapidimprovement of the Fiber Bragg Grating sensor technology for strain, vibration andacoustic emission measurements in recent times make them a feasible alternatives tothe traditional strain gauges transducers and conventional Piezoelectric sensors usedfor Non Destructive Evaluation (NDE) and Structural Health Monitoring (SHM).Optical fiber-based sensors offers advantages over conventional strain gauges, PVDFfilm and PZT devices in terms of size, ease of embedment, immunity fromelectromagnetic interference(EMI) and potential for multiplexing a number ofsensors. The objective of this paper is to demonstrate the feasibility of Fiber BraggGrating sensor and compare its utility with the conventional strain gauges and PVDFfilm sensors. For this purpose experiments are being carried out in the laboratory on acomposite wing of a mini air vehicle (MAV). In this paper, the results obtained fromthese preliminary experiments are discussed.
Resumo:
Ultrasonic C-Scan is used very often to detect flaws and defects in the composite components resulted during fabrication and damages resulting from service conditions. Evaluation and characterization of defects and damages of composites require experience and good understanding of the material as they are distinctly different in composition and behavior as compared to conventional metallic materials. The failure mechanisms in composite materials are quite complex. They involve the interaction of matrix cracking, fiber matrix interface debonding, fiber pullout, fiber fracture and delamination. Generally all of them occur making the stress and failure analysis very complex. Under low-velocity impact loading delamination is observed to be a major failure mode. In composite materials the ultrasonic waves suffer high acoustic attenuation and scattering effect, thus making data interpretation difficult. However these difficulties can be overcome to a greater extent by proper selection of probe, probe parameter settings like pulse width, pulse amplitude, pulse repetition rate, delay, blanking, gain etc., and data processing which includes image processing done on the image obtained by the C-Scan.
Resumo:
用相似准数构造刺激量,根据不同撞击条件起爆的实验结果,用最大似然法在正态假设下估计刺激量的期望和方差,并对处理结果进行讨论。把炸药装药撞击起爆实验数据视同刺激量不重复的感度试验数据,用最大似然法处理炸药实验数据。结果表明炸药装药撞击起爆模拟实验相似律存在;用相似准数统计方法可以处理实验数据。
Resumo:
An apparatus of low-temperature controlling for fatigue experiments and its crack measuring system were developed and used for offshore structural steel A131 under conditions of both low temperature and random sea ice. The experimental procedures and data processing were described, and a universal random data processing software for FCP under spectrum loading was written. Many specific features of random ice-induced FCP which differed with constant amplitude FCP behaviours were proposed and temperature effect on ice-induced FCP was pointed out with an easily neglected aspect in designing for platforms in sea ice emphasized. In the end, differences of FCP behaviours between sea ice and ocean wave were presented.
Resumo:
Almost all extreme events lasting less than several weeks that significantly impact ecosystems are weather related. This review examines the response of estuarine systems to intense short-term perturbations caused by major weather events such as hurricanes. Current knowledge concerning these effects is limited to relatively few studies where hurricanes and storms impacted estuaries with established environmental monitoring programs. Freshwater inputs associated with these storms were found to initially result in increased primary productivity. When hydrographic conditions are favorable, bacterial consumption of organic matter produced by the phytoplankton blooms and deposited during the initial runoff event can contribute to significant oxygen deficits during subsequent warmer periods. Salinity stress and habitat destruction associated with freshwater inputs, as well as anoxia, adversely affect benthic populations and fish. In contrast, mobile invertebrate species such as shrimp, which have a short life cycle and the ability to migrate during the runoff event, initially benefit from the increased primary productivity and decreased abundance of fish predators. Events studied so far indicate that estuaries rebound in one to three years following major short-term perturbations. However, repeated storm events without sufficient recovery time may cause a fundamental shift in ecosystem structure (Scavia et al. 2002). This is a scenario consistent with the predicted increase in hurricanes for the east coast of the United States. More work on the response of individual species to these stresses is needed so management of commercial resources can be adjusted to allow sufficient recovery time for affected populations.
Resumo:
ENGLISH: Comparison of physical and biological environmental factors affecting the aggregation of tunas with the success of fishing by the commercial fleets, requires that catch and effort data be examined in greater detail than has been presented in these publications. Consequently, the United States Bureau of Commercial Fisheries Biological Laboratory, San Diego, to serve the needs of its program of research on causes of variations in tuna abundance, made arrangements with the Tuna Commission to summarize these catch and effort data by month, by one-degree area, by fishing vessel size-class, for the years 1951-1960 for bait boats and 1953-1960 for purse-seiners. The present paper describes the techniques employed in summarizing these data by automatic data processing methods. It also presents the catch and effort information by months, by five-degree areas and certain combinations of five-degree areas for use by fishermen, industry personnel, and research agencies. Because of space limitations and other considerations, the one-degree tabulations are not included but are available at the Tuna Commission and Bureau laboratories. SPANISH: La comparación de los factores ambientales físicos y biológicos que afectan la agrupación del atún, con el éxito obtenido en la pesca por las flotas comerciales, requiere que los datos sobre la captura y el esfuerzo sean examinados con mayor detalle de lo que han sido presentados en estas publicaciones. En consecuencia, el Laboratorio Biológico del Buró de Pesquerías Comerciales de los Estados Unidos, situado en San Diego, a fin de llenar los requisitos de su programa de investigación sobre las causas de las variaciones en la abundancia del atún, hizo arreglos con la Comisión del Atún para sumarizar esos datos sobre la captura y el esfuerzo por meses, por áreas de un grado, por clases de tamaño de las embarcaciones de pesca durante los años 1951-1960 en lo que concierne a los barcos de carnada y durante el período 1953-1960 en lo que respecta a los barcos rederos. El presente trabajo describe la técnica empleada en la sumarización de dichos datos mediante métodos automáticos de manejo de datos. También se da aquí la información sobre la captura y el esfuerzo por meses, por áreas de cinco grados y ciertas combinaciones de áreas de cinco grados para el uso de los pescadores, del personal de la industria y de las oficinas de investigación. Por falta de espacio y otras razones, las tabulaciones de las áreas de un grado no han sido incluídos en este trabajo, pero están a la disposición de quien tenga interés en los laboratorios de la Comisión del Atún y del Buró.
Resumo:
Radar services are occasionally affected by wind farms. This paper presents a comprehensive description of the effects that a wind farm may cause on the different radar services, and it compiles a review of the recent research results regarding the mitigation techniques to minimize this impact. Mitigation techniques to be applied at the wind farm and on the radar systems are described. The development of thorough impact studies before the wind farm is installed is presented as the best way to analyze in advance the potential for interference, and subsequently identify the possible solutions to allow the coexistence of wind farms and radar services.
Resumo:
The discrimination of stocks and separate reproductive units within fish species to facilitate fisheries management based on biological data has always been a challenge to fisheries biologists. We describe the use of three different molecular genetic techniques to detect genetic differences between stocks and closely related species. Direct sequencing of the mitochondrial ND3 gene describes the relationship between different aquaculture strains and natural populations of rainbow trout and revealed genetic homogeneity within the hatchery strains. Microsatellite analyses were used to explore the differences between redfish species from the genus Sebastes and to verify populations structure within S. mentella and S. marinus. This lead to an un equivocal discrimination of the species and an indication of populations structure within those species in the North Atlantic. The Amplified Fragment Length Polymorphisum (AFLP) methodology revealed genetic differences between Baltic and North Sea dap (Limanda limanda)and a possible population structure within the North Sea.
Resumo:
Fish research institutes in Europe have made considerable effort in developing rapid, objective sensory methods for evaluation of fish freshness. The Quality Index Method(QIM) has been recommended for a European initiative regarding standardisation and harmonisation of sensory evaluation of fish. QIM-schemes have been developed for various common European fish species. Research has now provided the industry with a convenient, objective and powerful tool for measuring freshness of fish kept in ice Further research is needed to evaluate the applicability of QIM for fish handled, stored and processed under different conditions. However, for progress and development of QIM it is now very important that the fish sector implements QIM in fish auctions and the quality management system of the fish processing plants.
Resumo:
Future coastal management practices require that a holistic, ecosystem management approach be adopted. Coastal ecosystems, however, present a variety of specific and unique challenges relative to open ocean systems. In particular, interactions with the seabed significantly influence the coastal ecosystem. Observing technologies must be developed and employed to incorporate seafloor interactions, processes and habitat diversity into research and management activities. An ACT Workshop on Seabed Sensor Technology was held February 1-3, 2006 in Savannah, Georgia, to summarize the current state of sensor technologies applicable to examining and monitoring the coastal seabed, including the near-bed benthic boundary layer and surface sediment layer. Workshop participants were specifically charged to identify current sensors in use, recommend improvements to these systems and to identify areas for future development and activities that would advance the use of sensor technology in the observation, monitoring and management of the coastal benthic environment. (pdf contains 23 pages)