849 resultados para Integración of methods
Resumo:
Based on the theory of the pumping well test, the transient injection well test was suggested in this paper. The design method and the scope of application are discussed in detail. The mathematical models are developed for the short-time and long-time transient injection test respectively. A double logarithm type curve matching method was introduced for analyzing the field transient injection test data. A set of methods for the transient injection test design, experiment performance and data analysis were established. Some field tests were analyzed, and the results show that the test model and method are suitable for the transient injection test and can be used to deal with the real engineering problems.
Resumo:
Poster presentado 10th Symposium on Aquatic Microbial Ecology (SAME10) september 2-7 2007, Faro
Resumo:
A means of assessing the effectiveness of methods used in the numerical solution of various linear ill-posed problems is outlined. Two methods: Tikhonov' s method of regularization and the quasireversibility method of Lattès and Lions are appraised from this point of view.
In the former method, Tikhonov provides a useful means for incorporating a constraint into numerical algorithms. The analysis suggests that the approach can be generalized to embody constraints other than those employed by Tikhonov. This is effected and the general "T-method" is the result.
A T-method is used on an extended version of the backwards heat equation with spatially variable coefficients. Numerical computations based upon it are performed.
The statistical method developed by Franklin is shown to have an interpretation as a T-method. This interpretation, although somewhat loose, does explain some empirical convergence properties which are difficult to pin down via a purely statistical argument.
Resumo:
Planetary atmospheres exist in a seemingly endless variety of physical and chemical environments. There are an equally diverse number of methods by which we can study and characterize atmospheric composition. In order to better understand the fundamental chemistry and physical processes underlying all planetary atmospheres, my research of the past four years has focused on two distinct topics. First, I focused on the data analysis and spectral retrieval of observations obtained by the Ultraviolet Imaging Spectrograph (UVIS) instrument onboard the Cassini spacecraft while in orbit around Saturn. These observations consisted of stellar occultation measurements of Titan's upper atmosphere, probing the chemical composition in the region 300 to 1500 km above Titan's surface. I examined the relative abundances of Titan's two most prevalent chemical species, nitrogen and methane. I also focused on the aerosols that are formed through chemistry involving these two major species, and determined the vertical profiles of aerosol particles as a function of time and latitude. Moving beyond our own solar system, my second topic of investigation involved analysis of infra-red light curves from the Spitzer space telescope, obtained as it measured the light from stars hosting planets of their own. I focused on both transit and eclipse modeling during Spitzer data reduction and analysis. In my initial work, I utilized the data to search for transits of planets a few Earth masses in size. In more recent research, I analyzed secondary eclipses of three exoplanets and constrained the range of possible temperatures and compositions of their atmospheres.
Resumo:
Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe.
Initial phase of LIGO started in 2002, and since then data was collected during six science runs. Instrument sensitivity was improving from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010.
In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted till 2014.
This thesis describes results of commissioning work done at LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers.
The first part of this thesis is devoted to the description of methods for bringing interferometer to the linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details.
Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in real time. Sensitivity analysis was done to understand and eliminate noise sources of the instrument.
Coupling of noise sources to gravitational wave channel can be reduced if robust feedforward and optimal feedback control loops are implemented. The last part of this thesis describes static and adaptive feedforward noise cancellation techniques applied to Advanced LIGO interferometers and tested at the 40m prototype. Applications of optimal time domain feedback control techniques and estimators to aLIGO control loops are also discussed.
Commissioning work is still ongoing at the sites. First science run of advanced LIGO is planned for September 2015 and will last for 3-4 months. This run will be followed by a set of small instrument upgrades that will be installed on a time scale of few months. Second science run will start in spring 2016 and last for about 6 months. Since current sensitivity of advanced LIGO is already more than factor of 3 higher compared to initial detectors and keeps improving on a monthly basis, upcoming science runs have a good chance for the first direct detection of gravitational waves.
Resumo:
A study is made of the accuracy of electronic digital computer calculations of ground displacement and response spectra from strong-motion earthquake accelerograms. This involves an investigation of methods of the preparatory reduction of accelerograms into a form useful for the digital computation and of the accuracy of subsequent digital calculations. Various checks are made for both the ground displacement and response spectra results, and it is concluded that the main errors are those involved in digitizing the original record. Differences resulting from various investigators digitizing the same experimental record may become as large as 100% of the maximum computed ground displacements. The spread of the results of ground displacement calculations is greater than that of the response spectra calculations. Standardized methods of adjustment and calculation are recommended, to minimize such errors.
Studies are made of the spread of response spectral values about their mean. The distribution is investigated experimentally by Monte Carlo techniques using an electric analog system with white noise excitation, and histograms are presented indicating the dependence of the distribution on the damping and period of the structure. Approximate distributions are obtained analytically by confirming and extending existing results with accurate digital computer calculations. A comparison of the experimental and analytical approaches indicates good agreement for low damping values where the approximations are valid. A family of distribution curves to be used in conjunction with existing average spectra is presented. The combination of analog and digital computations used with Monte Carlo techniques is a promising approach to the statistical problems of earthquake engineering.
Methods of analysis of very small earthquake ground motion records obtained simultaneously at different sites are discussed. The advantages of Fourier spectrum analysis for certain types of studies and methods of calculation of Fourier spectra are presented. The digitizing and analysis of several earthquake records is described and checks are made of the dependence of results on digitizing procedure, earthquake duration and integration step length. Possible dangers of a direct ratio comparison of Fourier spectra curves are pointed out and the necessity for some type of smoothing procedure before comparison is established. A standard method of analysis for the study of comparative ground motion at different sites is recommended.
Resumo:
I. The attenuation of sound due to particles suspended in a gas was first calculated by Sewell and later by Epstein in their classical works on the propagation of sound in a two-phase medium. In their work, and in more recent works which include calculations of sound dispersion, the calculations were made for systems in which there was no mass transfer between the two phases. In the present work, mass transfer between phases is included in the calculations.
The attenuation and dispersion of sound in a two-phase condensing medium are calculated as functions of frequency. The medium in which the sound propagates consists of a gaseous phase, a mixture of inert gas and condensable vapor, which contains condensable liquid droplets. The droplets, which interact with the gaseous phase through the interchange of momentum, energy, and mass (through evaporation and condensation), are treated from the continuum viewpoint. Limiting cases, for flow either frozen or in equilibrium with respect to the various exchange processes, help demonstrate the effects of mass transfer between phases. Included in the calculation is the effect of thermal relaxation within droplets. Pressure relaxation between the two phases is examined, but is not included as a contributing factor because it is of interest only at much higher frequencies than the other relaxation processes. The results for a system typical of sodium droplets in sodium vapor are compared to calculations in which there is no mass exchange between phases. It is found that the maximum attenuation is about 25 per cent greater and occurs at about one-half the frequency for the case which includes mass transfer, and that the dispersion at low frequencies is about 35 per cent greater. Results for different values of latent heat are compared.
II. In the flow of a gas-particle mixture through a nozzle, a normal shock may exist in the diverging section of the nozzle. In Marble’s calculation for a shock in a constant area duct, the shock was described as a usual gas-dynamic shock followed by a relaxation zone in which the gas and particles return to equilibrium. The thickness of this zone, which is the total shock thickness in the gas-particle mixture, is of the order of the relaxation distance for a particle in the gas. In a nozzle, the area may change significantly over this relaxation zone so that the solution for a constant area duct is no longer adequate to describe the flow. In the present work, an asymptotic solution, which accounts for the area change, is obtained for the flow of a gas-particle mixture downstream of the shock in a nozzle, under the assumption of small slip between the particles and gas. This amounts to the assumption that the shock thickness is small compared with the length of the nozzle. The shock solution, valid in the region near the shock, is matched to the well known small-slip solution, which is valid in the flow downstream of the shock, to obtain a composite solution valid for the entire flow region. The solution is applied to a conical nozzle. A discussion of methods of finding the location of a shock in a nozzle is included.
Resumo:
At the present time hydrobiological indicators are widely used for the control of surface water quality. Results of the applying of methods suggested at the 1st Soviet-American seminar (1975), development of improved methods and estimation of their usefulness for various conditions are presented in this report. Among the criteria permitting an estimation of the degree and character of changes in water quality and their connection with the functioning of river ecosystems in general, the biological tests of natural waters appears to be the most universal one and is being carried out in two main directions — ecological and physiological. This study summarises approaches in both directions.
Resumo:
The lack of information concerning the preservation of ovarian material of fish species inhibits standardization of methods for determining fecundity and measuring oocytes. The effects of four preservatives (10% phosphate-buffered formalin, modified Gilson’s solution, 70% ethanol, and freezing) on ovarian material weight and oocyte size were quantified for prespawning Atlantic cod (Gadus morhua), haddock (Melanogrammus aeglefinus), and American plaice (Hippoglossoides platessoides). Effects of preservation were similar between Atlantic cod and haddock but different between Atlantic cod and American plaice for nearly all comparisons. Although all treatments affected the weight of ovarian material, freezing caused the most change and formalin caused the least. Such signif icant species-specific effects should be quantified in the calculation of life history characteristics, such as fecundity, to minimize error. This is one of few studies dedicated to evaluating the effects of preservation on oocytes and ovarian material and is the first to evaluate multiple preservatives on species.
Resumo:
As compared to crops and livestock, the genetic enhancement of fish is in its infancy. While significant progress has been achieved in the genetic improvement of temperate fish such as salmonids, no efforts were made until the late 1980s for the genetic improvement of tropical finfish, which account for about 90 percent of global aquaculture production. This paper traces the history of the Genetic Improvement of Farmed Tilapia (GIFT) project initiated in 1988 by the WorldFish Center and its partners for the development of methods for genetic enhancement of tropical finfish using Nile tilapia (Oreochromis niloticus) as a test species. It also describes the impacts of the project on the adoption of these methods for other species and the dissemination of improved breeds in several countries in Asia and the Pacific.
Resumo:
The Age and Growth Program at the Alaska Fisheries Science Center is tasked with providing age data in order to improve the basic understanding of the ecology and fisheries dynamics of Alaskan fish species. The primary focus of the Age and Growth Program is to estimate ages from otoliths and other calcified structures for age-structured modeling of commercially exploited stocks; however, the program has recently expanded its interests to include numerous studies on topics ranging from age estimate validation to the growth and life history of non-target species. Because so many applications rely upon age data and particularly upon assurances as to their accuracy and precision, the Age and Growth Program has developed this practical guide to document the age determination of key groundfish species from Alaskan waters. The main objective of this manual is to describe techniques specific to the age determination of commercially and ecologically important species studied by the Age and Growth Program. The manual also provides general background information on otolith morphology, dissection, and preparation, as well as descriptions of methods used to measure precision and accuracy of age estimates. This manual is intended not only as a reference for age readers at the AFSC and other laboratories, but also to give insight into the quality of age estimates to scientists who routinely use such data.
Resumo:
Understanding the regulatory mechanisms that are responsible for an organism's response to environmental change is an important issue in molecular biology. A first and important step towards this goal is to detect genes whose expression levels are affected by altered external conditions. A range of methods to test for differential gene expression, both in static as well as in time-course experiments, have been proposed. While these tests answer the question whether a gene is differentially expressed, they do not explicitly address the question when a gene is differentially expressed, although this information may provide insights into the course and causal structure of regulatory programs. In this article, we propose a two-sample test for identifying intervals of differential gene expression in microarray time series. Our approach is based on Gaussian process regression, can deal with arbitrary numbers of replicates, and is robust with respect to outliers. We apply our algorithm to study the response of Arabidopsis thaliana genes to an infection by a fungal pathogen using a microarray time series dataset covering 30,336 gene probes at 24 observed time points. In classification experiments, our test compares favorably with existing methods and provides additional insights into time-dependent differential expression.
Resumo:
The primary objective of this project, “the Assessment of Existing Information on Atlantic Coastal Fish Habitat”, is to inform conservation planning for the Atlantic Coastal Fish Habitat Partnership (ACFHP). ACFHP is recognized as a Partnership by the National Fish Habitat Action Plan (NFHAP), whose overall mission is to protect, restore, and enhance the nation’s fish and aquatic communities through partnerships that foster fish habitat conservation. This project is a cooperative effort of NOAA/NOS Center for Coastal Monitoring and Assessment (CCMA) Biogeography Branch and ACFHP. The Assessment includes three components; 1. a representative bibliographic and assessment database, 2. a Geographical Information System (GIS) spatial framework, and 3. a summary document with description of methods, analyses of habitat assessment information, and recommendations for further work. The spatial bibliography was created by linking the bibliographic table developed in Microsoft Excel and exported to SQL Server, with the spatial framework developed in ArcGIS and exported to GoogleMaps. The bibliography is a comprehensive, searchable database of over 500 selected documents and data sources on Atlantic coastal fish species and habitats. Key information captured for each entry includes basic bibliographic data, spatial footprint (e.g. waterbody or watershed), species and habitats covered, and electronic availability. Information on habitat condition indicators, threats, and conservation recommendations are extracted from each entry and recorded in a separate linked table. The spatial framework is a functional digital map based on polygon layers of watersheds, estuarine and marine waterbodies derived from NOAA’s Coastal Assessment Framework, MMS/NOAA’s Multipurpose Marine Cadastre, and other sources, providing spatial reference for all of the documents cited in the bibliography. Together, the bibliography and assessment tables and their spatial framework provide a powerful tool to query and assess available information through a publicly available web interface. They were designed to support the development of priorities for ACFHP’s conservation efforts within a geographic area extending from Maine to Florida, and from coastal watersheds seaward to the edge of the continental shelf. The Atlantic Coastal Fish Habitat Partnership has made initial use of the Assessment of Existing Information. Though it has not yet applied the AEI in a systematic or structured manner, it expects to find further uses as the draft conservation strategic plan is refined, and as regional action plans are developed. It also provides a means to move beyond an “assessment of existing information” towards an “assessment of fish habitat”, and is being applied towards the National Fish Habitat Action Plan (NFHAP) 2010 Assessment. Beyond the scope of the current project, there may be application to broader initiatives such as Integrated Ecosystem Assessments (IEAs), Ecosystem Based Management (EBM), and Marine Spatial Planning (MSP).
Resumo:
Samples of the commercially and recreationally important West Australian dhufish (Glaucosoma hebraicum) were obtained from the lower west coast of Australia by a variety of methods. Fish <300 mm TL were caught over flat, hard substrata and low-lying limestone reefs, whereas larger fish were caught over larger limestone and coral reef formations. Maximum total lengths, weights, and ages were 981 mm, 15.3 kg, and 39 years, respectively, for females and 1120 mm, 23.2 kg, and 41 years, respectively, for males. The von Bertalanffy growth curves for females and males were significantly different. The values for L∞, k, and t0 in the von Bertalanffy growth equations were 929 mm, 0.111/year, and –0.141 years, respectively, for females, and 1025 mm, 0.111/year, and –0.052 years, respectively, for males. Preliminary estimates of total mortality indicated that G. hebraicum is now subjected to a level of fishing pressure that must be of concern to fishery managers. Glaucosoma hebraicum, which spawns between November and April and predominantly between December and March, breeds at a wide range of depths and is a multiple spawner. The L50’s for females and males at first maturity, i.e. 301 and 320 mm, respectively, were attained by about the end of the third year of life and are well below the minimum legal length (MLL) of 500 mm. Because females and males did not reach the MLL until the end of their seventh and sixth years of life, respectively, they would have had, on average, the opportunity of spawning during four and three spawning seasons, respectively, before they reached the MLL. However, because G. hebraicum caught in water depths >40 m typically die upon release, a MLL is of limited use for conserving this species. Alternative approaches, such as restricting fishing activity in highly fished areas, reducing daily bag limits for recreational fishermen, introducing quotas or revising specific details of certain commercial hand-line licences (or doing both) are more likely to provide effective conservation measures.
Resumo:
There is increasing adoption of computer-based tools to support the product development process. Tolls include computer-aided design, computer-aided manufacture, systems engineering and product data management systems. The fact that companies choose to invest in tools might be regarded as evidence that tools, in aggregate, are perceived to possess business value through their application to engineering activities. Yet the ways in which value accrues from tool technology are poorly understood.
This report records the proceedings of an international workshop during which some novel approaches to improving our understanding of this problem of tool valuation were presented and debated. The value of methods and processes were also discussed. The workshop brought together British, Dutch, German and Italian researchers. The presenters included speakers from industry and academia (the University of Cambridge, the University of Magdeburg and the Politechnico de Torino)
The work presented showed great variety. Research methods include case studies, questionnaires, statistical analysis, semi-structured interviews, deduction, inductive reasoning, the recording of anecdotes and analogies. The presentations drew on financial investment theory, the industrial experience of workshop participants, discussions with students developing tools, modern economic theories and speculation on the effects of company capabilities.