837 resultados para Measure of time
Resumo:
NOAA’s Coral Reef Conservation program (CRCP) develops coral reef management priorities by bringing together various partners to better understand threats to coral reef ecosystems with the goal of conserving, protecting and restoring these resources. Place-based and ecosystem-based management approaches employed by CRCP require that spatially explicit information about benthic habitats and fish utilization are available to characterize coral reef ecosystems and set conservation priorities. To accomplish this, seafloor habitat mapping of coral reefs around the U.S. Virgin Islands (USVI) and Puerto Rico has been ongoing since 2004. In 2008, fishery acoustics surveys were added to NOAA survey missions in the USVI and Puerto Rico to assess fish distribution and abundance in relation to benthic habitats in high priority conservation areas. NOAA’s National Centers for Coastal Ocean Science (NCCOS) have developed fisheries acoustics survey capabilities onboard the NOAA ship Nancy Foster to complement the CRCP seafloor habitat mapping effort spearheaded by the Center for Coastal Monitoring and Assessment Biogeography Branch (CCMA-BB). The integration of these activities has evolved on the Nancy Foster over the three years summarized in this report. A strategy for improved operations and products has emerged over that time. Not only has the concurrent operation of multibeam and fisheries acoustics surveys been beneficial in terms of optimizing ship time and resources, this joint effort has advanced an integrated approach to characterizing bottom and mid-water habitats and the fishes associated with them. CCMA conducts multibeam surveys to systematically map and characterize coral reef ecosystems, resulting in products such as high resolution bathymetric maps, backscatter information, and benthic habitat classification maps. These products focus on benthic features and live bottom habitats associated with them. NCCOS Centers (the Center for Coastal Fisheries and Habitat Research and the Center for Coastal Environmental Health and Biomolecular Research) characterize coral reef ecosystems by using fisheries acoustics methods to capture biological information through the entire water column. Spatially-explicit information on marine resources derived from fisheries acoustics surveys, such as maps of fish density, supports marine spatial planning strategies and decision making by providing a biological metric for evaluating coral reef ecosystems and assessing impacts from pollution, fishing pressure, and climate change. Data from fisheries acoustics surveys address management needs by providing a measure of biomass in management areas, detecting spatial and temporal responses in distribution relative to natural and anthropogenic impacts, and identifying hotspots that support high fish abundance or fish aggregations. Fisheries acoustics surveys conducted alongside multibeam mapping efforts inherently couple water column data with information on benthic habitats and provide information on the heterogeneity of both benthic habitats and biota in the water column. Building on this information serves to inform resource managers regarding how fishes are organized around habitat structure and the scale at which these relationships are important. Where resource managers require place-based assessments regarding the location of critical habitats along with high abundances of fish, concurrent multibeam and fisheries acoustics surveys serve as an important tool for characterizing and prioritizing coral reef ecosystems. This report summarizes the evolution of fisheries acoustics surveys onboard the NOAA ship Nancy Foster from 2008 to 2010, in conjunction with multibeam data collection, aimed at characterizing benthic and mid-water habitats in high priority conservation areas around the USVI and Puerto Rico. It also serves as a resource for the continued development of consistent data products derived from acoustic surveys. By focusing on the activities of 2010, this report highlights the progress made to date and illustrates the potential application of fisheries data derived from acoustic surveys to the management of coral reef ecosystems.
Resumo:
Our analyses of observer records reveal that abundance estimates are strongly influenced by the timing of longline operations in relation to dawn and dusk and soak time— the amount of time that baited hooks are available in the water. Catch data will underestimate the total mortality of several species because hooked animals are “lost at sea.” They fall off, are removed, or escape from the hook before the longline is retrieved. For example, longline segments with soak times of 20 hours were retrieved with fewer skipjack tuna and seabirds than segments with soak times of 5 hours. The mortality of some seabird species is up to 45% higher than previously estimated. The effects of soak time and timing vary considerably between species. Soak time and exposure to dusk periods have strong positive effects on the catch rates of many species. In particular, the catch rates of most shark and billfish species increase with soak time. At the end of longline retrieval, for example, expected catch rates for broadbill swordfish are four times those at the beginning of retrieval. Survival of the animal while it is hooked on the longline appears to be an important factor determining whether it is eventually brought on board the vessel. Catch rates of species that survive being hooked (e.g. blue shark) increase with soak time. In contrast, skipjack tuna and seabirds are usually dead at the time of retrieval. Their catch rates decline with time, perhaps because scavengers can easily remove hooked animals that are dead. The results of our study have important implications for fishery management and assessments that rely on longline catch data. A reduction in soak time since longlining commenced in the 1950s has introduced a systematic bias in estimates of mortality levels and abundance. The abundance of species like seabirds has been over-estimated in recent years. Simple modifications to procedures for data collection, such as recording the number of hooks retrieved without baits, would greatly improve mortality estimates.
Resumo:
Bycatch, or the incidental catch of nontarget organisms during fi shing operations, is a major issue in U.S. shrimp trawl fisheries. Because bycatch is typically discarded at sea, total bycatch is usually estimated by extrapolating from an observed bycatch sample to the entire fleet with either mean-per-unit or ratio estimators. Using both field observations of commercial shrimp trawlers and computer simulations, I compared five methods for generating bycatch estimates that were used in past studies, a mean-per-unit estimator and four forms of the ratio estimator, respectively: 1) the mean fish catch per unit of effort, where unit effort was a proxy for sample size, 2) the mean of the individual fish to shrimp ratios, 3) the ratio of mean fish catch to mean shrimp catch, 4) the mean of the ratios of fish catch per time fished (a variable measure of effort), and 5) the ratio of mean fish catch per mean time fished. For field data, different methods used to estimate bycatch of Atlantic croaker, spot, and weakfish yielded extremely different results, with no discernible pattern in the estimates by method, geographic region, or species. Simulated fishing fleets were used to compare bycatch estimated by the fi ve methods with “actual” (simulated) bycatch. Simulations were conducted by using both normal and delta lognormal distributions of fish and shrimp and employed a range of values for several parameters, including mean catches of fish and shrimp, variability in the catches of fish and shrimp, variability in fishing effort, number of observations, and correlations between fish and shrimp catches. Results indicated that only the mean per unit estimators provided statistically unbiased estimates, while all other methods overestimated bycatch. The mean of the individual fish to shrimp ratios, the method used in the South Atlantic Bight before the 1990s, gave the most biased estimates. Because of the statistically significant two- and 3-way interactions among parameters, it is unlikely that estimates generated by one method can be converted or corrected to estimates made by another method: therefore bycatch estimates obtained with different methods should not be compared directly.
Resumo:
In this paper a recently published finite element method, which combines domain decomposition with a novel technique for solving nonlinear magnetostatic finite element problems is described. It is then shown how the method can be extended to, and optimised for, the solution of time-domain problems. © 1999 IEEE.
Resumo:
A receding horizon steering controller is presented, capable of pushing an oversteering nonlinear vehicle model to its handling limit while travelling at constant forward speed. The controller is able to optimise the vehicle path, using a computationally efficient and robust technique, so that the vehicle progression along a track is maximised as a function of time. The resultant method forms part of the solution to the motor racing objective of minimising lap time. © 2011 AACC American Automatic Control Council.
Resumo:
Biological diversity of an ecosystem is considered a reliable measure of the state of health of the ecosystem. In Uganda's large lakes, the Victoria and Kyoga, the past three decades have been characterized by profound changes in fish species composition following the introduction of the piscivorous Nile perch (Oguto-Ohwayo 1990). Over 300 haplochromine cichlid species comprising a wide range of trophic groups were lost along with a host of non-cichlid fishes which occupied virtually all available ecological niches and in the lakes (Witte 1992). A second major ecological event has been the gradual nutrient enrichment of the water bodies (eutrophication) from diffuse and point sources, while at the same time pollutants have also gained entrance into the water systems in pace with indusfrial development and human population increases in the lake basins. Eutrophication and pollution have drastically altered the physical and-chemical character of the water medium in which different fauna and flora thrive. In Lake Victoria these alterations have resulted in changes of algal species composition from pristine community dominated by chlorophytes and diatoms (Melosira etc) to one composed largely of blue-green algae or Cyanobacteria (Microcystis, Anabaena, Planktolyngbya etc) (Mugidde 1993, Hecky 1993).
Resumo:
This paper presents an efficient algorithm for robust network reconstruction of Linear Time-Invariant (LTI) systems in the presence of noise, estimation errors and unmodelled nonlinearities. The method here builds on previous work [1] on robust reconstruction to provide a practical implementation with polynomial computational complexity. Following the same experimental protocol, the algorithm obtains a set of structurally-related candidate solutions spanning every level of sparsity. We prove the existence of a magnitude bound on the noise, which if satisfied, guarantees that one of these structures is the correct solution. A problem-specific model-selection procedure then selects a single solution from this set and provides a measure of confidence in that solution. Extensive simulations quantify the expected performance for different levels of noise and show that significantly more noise can be tolerated in comparison to the original method. © 2012 IEEE.
Resumo:
We present a method for characterizing the propagation of the magnetic flux in an artificially drilled bulk high-temperature superconductor (HTS) during a pulsed-field magnetization. As the magnetic pulse penetrates the cylindrical sample, the magnetic flux density is measured simultaneously in 16 holes by means of microcoils that are placed across the median plane, i.e. at an equal distance from the top and bottom surfaces, and close to the surface of the sample. We discuss the time evolution of the magnetic flux density in the holes during a pulse and measure the time taken by the external magnetic flux to reach each hole. Our data show that the flux front moves faster in the median plane than on the surface when penetrating the sample edge; it then proceeds faster along the surface than in the bulk as it penetrates the sample further. Once the pulse is over, the trapped flux density inside the central hole is found to be about twice as large in the median plane than on the surface. This ratio is confirmed by modelling.
Resumo:
We examine theoretically the transient displacement flow and density stratification that develops within a ventilated box after two localized floor-level heat sources of unequal strengths are activated. The heat input is represented by two non-interacting turbulent axisymmetric plumes of constant buoyancy fluxes B1 and B2 > B1. The box connects to an unbounded quiescent external environment of uniform density via openings at the top and base. A theoretical model is developed to predict the time evolution of the dimensionless depths λj and mean buoyancies δj of the 'intermediate' (j = 1) and 'top' (j = 2) layers leading to steady state. The flow behaviour is classified in terms of a stratification parameter S, a dimensionless measure of the relative forcing strengths of the two buoyant layers that drive the flow. We find that dδ1/dτ α 1/λ1 and dδ2/dτ α 1/λ2, where τ is a dimensionless time. When S 1, the intermediate layer is shallow (small λ1), whereas the top layer is relatively deep (large λ2) and, in this limit, δ1 and δ2 evolve on two characteristically different time scales. This produces a time lag and gives rise to a 'thermal overshoot', during which δ1 exceeds its steady value and attains a maximum during the transients; a flow feature we refer to, in the context of a ventilated room, as 'localized overheating'. For a given source strength ratio ψ = B1/B2, we show that thermal overshoots are realized for dimensionless opening areas A < Aoh and are strongly dependent on the time history of the flow. We establish the region of {A, ψ} space where rapid development of δ1 results in δ1 > δ2, giving rise to a bulk overturning of the buoyant layers. Finally, some implications of these results, specifically to the ventilation of a room, are discussed. © Cambridge University Press 2013.
Resumo:
The human cervix is an important mechanical barrier in pregnancy which must withstand the compressive and tensile forces generated from the growing fetus. Premature cervical shortening resulting from premature cervical remodeling and alterations of cervical material properties are known to increase a woman׳s risk of preterm birth (PTB). To understand the mechanical role of the cervix during pregnancy and to potentially develop indentation techniques for in vivo diagnostics to identify women who are at risk for premature cervical remodeling and thus preterm birth, we developed a spherical indentation technique to measure the time-dependent material properties of human cervical tissue taken from patients undergoing hysterectomy. In this study we present an inverse finite element analysis (IFEA) that optimizes material parameters of a viscoelastic material model to fit the stress-relaxation response of excised tissue slices to spherical indentation. Here we detail our IFEA methodology, report compressive viscoelastic material parameters for cervical tissue slices from nonpregnant (NP) and pregnant (PG) hysterectomy patients, and report slice-by-slice data for whole cervical tissue specimens. The material parameters reported here for human cervical tissue can be used to model the compressive time-dependent behavior of the tissue within a small strain regime of 25%.
Resumo:
The process of multielectron transfer from a Na-4 cluster induced by highly charged C6+, C4+, C2+ and C+ ions is studied using the method of time-dependent density functional theory within the local density approximation combined with the use of pseudopotential. The evolution of dipole moment changes and emitted electrons in Na-4 isobtained and the time-dependent probabilities with various charges are deduced. It is shown that the Na-4 cluster is strongly ionized by C6+ and that the number of emitted electrons per atom of Na-4 is larger than that of Na-2 under the same condition. One can find that the detailed information of the emitted electrons from Na-4 is different from the same from Na-2, which is possibly related to the difference in structure between the two clusters.
Resumo:
To study the brittle-ductile transition (BDT) of polypropylene (PP)/ethylene-propylene-diene monomer (EPDM) blends induced by size, temperature, and time, the toughness of the PP/EPDM blends was investigated over wide ranges of EPDM content, temperature, and strain rate. The toughness of the blends was determined from the tensile fracture energy of the side-edge notched samples. The concept of interparticle distance (ID) was introduced into this study to probe the size effect on the BDT of PP/EPDM blends, whereas the effect of time corresponded to that of strain rate. The BDT induced by size, temperature, and time was observed in the fracture energy versus ID, temperature, and strain rate. The critical BDT temperatures for various EPDM contents at different initial strain rates were obtained from these transitions. The critical interparticle distance (IDc) increased nonlinearly with increasing temperature, and when the initial strain rate was lower, the IDc was larger. Moreover, the variation of the reciprocal of the initial strain rate with the reciprocal of temperature followed different straight lines for various EPDM contents. These straight lines were with the same slope.
Resumo:
Influences of seven organic modifiers, including urea, methanol (MeOH), dioxane (DIO), tetrahydrofuran (THF), acetonitrile (ACN), 1-propanol (1-PrOH) and 2-propanol (2-PrOH), on the solute retention and the electrokinetic migrations in micellar electrokinetic capillary chromatography (MEKC) are investigated with sodium dodecyl sulfate (SDS) micelle as pseudostationary phase. It is observed that in the limited concentration ranges used in the MEKC systems the effect of organic modifier concentration on the retention can be described by the equation logk'=logk'(w)-SC for most binary aqueous-organic buffer, but deviations from this retention equation are observed at ACN and particularly THF as organic modifiers. With parameter S as a measure of the elutropic strength, the elutropic strength of the organic modifiers is found to follow a general order urea
Resumo:
A fundamental problem in artificial intelligence is obtaining coherent behavior in rule-based problem solving systems. A good quantitative measure of coherence is time behavior; a system that never, in retrospect, applied a rule needlessly is certainly coherent; a system suffering from combinatorial blowup is certainly behaving incoherently. This report describes a rule-based problem solving system for automatically writing and improving numerical computer programs from specifications. The specifications are in terms of "constraints" among inputs and outputs. The system has solved program synthesis problems involving systems of equations, determining that methods of successive approximation converge, transforming recursion to iteration, and manipulating power series (using differing organizations, control structures, and argument-passing techniques).
Resumo:
The vehicle navigation problem studied in Bell (2009) is revisited and a time-dependent reverse Hyperstar algorithm is presented. This minimises the expected time of arrival at the destination, and all intermediate nodes, where expectation is based on a pessimistic (or risk-averse) view of unknown link delays. This may also be regarded as a hyperpath version of the Chabini and Lan (2002) algorithm, which itself is a time-dependent A* algorithm. Links are assigned undelayed travel times and maximum delays, both of which are potentially functions of the time of arrival at the respective link. The driver seeks probabilities for link use that minimise his/her maximum exposure to delay on the approach to each node, leading to the determination of the pessimistic expected time of arrival. Since the context considered is vehicle navigation where the driver is not making repeated trips, the probability of link use may be interpreted as a measure of link attractiveness, so a link with a zero probability of use is unattractive while a link with a probability of use equal to one will have no attractive alternatives. A solution algorithm is presented and proven to solve the problem provided the node potentials are feasible and a FIFO condition applies for undelayed link travel times. The paper concludes with a numerical example.