930 resultados para data availability
Resumo:
The primary objective of this project, “the Assessment of Existing Information on Atlantic Coastal Fish Habitat”, is to inform conservation planning for the Atlantic Coastal Fish Habitat Partnership (ACFHP). ACFHP is recognized as a Partnership by the National Fish Habitat Action Plan (NFHAP), whose overall mission is to protect, restore, and enhance the nation’s fish and aquatic communities through partnerships that foster fish habitat conservation. This project is a cooperative effort of NOAA/NOS Center for Coastal Monitoring and Assessment (CCMA) Biogeography Branch and ACFHP. The Assessment includes three components; 1. a representative bibliographic and assessment database, 2. a Geographical Information System (GIS) spatial framework, and 3. a summary document with description of methods, analyses of habitat assessment information, and recommendations for further work. The spatial bibliography was created by linking the bibliographic table developed in Microsoft Excel and exported to SQL Server, with the spatial framework developed in ArcGIS and exported to GoogleMaps. The bibliography is a comprehensive, searchable database of over 500 selected documents and data sources on Atlantic coastal fish species and habitats. Key information captured for each entry includes basic bibliographic data, spatial footprint (e.g. waterbody or watershed), species and habitats covered, and electronic availability. Information on habitat condition indicators, threats, and conservation recommendations are extracted from each entry and recorded in a separate linked table. The spatial framework is a functional digital map based on polygon layers of watersheds, estuarine and marine waterbodies derived from NOAA’s Coastal Assessment Framework, MMS/NOAA’s Multipurpose Marine Cadastre, and other sources, providing spatial reference for all of the documents cited in the bibliography. Together, the bibliography and assessment tables and their spatial framework provide a powerful tool to query and assess available information through a publicly available web interface. They were designed to support the development of priorities for ACFHP’s conservation efforts within a geographic area extending from Maine to Florida, and from coastal watersheds seaward to the edge of the continental shelf. The Atlantic Coastal Fish Habitat Partnership has made initial use of the Assessment of Existing Information. Though it has not yet applied the AEI in a systematic or structured manner, it expects to find further uses as the draft conservation strategic plan is refined, and as regional action plans are developed. It also provides a means to move beyond an “assessment of existing information” towards an “assessment of fish habitat”, and is being applied towards the National Fish Habitat Action Plan (NFHAP) 2010 Assessment. Beyond the scope of the current project, there may be application to broader initiatives such as Integrated Ecosystem Assessments (IEAs), Ecosystem Based Management (EBM), and Marine Spatial Planning (MSP).
Resumo:
Catch and effort data and some biological characteristics of the deep water spiny lobster P. delagoae collected between August 1980 and December 1981, are presented. The work was conducted on board a commercial vessel off the coast of Mozambique.
Resumo:
Clare, A. and King R.D. (2002) Machine learning of functional class from phenotype data. Bioinformatics 18(1) 160-166
Resumo:
Many Web applications walk the thin line between the need for dynamic data and the need to meet user performance expectations. In environments where funds are not available to constantly upgrade hardware inline with user demand, alternative approaches need to be considered. This paper introduces a ‘Data farming’ model whereby dynamic data, which is ‘grown’ in operational applications, is ‘harvested’ and ‘packaged’ for various consumer markets. Like any well managed agricultural operation, crops are harvested according to historical and perceived demand as inferred by a self-optimising process. This approach aims to make enhanced use of available resources through better utlilisation of system downtime - thereby improving application performance and increasing the availability of key business data.
Resumo:
This paper examines the influence of exit availability on evacuation time for a narrow body aircraft under certification trial conditions using computer simulation. A narrow body aircraft which has previously passed the certification trial is used as the test configuration. While maintaining the certification requirement of 50% of the available exits, six different exit configurations are examined. These include the standard certification configuration (one exit from each exit pair) and five other exit configurations based on commonly occurring exit combinations found in accidents. These configurations are based on data derived from the AASK database and the evacuation simulations are performed using the airEXODUS evacuation simulation software. The results show that the certification practice of using half the available exits predominately down one side of the aircraft is neither statistically relevant nor challenging. For the aircraft cabin layout examined, the exit configuration used in certification trial produces the shortest egress times. Furthermore, three of the six exit combinations investigated result in predicted egress times in excess of 90 seconds, suggesting that the aircraft would not satisfy the certification requirement under these conditions.
Resumo:
This paper examines the influence of exit availability on evacuation time for narrow body aircraft under certification trial conditions using computer simulation. A narrow body aircraft which has previously passed the certification trial is used as the test configuration. While maintaining the certification requirement of 50% of the available exits, six different configurations are examined. These include the standard certification and five other exit configurations based on commonly occurring exit combinations found in accidents. These configurations are based on data derived from the AASK database and the evacuation simulations are performed using the airEXODUS evacuation software. The results show that the certification practise of using half of the available exits predominately down one side of the aircraft is neither statistically relevant nor challenging. For the aircraft cabin layout examined, the exit configuration used in certification trial produces the shortest egress times. Furthermore, three of the six exit combinations investigated result in predicted egress times in excess of 90 seconds, suggesting that the aircraft would not satisfy the certification requirement under these conditions.
Resumo:
The Continuous Plankton Recorder has been deployed on a seasonal basis in the north Pacific since 2000, accumulating a database of abundance measurements for over 290 planktonic taxa in over 3,500 processed samples. There is an additional archive of over 10,000 samples available for further analyses. Exxon Valdez Oil Spill Trustee Council financial support has contributed to about half of this tally, through four projects funded since 2002. Time series of zooplankton variables for sub-regions of the survey area are presented together with abstracts of eight papers published using data from these projects. The time series covers a period when the dominant climate signal in the north Pacific, the Pacific Decadal Oscillation (PDO), switched with unusual frequency between warm/positive states (pre-1999 and 2003-2006) and cool/negative states (1999-2002 and 2007). The CPR data suggest that cool negative years show higher biomass on the shelf and lower biomass in the open ocean, while the reverse is true in warm (PDO positive) years with lower shelf biomass (except 2005) and higher oceanic biomass. In addition, there was a delay in plankton increase on the Alaskan shelf in the colder spring of 2007, compared to the warmer springs of the preceding years. In warm years, smaller species of copepods which lack lipid reserves are also more common. Availability of the zooplankton prey to higher trophic levels (including those that society values highly) is therefore dependent on the timing of increase and peak abundance, ease of capture and nutritional value. Previously published studies using these data highlight the wide-ranging applicability of CPR data and include collaborative studies on; phenology in the key copepod species Neocalanus plumchrus, descriptions of distributions of decapod larvae and euphausiid species, the effects of hydrographic features such as mesoscale eddies and the North Pacific Current on plankton populations and a molecularbased investigation of macro-scale population structure in N. cristatus. The future funding situation is uncertain but the value of the data and studies so far accumulated is considerable and sets a strong foundation for further studies on plankton dynamics and interactions with higher trophic levels in the northern Gulf of Alaska.
Resumo:
It has been hypothesized that changes in zooplankton community structure over the past four decades led to reduced growth and survival of prerecruit Atlantic cod (Gadus morhua) and that this was a key factor underlying poor year classes, contributing to stock collapse, and inhibiting the recovery of stocks around the UK. To evaluate whether observed changes in plankton abundance, species composition and temperature could have led to periods of poorer growth of cod larvae, we explored the effect of prey availability and temperature on early larval growth using an empirical trophodynamic model. Prey availability was parameterized using species abundance data from the Continuous Plankton Recorder. Our model suggests that the observed changes in plankton community structure in the North Sea may have had less impact on cod larval growth, at least for the first 40 days following hatching, than previously suggested. At least in the short term, environmental and prey conditions should be able to sustain growth of cod larvae and environmental changes acting on this early life stage should not limit stock recovery.
Resumo:
The Continuous Plankton Recorder (CPR) dataset on fish larvae has an extensive spatio-temporal coverage that allows the responses of fish populations to past changes in climate variability, including abrupt changes such as regime shifts, to be investigated. The newly available dataset offers a unique opportunity to investigate long-term changes over decadal scales in the abundance and distribution of fish larvae in relation to physical and biological factors. A principal component analysis (PCA) using 7 biotic and abiotic parameters is applied to investigate the impact of environmental changes in the North Sea on 5 selected taxa of fish larvae during the period 1960 to 2004. The analysis revealed 4 periods of time (1960–1976; 1977–1982; 1983–1996; 1997–2004) reflecting 3 different ecosystem states. The larvae of clupeids, sandeels, dab and gadoids seemed to be affected mainly by changes in the plankton ecosystem, while the larvae of migratory species such as Atlantic mackerel responded more to hydrographic changes. Climate variability seems more likely to influence fish populations through bottom-up control via a cascading effect from changes in the North Atlantic Oscillation (NAO) impacting on the hydro dynamic features of the North Sea, in turn impacting on the plankton available as prey for fish larvae. The responses and adaptability of fish larvae to changing environmental conditions, parti cularly to changes in prey availability, are complex and species-specific. This complexity is enhanced with fishing effects interacting with climate effects and this study supports furthering our under - standing of such interactions before attempting to predict how fish populations respond to climate variability
Resumo:
Recent experimental neutron diffraction data and ab initio molecular dynamics simulation of the ionic liquid dimethylimidazolium chloride ([dmim]Cl) have provided a structural description of the system at the molecular level. However, partial radial distribution functions calculated from the latter, when compared to previous classical simulation results, highlight some limitations in the structural description offered by force fieldbased simulations. With the availability of ab initio data it is possible to improve the classical description of [dmim]Cl by using the force matching approach, and the strategy for fitting complex force fields in their original functional form is discussed. A self-consistent optimization method for the generation of classical potentials of general functional form is presented and applied, and a force field that better reproduces the observed first principles forces is obtained. When used in simulation, it predicts structural data which reproduces more faithfully that observed in the ab initio studies. Some possible refinements to the technique, its application, and the general suitability of common potential energy functions used within many ionic liquid force fields are discussed.
Testing the stability of the benefit transfer function for discrete choice contingent valuation data
Resumo:
This paper examines the stability of the benefit transfer function across 42 recreational forests in the British Isles. A working definition of reliable function transfer is Put forward, and a suitable statistical test is provided. A novel split sample method is used to test the sensitivity of the models' log-likelihood values to the removal of contingent valuation (CV) responses collected at individual forest sites, We find that a stable function improves Our measure of transfer reliability, but not by much. We conclude that, in empirical Studies on transferability, considerations of function stability are secondary to the availability and quality of site attribute data. Modellers' can study the advantages of transfer function stability vis-a-vis the value of additional information on recreation site attributes. (c) 2008 Elsevier GmbH. All rights reserved.
Resumo:
Flutter prediction as currently practiced is usually deterministic, with a single structural model used to represent an aircraft. By using interval analysis to take into account structural variability, recent work has demonstrated that small changes in the structure can lead to very large changes in the altitude at which
utter occurs (Marques, Badcock, et al., J. Aircraft, 2010). In this follow-up work we examine the same phenomenon using probabilistic collocation (PC), an uncertainty quantification technique which can eficiently propagate multivariate stochastic input through a simulation code,
in this case an eigenvalue-based fluid-structure stability code. The resulting analysis predicts the consequences of an uncertain structure on incidence of
utter in probabilistic terms { information that could be useful in planning
flight-tests and assessing the risk of structural failure. The uncertainty in
utter altitude is confirmed to be substantial. Assuming that the structural uncertainty represents a epistemic uncertainty regarding the
structure, it may be reduced with the availability of additional information { for example aeroelastic response data from a flight-test. Such data is used to update the structural uncertainty using Bayes' theorem. The consequent
utter uncertainty is significantly reduced across the entire Mach number range.
Resumo:
Recent years have witnessed an incredibly increasing interest in the topic of incremental learning. Unlike conventional machine learning situations, data flow targeted by incremental learning becomes available continuously over time. Accordingly, it is desirable to be able to abandon the traditional assumption of the availability of representative training data during the training period to develop decision boundaries. Under scenarios of continuous data flow, the challenge is how to transform the vast amount of stream raw data into information and knowledge representation, and accumulate experience over time to support future decision-making process. In this paper, we propose a general adaptive incremental learning framework named ADAIN that is capable of learning from continuous raw data, accumulating experience over time, and using such knowledge to improve future learning and prediction performance. Detailed system level architecture and design strategies are presented in this paper. Simulation results over several real-world data sets are used to validate the effectiveness of this method.
Resumo:
EUROCHIP (European Cancer Health Indicators Project) focuses on understanding inequalities in the cancer burden, care and survival by the indicators "stage at diagnosis," "cancer treatment delay" and "compliance with cancer guidelines" as the most important indicators. Our study aims at providing insight in whether cancer registries collect well-defined variables to determine these indicators in a comparative way. Eighty-six general European population-based cancer registries (PBCR) from 32 countries responded to the questionnaire, which was developed by EUROCHIP in collaboration with ENCR (European Network of Cancer Registries) and EUROCOURSE. Only 15% of all the PBCR in EU had all three indicators available. The indicator "stage at diagnosis" was gathered for at least one cancer site by 81% (using TNM in 39%). Variables for the indicator "cancer treatment delay" were collected by 37%. Availability of type of treatment (30%), surgery date (36%), starting date of radiotherapy (26%) and starting date of chemotherapy (23%) resulted in 15% of the PBCRs to be able to gather the indicator "compliance to guidelines". Lack of data source access and qualified staff were the major reasons for not collecting all the variables. In conclusion, based on self-reporting, a few of the participating PBCRs had data available which could be used for clinical audits, evaluation of cancer care projects, survival and for monitoring national cancer control strategies. Extra efforts should be made to improve this very efficient tool to compare cancer burden and the effects of the national cancer plans over Europe and to learn from each other. © 2012 UICC.
Resumo:
Objective: Several surveillance definitions of influenza-like illness (ILI) have been proposed, based on the presence of symptoms. Symptom data can be obtained from patients, medical records, or both. Past research has found that agreements between health record data and self-report are variable depending on the specific symptom. Therefore, we aimed to explore the implications of using data on influenza symptoms extracted from medical records, similar data collected prospectively from outpatients, and the combined data from both sources as predictors of laboratory-confirmed influenza. Methods: Using data from the Hutterite Influenza Prevention Study, we calculated: 1) the sensitivity, specificity and predictive values of individual symptoms within surveillance definitions; 2) how frequently surveillance definitions correlated to laboratory-confirmed influenza; and 3) the predictive value of surveillance definitions. Results: Of the 176 participants with reports from participants and medical records, 142 (81%) were tested for influenza and 37 (26%) were PCR positive for influenza. Fever (alone) and fever combined with cough and/or sore throat were highly correlated with being PCR positive for influenza for all data sources. ILI surveillance definitions, based on symptom data from medical records only or from both medical records and self-report, were better predictors of laboratory-confirmed influenza with higher odds ratios and positive predictive values. Discussion: The choice of data source to determine ILI will depend on the patient population, outcome of interest, availability of data source, and use for clinical decision making, research, or surveillance. © Canadian Public Health Association, 2012.