973 resultados para cost prediction
Resumo:
As defined, the modeling procedure is quite broad. For example, the chosen compartments may contain a single organism, a population of organisms, or an ensemble of populations. A population compartment, in turn, could be homogeneous or possess structure in size or age. Likewise, the mathematical statements may be deterministic or probabilistic in nature, linear or nonlinear, autonomous or able to possess memory. Examples of all types appear in the literature. In practice, however, ecosystem modelers have focused upon particular types of model constructions. Most analyses seem to treat compartments which are nonsegregated (populations or trophic levels) and homogeneous. The accompanying mathematics is, for the most part, deterministic and autonomous. Despite the enormous effort which has gone into such ecosystem modeling, there remains a paucity of models which meets the rigorous &! validation criteria which might be applied to a model of a mechanical system. Most ecosystem models are short on prediction ability. Even some classical examples, such as the Lotka-Volterra predator-prey scheme, have not spawned validated examples.
Resumo:
Data have been collected on fisheries catch and effort trends since the latter half of the 1800s. With current trends in declining stocks and stricter management regimes, data need to be collected and analyzed over shorter periods and at finer spatial resolution than in the past. New methods of electronic reporting may reduce the lag time in data collection and provide more accurate spatial resolution. In this study I evaluated the differences between fish dealer and vessel reporting systems for federal fisheries in the US New England and Mid-Atlantic areas. Using data on landing date, report date, gear used, port landed, number of hauls, number of fish sampled and species quotas from available catch and effort records I compared dealer and vessel electronically collected data against paper collected dealer and vessel data to determine if electronically collected data are timelier and more accurate. To determine if vessel or dealer electronic reporting is more useful for management, I determined differences in timeliness and accuracy between vessel and dealer electronic reports. I also compared the cost and efficiency of these new methods with less technology intensive reporting methods using available cost data and surveys of seafood dealers for cost information. Using this information I identified potentially unnecessary duplication of effort and identified applications in ecosystem-based fisheries management. This information can be used to guide the decisions of fisheries managers in the United States and other countries that are attempting to identify appropriate fisheries reporting methods for the management regimes under consideration. (PDF contains 370 pages)
Resumo:
The recent application of large-eddy simulation (LES) to particle-laden turbulence requires that the LES with a subgrid scale (SGS) model could accurately predict particle distributions. Usually, a SGS particle model is used to recover the small-scale structures of velocity fields. In this study, we propose a rescaling technique to recover the effects of small-scale motions on the preferential concentration of inertial particles. The technique is used to simulate particle distribution in isotropic turbulence by LES and produce consistent results with direct numerical simulation (DNS). Key words: particle distribution, particle-laden turbulence, large-eddy simulation, subgrid scale model.
Resumo:
This study has investigated the medium to long term costs to Higher Education Institutions (HEIs) of the preservation of research data and developed guidance to HEFCE and institutions on these issues. It has provided an essential methodological foundation on research data costs for the forthcoming HEFCE-sponsored feasibility study for a UK Research Data Service.It will also assist HEIs and funding bodies wishing to establish strategies and TRAC costings for long-term data management and archiving. The rising tide of digital research data raises issues relating to access, curation and preservation for HEIs and within the UK a growing number of research funders are now implementing policies requiring researchers to submit data management, preservation or data sharing plans with their funding applications.
Resumo:
Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'