921 resultados para Artificial Information Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate and precise estimates of age and growth rates are essential parameters in understanding the population dynamics of fishes. Some of the more sophisticated stock assessment models, such as virtual population analysis, require age and growth information to partition catch data by age. Stock assessment efforts by regulatory agencies are usually directed at specific fisheries which are being heavily exploited and are suspected of being overfished. Interest in stock assessment of some of the oceanic pelagic fishes (tunas, billfishes, and sharks) has developed only over the last decade, during which exploitation has increased steadily in response to increases in worldwide demand for these resources. Traditionally, estimating the age of fishes has been done by enumerating growth bands on skeletal hardparts, through length frequency analysis, tag and recapture studies, and raising fish in enclosures. However, problems related to determining the age of some of the oceanic pelagic fishes are unique compared with other species. For example, sampling is difficult for these large, highly mobile fishes because of their size, extensive distributions throughout the world's oceans, and for some, such as the marlins, infrequent catches. In addition, movements of oceanic pelagic fishes often transect temperate as well as tropical oceans, making interpretation of growth bands on skeletal hardparts more difficult than with more sedentary temperate species. Many oceanic pelagics are also long-lived, attaining ages in excess of 30 yr, and more often than not, their life cycles do not lend themselves easily to artificial propagation and culture. These factors contribute to the difficulty of determining ages and are generally characteristic of this group-the tunas, billfishes, and sharks. Accordingly, the rapidly growing international concern in managing oceanic pelagic fishes, as well as unique difficulties in ageing these species, prompted us to hold this workshop. Our two major objectives for this workshop are to: I) Encourage the interchange of ideas on this subject, and 2) establish the "state of the art." A total of 65 scientists from 10 states in the continental United States and Hawaii, three provinces in Canada, France, Republic of Senegal, Spain, Mexico, Ivory Coast, and New South Wales (Australia) attended the workshop held at the Southeast Fisheries Center, Miami, Fla., 15-18 February 1982. Our first objective, encouraging the interchange of ideas, is well illustrated in the summaries of the Round Table Discussions and in the Glossary, which defines terms used in this volume. The majority of the workshop participants agreed that the lack of validation of age estimates and the means to accomplish the same are serious problems preventing advancements in assessing the age and growth of fishes, particularly oceanic pelagics. The alternatives relating to the validation problem were exhaustively reviewed during the Round Table Discussions and are a major highlight of this workshop. How well we accomplished our second objective, to establish the "state of the art" on age determination of oceanic pelagic fishes, will probably best be judged on the basis of these proceedings and whether future research efforts are directed at the problem areas we have identified. In order to produce high-quality papers, workshop participants served as referees for the manuscripts published in this volume. Several papers given orally at the workshop, and included in these proceedings, were summarized from full-length manuscripts, which have been submitted to or published in other scientific outlets-these papers are designated as SUMMARY PAPERS. In addition, the SUMMARY PAPER designation was also assigned to workshop papers that represented very preliminary or initial stages of research, cursory progress reports, papers that were data shy, or provide only brief reviews on general topics. Bilingual abstracts were included for all papers that required translation. We gratefully acknowledge the support of everyone involved in this workshop. Funding was provided by the Southeast Fisheries Center, and Jack C. Javech did the scientific illustrations appearing on the cover, between major sections, and in the Glossary. (PDF file contains 228 pages.)

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]The generation of spikes by neurons is energetically a costly process and the evaluation of the metabolic energy required to maintain the signaling activity of neurons a challenge of practical interest. Neuron models are frequently used to represent the dynamics of real neurons but hardly ever to evaluate the electrochemical energy required to maintain that dynamics. This paper discusses the interpretation of a Hodgkin-Huxley circuit as an energy model for real biological neurons and uses it to evaluate the consumption of metabolic energy in the transmission of information between neurons coupled by electrical synapses, i.e., gap junctions. We show that for a single postsynaptic neuron maximum energy efficiency, measured in bits of mutual information per molecule of adenosine triphosphate (ATP) consumed, requires maximum energy consumption. For groups of parallel postsynaptic neurons we determine values of the synaptic conductance at which the energy efficiency of the transmission presents clear maxima at relatively very low values of metabolic energy consumption. Contrary to what could be expected, the best performance occurs at a low energy cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The learning of probability distributions from data is a ubiquitous problem in the fields of Statistics and Artificial Intelligence. During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models due to their advantageous theoretical properties. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k, which controls the complexity of the model. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for k > 2. In this work, we propose a family of algorithms which approximates this problem with a computational complexity of O(k · n^2 log n) in the worst case, where n is the number of implied random variables. The structures of the decomposable models that solve the maximum likelihood problem are called maximal k-order decomposable graphs. Our proposals, called fractal trees, construct a sequence of maximal i-order decomposable graphs, for i = 2, ..., k, in k − 1 steps. At each step, the algorithms follow a divide-and-conquer strategy based on the particular features of this type of structures. Additionally, we propose a prune-and-graft procedure which transforms a maximal k-order decomposable graph into another one, increasing its likelihood. We have implemented two particular fractal tree algorithms called parallel fractal tree and sequential fractal tree. These algorithms can be considered a natural extension of Chow and Liu’s algorithm, from k = 2 to arbitrary values of k. Both algorithms have been compared against other efficient approaches in artificial and real domains, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their low computational complexity they are especially recommended to deal with high dimensional domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] This paper is based in the following project:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to the Millennium Ecosystem Assessment’s chapter “Coastal Systems” (Agardy and Alder 2005), 40% of the world population falls within 100 km of the coast. Agardy and Alder report that population densities in coastal regions are three times those of inland regions and demographic forecasts suggest a continued rise in coastal populations. These high population levels can be partially traced to the abundance of ecosystem services provided in the coastal zone. While populations benefit from an abundance of services, population pressure also degrades existing services and leads to increased susceptibility of property and human life to natural hazards. In the face of these challenges, environmental administrators on the coast must pursue agendas which reflect the difficult balance between private and public interests. These decisions include maintaining economic prosperity and personal freedoms, protecting or enhancing the existing flow of ecosystem services to society, and mitigating potential losses from natural hazards. (PDF contains 5 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with spatial filtering. What is its utility in tone reproduction? Does it exist in vision, and if so, what constraints does it impose on the nervous system?

Tone reproduction is just the art and science of taking a picture and then displaying it. The sensors available to capture an image have a greater dynamic range than the media that may be used to display it. Conventionally, spatial filtering is used to boost contrast; it ameliorates the loss of contrast that results when the sensor signal range is scaled down to fit the display range. In this thesis, a type of nonlinear spatial filtering is discussed that results in direct range reduction without range scaling. This filtering process is instantiated in a real-time image processor built using analog CMOS VLSI.

Spatial filtering must be applied with care in both artificial and natural vision systems. It is argued that the nervous system does not simply filter linearly across an image. Rather, the way that we see things implies that the nervous system filters nonlinearly. Further, many models for color vision include a high-pass filtering step in which the DC information is lost. A real-time study of filtering in color space leads to the conclusion that the nervous system is not that simple, and that it maintains DC information by referencing to white.

Relevância:

30.00% 30.00%

Publicador: