849 resultados para alternative economic networks
Resumo:
Organised by Knowledge Exchange & the Nordbib programme 11 June 2012, 8:30-12:30, Copenhagen Adjacent to the Nordbib conference 'Structural frameworks for open, digital research' Participants in break out discussion during the workshop on cost modelsThe Knowledge Exchange and the Nordbib programme organised a workshop on cost models for the preservation and management of digital collections. The rapid growth of the digital information which a wide range of institutions must preserve emphasizes the need for robust cost modelling. Such models should enable these institutions to assess both what resources are needed to sustain their digital preservation activities and allow comparisons of different preservation solutions in order to select the most cost-efficient alternative. In order to justify the costs institutions also need to describe the expected benefits of preserving digital information. This workshop provided an overview of existing models and demonstrated the functionality of some of the current cost tools. It considered the specific economic challenges with regard to the preservation of research data and addressed the benefits of investing in the preservation of digital information. Finally, the workshop discussed international collaboration on cost models. The aim of the workshop was to facilitate understanding of the economies of data preservation and to discuss the value of developing an international benchmarking model for the costs and benefits of digital preservation. The workshop took place in the Danish Agency for Culture and was planned directly prior to the Nordbib conference 'Structural frameworks for open, digital research'
Resumo:
The cod stock in the Western Baltic Sea is assessed to be overfished regarding the definitions of the UN World Summit on Sustainable Development at Johannesburg in 2002. Thus, the European Fisheries Council enforced a multi-annual management plan in 2007. Our medium term simulations over the future 10 years assume similar stock productivity as compared with the past four decades and indicate that the goals of the management plan can be achieved through TAC and consistent effort regulations. Taking account of the uncertainty in the recruitment patterns, the target average fishing mortality of age groups 3 – 6 years of F = 0.6 per year as defined in the management plan is indicated to exceed sustainable levels consistent with high long term yields and low risk of depletion. The stipulated constraint of the annual TAC variations of ±15% will dominate future fisheries management and implies a high recovery potential of the stock through continued reductions in fishing mortality. The scientific assessment of sustainable levels of exploitation and consideration in the plan is strongly advised, taking account of uncertainties attributed to environmental and biological effects. We recommend our study to be complemented with economic impact assessments including effects on by-catch species, which have been disregarded in this study. It is further demonstrated, that the goals of the management plan can alternatively be achieved by mesh size adaptations. An alternative technical option of mesh size increases to realize the required reductions in fishing mortality provides avoidance of discards of undersized fish after a few years by means of improved selectivity, another important element of the Common Fisheries Policy. However, it is emphasized that technical regulations since 1990 failed to affect the by-catch and discards of juvenile cod. In any way, the meaningful implementation of the multiannual management plan through stringent control and enforcement appears critical.
Resumo:
28 p.
Resumo:
Kainji Lake Basin is the first man-made Lake in Nigeria with a surface area of 1270km super(2). Since its creation in 1968 research activities were carried out on biological, socio-economic, hydrological and limnological characteristics of the water body. Extension activities concentrated on the dissemination of proven technologies developed by the Research scientists. Most of the socio-economic and extension activities focused on fishermen as women were regarded as homemakers and their activities concentrated in the home. The situation is even compounded by the Islamic injunction of seclusion. The intervention of NGKLFPP in 1993 has introduced many changes into the research and extension activities directed at the beneficiaries of the project because women were considered as a major stakeholder around the Lake area. The intervention of the project in Kainji Lake in the introduction of alternative income generating activities to women is enumerated in this paper. The intervention has improved the living standard of women and to a certain level reduced poverty among women in the area
Resumo:
4 p.
Resumo:
22 p.
Resumo:
Nowadays, enterprises, and especially SMEs, are immersed in a very difficult economic situation. Therefore, they need new and innovative tools to compete in that environment. Integration of the internet 2.0 and social networks in marketing strategies of companies could be the key to success. If social networks are well managed, they can bring a lot to enterprise plans. Moreover, social networks are very attractive from an economic point of view as companies can find most of their customers on it.
Resumo:
We develop and test a method to estimate relative abundance from catch and effort data using neural networks. Most stock assessment models use time series of relative abundance as their major source of information on abundance levels. These time series of relative abundance are frequently derived from catch-per-unit-of-effort (CPUE) data, using general linearized models (GLMs). GLMs are used to attempt to remove variation in CPUE that is not related to the abundance of the population. However, GLMs are restricted in the types of relationships between the CPUE and the explanatory variables. An alternative approach is to use structural models based on scientific understanding to develop complex non-linear relationships between CPUE and the explanatory variables. Unfortunately, the scientific understanding required to develop these models may not be available. In contrast to structural models, neural networks uses the data to estimate the structure of the non-linear relationship between CPUE and the explanatory variables. Therefore neural networks may provide a better alternative when the structure of the relationship is uncertain. We use simulated data based on a habitat based-method to test the neural network approach and to compare it to the GLM approach. Cross validation and simulation tests show that the neural network performed better than nominal effort and the GLM approach. However, the improvement over GLMs is not substantial. We applied the neural network model to CPUE data for bigeye tuna (Thunnus obesus) in the Pacific Ocean.
Resumo:
Nos últimos 30 anos, a Cultura tem exercido um papel cada vez mais determinante nas esferas políticas, econômicas e sociais. A indústria dos bens culturais demonstra uma força crescente, mesmo perante crises, e agências multinacionais, juntamente com Estados-Nação, têm utilizado de projetos culturais com o objetivo de desenvolvimento econômico e social de diversas regiões do mundo. Porém, ao mesmo tempo que a cultura exerce papel tão predominante, ela vem sendo moldada segundo os ditames da lógica de mercado. Por isso órgãos como a UNESCO apontam para a necessidade dos países de fomentarem uma produção cultural heterogênica, que fuja dos modelos únicos das indústrias culturais. O presente trabalho põe em análise o programa Cultura Viva, implementado no Governo Lula, que tem nos Pontos de Cultura sua principal ação. A partir de conceitos de Gilles Deleuze, Felix Guattari e Antonio Negri, esse trabalho visa discutir de que maneiras o Cultura Viva se constitui como uma alternativa ao modelo de fomentação de cultura regido por esses ditames mercadológicos. Ao propor uma diferente forma de produção, organização e gestão do meio cultural, o programa Cultura Viva se inscreve em uma política mais aberta a diferentes grupos e movimentos de nossa sociedade brasileira.
Resumo:
In this report we analyze the Topic 5 report’s recommendations for reducing nitrogen losses to the Gulf of Mexico (Mitsch et al. 1999). We indicate the relative costs and cost-effectiveness of different control measures, and potential benefits within the Mississippi River Basin. For major nonpoint sources, such as agriculture, we examine both national and basin costs and benefits. Based on the Topic 2 economic analysis (Diaz and Solow 1999), the direct measurable dollar benefits to Gulf fisheries of reducing nitrogen loads from the Mississippi River Basin are very limited at best. Although restoring the ecological communities in the Gulf may be significant over the long term, we do not currently have information available to estimate the benefits of such measures to restore the Gulf’s long-term health. For these reasons, we assume that measures to reduce nitrogen losses to the Gulf will ultimately prove beneficial, and we concentrate on analyzing the cost-effectiveness of alternative reduction strategies. We recognize that important public decisions are seldom made on the basis of strict benefit–cost analysis, especially when complete benefits cannot be estimated. We look at different approaches and different levels of these approaches to identify those that are cost-effective and those that have limited undesirable secondary effects, such as reduced exports, which may result in lost market share. We concentrate on the measures highlighted in the Topic 5 report, and also are guided by the source identification information in the Topic 3 report (Goolsby et al. 1999). Nonpoint sources that are responsible for the bulk of the nitrogen receive most of our attention. We consider restrictions on nitrogen fertilizer levels, and restoration of wetlands and riparian buffers for denitrification. We also examine giving more emphasis to nitrogen control in regions contributing a greater share of the nitrogen load.
Resumo:
Implications of the fish export trade on the people and the fisheries resource of Lake Victoria, Uganda were examined. Eight fish processing factories and ninety fishers were analyzed in terms of socio-economic characteristics of fishers and the economic characteristics of fish factories. Results indicated that industrial fish processors in Uganda are presently the main link between the artisanal fisher-folk and the overseas export markets. Their entry into the market has stabilized and expanded the fisher-folk market and average earnings. Fishers attributed improvement in incomes and living standards (76%) to positive changes in the fish market (78%) in the last 5 years (1994-1999). Ugandan fisher-folk communities are not seriously affected by the Nile perch exports (73%) because they normally have easy access to cheap fish at prices much less than urban prices and; depend mainly on alternative fish species of less export value. The price of Nile perch influences positively the price of Tilapia
Resumo:
We quantify the conditions that might trigger wide spread adoption of alternative fuel vehicles (AFVs) to support energy policy. Empirical review shows that early adopters are heterogeneous motivated by financial benefits, environmental appeal, new technology, and vehicle reliability. A probabilistic Monte Carlo simulation model is used to assess consumer heterogeneity for early and mass market adopters. For early adopters full battery electric vehicles (BEVs) are competitive but unable to surpass diesels or hybrids due to purchase price premium and lack of charging availability. For mass adoption, simulations indicate that if the purchase price premium of a BEV closes to within 20% of an in-class internal combustion engine (ICE) vehicle, combined with a 60% increase in refuelling availability relative to the incumbent system, BEVs become competitive. But this depends on a mass market that values the fuel economy and CO2 reduction benefits associated with BEVs. We also find that the largest influence on early adoption is financial benefit rather than pro-environmental behaviour suggesting that AFVs should be marketed by appealing to economic benefits combined with pro-environmental behaviour to motivate adoption. Monte Carlo simulations combined with scenarios can give insight into diffusion dynamics for other energy demand-side technologies. © 2012 Elsevier Inc.
Resumo:
Recent work in sensor databases has focused extensively on distributed query problems, notably distributed computation of aggregates. Existing methods for computing aggregates broadcast queries to all sensors and use in-network aggregation of responses to minimize messaging costs. In this work, we focus on uniform random sampling across nodes, which can serve both as an alternative building block for aggregation and as an integral component of many other useful randomized algorithms. Prior to our work, the best existing proposals for uniform random sampling of sensors involve contacting all nodes in the network. We propose a practical method which is only approximately uniform, but contacts a number of sensors proportional to the diameter of the network instead of its size. The approximation achieved is tunably close to exact uniform sampling, and only relies on well-known existing primitives, namely geographic routing, distributed computation of Voronoi regions and von Neumann's rejection method. Ultimately, our sampling algorithm has the same worst-case asymptotic cost as routing a point-to-point message, and thus it is asymptotically optimal among request/reply-based sampling methods. We provide experimental results demonstrating the effectiveness of our algorithm on both synthetic and real sensor topologies.
Resumo:
The quality of available network connections can often have a large impact on the performance of distributed applications. For example, document transfer applications such as FTP, Gopher and the World Wide Web suffer increased response times as a result of network congestion. For these applications, the document transfer time is directly related to the available bandwidth of the connection. Available bandwidth depends on two things: 1) the underlying capacity of the path from client to server, which is limited by the bottleneck link; and 2) the amount of other traffic competing for links on the path. If measurements of these quantities were available to the application, the current utilization of connections could be calculated. Network utilization could then be used as a basis for selection from a set of alternative connections or servers, thus providing reduced response time. Such a dynamic server selection scheme would be especially important in a mobile computing environment in which the set of available servers is frequently changing. In order to provide these measurements at the application level, we introduce two tools: bprobe, which provides an estimate of the uncongested bandwidth of a path; and cprobe, which gives an estimate of the current congestion along a path. These two measures may be used in combination to provide the application with an estimate of available bandwidth between server and client thereby enabling application-level congestion avoidance. In this paper we discuss the design and implementation of our probe tools, specifically illustrating the techniques used to achieve accuracy and robustness. We present validation studies for both tools which demonstrate their reliability in the face of actual Internet conditions; and we give results of a survey of available bandwidth to a random set of WWW servers as a sample application of our probe technique. We conclude with descriptions of other applications of our measurement tools, several of which are currently under development.
Resumo:
The development of ultra high speed (~20 Gsamples/s) analogue to digital converters (ADCs), and the delayed deployment of 40 Gbit/s transmission due to the economic downturn, has stimulated the investigation of digital signal processing (DSP) techniques for compensation of optical transmission impairments. In the future, DSP will offer an entire suite of tools to compensate for optical impairments and facilitate the use of advanced modulation formats. Chromatic dispersion is a very significant impairment for high speed optical transmission. This thesis investigates a novel electronic method of dispersion compensation which allows for cost-effective accurate detection of the amplitude and phase of the optical field into the radio frequency domain. The first electronic dispersion compensation (EDC) schemes accessed only the amplitude information using square law detection and achieved an increase in transmission distances. This thesis presents a method by using a frequency sensitive filter to estimate the phase of the received optical field and, in conjunction with the amplitude information, the entire field can be digitised using ADCs. This allows DSP technologies to take the next step in optical communications without requiring complex coherent detection. This is of particular of interest in metropolitan area networks. The full-field receiver investigated requires only an additional asymmetrical Mach-Zehnder interferometer and balanced photodiode to achieve a 50% increase in EDC reach compared to amplitude only detection.