906 resultados para multispecies gillnet small-scale fishery


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A diferencia de la generalidad de trabajos sobre la pobreza rural en Colombia, este estudio emplea un enfoque de activos para indagar acerca de los determinantes de la pobreza rural. En particular se examinan la existencia de no convexidades locales en el proceso de generación de ingresos, el grado de concentración de los hogares en ciertos rangos de acumulación de activos y la presencia de retornos marginales diferenciados a los activos. Con base en esto se proporciona evidencia prima facie acerca de la existencia de una trampa de pobreza en el sector rural, abriendo una línea promisoria de investigación sobre el tema, que puede contribuir de forma importante a su comprensión y a un mejor diseño de política social y sectorial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción Debido a que los accidentes de origen ocupacional ocupan un lugar importante dentro de las causas de ausentismo, discapacidades y hasta las muertes se realizó la descripción de la accidentalidad, teniendo en cuenta factores como la severidad y tipo de lesión, el tipo de actividad laboral y re accidentalidad ocurrida en una empresa manufacturera en un periodo de 2010 al 2014. Objetivos Establecer la distribución de los accidentes laborales ocurridos en una empresa manufacturera en el periodo de 2010 al 2014 según edad, género, área laboral y tipo de lesión. Materiales y métodos Se realizó un estudio de corte transversal en donde se analizaron las características según los trabajadores y la empresa y se realizaron asociaciones para establecer cuáles eran los factores de riesgo para presentar re accidentalidad. Se tomó una empresa manufacturera del sector de producción de alimentos que cuenta con un total de 950 empleados, en riesgo de presentar accidentes laborales. Se seleccionaron 338 accidentes ocurridos en el periodo de 2010 y 2014.Se realizaron análisis de los accidentes según las variables de: género, área de trabajo y tipo de herida para determinar su distribución según dichos factores. Posteriormente se realizaron análisis bivariado por medio de asociaciones estadísticas usando el estadístico Chi cuadrado y pruebas no paramétricas o paramétricas según la distribución de normalidad de las variables cuantitativas. El programa que se usó para el análisis fue el de SPSS versión 22. Resultados El estudio identificó que la proporción de accidentes durante el periodo estudiado con respecto al número de trabajadores fue de 35,6% y de los 950 trabajadores el 28,8% presentaron accidentes que corresponde a los 274 trabajadores. La mediana de edad fue de 35 años y se presentó más frecuente en mujeres (55,6%). El área laboral en la que se presentaron mayor número de accidentes fue el área de manufactura (75,7%). La proporción de mujeres que presentaban mayor accidentes en el área de manufactura si representó una diferencia de 23,4% con respecto a la proporción de hombres que presentaron mayor accidentes en la misma área. Con respecto al tipo de lesión, se presentaron accidentes frecuentemente en miembros superiores y según el género, las mujeres presentaron 2,02% más accidentes en miembros superiores comparado con los hombres. Los análisis en cuanto a re accidentalidad determinaron que el 16,8% de los trabajadores presentaron más de un accidente y un trabajador presentó 5 accidentes en el periodo de tiempo estudiado en la empresa, la edad se relacionó significativamente con el hecho de presentar re accidentalidad.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By the end of the summer, the EU will launch new crisis management missions in the Horn of Africa, Niger and South Sudan. In this CEPS Commentary, Giovanni Faleg and Steven Blockmans question whether the new deployments will revive the EU’s persona as a global security actor. The authors point out that, without the backing of a comprehensive security strategy rationale, the EU’s re-engagement as a crisis manager that opts for small-scale operations will be seen as a continuation of its sleepwalking through a changing geostrategic landscape.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of humidity observations on forecast skill is explored by producing a series of global forecasts using initial data derived from the ERA-40 reanalyses system, in which all humidity data have been removed during the data assimilation. The new forecasts have been compared with the original ERA-40 analyses and forecasts made from them. Both sets of forecasts show virtually identical prediction skill in the extratropics and the tropics. Differences between the forecasts are small and undergo characteristic amplification rate. There are larger differences in temperature and geopotential in the tropics but the differences are small-scale and unstructured and have no noticeable effect on the skill of the wind forecasts. The results highlight the current very limited impact of the humidity observations, used to produce the initial state, on the forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data from four recent reanalysis projects [ECMWF, NCEP-NCAR, NCEP - Department of Energy ( DOE), NASA] have been diagnosed at the scale of synoptic weather systems using an objective feature tracking method. The tracking statistics indicate that, overall, the reanalyses correspond very well in the Northern Hemisphere (NH) lower troposphere, although differences for the spatial distribution of mean intensities show that the ECMWF reanalysis is systematically stronger in the main storm track regions but weaker around major orographic features. A direct comparison of the track ensembles indicates a number of systems with a broad range of intensities that compare well among the reanalyses. In addition, a number of small-scale weak systems are found that have no correspondence among the reanalyses or that only correspond upon relaxing the matching criteria, indicating possible differences in location and/or temporal coherence. These are distributed throughout the storm tracks, particularly in the regions known for small-scale activity, such as secondary development regions and the Mediterranean. For the Southern Hemisphere (SH), agreement is found to be generally less consistent in the lower troposphere with significant differences in both track density and mean intensity. The systems that correspond between the various reanalyses are considerably reduced and those that do not match span a broad range of storm intensities. Relaxing the matching criteria indicates that there is a larger degree of uncertainty in both the location of systems and their intensities compared with the NH. At upper-tropospheric levels, significant differences in the level of activity occur between the ECMWF reanalysis and the other reanalyses in both the NH and SH winters. This occurs due to a lack of coherence in the apparent propagation of the systems in ERA15 and appears most acute above 500 hPa. This is probably due to the use of optimal interpolation data assimilation in ERA15. Also shown are results based on using the same techniques to diagnose the tropical easterly wave activity. Results indicate that the wave activity is sensitive not only to the resolution and assimilation methods used but also to the model formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different systems, different purposes – but how do they compare as learning environments? We undertook a survey of students at the University, asking whether they learned from their use of the systems, whether they made contact with other students through them, and how often they used them. Although it was a small scale survey, the results are quite enlightening and quite surprising. Blackboard is populated with learning material, has all the students on a module signed up to it, a safe environment (in terms of Acceptable Use and some degree of staff monitoring) and provides privacy within the learning group (plus lecturer and relevant support staff). Facebook, on the other hand, has no learning material, only some of the students using the system, and on the face of it, it has the opportunity for slips in privacy and potential bullying because the Acceptable Use policy is more lax than an institutional one, and breaches must be dealt with on an exception basis, when reported. So why do more students find people on their courses through Facebook than Blackboard? And why are up to 50% of students reporting that they have learned from using Facebook? Interviews indicate that students in subjects which use seminars are using Facebook to facilitate working groups – they can set up private groups which give them privacy to discuss ideas in an environment which perceived as safer than Blackboard can provide. No staff interference, unless they choose to invite them in, and the opportunity to select who in the class can engage. The other striking finding is the difference in use between the genders. Males are using blackboard more frequently than females, whilst the reverse is true for Facebook. Interviews suggest that this may have something to do with needing to access lecture notes… Overall, though, it appears that there is little relationship between the time spent engaging with Blackboard and reports that students have learned from it. Because Blackboard is our central repository for notes, any contact is likely to result in some learning. Facebook, however, shows a clear relationship between frequency of use and perception of learning – and our students post frequently to Facebook. Whilst much of this is probably trivia and social chit chat, the educational elements of it are, de facto, contructivist in nature. Further questions need to be answered - Is the reason the students learn from Facebook because they are creating content which others will see and comment on? Is it because they can engage in a dialogue, without the risk of interruption by others?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sudden stratospheric warmings (SSWs) are usually considered to be initiated by planetary wave activity. Here it is asked whether small-scale variability (e.g., related to gravity waves) can lead to SSWs given a certain amount of planetary wave activity that is by itself not sufficient to cause a SSW. A highly vertically truncated version of the Holton–Mass model of stratospheric wave–mean flow interaction, recently proposed by Ruzmaikin et al., is extended to include stochastic forcing. In the deterministic setting, this low-order model exhibits multiple stable equilibria corresponding to the undisturbed vortex and SSW state, respectively. Momentum forcing due to quasi-random gravity wave activity is introduced as an additive noise term in the zonal momentum equation. Two distinct approaches are pursued to study the stochastic system. First, the system, initialized at the undisturbed state, is numerically integrated many times to derive statistics of first passage times of the system undergoing a transition to the SSW state. Second, the Fokker–Planck equation corresponding to the stochastic system is solved numerically to derive the stationary probability density function of the system. Both approaches show that even small to moderate strengths of the stochastic gravity wave forcing can be sufficient to cause a SSW for cases for which the deterministic system would not have predicted a SSW.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finite computing resources limit the spatial resolution of state-of-the-art global climate simulations to hundreds of kilometres. In neither the atmosphere nor the ocean are small-scale processes such as convection, clouds and ocean eddies properly represented. Climate simulations are known to depend, sometimes quite strongly, on the resulting bulk-formula representation of unresolved processes. Stochastic physics schemes within weather and climate models have the potential to represent the dynamical effects of unresolved scales in ways which conventional bulk-formula representations are incapable of so doing. The application of stochastic physics to climate modelling is a rapidly advancing, important and innovative topic. The latest research findings are gathered together in the Theme Issue for which this paper serves as the introduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosols from anthropogenic and natural sources have been recognized as having an important impact on the climate system. However, the small size of aerosol particles (ranging from 0.01 to more than 10 μm in diameter) and their influence on solar and terrestrial radiation makes them difficult to represent within the coarse resolution of general circulation models (GCMs) such that small-scale processes, for example, sulfate formation and conversion, need parameterizing. It is the parameterization of emissions, conversion, and deposition and the radiative effects of aerosol particles that causes uncertainty in their representation within GCMs. The aim of this study was to perturb aspects of a sulfur cycle scheme used within a GCM to represent the climatological impacts of sulfate aerosol derived from natural and anthropogenic sulfur sources. It was found that perturbing volcanic SO2 emissions and the scavenging rate of SO2 by precipitation had the largest influence on the sulfate burden. When these parameters were perturbed the sulfate burden ranged from 0.73 to 1.17 TgS for 2050 sulfur emissions (A2 Special Report on Emissions Scenarios (SRES)), comparable with the range in sulfate burden across all the Intergovernmental Panel on Climate Change SRESs. Thus, the results here suggest that the range in sulfate burden due to model uncertainty is comparable with scenario uncertainty. Despite the large range in sulfate burden there was little influence on the climate sensitivity, which had a range of less than 0.5 K across the ensemble. We hypothesize that this small effect was partly associated with high sulfate loadings in the control phase of the experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes the development and evaluation of the U.K.’s new High-Resolution Global Environmental Model (HiGEM), which is based on the latest climate configuration of the Met Office Unified Model, known as the Hadley Centre Global Environmental Model, version 1 (HadGEM1). In HiGEM, the horizontal resolution has been increased to 0.83° latitude × 1.25° longitude for the atmosphere, and 1/3° × 1/3° globally for the ocean. Multidecadal integrations of HiGEM, and the lower-resolution HadGEM, are used to explore the impact of resolution on the fidelity of climate simulations. Generally, SST errors are reduced in HiGEM. Cold SST errors associated with the path of the North Atlantic drift improve, and warm SST errors are reduced in upwelling stratocumulus regions where the simulation of low-level cloud is better at higher resolution. The ocean model in HiGEM allows ocean eddies to be partially resolved, which dramatically improves the representation of sea surface height variability. In the Southern Ocean, most of the heat transports in HiGEM is achieved by resolved eddy motions, which replaces the parameterized eddy heat transport in the lower-resolution model. HiGEM is also able to more realistically simulate small-scale features in the wind stress curl around islands and oceanic SST fronts, which may have implications for oceanic upwelling and ocean biology. Higher resolution in both the atmosphere and the ocean allows coupling to occur on small spatial scales. In particular, the small-scale interaction recently seen in satellite imagery between the atmosphere and tropical instability waves in the tropical Pacific Ocean is realistically captured in HiGEM. Tropical instability waves play a role in improving the simulation of the mean state of the tropical Pacific, which has important implications for climate variability. In particular, all aspects of the simulation of ENSO (spatial patterns, the time scales at which ENSO occurs, and global teleconnections) are much improved in HiGEM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problem structuring methods or PSMs are widely applied across a range of variable but generally small-scale organizational contexts. However, it has been argued that they are seen and experienced less often in areas of wide ranging and highly complex human activity-specifically those relating to sustainability, environment, democracy and conflict (or SEDC). In an attempt to plan, track and influence human activity in SEDC contexts, the authors in this paper make the theoretical case for a PSM, derived from various existing approaches. They show how it could make a contribution in a specific practical context-within sustainable coastal development projects around the Mediterranean which have utilized systemic and prospective sustainability analysis or, as it is now known, Imagine. The latter is itself a PSM but one which is 'bounded' within the limits of the project to help deliver the required 'deliverables' set out in the project blueprint. The authors argue that sustainable development projects would benefit from a deconstruction of process by those engaged in the project and suggest one approach that could be taken-a breakout from a project-bounded PSM to an analysis that embraces the project itself. The paper begins with an introduction to the sustainable development context and literature and then goes on to illustrate the issues by grounding the debate within a set of projects facilitated by Blue Plan for Mediterranean coastal zones. The paper goes on to show how the analytical framework could be applied and what insights might be generated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes some of the results of a detailed farm-level survey of 32 small-scale cotton farmers in the Makhathini Flats region of South Africa. The aim was to assess and measure some of the impacts (especially in terms of savings in pesticide and labour as well as benefits to human health) attributable to the use of insect-tolerant Bt cotton. The study reveals a direct cost benefit for Bt growers of SAR416 ($51) per hectare per season due to a reduction in the number of insecticide applications. Cost savings emerged in the form of lower requirements for pesticide, but also important were reduced requirements for water and labour. The reduction in the number of sprays was particularly beneficial to women who do some spraying and children who collect water and assist in spraying. The increasing adoption rate of Bt cotton appears to have a health benefit measured in terms of reported rates of accidental insecticide poisoning. These appear to be declining as the uptake of Bt cotton increases. However, the understanding of refugia and their management by local farmers are deficient and need improving. Finally, Bt cotton growers emerge as more resilient in absorbing price fluctuations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper summarizes some of the geoarchaeological evidence for early arable agriculture in Britain and Europe, and introduces new evidence for small-scale but very intensive cultivation in the Neolithic, Bronze Age and Iron Age in Scotland. The Scottish examples demonstrate that, from the Neolithic to the Iron Age, midden heaps were sometimes ploughed in situ; this means that, rather than spreading midden material onto the fields, the early farmers simply ran an ard over their compost heaps and sowed the resulting plots. The practice appears to have been common in Scotland, and may also have occurred in England. Neolithic cultivation of a Mesolithic midden is suggested, based on thin-section analysis of the middens at Northton, Harris. The fertility of the Mesolithic middens may partly explain why Neolithic farmers re-settled Mesolithic sites in the Northern and Western Isles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The promotion of technologies seen to be aiding in the attainment of agricultural sustainability has been Popular amongst Northern-based development donors for many years. One of these, botanical insecticides (e.g., those based on neem, Pyrethrum and tobacco) have been a particular favorite as they are equated with being 'natural' and hence less damaging to human health and the environment. This paper describes the outcome of interactions between one non-government organisation (NGO), the Diocesan Development Services (DDS), based in Kogi State, Nigeria, and a major development donor based in Europe that led to the establishment of a programme designed to promote the Virtues of a tobacco-based insecticide to small-scale farmers. The Tobacco Insecticide Programme (TIP) began in the late 1980s and ended in 200 1, absorbing significant quantities of resource in the process. TIP began with exploratory investigations of efficacy on the DDS seed multiplication farm followed by stages of researcher-managed and farmer-managed on-farm trials. A survey in 2002 assessed adoption of the technology by farmers. While yield benefits from using the insecticide were nearly always positive and statistically significant relative to an untreated control, they were not as good as commercial insecticides. However, adoption of the tobacco insecticide by local farmers was poor. The paper discusses the reasons for poor adoption, including relative benefits in gross margin, and uses the TIP example to explore the differing power relationships that exist between donors, their field partners and farmers. (C) 2004 by The Haworth Press, Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

These notes have been issued on a small scale in 1983 and 1987 and on request at other times. This issue follows two items of news. First, WaIter Colquitt and Luther Welsh found the 'missed' Mersenne prime M110503 and advanced the frontier of complete Mp-testing to 139,267. In so doing, they terminated Slowinski's significant string of four consecutive Mersenne primes. Secondly, a team of five established a non-Mersenne number as the largest known prime. This result terminated the 1952-89 reign of Mersenne primes. All the original Mersenne numbers with p < 258 were factorised some time ago. The Sandia Laboratories team of Davis, Holdridge & Simmons with some little assistance from a CRAY machine cracked M211 in 1983 and M251 in 1984. They contributed their results to the 'Cunningham Project', care of Sam Wagstaff. That project is now moving apace thanks to developments in technology, factorisation and primality testing. New levels of computer power and new computer architectures motivated by the open-ended promise of parallelism are now available. Once again, the suppliers may be offering free buildings with the computer. However, the Sandia '84 CRAY-l implementation of the quadratic-sieve method is now outpowered by the number-field sieve technique. This is deployed on either purpose-built hardware or large syndicates, even distributed world-wide, of collaborating standard processors. New factorisation techniques of both special and general applicability have been defined and deployed. The elliptic-curve method finds large factors with helpful properties while the number-field sieve approach is breaking down composites with over one hundred digits. The material is updated on an occasional basis to follow the latest developments in primality-testing large Mp and factorising smaller Mp; all dates derive from the published literature or referenced private communications. Minor corrections, additions and changes merely advance the issue number after the decimal point. The reader is invited to report any errors and omissions that have escaped the proof-reading, to answer the unresolved questions noted and to suggest additional material associated with this subject.