916 resultados para Illegal arms transfers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes whether a minimum wage can be an optimal redistribution policy when distorting taxes and lump-sum transfers are also available in a competitive economy. We build a static general equilibrium model with a Ramsey planner making decisions on taxes, transfers, and minimum wage levels. Workers are assumed to differ only in their productivity. We find that optimal redistribution may imply the use of a minimum wage. The key factor driving our results is the reaction of the demand for low skilled labor to the minimum wage law. Hence, an optimal minimum wage appears to be most likely when low skilled households are scarce, the complementarity between the two types of workers is large or the difference in productivity is small. The main contribution of the paper is a modelling approach that allows us to adopt analysis and solution techniques widely used in recent public finance research. Moreover, this modelling strategy is flexible enough to allow for potential extensions to include dynamics into the model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population characteristics of largemouth bass (Micropterous salmoides L.) including growth, body condition (relative weight), size structure, survival, and fecundity were examined in relation to abundance of submersed aquatic vegetation (SAV) coverage (primarily hydrilla Hydrilla verticillata L.f. Royle) in three major embayments of Lake Seminole, Georgia. Relative weight, fecundity, and growth of large-mouth bass in the Spring Creek embayment (76% areal SAV coverage) was considerably less than measured in the Chattahoochee and Flint river arms that contained lower SAV coverages (26% and 32%). It took fish 1.8 years longer to reach 406 mm in Spring Creek compared to the Chattahoochee-Flint arms. Consequently, fish were smaller in Spring Creek than in the Chattahoochee-Flint arms. In addition, due to slower growth rates and lower fecundity-to-body weight relation, we predicted a 47% reduction in total potential ova production in Spring Creek compared to the other two reservoir embayments. The annual survival rate of 3 to 10 year old largemouth bass was higher in Spring Creek (84%) than in the Chattahoochee-Flint arms (72%) and suggested either lower harvest and/or lower accessibility of particularly larger fish to angling in dense vegetation. Contrary to our expectaions, the fit between number-at-age and age in a catch-curve regression was weaker for fish collected in Spring Creek and suggested greater recruitment variability has occurred over time in this highly vegetated embayment. In Lake Seminole, spatial differences in largemouth bass population characterstics were associated with disparate levels of SAV. Our data suggest that a reduction in hydrilla, but maintenance of an intermediate level of SAV in Spring Creek, should improve largermouth bass population in this arm of the reservoir.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Toxic chemicals can enter the marine environment through numerous routes: stormwater runoff, industrial point source discharges, municipal wastewater discharges, atmospheric deposition, accidental spills, illegal dumping, pesticide applications and agricultural practices. Once they enter a receiving system, toxicants often become bound to suspended particles and increase in density sufficiently to sink to the bottom. Sediments are one of the major repositories of contaminants in aquatic envronments. Furthermore, if they become sufficiently contaminated sediments can act as sources of toxicants to important biota. Sediment quality data are direct indicators of the health of coastal aquatic habitats. Sediment quality investigations conducted by the National Oceanic and Atmospheric Administration (NOAA) and others have indicated that toxic chemicals are found in the sediments and biota of some estuaries in South Carolina and Georgia (NOAA, 1992). This report documents the toxicity of sediments collected within five selected estuaries: Savannah River, Winyah Bay, Charleston Harbor, St. Simons Sound, and Leadenwah Creek (Figure 1). (PDF contains 292 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the macroeconomic effects of a permanent increase in foreign aid in a model that takes into account environmental quality. We develop a dynamic equilibrium model in which both public investment in infrastructure and environmental protection can be financed using domestic resources and international aid programs. The framework considers four scenarios for international aid: untied aid,aid fully tied to infrastructure, aid fully tied to abatement, and aid equally tied to both types of expenditures. We find that the effects of the transfers may depend on (i) the structural characteristics of the recipient country (the elasticity of substitution in production and its dependence on environment and natural resources) and on (ii) how recipient countries distribute their public expenditure. These results underscore the importance of these factors when deciding how and to what extent to tie aid to infrastructure and/or pollution abatement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fishing gears and methods that target the Clupeids are atalla lift net, light attraction, midwater trawl and dala (Clupeid beach seine).Dala fisheries have been the most lucrative fishing gear on Clupeids within the recent past. However, it has been declared as 100% illegal gear having a mesh size of 0.1 mm, because it catches indiscriminately undersized (juveniles/larvae) commercial fish. The ban on the gear was therefore mounted through promulgation and implementation of special fisheries laws for Kainji Lake (Nigeria). Although, the use of Dala fishing gear and method has decreased due to this management approach, its use is currently increased again. It is therefore suggested that the ban on the use of Dala may not be the answer to indiscriminate fisheries, but modification of the gear and methods for harvesting Clupeids should be the main objective

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For a long time, the Tanzanian Fisheries Department has managed Tanzanian fisheries without incorporating other stakeholders within its management framework. On lake Victoria, the persistent use of illegal fishing gear and declining catches have led the government to realize that this system of fisheries management may no longer be viable, and have sought to incorporate fishing communities into the management structure. Through the creation of beach management units (BMUs), the Fisheries Departments have sought to persuade fishing communities to implement and enforce Tanzania's fishing regulations and to monitor the fishery. In this paper we explore a recently gathered data set that yields information on, amongst others, how Tanzanian fishing communities perceive the state of their resource base, how they view their relationship with the Fisheries Department, the efficacy of fishing regulations and other variables. We draw on a series of criteria developed by Ostron (1990) for institutional 'robustness' to explore various areas of institutional development on Lake Victoria, and to try and anticipate how the BMUs will fare. We argue there are many socio-political and economic factors that will determine how communities will receive and perceive their responsibilities towards government-imposed administrative structures at the local level, these will become 'socialized' such that they will vary from place to place. While this may bode well for problems of heterogeneity, it does not necessarily mean that fisheries management objective on Lake Victoria will be met

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a novel framework for state estimation in the context of robotic grasping and manipulation. The overall estimation approach is based on fusing various visual cues for manipulator tracking, namely appearance and feature-based, shape-based, and silhouette-based visual cues. Similarly, a framework is developed to fuse the above visual cues, but also kinesthetic cues such as force-torque and tactile measurements, for in-hand object pose estimation. The cues are extracted from multiple sensor modalities and are fused in a variety of Kalman filters.

A hybrid estimator is developed to estimate both a continuous state (robot and object states) and discrete states, called contact modes, which specify how each finger contacts a particular object surface. A static multiple model estimator is used to compute and maintain this mode probability. The thesis also develops an estimation framework for estimating model parameters associated with object grasping. Dual and joint state-parameter estimation is explored for parameter estimation of a grasped object's mass and center of mass. Experimental results demonstrate simultaneous object localization and center of mass estimation.

Dual-arm estimation is developed for two arm robotic manipulation tasks. Two types of filters are explored; the first is an augmented filter that contains both arms in the state vector while the second runs two filters in parallel, one for each arm. These two frameworks and their performance is compared in a dual-arm task of removing a wheel from a hub.

This thesis also presents a new method for action selection involving touch. This next best touch method selects an available action for interacting with an object that will gain the most information. The algorithm employs information theory to compute an information gain metric that is based on a probabilistic belief suitable for the task. An estimation framework is used to maintain this belief over time. Kinesthetic measurements such as contact and tactile measurements are used to update the state belief after every interactive action. Simulation and experimental results are demonstrated using next best touch for object localization, specifically a door handle on a door. The next best touch theory is extended for model parameter determination. Since many objects within a particular object category share the same rough shape, principle component analysis may be used to parametrize the object mesh models. These parameters can be estimated using the action selection technique that selects the touching action which best both localizes and estimates these parameters. Simulation results are then presented involving localizing and determining a parameter of a screwdriver.

Lastly, the next best touch theory is further extended to model classes. Instead of estimating parameters, object class determination is incorporated into the information gain metric calculation. The best touching action is selected in order to best discern between the possible model classes. Simulation results are presented to validate the theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rates for A(e, e'p) on the nuclei ^2H, C, Fe, and Au have been measured at momentum transfers Q^2 = 1, 3, 5, and 6.8 (GeV fc)^2 . We extract the nuclear transparency T, a measure of the importance of final state interactions (FSI) between the outgoing proton and the recoil nucleus. Some calculations based on perturbative QCD predict an increase in T with momentum transfer, a phenomenon known as Color Transparency. No statistically significant rise is seen in the present experiment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the complexity of biological networks, we find that certain common architectures govern network structures. These architectures impose fundamental constraints on system performance and create tradeoffs that the system must balance in the face of uncertainty in the environment. This means that while a system may be optimized for a specific function through evolution, the optimal achievable state must follow these constraints. One such constraining architecture is autocatalysis, as seen in many biological networks including glycolysis and ribosomal protein synthesis. Using a minimal model, we show that ATP autocatalysis in glycolysis imposes stability and performance constraints and that the experimentally well-studied glycolytic oscillations are in fact a consequence of a tradeoff between error minimization and stability. We also show that additional complexity in the network results in increased robustness. Ribosome synthesis is also autocatalytic where ribosomes must be used to make more ribosomal proteins. When ribosomes have higher protein content, the autocatalysis is increased. We show that this autocatalysis destabilizes the system, slows down response, and also constrains the system’s performance. On a larger scale, transcriptional regulation of whole organisms also follows architectural constraints and this can be seen in the differences between bacterial and yeast transcription networks. We show that the degree distributions of bacterial transcription network follow a power law distribution while the yeast network follows an exponential distribution. We then explored the evolutionary models that have previously been proposed and show that neither the preferential linking model nor the duplication-divergence model of network evolution generates the power-law, hierarchical structure found in bacteria. However, in real biological systems, the generation of new nodes occurs through both duplication and horizontal gene transfers, and we show that a biologically reasonable combination of the two mechanisms generates the desired network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have measured inclusive electron-scattering cross sections for targets of ^(4)He, C, Al, Fe, and Au, for kinematics spanning the quasi-elastic peak, with squared, four­ momentum transfers (q^2) between 0.23 and 2.89 (GeV/c)^2. Additional data were measured for Fe with q^2's up to 3.69 (GeV/c)^2 These cross sections were analyzed for the y-scaling behavior expected from a simple, impulse-approximation model, and are found to approach a scaling limit at the highest q^2's. The q^2 approach to scaling is compared with a calculation for infinite nuclear matter, and relationships between the scaling function and nucleon momentum distributions are discussed. Deviations from perfect scaling are used to set limits on possible changes in the size of nucleons inside the nucleus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Iterative in situ click chemistry (IISCC) is a robust general technology for development of high throughput, inexpensive protein detection agents. In IISCC, the target protein acts as a template and catalyst, and assembles its own ligand from modular blocks of peptides. This process of ligand discovery is iterated to add peptide arms to develop a multivalent ligand with increased affinity and selectivity. The peptide based protein capture agents (PCC) should ideally have the same degree of selectivity and specificity as a monoclonal antibody, along with improved chemical stability. We had previously reported developing a PCC agent against bovine carbonic anhydrase II (bCAII) that could replace a polyclonal antibody. To further enhance the affinity or specificity of the PCC agent, I explore branching the peptide arms to develop branched PCC agents against bCAII. The developed branched capture agents have two to three fold higher affinities for the target protein. In the second part of my thesis, I describe the epitope targeting strategy, a strategy for directing the development of a peptide ligand against specific region or fragment of the protein. The strategy is successfully demonstrated by developing PCC agents with low nanomolar binding affinities that target the C-terminal hydrophobic motif of Akt2 kinase. One of the developed triligands inhibits the kinase activity of Akt. This suggests that, if targeted against the right epitope, the PCC agents can also influence the functional properties of the protein. The exquisite control of the epitope targeting strategy is further demonstrated by developing a cyclic ligand against Akt2. The cyclic ligand acts as an inhibitor by itself, without any iteration of the ligand discovery process. The epitope targeting strategy is a cornerstone of the IISCC technology and opens up new opportunities, leading to the development of protein detection agents and of modulators of protein functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fishery of Lake Victoria became a major commercial fishery with the introduction of Nile perch in 1950s and 1960s. Biological and population characteristics point to a fishery under intense fishing pressure attributed to increased capacity and use of illegal fishing gears. Studies conducted between 1998 to 2000 suggested capture of fish between slot size of 50 to 85 cm TL to sustain the fishery. Samples from Kenya and Uganda factories in 2008 showed that 50% and 71% of individuals processed were below the slot size respectively. This study revealed that fish below and above the slot has continued being caught and processed. This confirms that the slot size is hardly adhered to by both the fishers and the processors. The paper explores why the slot size has not been a successful tool in management of Nile perch and suggests strategies to sustain the fishery

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Faults can slip either aseismically or through episodic seismic ruptures, but we still do not understand the factors which determine the partitioning between these two modes of slip. This challenge can now be addressed thanks to the dense set of geodetic and seismological networks that have been deployed in various areas with active tectonics. The data from such networks, as well as modern remote sensing techniques, indeed allow documenting of the spatial and temporal variability of slip mode and give some insight. This is the approach taken in this study, which is focused on the Longitudinal Valley Fault (LVF) in Eastern Taiwan. This fault is particularly appropriate since the very fast slip rate (about 5 cm/yr) is accommodated by both seismic and aseismic slip. Deformation of anthropogenic features shows that aseismic creep accounts for a significant fraction of fault slip near the surface, but this fault also released energy seismically, since it has produced five M_w>6.8 earthquakes in 1951 and 2003. Moreover, owing to the thrust component of slip, the fault zone is exhumed which allows investigation of deformation mechanisms. In order to put constraint on the factors that control the mode of slip, we apply a multidisciplinary approach that combines modeling of geodetic observations, structural analysis and numerical simulation of the "seismic cycle". Analyzing a dense set of geodetic and seismological data across the Longitudinal Valley, including campaign-mode GPS, continuous GPS (cGPS), leveling, accelerometric, and InSAR data, we document the partitioning between seismic and aseismic slip on the fault. For the time period 1992 to 2011, we found that about 80-90% of slip on the LVF in the 0-26 km seismogenic depth range is actually aseismic. The clay-rich Lichi M\'elange is identified as the key factor promoting creep at shallow depth. Microstructural investigations show that deformation within the fault zone must have resulted from a combination of frictional sliding at grain boundaries, cataclasis and pressure solution creep. Numerical modeling of earthquake sequences have been performed to investigate the possibility of reproducing the results from the kinematic inversion of geodetic and seismological data on the LVF. We first investigate the different modeling strategy that was developed to explore the role and relative importance of different factors on the manner in which slip accumulates on faults. We compare the results of quasi dynamic simulations and fully dynamic ones, and we conclude that ignoring the transient wave-mediated stress transfers would be inappropriate. We therefore carry on fully dynamic simulations and succeed in qualitatively reproducing the wide range of observations for the southern segment of the LVF. We conclude that the spatio-temporal evolution of fault slip on the Longitudinal Valley Fault over 1997-2011 is consistent to first order with prediction from a simple model in which a velocity-weakening patch is embedded in a velocity-strengthening area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the quest to develop viable designs for third-generation optical interferometric gravitational-wave detectors, one strategy is to monitor the relative momentum or speed of the test-mass mirrors, rather than monitoring their relative position. The most straightforward design for a speed-meter interferometer that accomplishes this is described and analyzed in Chapter 2. This design (due to Braginsky, Gorodetsky, Khalili, and Thorne) is analogous to a microwave-cavity speed meter conceived by Braginsky and Khalili. A mathematical mapping between the microwave speed meter and the optical interferometric speed meter is developed and used to show (in accord with the speed being a quantum nondemolition observable) that in principle the interferometric speed meter can beat the gravitational-wave standard quantum limit (SQL) by an arbitrarily large amount, over an arbitrarily wide range of frequencies . However, in practice, to reach or beat the SQL, this specific speed meter requires exorbitantly high input light power. The physical reason for this is explored, along with other issues such as constraints on performance due to optical dissipation.

Chapter 3 proposes a more sophisticated version of a speed meter. This new design requires only a modest input power and appears to be a fully practical candidate for third-generation LIGO. It can beat the SQL (the approximate sensitivity of second-generation LIGO interferometers) over a broad range of frequencies (~ 10 to 100 Hz in practice) by a factor h/hSQL ~ √W^(SQL)_(circ)/Wcirc. Here Wcirc is the light power circulating in the interferometer arms and WSQL ≃ 800 kW is the circulating power required to beat the SQL at 100 Hz (the LIGO-II power). If squeezed vacuum (with a power-squeeze factor e-2R) is injected into the interferometer's output port, the SQL can be beat with a much reduced laser power: h/hSQL ~ √W^(SQL)_(circ)/Wcirce-2R. For realistic parameters (e-2R ≃ 10 and Wcirc ≃ 800 to 2000 kW), the SQL can be beat by a factor ~ 3 to 4 from 10 to 100 Hz. [However, as the power increases in these expressions, the speed meter becomes more narrow band; additional power and re-optimization of some parameters are required to maintain the wide band.] By performing frequency-dependent homodyne detection on the output (with the aid of two kilometer-scale filter cavities), one can markedly improve the interferometer's sensitivity at frequencies above 100 Hz.

Chapters 2 and 3 are part of an ongoing effort to develop a practical variant of an interferometric speed meter and to combine the speed meter concept with other ideas to yield a promising third- generation interferometric gravitational-wave detector that entails low laser power.

Chapter 4 is a contribution to the foundations for analyzing sources of gravitational waves for LIGO. Specifically, it presents an analysis of the tidal work done on a self-gravitating body (e.g., a neutron star or black hole) in an external tidal field (e.g., that of a binary companion). The change in the mass-energy of the body as a result of the tidal work, or "tidal heating," is analyzed using the Landau-Lifshitz pseudotensor and the local asymptotic rest frame of the body. It is shown that the work done on the body is gauge invariant, while the body-tidal-field interaction energy contained within the body's local asymptotic rest frame is gauge dependent. This is analogous to Newtonian theory, where the interaction energy is shown to depend on how one localizes gravitational energy, but the work done on the body is independent of that localization. These conclusions play a role in analyses, by others, of the dynamics and stability of the inspiraling neutron-star binaries whose gravitational waves are likely to be seen and studied by LIGO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cross sections for the photoproduction of neutral pi, eta, rho and phi mesons on hydrogen have been measured at the Stanford Linear Accelerator Center using a missing mass spectrometer technique. The data cover photon energies between 5.0 and 17.8 GeV and four momentum transfer squared t between -.12 and -1.38 (GeV/c)2.

Pion differential cross sections at lower energies show a peak at low momentum transfers, a distinctive dip and secondary maximum for t in the region -.4 to -.9 (GeV /c)2, and a smooth decrease at higher momentum transfers. As photon energy increases, the dip becomes less pronounced, in contradiction to the expectations of simple Regge theories based on the exchange of omega and B trajectories only.

Eta photoproduction was measured only below 10 GeV. The cross section has about the same magnitude as the pion production cross section, but decreases exponentially with t, showing no dip.

Rho mesons appear to be diffractively produced. The differential cross section varies approximately as exp(8.5t + 2t2). It falls slowly with energy, decreasing about 35 percent from 6 GeV to 17.8 GeV. A simple quark model relation appears to describe the data well.

Phi meson cross sections are also consistent with diffraction production. The differential cross section varies approximately as exp(4t). The cross section tends to decrease slightly with photon energy.