827 resultados para Adoption constraints
Resumo:
Conservation Agriculture (CA) is mostly referred to in the literature as having three principles at the core of its identity: minimum soil disturbance, permanent organic soil cover and crop diversity. This farming package has been described as suitable to improve yields and livelihoods of smallholders in semi-arid regions of Kenya, which since the colonial period have been heavily subjected to tillage. Our study is based on a qualitative approach that followed local meanings and understandings of soil fertility, rainfall and CA in Ethi and Umande located in the semi-arid region of Laikipia, Kenya. Farm visits, 53 semistructured interviews, informal talks were carried out from April to June 2015. Ethi and Umande locations were part of a resettlement programme after the independence of Kenya that joined together people coming from different farming contexts. Since the 1970–80s, state and NGOs have been promoting several approaches to control erosion and boost soil fertility. In this context, CA has also been promoted preferentially since 2007. Interviewees were well acquainted with soil erosion and the methods to control it. Today, rainfall amount and distribution are identified as major constraints to crop performance. Soil fertility is understood as being under control since farmers use several methods to boost it (inorganic fertilisers, manure, terraces, agroforestry, vegetation barriers). CA is recognised to deliver better yields but it is not able to perform well under severe drought and does not provide yields as high as ‘promised’ in promotion campaigns. Moreover, CA is mainly understood as “cultivating with chemicals”, “kulima na dawa”, in kiswahili. A dominant view is that CA is about minimum tillage and use of pre-emergence herbicides. It is relevant to reflect about what kind of CA is being promoted and if elements like soil cover and crop rotation are given due attention. CA based on these two ideas, minimum tillage and use of herbicides, is hard to stand as a programme to be promoted and up-scaled. Therefore CA appears not to be recognised as a convincing approach to improve the livelihoods in Laikipia.
Resumo:
Tese de Doutoramento em Ciências Empresariais.
Resumo:
Use of new technologies, such as virtual reality (VR), is important to corporations, yet understanding of their successful implementation is insuf. ciently developed. In this paper a case study is used to analyse the introduction of VR use in a British housebuilding company. Although the implementation was not successful in the manner initially anticipated, the study provides insight into the process of change, the constraints that inhibit implementation and the relationship between new technology and work organization. Comparison is made with the early use of CAD and similarities and differences between empirical . ndings of the case study and the previous literature are discussed.
Resumo:
The recent process of accelerated expansion of the Brazilian economy was driven by exports and fixed capital formation. Although the pace of growth was more robust than in the 1990´s, we can still witness the existence of certain macroeconomic constraints to its continuation in the long run such as, for instance, the exchange rate overvaluation in particular since 2005, and in general the modus operandi of monetary policy. Such constraints may jeopardize the sustainability of the current pace of growth. Therefore, we argue that Brazil still lies in a trap made up of high interest and low exchange rates. The elimination of the exchange rate misalignment would bring about a great increase in the rate of interest, which on its turn would impact negatively upon investment and hence upon the sustainability of long run economic growth. We outline a set of policy measures to eliminate such a trap, in particular, the adoption of an implicit target for the exchange rate, capital controls and the abandonment of the present regime of inflation targeting. Recent events seem to go in this direction.
Resumo:
The purpose of this paper is to explore the use of automated inventory management systems (IMS) and identify the stage of technology adoption for restaurants in Aruba. A case study analysis involving twelve members of the Aruba Gastronomic Association was conducted using a qualitative research design to gather information on approaches currently used as well as the reasons and perceptions managers/owners have for using or not using automated systems in their facilities. This is the first study conducted using the Aruba restaurant market. Therefore, the application of two technology adoption models was used to integrate critical factors relevant to the study. Major findings indicated the use of an automated IMS in restaurants is limited, thus underscoring the lack of adoption of technology in this area. The results also indicated that two major reasons that restaurants are not adopting IMS technology are budgetary constraints and service support. This study is imperative for two reasons: (1) the results of this study can be used as a comparison for future IMS adoption, not only for Aruba’s restaurant industry but also for other Caribbean destinations and the U.S., (2) this study also provides insight into the additional training and support help needed in hospitality technology services.
Resumo:
Government call centers (311) were first created to reduce the volume of non-emergency calls that were being placed to emergency 911 call centers. The number of 311 call centers increased from 57 in 2008 to about 300 in 2013. Considering that there are over 2,700 municipal government units across the United States, the adoption rate of the 311 centers is arguably low in the country. This dissertation is an examination of the adoption of 311 call centers by municipal governments. My focus is specifically on why municipal governments adopt 311 and identifying which barriers result in the non-adoption of 311 call centers. This dissertation is possibly the first study to examine the adoption of 311 call centers in the United States. The dissertation study has identified several significant factors in the adoption and non-adoption of 311 government call centers. The following factors were significant in the adoption of 311 government call centers: managerial support, financial constraints, organizational responsiveness, strategic plan placement, and technology champion. The following factors were significant barriers that resulted in the non-adoption of a 311 government call center; no demand from citizens, start up costs, annual operating costs, unavailability of funding, and no obvious need for one.If local government entities that do not have a 311 government call center decide to adopt one, this study will help them identify the conditions that need to be in place for successful adoption to occur. Local government officials would first need to address the barriers in setting up the 311 call centers.
Resumo:
Government call centers (311) were first created to reduce the volume of non-emergency calls that were being placed to emergency 911 call centers. The number of 311 call centers increased from 57 in 2008 to about 300 in 2013. Considering that there are over 2,700 municipal government units across the United States, the adoption rate of the 311 centers is arguably low in the country. This dissertation is an examination of the adoption of 311 call centers by municipal governments. My focus is specifically on why municipal governments adopt 311 and identifying which barriers result in the non-adoption of 311 call centers. This dissertation is possibly the first study to examine the adoption of 311 call centers in the United States. ^ The dissertation study has identified several significant factors in the adoption and non-adoption of 311 government call centers. The following factors were significant in the adoption of 311 government call centers: managerial support, financial constraints, organizational responsiveness, strategic plan placement, and technology champion. The following factors were significant barriers that resulted in the non-adoption of a 311 government call center; no demand from citizens, start up costs, annual operating costs, unavailability of funding, and no obvious need for one. ^ If local government entities that do not have a 311 government call center decide to adopt one, this study will help them identify the conditions that need to be in place for successful adoption to occur. Local government officials would first need to address the barriers in setting up the 311 call centers. ^
Resumo:
We present a re-analysis of the Geneva-Copenhagen survey, which benefits from the infrared flux method to improve the accuracy of the derived stellar effective temperatures and uses the latter to build a consistent and improved metallicity scale. Metallicities are calibrated on high-resolution spectroscopy and checked against four open clusters and a moving group, showing excellent consistency. The new temperature and metallicity scales provide a better match to theoretical isochrones, which are used for a Bayesian analysis of stellar ages. With respect to previous analyses, our stars are on average 100 K hotter and 0.1 dex more metal rich, which shift the peak of the metallicity distribution function around the solar value. From Stromgren photometry we are able to derive for the first time a proxy for [alpha/Fe] abundances, which enables us to perform a tentative dissection of the chemical thin and thick disc. We find evidence for the latter being composed of an old, mildly but systematically alpha-enhanced population that extends to super solar metallicities, in agreement with spectroscopic studies. Our revision offers the largest existing kinematically unbiased sample of the solar neighbourhood that contains full information on kinematics, metallicities, and ages and thus provides better constraints on the physical processes relevant in the build-up of the Milky Way disc, enabling a better understanding of the Sun in a Galactic context.
Resumo:
We discuss the dynamics of the Universe within the framework of the massive graviton cold dark matter scenario (MGCDM) in which gravitons are geometrically treated as massive particles. In this modified gravity theory, the main effect of the gravitons is to alter the density evolution of the cold dark matter component in such a way that the Universe evolves to an accelerating expanding regime, as presently observed. Tight constraints on the main cosmological parameters of the MGCDM model are derived by performing a joint likelihood analysis involving the recent supernovae type Ia data, the cosmic microwave background shift parameter, and the baryonic acoustic oscillations as traced by the Sloan Digital Sky Survey red luminous galaxies. The linear evolution of small density fluctuations is also analyzed in detail. It is found that the growth factor of the MGCDM model is slightly different (similar to 1-4%) from the one provided by the conventional flat Lambda CDM cosmology. The growth rate of clustering predicted by MGCDM and Lambda CDM models are confronted to the observations and the corresponding best fit values of the growth index (gamma) are also determined. By using the expectations of realistic future x-ray and Sunyaev-Zeldovich cluster surveys we derive the dark matter halo mass function and the corresponding redshift distribution of cluster-size halos for the MGCDM model. Finally, we also show that the Hubble flow differences between the MGCDM and the Lambda CDM models provide a halo redshift distribution departing significantly from the those predicted by other dark energy models. These results suggest that the MGCDM model can observationally be distinguished from Lambda CDM and also from a large number of dark energy models recently proposed in the literature.
Resumo:
We discuss the properties of homogeneous and isotropic flat cosmologies in which the present accelerating stage is powered only by the gravitationally induced creation of cold dark matter (CCDM) particles (Omega(m) = 1). For some matter creation rates proposed in the literature, we show that the main cosmological functions such as the scale factor of the universe, the Hubble expansion rate, the growth factor, and the cluster formation rate are analytically defined. The best CCDM scenario has only one free parameter and our joint analysis involving baryonic acoustic oscillations + cosmic microwave background (CMB) + SNe Ia data yields (Omega) over tilde = 0.28 +/- 0.01 (1 sigma), where (Omega) over tilde (m) is the observed matter density parameter. In particular, this implies that the model has no dark energy but the part of the matter that is effectively clustering is in good agreement with the latest determinations from the large- scale structure. The growth of perturbation and the formation of galaxy clusters in such scenarios are also investigated. Despite the fact that both scenarios may share the same Hubble expansion, we find that matter creation cosmologies predict stronger small scale dynamics which implies a faster growth rate of perturbations with respect to the usual Lambda CDM cosmology. Such results point to the possibility of a crucial observational test confronting CCDM with Lambda CDM scenarios through a more detailed analysis involving CMB, weak lensing, as well as the large-scale structure.
Resumo:
Aims. We calculate the theoretical event rate of gamma-ray bursts (GRBs) from the collapse of massive first-generation (Population III; Pop III) stars. The Pop III GRBs could be super-energetic with the isotropic energy up to E(iso) greater than or similar to 10(55-57) erg, providing a unique probe of the high-redshift Universe. Methods. We consider both the so-called Pop III.1 stars (primordial) and Pop III.2 stars (primordial but affected by radiation from other stars). We employ a semi-analytical approach that considers inhomogeneous hydrogen reionization and chemical evolution of the intergalactic medium. Results. We show that Pop III.2 GRBs occur more than 100 times more frequently than Pop III.1 GRBs, and thus should be suitable targets for future GRB missions. Interestingly, our optimistic model predicts an event rate that is already constrained by the current radio transient searches. We expect similar to 10-10(4) radio afterglows above similar to 0.3 mJy on the sky with similar to 1 year variability and mostly without GRBs (orphans), which are detectable by ALMA, EVLA, LOFAR, and SKA, while we expect to observe maximum of N < 20 GRBs per year integrated over at z > 6 for Pop III.2 and N < 0.08 per year integrated over at z > 10 for Pop III.1 with EXIST, and N < 0.2 for Pop III.2 GRBs per year integrated over at z > 6 with Swift.
Resumo:
The kinematic approach to cosmological tests provides direct evidence to the present accelerating stage of the Universe that does not depend on the validity of general relativity, as well as on the matter-energy content of the Universe. In this context, we consider here a linear two-parameter expansion for the decelerating parameter, q(z)=q(0)+q(1)z, where q(0) and q(1) are arbitrary constants to be constrained by the union supernovae data. By assuming a flat Universe we find that the best fit to the pair of free parameters is (q(0),q(1))=(-0.73,1.5) whereas the transition redshift is z(t)=0.49(-0.07)(+0.14)(1 sigma) +0.54-0.12(2 sigma). This kinematic result is in agreement with some independent analyses and more easily accommodates many dynamical flat models (like Lambda CDM).
Resumo:
This paper reports results from a search for nu(mu) -> nu(e) transitions by the MINOS experiment based on a 7 x 10(20) protons-on-target exposure. Our observation of 54 candidate nu(e) events in the far detector with a background of 49.1 +/- 7.0(stat) +/- 2.7(syst) events predicted by the measurements in the near detector requires 2sin(2)(2 theta(13))sin(2)theta(23) < 0.12(0.20) at the 90% C.L. for the normal (inverted) mass hierarchy at delta(CP) = 0. The experiment sets the tightest limits to date on the value of theta(13) for nearly all values of delta(CP) for the normal neutrino mass hierarchy and maximal sin(2)(2 theta(23)).
Resumo:
For Au + Au collisions at 200 GeV, we measure neutral pion production with good statistics for transverse momentum, p(T), up to 20 GeV/c. A fivefold suppression is found, which is essentially constant for 5 < p(T) < 20 GeV/c. Experimental uncertainties are small enough to constrain any model-dependent parametrization for the transport coefficient of the medium, e. g., <(q) over cap > in the parton quenching model. The spectral shape is similar for all collision classes, and the suppression does not saturate in Au + Au collisions.
Resumo:
The PHENIX experiment has measured the suppression of semi-inclusive single high-transverse-momentum pi(0)'s in Au+Au collisions at root s(NN) = 200 GeV. The present understanding of this suppression is in terms of energy loss of the parent (fragmenting) parton in a dense color-charge medium. We have performed a quantitative comparison between various parton energy-loss models and our experimental data. The statistical point-to-point uncorrelated as well as correlated systematic uncertainties are taken into account in the comparison. We detail this methodology and the resulting constraint on the model parameters, such as the initial color-charge density dN(g)/dy, the medium transport coefficient <(q) over cap >, or the initial energy-loss parameter epsilon(0). We find that high-transverse-momentum pi(0) suppression in Au+Au collisions has sufficient precision to constrain these model-dependent parameters at the +/- 20-25% (one standard deviation) level. These constraints include only the experimental uncertainties, and further studies are needed to compute the corresponding theoretical uncertainties.