980 resultados para C14.907.489
Resumo:
Panzootics such as highly pathogenic avian influenza and Rift Valley fever have originated from the South, largely among poor communities. On a global level, approximately two-thirds of those individuals living on less than US$2 per day keep livestock. Consequently, there is a need to better target animal health interventions for poverty reduction using an evidence-based approach. Therefore, the paper offers a three-step prioritisation framework using calculations derived from standard poverty measures: the poverty gap and the head count ratio. Data from 265 poor livestock-keeping households in Kenya informed the study. The results demonstrate that, across a spectrum of producers, the dependence upon particular species varies. Furthermore, the same livestock disease has differing impacts on the depth and severity of poverty. Consequently, animal health interventions need to
Resumo:
Complexity is integral to planning today. Everyone and everything seem to be interconnected, causality appears ambiguous, unintended consequences are ubiquitous, and information overload is a constant challenge. The nature of complexity, the consequences of it for society, and the ways in which one might confront it, understand it and deal with it in order to allow for the possibility of planning, are issues increasingly demanding analytical attention. One theoretical framework that can potentially assist planners in this regard is Luhmann's theory of autopoiesis. This article uses insights from Luhmann's ideas to understand the nature of complexity and its reduction, thereby redefining issues in planning, and explores the ways in which management of these issues might be observed in actual planning practice via a reinterpreted case study of the People's Planning Campaign in Kerala, India. Overall, this reinterpretation leads to a different understanding of the scope of planning and planning practice, telling a story about complexity and systemic response. It allows the reinterpretation of otherwise familiar phenomena, both highlighting the empirical relevance of the theory and providing new and original insight into particular dynamics of the case study. This not only provides a greater understanding of the dynamics of complexity, but also produces advice to help planners implement structures and processes that can cope with complexity in practice.
Resumo:
This paper represents a study of the transient changes occurring in temperature, and moisture and oil contents during the so called “post-frying drainage”—which is the duration for which a product is held in the head space of the fryer after it is removed from the oil. Since most of the oil adhering to the product penetrates into the structure during this period, this paper examines the effects of applying vacuum during drainage (1.33 kPa) to maintain the product temperature consistently above the water saturation temperature corresponding to the prevailing pressure (11 °C), which potentially eliminates water condensation and prevents the occluded surface oil from penetrating into the product structure. Draining under vacuum significantly lowers the oil content of potato chips by 38% compared to atmospheric drainage. This phenomenon can be further confirmed by confocal laser scanning microscopy (CLSM) images, which show that the boundary between the core and the crust regions is clearly visible in the case of vacuum drainage, whereas in the case of atmospheric drainage, the oil is distributed throughout the structure. Unfortunately, the same approach did not reduce the oil content of French fries—the oil content of vacuum-drained product was found similar to the product obtained by draining under atmospheric pressure. This is because the reduction in oil content only occurs when there is net moisture evaporation from the product and the evaporation rate is sufficient to force out the oil from the product; this was clearly not the case with French fries. The CLSM images show that the oil distribution in the products drained under atmospheric pressure and vacuum was similar.
Resumo:
In this paper we extend the well-known Leinfelder–Simader theorem on the essential selfadjointness of singular Schrödinger operators to arbitrary complete Riemannian manifolds. This improves some earlier results of Shubin, Milatovic and others.
Resumo:
Studying peptide amphiphiles (PAs), we investigate the influence of alkyl chain length on the aggregation behavior of the collagen-derived peptide KTTKS with applications ranging from antiwrinkle cosmetic creams to potential uses in regenerative medicine. We have studied synthetic peptides amphiphiles C14− KTTKS (myristoyl Lys-Thr-Thr-Lys-Ser) and C18−KTTKS(stearoyl-Lys-Thr Thr-Lys-Ser) to investigate in detail their physicochemical properties. It is presumed that the hydrophobic chain in these self-assembling peptide amphiphiles enhances peptide permeation across the skin compared to KTTKS alone. Subsequently Cn−KTTKS should act as a prodrug and release the peptide by enzymatic cleavage. Our results should be useful in the further development of molecules with collagen-stimulating activity.
Resumo:
Pervasive computing is a continually, and rapidly, growing field, although still remains in relative infancy. The possible applications for the technology are numerous, and stand to fundamentally change the way users interact with technology. However, alongside these are equally numerous potential undesirable effects and risks. The lack of empirical naturalistic data in the real world makes studying the true impacts of this technology difficult. This paper describes how two independent research projects shared such valuable empirical data on the relationship between pervasive technologies and users. Each project had different aims and adopted different methods, but successfully used the same data and arrived at the same conclusions. This paper demonstrates the benefit of sharing research data in multidisciplinary pervasive computing research where real world implementations are not widely available.
Resumo:
In this paper I analyze the general equilibrium in a random Walrasian economy. Dependence among agents is introduced in the form of dependency neighborhoods. Under the uncertainty, an agent may fail to survive due to a meager endowment in a particular state (direct effect), as well as due to unfavorable equilibrium price system at which the value of the endowment falls short of the minimum needed for survival (indirect terms-of-trade effect). To illustrate the main result I compute the stochastic limit of equilibrium price and probability of survival of an agent in a large Cobb-Douglas economy.
Resumo:
In 1594, major decisions were made by the governors of London and the country about plays and playing. We need to learn what lay behind these events, such as what led James Burbage to build his Blackfriars theater in 1596. That initial fiasco might tell us much about what lay behind Shakespeare’s decision to join the new Chamberlain’s Men in 1594 and his subsequent commitment to them as a full-time playwright. When the Globe burned down in 1613, a majority of the shareholders decided to rebuild it at great cost, but Shakespeare withdrew. The rebuilding was old-fashioned thinking, reverting to the company’s desire, asserted in 1594, to play indoors in winter, which helps to clarify their decisions and Shakespeare’s own—to write plays rather than more long poems. The few surviving papers of the Privy Council and the London mayoralty from the time suggest that one of the two new companies of 1594 preferred to play indoors during the winter instead of at their allocated open playhouses in the suburbs. They tried to renew this traditional practice, first in 1594 and again in 1596 when James Burbage built the indoor Blackfriars playhouse for them. The renewal of the Globe in 1614 was part of the same thinking, although Shakespeare evidently opted out of the decision.
Resumo:
Land-use changes can alter the spatial population structure of plant species, which may in turn affect the attractiveness of flower aggregations to different groups of pollinators at different spatial scales. To assess how pollinators respond to spatial heterogeneity of plant distributions and whether honeybees affect visitation by other pollinators we used an extensive data set comprising ten plant species and their flower visitors from five European countries. In particular we tested the hypothesis that the composition of the flower visitor community in terms of visitation frequencies by different pollinator groups were affected by the spatial plant population structure, viz. area and density measures, at a within-population (‘patch’) and among-population (‘population’) scale. We found that patch area and population density were the spatial variables that best explained the variation in visitation frequencies within the pollinator community. Honeybees had higher visitation frequencies in larger patches, while bumblebees and hoverflies had higher visitation frequencies in sparser populations. Solitary bees had higher visitation frequencies in sparser populations and smaller patches. We also tested the hypothesis that honeybees affect the composition of the pollinator community by altering the visitation frequencies of other groups of pollinators. There was a positive relationship between visitation frequencies of honeybees and bumblebees, while the relationship with hoverflies and solitary bees varied (positive, negative and no relationship) depending on the plant species under study. The overall conclusion is that the spatial structure of plant populations affects different groups of pollinators in contrasting ways at both the local (‘patch’) and the larger (‘population’) scales and, that honeybees affect the flower visitation by other pollinator groups in various ways, depending on the plant species under study. These contrasting responses emphasize the need to investigate the entire pollinator community when the effects of landscape change on plant–pollinator interactions are studied.
Resumo:
Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element
Resumo:
Purpose – This study aims to examine the moderating effects of external environment and organisational structure in the relationship between business-level strategy and organisational performance. Design/methodology/approach – The focus of the study is on manufacturing firms in the UK belonging to the electrical and mechanical engineering sectors, and respondents were CEOs. Both objective and subjective measures were used to assess performance. Non-response bias was assessed statistically and appropriate measures taken to minimise the impact of common method variance (CMV). Findings – The results indicate that environmental dynamism and hostility act as moderators in the relationship between business-level strategy and relative competitive performance. In low-hostility environments a cost-leadership strategy and in high-hostility environments a differentiation strategy lead to better performance compared with competitors. In highly dynamic environments a cost-leadership strategy and in low dynamism environments a differentiation strategy are more helpful in improving financial performance. Organisational structure moderates the relationship of both the strategic types with ROS. However, in the case of ROA, the moderating effect of structure was found only in its relationship with cost-leadership strategy. A mechanistic structure is helpful in improving the financial performance of organisations adopting either a cost-leadership or a differentiation strategy. Originality/value – Unlike many other empirical studies, the study makes an important contribution to the literature by examining the moderating effects of both environment and structure on the relationship between business-level strategy and performance in a detailed manner, using moderated regression analysis.
Resumo:
Rigorous upper bounds are derived that limit the finite-amplitude growth of arbitrary nonzonal disturbances to an unstable baroclinic zonal flow in a continuously stratified, quasi-geostrophic, semi-infinite fluid. Bounds are obtained bath on the depth-integrated eddy potential enstrophy and on the eddy available potential energy (APE) at the ground. The method used to derive the bounds is essentially analogous to that used in Part I of this study for the two-layer model: it relies on the existence of a nonlinear Liapunov (normed) stability theorem, which is a finite-amplitude generalization of the Charney-Stern theorem. As in Part I, the bounds are valid both for conservative (unforced, inviscid) flow, as well as for forced-dissipative flow when the dissipation is proportional to the potential vorticity in the interior, and to the potential temperature at the ground. The character of the results depends on the dimensionless external parameter γ = f02ξ/β0N2H, where ξ is the maximum vertical shear of the zonal wind, H is the density scale height, and the other symbols have their usual meaning. When γ ≫ 1, corresponding to “deep” unstable modes (vertical scale ≈H), the bound on the eddy potential enstrophy is just the total potential enstrophy in the system; but when γ≪1, corresponding to ‘shallow’ unstable modes (vertical scale ≈γH), the eddy potential enstrophy can be bounded well below the total amount available in the system. In neither case can the bound on the eddy APE prevent a complete neutralization of the surface temperature gradient which is in accord with numerical experience. For the special case of the Charney model of baroclinic instability, and in the limit of infinitesimal initial eddy disturbance amplitude, the bound states that the dimensionless eddy potential enstrophy cannot exceed (γ + 1)2/24&gamma2h when γ ≥ 1, or 1/6;&gammah when γ ≤ 1; here h = HN/f0L is the dimensionless scale height and L is the width of the channel. These bounds are very similar to (though of course generally larger than) ad hoc estimates based on baroclinic-adjustment arguments. The possibility of using these kinds of bounds for eddy-amplitude closure in a transient-eddy parameterization scheme is also discussed.
Resumo:
We present an experiment designed to study the psychological basis for the willingness to accept (WTA)–willingness to pay (WTP) gap. Specifically, we conduct a standard WTA–WTP economic experiment to replicate the gap and include in it five additional instruments to try to follow the psychological processes producing it. These instruments are designed to measure five psychological constructs we consider especially relevant: (1) attitudes, (2) feelings, (3) familiarity with the target good, (4) risk attitudes, and (5) personality. Our results provide important new insights into the psychological foundations of the WTA–WTP disparity, which can be used to organize some major previous results and cast serious doubts on the claim that the gap might be just a consequence of inappropriate experimental practice.