888 resultados para Future issues


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing human demands on soil-derived ecosystem services requires reliable data on global soil resources for sustainable development. The soil organic carbon (SOC) pool is a key indicator of soil quality as it affects essential biological, chemical and physical soil functions such as nutrient cycling, pesticide and water retention, and soil structure maintenance. However, information on the SOC pool, and its temporal and spatial dynamics is unbalanced. Even in well-studied regions with a pronounced interest in environmental issues information on soil carbon (C) is inconsistent. Several activities for the compilation of global soil C data are under way. However, different approaches for soil sampling and chemical analyses make even regional comparisons highly uncertain. Often, the procedures used so far have not allowed the reliable estimation of the total SOC pool, partly because the available knowledge is focused on not clearly defined upper soil horizons and the contribution of subsoil to SOC stocks has been less considered. Even more difficult is quantifying SOC pool changes over time. SOC consists of variable amounts of labile and recalcitrant molecules of plant, and microbial and animal origin that are often operationally defined. A comprehensively active soil expert community needs to agree on protocols of soil surveying and lab procedures towards reliable SOC pool estimates. Already established long-term ecological research sites, where SOC changes are quantified and the underlying mechanisms are investigated, are potentially the backbones for regional, national, and international SOC monitoring programs. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This manuscript provides an overview of past wildlife contraception efforts and discusses the current state of research. Two fertility control agents, an avian reproductive inhibitor containing the active ingredient nicarbazin and an immunocontraceptive vaccine, have received regulatory approval with the Environmental Protection Agency and are commercially available in the USA. OvoControl G Contraceptive Bait for Canada Geese and Ovo Control for pigeons are delivered as oral baits. An injectable immunocontraceptive vaccine (GonaCon Immunocontraceptive Vaccine) was registered with the Environmental Protection Agency for use in female white-tailed deer in September 2009. An injectable product (GonaCon Immunocontraceptive Vaccine) is registered for use in female white-tailed deer. Both products are labeled for use in urban/suburban areas where these species are overabundant. Several other compounds are currently being tested for use in wildlife in the USA, Europe, Australia and New Zealand that could have promise in the future. The development and use of reproductive inhibitors for resolving human–wildlife conflicts will depend on a number of factors, including meeting the requirements of regulatory agencies for use in the environment and on the biological and economical feasibility of their use. Use will also be dependent on health and safety issues and on public acceptance of the techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrospinning has become a widely implemented technique for the generation of nonwoven mats that are useful in tissue engineering and filter applications. The overriding factor that has contributed to the popularity of this method is the ease with which fibers with submicron diameters can be produced. Fibers on that size scale are comparable to protein filaments that are observed in the extracellular matrix. The apparatus and procedures for conducting electrospinning experiments are ostensibly simple. While it is rarely reported in the literature on this topic, any experience with this method of fiber spinning reveals substantial ambiguities in how the process can be controlled to generate reproducible results. The simplicity of the procedure belies the complexity of the physical processes that determine the electrospinning process dynamics. In this article, three process domains and the physical domain of charge interaction are identified as important in electrospinning: (a) creation of charge carriers, (b) charge transport, (c) residual charge. The initial event that enables electrospinning is the generation of region of excess charge in the fluid that is to be electrospun. The electrostatic forces that develop on this region of charged fluid in the presence of a high potential result in the ejection of a fluid jet that solidifies into the resulting fiber. The transport of charge from the charge solution to the grounded collection device produces some of the current which is observed. That transport can occur by the fluid jet and through the atmosphere surrounding the electrospinning apparatus. Charges that are created in the fluid that are not dissipated remain in the solidified fiber as residual charges. The physics of each of these domains in the electrospinning process is summarized in terms of the current understanding, and possible sources of ambiguity in the implementation of this technique are indicated. Directions for future research to further articulate the behavior of the electrospinning process are suggested. (C) 2012 American Institute of Physics. [doi: 10.1063/1.3682464]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis consists of four self-contained essays in economics. Tournaments and unfair treatment. This paper introduces the negative feelings associated with the perception of being unfairly treated into a tournament model and examines the impact of these perceptions on workers’ efforts and their willingness to work overtime. The effect of unfair treatment on workers’ behavior is ambiguous in the model in that two countervailing effects arise: a negative impulsive effect and a positive strategic effect. The impulsive effect implies that workers react to the perception of being unfairly treated by reducing their level of effort. The strategic effect implies that workers raise this level in order to improve their career opportunities and thereby avoid feeling even more unfairly treated in the future. An empirical test of the model using survey data from a Swedish municipal utility shows that the overall effect is negative. This suggests that employers should consider the negative impulsive effect of unfair treatment on effort and overtime in designing contracts and determining on promotions. Late careers in Sweden between 1970 and 2000. In this essay Swedish workers’ late careers between 1970 and 2000 are studied. The aim is to examine older workers’ career patterns and whether they have changed during this period. For example, is there a difference in career mobility or labor market exiting between cohorts? What affects the late career, and does this differ between cohorts? The analysis shows that between 1970 and 2000 the late careers of Swedish workers comprised of few job changes and consisted more of “trying to keep the job you had in your mid-fifties” than of climbing up the promotion ladder. There are no cohort differences in this pattern. Also a large fraction of the older workers exited the labor market before the normal retirement age of 65. During the 1970s and first part of the 1980s, 56 percent of the older workers made an early exit and the average drop-out age was 63. During the late 1980s and the 1990s the share of old workers who made an early exit had risen to 76 percent and the average drop-out age had dropped to 61.5. Different factors have affected the probabilities of an early exit between 1970 and 2000. For example, skills did affect the risk of exiting the labor market during the 1970s and up to the mid-1980s, but not in the late 1980s or the 1990s. During the first period old workers in the lowest occupations or with the lowest level of education were more likely to exit the labor market than more highly skilled workers. In the second period old workers at all levels of skill had the same probability of leaving the labor market. The growth and survival of establishments: does gender segregation matter? We empirically examine the employment dynamics that arise in Becker’s (1957) model of labor market discrimination. According to the model, firms that employ a large fraction of women will be relatively more profitable due to lower wage costs, and thus enjoy a greater probability of surviving and growing by underselling other firms in the competitive product market. In order to test these implications, we use a unique Swedish matched employer-employee data set. We find that female-dominated establishments do not enjoy any greater probability of surviving and do not grow faster than other establishments. Additionally, we find that integrated establishments, in terms of gender, age and education levels, are more successful than other establishments. Thus, attempts by legislators to integrate firms along all dimensions of diversity may have positive effects on the growth and survival of firms. Risk and overconfidence – Gender differences in financial decision-making as revealed in the TV game-show Jeopardy. We have used unique data from the Swedish version of the TV-show Jeopardy to uncover gender differences in financial decision-making by looking at the contestants’ final wagering strategies. After ruling out empirical best-responses, which do appear in Jeopardy in the US, a simple model is derived to show that risk preferences, the subjective and objective probabilities of answering correctly (individual and group competence), determine wagering strategies. The empirical model shows that, on average, women adopt more conservative and diversified strategies, while men’s strategies aim for the greatest gains. Further, women’s strategies are more responsive to the competence measures, which suggests that they are less overconfident. Together these traits make women more successful players. These results are in line with earlier findings on gender and financial trading.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a collection of essays related to the topic of innovation in the service sector. The choice of this structure is functional to the purpose of single out some of the relevant issues and try to tackle them, revising first the state of the literature and then proposing a way forward. Three relevant issues has been therefore selected: (i) the definition of innovation in the service sector and the connected question of measurement of innovation; (ii) the issue of productivity in services; (iii) the classification of innovative firms in the service sector. Facing the first issue, chapter II shows how the initial width of the original Schumpeterian definition of innovation has been narrowed and then passed to the service sector form the manufacturing one in a reduce technological form. Chapter III tackle the issue of productivity in services, discussing the difficulties for measuring productivity in a context where the output is often immaterial. We reconstruct the dispute on the Baumol’s cost disease argument and propose two different ways to go forward in the research on productivity in services: redefining the output along the line of a characteristic approach; and redefining the inputs, particularly analysing which kind of input it’s worth saving. Chapter IV derives an integrated taxonomy of innovative service and manufacturing firms, using data coming from the 2008 CIS survey for Italy. This taxonomy is based on the enlarged definition of “innovative firm” deriving from the Schumpeterian definition of innovation and classify firms using a cluster analysis techniques. The result is the emergence of a four cluster solution, where firms are differentiated by the breadth of the innovation activities in which they are involved. Chapter 5 reports some of the main conclusions of each singular previous chapter and the points worth of further research in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our growing understanding of human mind and cognition and the development of neurotechnology has triggered debate around cognitive enhancement in neuroethics. The dissertation examines the normative issues of memory enhancement, and focuses on two issues: (1) the distinction between memory treatment and enhancement; and (2) how the issue of authenticity concerns memory interventions, including memory treatments and enhancements. rnThe first part consists of a conceptual analysis of the concepts required for normative considerations. First, the representational nature and the function of memory are discussed. Memory is regarded as a special form of self-representation resulting from a constructive processes. Next, the concepts of selfhood, personhood, and identity are examined and a conceptual tool—the autobiographical self-model (ASM)—is introduced. An ASM is a collection of mental representations of the system’s relations with its past and potential future states. Third, the debate between objectivist and constructivist views of health are considered. I argue for a phenomenological account of health, which is based on the primacy of illness and negative utilitarianism.rnThe second part presents a synthesis of the relevant normative issues based on the conceptual tools developed. I argue that memory enhancement can be distinguished from memory treatment using a demarcation regarding the existence of memory-related suffering. That is, memory enhancements are, under standard circumstances and without any unwilling suffering or potential suffering resulting from the alteration of memory functions, interventions that aim to manipulate memory function based on the self-interests of the individual. I then consider the issue of authenticity, namely whether memory intervention or enhancement endangers “one’s true self”. By analyzing two conceptions of authenticity—authenticity as self-discovery and authenticity as self-creation, I propose that authenticity should be understood in terms of the satisfaction of the functional constraints of an ASM—synchronic coherence, diachronic coherence, and global veridicality. This framework provides clearer criteria for considering the relevant concerns and allows us to examine the moral values of authenticity. rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This manuscript focuses on development assistance players’ efforts to cooperate, coordinate and collaborate on projects of mutual interest. I target the case of the cross-sectoral and international Media Issues Group designed to reform and develop the media sector in Bosnia and Herzegovina. I identify and categorize variables that influenced interorganizational relationships to summarize lessons learned and potentially inform similar interventions. This work suggests that cooperation, coordination and collaboration are constrained by contextual, strategic and procedural variables. Through participant narrative based on observation and interviews, this work clarifies the nuances within these three sets of variables for potential extrapolation to other settings. Perhaps more importantly, it provides lessons learned that can inform future international community interventions in market development activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acquired haemophilia is an autoimmune disorder characterised by autoantibody formation against coagulation factor VIII. Immunosuppressive treatments including steroids, cytotoxic drugs, rituximab or combinations thereof have been used to eradicate autoantibodies. Very few prospective studies exist evaluating the use of these treatments. Here, we performed a survey among 73 physicians from 57 haemophilia treatment centres in order to describe current practice patterns and critical issues for future research in acquired haemophilia. The results demonstrate a high diversity of first- and second-line treatments. Factors influencing treatment decision were underlying disorder, severity of bleeding and inhibitor titre. Frequently used first-line treatments were steroids plus cyclophosphamide (44%) and steroids alone (11%). Second-line treatment was most often rituximab (30%), with or without steroids and/or cyclophosphamide. Most participants indicated to change from first- to second-line treatment after 4 weeks in case of failure to obtain partial remission (31%), continued bleeding (40%) or continued severe bleeding requiring bypass treatment (59%). Immunoadsorption was preferred for first- and second-line treatment by 10% and 9% of participants, respectively. These results highlight critical issues in the field. Open questions and directions for future research are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review of late-Holocene palaeoclimatology represents the results from a PAGES/CLIVAR Intersection Panel meeting that took place in June 2006. The review is in three parts: the principal high-resolution proxy disciplines (trees, corals, ice cores and documentary evidence), emphasizing current issues in their use for climate reconstruction; the various approaches that have been adopted to combine multiple climate proxy records to provide estimates of past annual-to-decadal timescale Northern Hemisphere surface temperatures and other climate variables, such as large-scale circulation indices; and the forcing histories used in climate model simulations of the past millennium. We discuss the need to develop a framework through which current and new approaches to interpreting these proxy data may be rigorously assessed using pseudo-proxies derived from climate model runs, where the `answer' is known. The article concludes with a list of recommendations. First, more raw proxy data are required from the diverse disciplines and from more locations, as well as replication, for all proxy sources, of the basic raw measurements to improve absolute dating, and to better distinguish the proxy climate signal from noise. Second, more effort is required to improve the understanding of what individual proxies respond to, supported by more site measurements and process studies. These activities should also be mindful of the correlation structure of instrumental data, indicating which adjacent proxy records ought to be in agreement and which not. Third, large-scale climate reconstructions should be attempted using a wide variety of techniques, emphasizing those for which quantified errors can be estimated at specified timescales. Fourth, a greater use of climate model simulations is needed to guide the choice of reconstruction techniques (the pseudo-proxy concept) and possibly help determine where, given limited resources, future sampling should be concentrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Innovations in hardware and network technologies lead to an exploding number of non-interrelated parallel media streams. Per se this does not mean any additional value for consumers. Broadcasting and advertisement industries have not yet found new formats to reach the individual user with their content. In this work we propose and describe a novel digital broadcasting framework, which allows for the live staging of (mass) media events and improved consumer personalisation. In addition new professions for future TV production workflows which will emerge are described, namely the 'video composer' and the 'live video conductor'.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future of Brazilian children who have the protection offered by familial bonds is threatened by social inequities that force them to seek shelter and grow up in shelters. According to the Institute of Applied Economic Research, an estimated 20,000 children and adolescents are served by institutions. The majority of these children are afro-descendent males between the ages of seven and fifteen years old. Of those researched, 87.6% have families (58.2% receive visits from their families, 22.7% are rarely visited by their families and 5.8 are legally prohibited from contacting or being by their families). The percentage of children and adolescents “without families” or with “missing families” is 11.3%. There is no information available for 2% of the children and adolescents residing in shelters. The principle factors that necessitate the placement of Brazilian children in institutions that provide care and shelter include poverty (including children forced to work, sell drugs or beg, for example); domestic violence; chemical dependence of parents or guardians; homelessness; death or parents or guardian; imprisonment of their parents; and sexual abuse committed by their parents or guardians. The issue of abandoned children and adolescents and their care and shelter in the Brazilian context expresses a perverse violation of Child and Adolescent Rights.