913 resultados para World Bank and IMF


Relevância:

100.00% 100.00%

Publicador:

Resumo:

University of Pretoria / Dissertation / Department of Church History and Church Policy / Advised by Prof J W Hofmeyr

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet and World Wide Web have had, and continue to have, an incredible impact on our civilization. These technologies have radically influenced the way that society is organised and the manner in which people around the world communicate and interact. The structure and function of individual, social, organisational, economic and political life begin to resemble the digital network architectures upon which they are increasingly reliant. It is increasingly difficult to imagine how our ‘offline’ world would look or function without the ‘online’ world; it is becoming less meaningful to distinguish between the ‘actual’ and the ‘virtual’. Thus, the major architectural project of the twenty-first century is to “imagine, build, and enhance an interactive and ever changing cyberspace” (Lévy, 1997, p. 10). Virtual worlds are at the forefront of this evolving digital landscape. Virtual worlds have “critical implications for business, education, social sciences, and our society at large” (Messinger et al., 2009, p. 204). This study focuses on the possibilities of virtual worlds in terms of communication, collaboration, innovation and creativity. The concept of knowledge creation is at the core of this research. The study shows that scholars increasingly recognise that knowledge creation, as a socially enacted process, goes to the very heart of innovation. However, efforts to build upon these insights have struggled to escape the influence of the information processing paradigm of old and have failed to move beyond the persistent but problematic conceptualisation of knowledge creation in terms of tacit and explicit knowledge. Based on these insights, the study leverages extant research to develop the conceptual apparatus necessary to carry out an investigation of innovation and knowledge creation in virtual worlds. The study derives and articulates a set of definitions (of virtual worlds, innovation, knowledge and knowledge creation) to guide research. The study also leverages a number of extant theories in order to develop a preliminary framework to model knowledge creation in virtual worlds. Using a combination of participant observation and six case studies of innovative educational projects in Second Life, the study yields a range of insights into the process of knowledge creation in virtual worlds and into the factors that affect it. The study’s contributions to theory are expressed as a series of propositions and findings and are represented as a revised and empirically grounded theoretical framework of knowledge creation in virtual worlds. These findings highlight the importance of prior related knowledge and intrinsic motivation in terms of shaping and stimulating knowledge creation in virtual worlds. At the same time, they highlight the importance of meta-knowledge (knowledge about knowledge) in terms of guiding the knowledge creation process whilst revealing the diversity of behavioural approaches actually used to create knowledge in virtual worlds and. This theoretical framework is itself one of the chief contributions of the study and the analysis explores how it can be used to guide further research in virtual worlds and on knowledge creation. The study’s contributions to practice are presented as actionable guide to simulate knowledge creation in virtual worlds. This guide utilises a theoretically based classification of four knowledge-creator archetypes (the sage, the lore master, the artisan, and the apprentice) and derives an actionable set of behavioural prescriptions for each archetype. The study concludes with a discussion of the study’s implications in terms of future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Avian malaria and related haematozoa are nearly ubiquitous parasites that can impose fitness costs of variable severity and may, in some cases, cause substantial mortality in their host populations. One example of the latter, the emergence of avian malaria in the endemic avifauna of Hawaii, has become a model for understanding the consequences of human-mediated disease introduction. The drastic declines of native Hawaiian birds due to avian malaria provided the impetus for examining more closely several aspects of host-parasite interactions in this system. Host-specificity is an important character determining the extent to which a parasite may emerge. Traditional parasite classification, however, has used host information as a character in taxonomical identification, potentially obscuring the true host range of many parasites. To improve upon previous methods, I first developed molecular tools to identify parasites infecting a particular host. I then used these molecular techniques to characterize host-specificity of parasites in the genera Plasmodium and Haemoproteus. I show that parasites in the genus Plasmodium exhibit low specificity and are therefore most likely to emerge in new hosts in the future. Subsequently, I characterized the global distribution of the single lineage of P. relictum that has emerged in Hawaii. I demonstrate that this parasite has a broad host distribution worldwide, that it is likely of Old World origin and that it has been introduced to numerous islands around the world, where it may have been overlooked as a cause of decline in native birds. I also demonstrate that morphological classification of P. relictum does not capture differences among groups of parasites that appear to be reproductively isolated based on molecular evidence. Finally, I examined whether reduced immunological capacity, which has been proposed to explain the susceptibility of Hawaiian endemics, is a general feature of an "island syndrome" in isolated avifauna of the remote Pacific. I show that, over multiple time scales, changes in immune response are not uniform and that observed changes probably reflect differences in genetic diversity, parasite exposure and life history that are unique to each species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gemstone Team AUDIO (Assessing and Understanding Deaf Individuals' Occupations)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An event memory is a mental construction of a scene recalled as a single occurrence. It therefore requires the hippocampus and ventral visual stream needed for all scene construction. The construction need not come with a sense of reliving or be made by a participant in the event, and it can be a summary of occurrences from more than one encoding. The mental construction, or physical rendering, of any scene must be done from a specific location and time; this introduces a "self" located in space and time, which is a necessary, but need not be a sufficient, condition for a sense of reliving. We base our theory on scene construction rather than reliving because this allows the integration of many literatures and because there is more accumulated knowledge about scene construction's phenomenology, behavior, and neural basis. Event memory differs from episodic memory in that it does not conflate the independent dimensions of whether or not a memory is relived, is about the self, is recalled voluntarily, or is based on a single encoding with whether it is recalled as a single occurrence of a scene. Thus, we argue that event memory provides a clearer contrast to semantic memory, which also can be about the self, be recalled voluntarily, and be from a unique encoding; allows for a more comprehensive dimensional account of the structure of explicit memory; and better accounts for laboratory and real-world behavioral and neural results, including those from neuropsychology and neuroimaging, than does episodic memory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our media is saturated with claims of ``facts'' made from data. Database research has in the past focused on how to answer queries, but has not devoted much attention to discerning more subtle qualities of the resulting claims, e.g., is a claim ``cherry-picking''? This paper proposes a Query Response Surface (QRS) based framework that models claims based on structured data as parameterized queries. A key insight is that we can learn a lot about a claim by perturbing its parameters and seeing how its conclusion changes. This framework lets us formulate and tackle practical fact-checking tasks --- reverse-engineering vague claims, and countering questionable claims --- as computational problems. Within the QRS based framework, we take one step further, and propose a problem along with efficient algorithms for finding high-quality claims of a given form from data, i.e. raising good questions, in the first place. This is achieved to using a limited number of high-valued claims to represent high-valued regions of the QRS. Besides the general purpose high-quality claim finding problem, lead-finding can be tailored towards specific claim quality measures, also defined within the QRS framework. An example of uniqueness-based lead-finding is presented for ``one-of-the-few'' claims, landing in interpretable high-quality claims, and an adjustable mechanism for ranking objects, e.g. NBA players, based on what claims can be made for them. Finally, we study the use of visualization as a powerful way of conveying results of a large number of claims. An efficient two stage sampling algorithm is proposed for generating input of 2d scatter plot with heatmap, evalutaing a limited amount of data, while preserving the two essential visual features, namely outliers and clusters. For all the problems, we present real-world examples and experiments that demonstrate the power of our model, efficiency of our algorithms, and usefulness of their results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unstructured grid meshes used in most commercial CFD codes inevitably adopt collocated variable solution schemes. These schemes have several shortcomings, mainly due to the interpolation of the pressure gradient, that lead to slow convergence. In this publication we show how it is possible to use a much more stable staggered mesh arrangement in an unstructured code. Several alternative groupings of variables are investigated in a search for the optimum scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Heidegger famously identified Modernity with a technological leveling of being to a single order of a “standing reserve.” In a radically different tone, Gilles Deleuze articulated a single “plane of immanence” within which ontological distinctions between mind and body, God and world, interiority and exteriority become indiscernible. Taking such philosophical declarations as points of departure, this panel will consider how a collapse of ontological distinction emerged as a thematic and structural trope in literary and cinematic modernisms. We hope to consider how writers and film-makers of the 20th c. utilize the resources of their media to ask “the question of being” that troubled their philosophical contemporaries and heirs. In this vein, we will examine how these modernist ontologies of immanence describe the crisis of a subject saturated and eclipsed by a world which comprises her while also remaining strange or opaque. Papers will ask what is lost with the departure of a distinctly human sense of “being” and how the historical arrival of an alternative ontological order may be evident in the lived experience of modernity. In this sense, the relationship to departures and arrivals becomes the modern subject’s suspicion that he is unable to do either vis á vis the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water operators need to be efficient, accountable, honest public institutions providing a universal service. Many water services however lack the institutional strength, the human resources, the technical expertise and equipment, or the financial or managerial capacity to provide these services. They need support to develop these capacities. The vast majority of water operators in the world are in the public sector – 90% of all major cities are served by such bodies. This means that the largest pool of experience and expertise, and the great majority of examples of good practice and sound institutions, are to be found in existing public sector water operators. Because they are public sector, however, they do not have any natural commercial incentive to provide international support. Their incentive stems from solidarity, not profit. Since 1990, however, the policies of donors and development banks have focussed on the private companies and their incentives. The vast resources of the public sector have been overlooked, even blocked by pro-private policies. Out of sight of these global policy-makers, however, a growing number of public sector water companies have been engaged, in a great variety of ways, in helping others develop the capacity to be effective and accountable public services. These supportive arrangements are now called 'public-public partnerships' (PUPs). A public-public partnership (PUP) is simply a collaboration between two or more public authorities or organisations, based on solidarity, to improve the capacity and effectiveness of one partner in providing public water or sanitation services. They have been described as: “a peer relationship forged around common values and objectives, which exclude profit-seeking”.1 Neither partner expects a commercial profit, directly or indirectly. This makes PUPs very different from the public–private partnerships (PPPs) which have been promoted by the international financial institutions (IFIs) like the World Bank. The problems of PPPs have been examined in a number of reports. A great advantage of PUPs is that they avoid the risks of such partnerships: transaction costs, contract failure, renegotiation, the complexities of regulation, commercial opportunism, monopoly pricing, commercial secrecy, currency risk, and lack of public legitimacy.2 PUPs are not merely an abstract concept. The list in the annexe to this paper includes over 130 PUPs in around 70 countries. This means that far more countries have hosted PUPs than host PPPs in water – according to a report from PPIAF in December 2008, there are only 44 countries with private participation in water. These PUPs cover a period of over 20 years, and been used in all regions of the world. The earliest date to the 1980s, when the Yokohama Waterworks Bureau first started partnerships to help train staff in other Asian countries. Many of the PUP projects have been initiated in the last few years, a result of the growing recognition of PUPs as a tool for achieving improvements in public water management. This paper attempts to provide an overview of the typical objectives of PUPs; the different forms of PUPs and partners involved; a series of case studies of actual PUPs; and an examination of the recent WOPs initiative. It then offers recommendations for future development of PUPs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EU-based industry for non-leisure games is an emerging business. As such it is still fragmented and needs to achieve critical mass to compete globally. Nevertheless its growth potential is widely recognized. To become competitive the relevant applied gaming communities and SMEs require support by fostering the generation of innovation potential. The European project Realizing an Applied Gaming Ecosystem (RAGE) is aiming at supporting this challenge. RAGE will help by making available an interoperable set of advanced technology assets, tuned to applied gaming, as well as proven practices of using asset-based applied games in various real-world contexts, and finally a centralized access to a wide range of applied gaming software modules, services and related document, media, and educational resources within an online community portal called the RAGE Ecosystem. It is based on an integrational, user-centered approach of Knowledge Management and Innovation Processes in the shape of a service-based implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main purpose of this paper is to provide the core description of the modelling exercise within the Shelf Edge Advection Mortality And Recruitment (SEAMAR) programme. An individual-based model (IBM) was developed for the prediction of year-to-year survival of the early life-history stages of mackerel (Scomber scombrus) in the eastern North Atlantic. The IBM is one of two components of the model system. The first component is a circulation model to provide physical input data for the IBM. The circulation model is a geographical variant of the HAMburg Shelf Ocean Model (HAMSOM). The second component is the IBM, which is an i-space configuration model in which large numbers of individuals are followed as discrete entities to simulate the transport, growth and mortality of mackerel eggs, larvae and post-larvae. Larval and post-larval growth is modelled as a function of length, temperature and food distribution; mortality is modelled as a function of length and absolute growth rate. Each particle is considered as a super-individual representing 10 super(6) eggs at the outset of the simulation, and then declining according to the mortality function. Simulations were carried out for the years 1998-2000. Results showed concentrations of particles at Porcupine Bank and the adjacent Irish shelf, along the Celtic Sea shelf-edge, and in the southern Bay of Biscay. High survival was observed only at Porcupine and the adjacent shelf areas, and, more patchily, around the coastal margin of Biscay. The low survival along the shelf-edge of the Celtic Sea was due to the consistently low estimates of food availability in that area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An individual-based model (IBM) for the simulation of year-to-year survival during the early life-history stages of the north-east Atlantic stock of mackerel (Scomber scombrus) was developed within the EU funded Shelf-Edge Advection, Mortality and Recruitment (SEAMAR) programme. The IBM included transport, growth and survival and was used to track the passive movement of mackerel eggs, larvae and post-larvae and determine their distribution and abundance after approximately 2 months of drift. One of the main outputs from the IBM, namely distributions and numbers of surviving post-larvae, are compared with field data as recruit (age-0/age-1 juveniles) distribution and abundance for the years 1998, 1999 and 2000. The juvenile distributions show more inter-annual and spatial variability than the modelled distributions of survivors; this may be due to the restriction of using the same initial egg distribution for all 3 yr of simulation. The IBM simulations indicate two main recruitment areas for the north-east Atlantic stock of mackerel, these being Porcupine Bank and the south-eastern Bay of Biscay. These areas correspond to areas of high juvenile catches, although the juveniles generally have a more widespread distribution than the model simulations. The best agreement between modelled data and field data for distribution (juveniles and model survivors) is for the year 1998. The juvenile catches in different representative nursery areas are totalled to give a field abundance index (FAI). This index is compared with a model survivor index (MSI) which is calculated from the total of survivors for the whole spawning season. The MSI compares favourably with the FAI for 1998 and 1999 but not for 2000; in this year, juvenile catches dropped sharply compared with the previous years but there was no equivalent drop in modelled survivors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is an integrated overview of pigment and microscopic analysis of phytoplankton communities throughout the Mozambican coast. Collected samples revealed notable patterns of phytoplankton occurrence and distribution, with community structure changing between regions and sample depth. Pigment data showed Delagoa Bight, Sofala Bank and Angoche as the most productive regions throughout the sampled area. In general, micro-sized phytoplankton, particularly diatoms, were important contributors to biomass both at surface and sub-surface maximum (SSM) samples, although were almost absent in the northern stations. In contrast, nano- and pico-sized phytoplankton revealed opposing patterns. Picophytoplankton were most abundant at surface, as opposed to nanophytoplankton, which were more abundant at the SSM. Microphytoplankton were associated with cooler southern water masses, while picophytoplankton were related to warmer northern water masses. Nanophytoplankton were found to increase their contribution to biomass with increasing SSM. Microscopy information on the genera and species level revealed the diatoms Chaetoceros spp., Proboscia alata, Pseudo-nitzschia spp., Cylindrotheca closterium and Hemiaulus haukii as the most abundant taxa of the micro-sized phytoplankton. Discosphaera tubifera and Emiliania huxleyi were the most abundant coccolithophores, nano-sized phytoplankton.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of a quasi-stationary anticyclonic eddy within the southeastern Bay of Biscay (centred around 44°30′N-4°W) has been reported on various occasions in the bibliography. The analysis made in this study for the period 2003–2010, by using in situ and remote sensing measurements and model results shows that this mesoscale coherent structure is present almost every year from the end of winter-beginning of spring, to the beginning of fall. During this period it remains in an area limited to the east by the Landes Plateau, to the west by Le Danois Bank and Torrelavega canyon and to the northwest by the Jovellanos seamount. All the observations and analysis made in this contribution, suggest that this structure is generated between Capbreton and Torrelavega canyons. Detailed monitoring from in situ and remote sensing data of an anticyclonic quasi-stationary eddy, in 2008, shows the origin of this structure from a warm water current located around 43°42′N-3°30′W in mid-January. This coherent structure is monitored until August around the same area, where it has a marked influence on the Sea Level Anomaly, Sea Surface Temperature and surface Chlorophyll-a concentration. An eddy tracking method, applied to the outputs of a numerical model, shows that the model is able to reproduce this type of eddy, with similar 2D characteristics and lifetimes to that suggested by the observations and previous works. This is the case, for instance, of the simulated MAY04 eddy, which was generated in May 2004 around Torrelavega canyon and remained quasi-stationary in the area for 4 months. The diameter of this eddy ranged from 40 to 60 km, its azimuthal velocity was less than 20 cm s−1, its vertical extension reached 3000–3500 m depth during April and May and it was observed to interact with other coherent structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The presence of a quasi-stationary anticyclonic eddy within the southeastern Bay of Biscay (centred around 44°30′N-4°W) has been reported on various occasions in the bibliography. The analysis made in this study for the period 2003–2010, by using in situ and remote sensing measurements and model results shows that this mesoscale coherent structure is present almost every year from the end of winter-beginning of spring, to the beginning of fall. During this period it remains in an area limited to the east by the Landes Plateau, to the west by Le Danois Bank and Torrelavega canyon and to the northwest by the Jovellanos seamount. All the observations and analysis made in this contribution, suggest that this structure is generated between Capbreton and Torrelavega canyons. Detailed monitoring from in situ and remote sensing data of an anticyclonic quasi-stationary eddy, in 2008, shows the origin of this structure from a warm water current located around 43°42′N-3°30′W in mid-January. This coherent structure is monitored until August around the same area, where it has a marked influence on the Sea Level Anomaly, Sea Surface Temperature and surface Chlorophyll-a concentration. An eddy tracking method, applied to the outputs of a numerical model, shows that the model is able to reproduce this type of eddy, with similar 2D characteristics and lifetimes to that suggested by the observations and previous works. This is the case, for instance, of the simulated MAY04 eddy, which was generated in May 2004 around Torrelavega canyon and remained quasi-stationary in the area for 4 months. The diameter of this eddy ranged from 40 to 60 km, its azimuthal velocity was less than 20 cm s−1, its vertical extension reached 3000–3500 m depth during April and May and it was observed to interact with other coherent structures.