984 resultados para Framework colaborativa


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the primary features of modern government-to-citizen (G2C) service provision is the ability to offer a citizen-centric view of the e-government portal. Life-event approach is one of the most widely adopted paradigms supporting the idea of solving a complex event in a citizen’s life through a single service provision. Several studies have used this approach to design e-government portals. However, they were limited in terms of use and scalability. There were no mechanisms that show how to specify a life-event for structuring public e-services, or how to systematically match life-events with these services taking into consideration the citizen needs. We introduce the NOrm-Based Life-Event (NoBLE) framework for G2C e-service provision with a set of mechanisms as a guide for designing active life-event oriented e-government portals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new iterative approach called Line Adaptation for the Singular Sources Objective (LASSO) to object or shape reconstruction based on the singular sources method (or probe method) for the reconstruction of scatterers from the far-field pattern of scattered acoustic or electromagnetic waves. The scheme is based on the construction of an indicator function given by the scattered field for incident point sources in its source point from the given far-field patterns for plane waves. The indicator function is then used to drive the contraction of a surface which surrounds the unknown scatterers. A stopping criterion for those parts of the surfaces that touch the unknown scatterers is formulated. A splitting approach for the contracting surfaces is formulated, such that scatterers consisting of several separate components can be reconstructed. Convergence of the scheme is shown, and its feasibility is demonstrated using a numerical study with several examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-gas approaches to climate change policies require a metric establishing ‘equivalences’ among emissions of various species. Climate scientists and economists have proposed four kinds of such metrics and debated their relative merits. We present a unifying framework that clarifies the relationships among them. We show, as have previous authors, that the global warming potential (GWP), used in international law to compare emissions of greenhouse gases, is a special case of the global damage potential (GDP), assuming (1) a finite time horizon, (2) a zero discount rate, (3) constant atmospheric concentrations, and (4) impacts that are proportional to radiative forcing. Both the GWP and GDP follow naturally from a cost–benefit framing of the climate change issue. We show that the global temperature change potential (GTP) is a special case of the global cost potential (GCP), assuming a (slight) fall in the global temperature after the target is reached. We show how the four metrics should be generalized if there are intertemporal spillovers in abatement costs, distinguishing between private (e.g., capital stock turnover) and public (e.g., induced technological change) spillovers. Both the GTP and GCP follow naturally from a cost-effectiveness framing of the climate change issue. We also argue that if (1) damages are zero below a threshold and (2) infinitely large above a threshold, then cost-effectiveness analysis and cost–benefit analysis lead to identical results. Therefore, the GCP is a special case of the GDP. The UN Framework Convention on Climate Change uses the GWP, a simplified cost–benefit concept. The UNFCCC is framed around the ultimate goal of stabilizing greenhouse gas concentrations. Once a stabilization target has been agreed under the convention, implementation is clearly a cost-effectiveness problem. It would therefore be more consistent to use the GCP or its simplification, the GTP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An interdisciplinary theoretical framework is proposed for analysing justice in global working conditions. In addition to gender and race as popular criteria to identify disadvantaged groups in organizations, in multinational corporations (MNCs) local employees (i.e. host country nationals (HCNs) working in foreign subsidiaries) deserve special attention. Their working conditions are often substantially worse than those of expatriates (i.e. parent country nationals temporarily assigned to a foreign subsidiary). Although a number of reasons have been put forward to justify such inequalities—usually with efficiency goals in mind—recent studies have used equity theory to question the extent to which they are perceived as fair by HCNs. However, since perceptual equity theory has limitations, this study develops an alternative and non-perceptual framework for analysing such inequalities. Employment discrimination theory and elements of Rawls’s ‘Theory of Justice’ are the theoretical pillars of this framework. This article discusses the advantages of this approach for MNCs and identifies some expatriation practices that are fair according to our non-perceptual justice standards, whilst also reasonably (if not highly) efficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To retain competitiveness, succeed and flourish, organizations are forced to continuously innovate. This drive for innovation is not solely limited to product/process innovation but more profoundly relates to a continuous process of improving how organizations work internally, requiring a constant stream of ideas and suggestions from motivated employees. In this chapter we investigate some recent developments and propose a conceptual framework for creative participation as a personality driven interface between creativity and innovation. Under the assumption that employees’ intrinsic willingness to contribute novel ideas and solutions requires a set of personal characteristics and necessary skill that might well be unique to each organizational unit, the chapter then explores personal characteristics associated with creativity, innovation and innovative behavior. Various studies on the correlation between creativity and personality types are also reviewed. The chapter provides a discussion of solutions and future development together with recommendations for the future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article considers the evolution and impact on schools in England of the "Framework for English" since its introduction in 2001, a national initiative that follows on from the National Literacy Strategy, which focused on primary schools. Whilst acknowledging that the Framework is part of a whole school policy, "The Key Stage Three Strategy", I concentrate on its direct impact on the school subject "English" and on standards within that subject. Such a discussion must incorporate some consideration of the rise of "Literacy" as a dominant term and theme in England (and globally) and its challenge to a politically controversial and much contested curriculum area, i.e. "English". If the Framework is considered within the context of the Literacy drive since the mid-1990s then it can be see to be evolving within a much changed policy context and therefore likely to change substantially in the next few years. In a global context England has been regarded for some time as at the extreme edge of standards-driven policy and practice. It is hoped that the story of "English" in England may be salutary to educators from other countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a world where massive amounts of data are recorded on a large scale we need data mining technologies to gain knowledge from the data in a reasonable time. The Top Down Induction of Decision Trees (TDIDT) algorithm is a very widely used technology to predict the classification of newly recorded data. However alternative technologies have been derived that often produce better rules but do not scale well on large datasets. Such an alternative to TDIDT is the PrismTCS algorithm. PrismTCS performs particularly well on noisy data but does not scale well on large datasets. In this paper we introduce Prism and investigate its scaling behaviour. We describe how we improved the scalability of the serial version of Prism and investigate its limitations. We then describe our work to overcome these limitations by developing a framework to parallelise algorithms of the Prism family and similar algorithms. We also present the scale up results of a first prototype implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Undeniably, anticipation plays a crucial role in cognition. By what means, to what extent, and what it achieves remain open questions. In a recent BBS target article, Clark (in press) depicts an integrative model of the brain that builds on hierarchical Bayesian models of neural processing (Rao and Ballard, 1999; Friston, 2005; Brown et al., 2011), and their most recent formulation using the free-energy principle borrowed from thermodynamics (Feldman and Friston, 2010; Friston, 2010; Friston et al., 2010). Hierarchical generative models of cognition, such as those described by Clark, presuppose the manipulation of representations and internal models of the world, in as much detail as is perceptually available. Perhaps surprisingly, Clark acknowledges the existence of a “virtual version of the sensory data” (p. 4), but with no reference to some of the historical debates that shaped cognitive science, related to the storage, manipulation, and retrieval of representations in a cognitive system (Shanahan, 1997), or accounting for the emergence of intentionality within such a system (Searle, 1980; Preston and Bishop, 2002). Instead of demonstrating how this Bayesian framework responds to these foundational questions, Clark describes the structure and the functional properties of an action-oriented, multi-level system that is meant to combine perception, learning, and experience (Niedenthal, 2007).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A metal organic framework of Cu-II, tartarate (tar) and 2,2'-bipyridyl (2,2'-bipy)], {[Cu(tar)(2,2'-bipy)]center dot 5H(2)O}(n)} (1) has been synthesized at the mild ambient condition and characterized by single crystal X-ray crystallography. In the compound, the Cu(2,2'-bipy) entities are bridged by tartarate ions which are coordinated to Cu-II by both hydroxyl and monodentate carboxylate oxygen to form a one-dimensional chain. The non-coordinated water molecules form ID water chains by edge-sharing cyclic water pentamers along with dangling water dimers. It shows reversible water expulsion upon heating. The water chains join the ID coordination polymeric chains to a 31) network through hydrogen-bond interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cross-bred cow adoption is an important and potent policy variable precipitating subsistence household entry into emerging milk markets. This paper focuses on the problem of designing policies that encourage and sustain milkmarket expansion among a sample of subsistence households in the Ethiopian highlands. In this context it is desirable to measure households’ ‘proximity’ to market in terms of the level of deficiency of essential inputs. This problem is compounded by four factors. One is the existence of cross-bred cow numbers (count data) as an important, endogenous decision by the household; second is the lack of a multivariate generalization of the Poisson regression model; third is the censored nature of the milk sales data (sales from non-participating households are, essentially, censored at zero); and fourth is an important simultaneity that exists between the decision to adopt a cross-bred cow, the decision about how much milk to produce, the decision about how much milk to consume and the decision to market that milk which is produced but not consumed internally by the household. Routine application of Gibbs sampling and data augmentation overcome these problems in a relatively straightforward manner. We model the count data from two sites close to Addis Ababa in a latent, categorical-variable setting with known bin boundaries. The single-equation model is then extended to a multivariate system that accommodates the covariance between crossbred-cow adoption, milk-output, and milk-sales equations. The latent-variable procedure proves tractable in extension to the multivariate setting and provides important information for policy formation in emerging-market settings