862 resultados para Non-negative sources


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of public service broadcasters (PSBs) in the 20th century was framed around debates about its difference compared to commercial broadcasting. These debates navigated between two poles. One concerned the relationship between non‐commercial sources of funding and the role played by statutory Charters as guarantors of the independence of PSBs. The other concerned the relationship between PSBs being both a complementary and a comprehensive service, although there are tensions inherent in this duality. In the 21st century, as reconfigured public service media organisations (PSMs) operate across multiple platforms in a convergent media environment, how are these debates changing, if at all? Is the case for PSM “exceptionalism” changed with Web‐based services, catch‐up TV, podcasting, ancillary product sales, and commissioning of programs from external sources in order to operate in highly diversified cross‐media environments? Do the traditional assumptions about non‐commercialism still hold as the basis for different forms of PSM governance and accountability? This paper will consider the question of PSM exceptionalism in the context of three reviews into Australian media that took place over 2011‐2012: the Convergence Review undertaken through the Department of Broadband, Communications and the Digital Economy; the National Classification Scheme Review undertaken by the Australian Law Reform Commission; and the Independent Media Inquiry that considered the future of news and journalism.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this paper is to provide a comparison of various algorithms and parameters to build reduced semantic spaces. The effect of dimension reduction, the stability of the representation and the effect of word order are examined in the context of the five algorithms bearing on semantic vectors: Random projection (RP), singular value decom- position (SVD), non-negative matrix factorization (NMF), permutations and holographic reduced representations (HRR). The quality of semantic representation was tested by means of synonym finding task using the TOEFL test on the TASA corpus. Dimension reduction was found to improve the quality of semantic representation but it is hard to find the optimal parameter settings. Even though dimension reduction by RP was found to be more generally applicable than SVD, the semantic vectors produced by RP are somewhat unstable. The effect of encoding word order into the semantic vector representation via HRR did not lead to any increase in scores over vectors constructed from word co-occurrence in context information. In this regard, very small context windows resulted in better semantic vectors for the TOEFL test.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Application of poultry litter (PL) to soil can lead to substantial nitrous oxide (N2O) emissions due to the co-application of labile carbon (C) and nitrogen (N). Slow pyrolysis of PL to produce biochar may mitigate N2O emissions from this source, whilst still providing agronomic benefits. In a corn crop on ferrosol with similarly matched available N inputs of ca. 116 kg N/ha, PL-biochar plus urea emitted significantly less N2O (1.5 kg N2O-N/ha) compared to raw PL at 4.9 kg N2O-N/ha. Urea amendment without the PL-biochar emitted 1.2 kg N2O-N/ha, and the PL-biochar alone emitted only 0.35 kg N2O-N/ha. Both PL and PL-biochar resulted in similar corn yields and total N uptake which was significantly greater than for urea alone. Using stable isotope methodology, the majority (~ 80%) of N2O emissions were shown to be from non-urea sources. Amendment with raw PL significantly increased C mineralisation and the quantity of permanganate oxidisable organic C. The low molar H/C (0.49) and O/C (0.16) ratios of the PL-biochar suggest its higher stability in soil than raw PL. The PL-biochar also had higher P and K fertiliser value than raw PL. This study suggests that PL-biochar is a valuable soil amendment with the potential to significantly reduce emissions of soil greenhouse gases compared to the raw product. Contrary to other studies, PL-biochar incorporated to 100 mm did not reduce N2O emissions from surface applied urea, which suggests that further field evaluation of biochar impacts, and methods of application of both biochar and fertiliser, are needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Samples of sea water contain phytoplankton taxa in varying amounts, and marine scientists are interested in the relative abundance of each taxa. Their relative biomass can be ascertained indirectly by measuring the quantity of various pigments using high performance liquid chromatography. However, the conversion from pigment to taxa is mathematically non trivial as it is a positive matrix factorisation problem where both matrices are unknown beyond the level of initial estimates. The prior information on the pigment to taxa conversion matrix is used to give the problem a unique solution. An iteration of two non-negative least squares algorithms gives satisfactory results. Some sample analysis of data indicates prospects for this type of analysis. An alternative more computationally intensive approach using Bayesian methods is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the revisions to the International Health Regulations (IHR) in 2005, much attention has turned to two concerns relating to infectious disease control. The first is how to assist states to strengthen their capacity to identify and verify public health emergencies of international concern (PHEIC). The second is the question of how the World Health Organization (WHO) will operate its expanded mandate under the revised IHR. Very little attention has been paid to the potential individual power that has been afforded under the IHR revisions – primarily through the first inclusion of human rights principles into the instrument and the allowance for the WHO to receive non-state surveillance intelligence and informal reports of health emergencies. These inclusions mark the individual as a powerful actor, but also recognise the vulnerability of the individual to the whim of the state in outbreak response and containment. In this paper we examine why these changes to the IHR occurred and explore the consequence of expanding the sovereignty-as-responsibility concept to disease outbreak response. To this end our paper considers both the strengths and weaknesses of incorporating reports from non-official sources and including human rights principles in the IHR framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rate at which people move and resettle around the world is unprecedented. Mobility and resettlement is now greatly assisted by the use of inexpensive internet communication technologies (ICTs) for a wide variety of functions: to communicate locally and across territories, for localised information seeking, geo – locational mapping and for forging new social connections in host countries and cities. This article is based on a qualitative study of newly arrived migrants and mobile people from non English speaking backgrounds (NESB) to the city of Brisbane, Australia and investigates how the internet is used to assist the initial period of settling into the city. As increasing amounts of essential information is placed online, the study asks how people from NESB communities manage to negotiate the types of information they require during the early stages of resettlement, given varying levels of access to ICTs, digital and language literacy. The study finds that the internet is widely used for specific location information seeking (such as accommodation and job-seeking), but this is often supplemented with other non-mediated sources of information. The study identified implications for social policy in regard to the resourcing and access of information. While findings are specific to the study location, it is feasible that the patterns of internet use for resettlement have relevance in a broader context.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Narrative text is a useful way of identifying injury circumstances from the routine emergency department data collections. Automatically classifying narratives based on machine learning techniques is a promising technique, which can consequently reduce the tedious manual classification process. Existing works focus on using Naive Bayes which does not always offer the best performance. This paper proposes the Matrix Factorization approaches along with a learning enhancement process for this task. The results are compared with the performance of various other classification approaches. The impact on the classification results from the parameters setting during the classification of a medical text dataset is discussed. With the selection of right dimension k, Non Negative Matrix Factorization-model method achieves 10 CV accuracy of 0.93.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The dissertation analyses the political culture of Sweden during the reign of King Gustav III (1771-1792). This period commonly referred to as the Gustavian era followed the so-called Age of Liberty ending half a century of strong parliamentary rule in Sweden. The question at the heart of this study engages with the practice of monarchical rule under Gustav III, its ideological origins and power-political objectives as well as its symbolic expression. The study thereby addresses the very nature of kingship. In concrete terms, why did Gustav III, his court, and his civil service vigorously pursue projects that contemporaneous political opponents and, in particular, subsequent historiography have variously pictured as irrelevant, superficial, or as products of pure vanity? The answer, the study argues, is to be found in patterns of political practice as developed and exercised by Gustav III and his administration, which formed a significant part of the political culture of Gustavian Sweden. The dissertation is divided into three parts. The first traces the use and development of royal graces chivalric orders, medals, titles, privileges, and other gifts issued by the king. The practice of royal reward is illustrated through two case studies: the 1772 coup d état that established Gustav III s rule, and the birth and baptism of the crown prince, Gustav Adolf, in 1778. The second part deals with the establishment of the Court of Appeal in Vasa in 1776. The formation of the Appeals Court was accompanied by a host of ceremonial, rhetorical, emblematic, and architectural features solidifying its importance as one of Gustav III s most symbolic administrative reform projects and hence portraying the king as an enlightened monarch par excellence. The third and final part of the thesis engages with war as a cultural phenomenon and focuses on the Russo-Swedish War of 1788-1790. In this study, the war against Russia is primarily seen as an arena for the king and other players to stage, create and re-create as well as articulate themselves through scenes and roles adhering to a particular cultural idiom. Its codes and symbolic forms, then, were communicated by means of theatre, literature, art, history, and classical mythology. The dissertation makes use of a host of sources: protocols, speeches, letters, diaries, newspapers, poetry, art, medals, architecture, inscriptions and registers. Traditional political source material and literary and art sources are studied as totalities, not as separate entities. Also it is argued that political and non-fictional sources cannot be understood properly without acknowledging the context of genre, literary conventions, and artistic modes. The study critically views the futile, but nonetheless almost habitual juxtaposition of the reality of images, ideas, and metaphors, and the reality of supposedly factual historical events. Significantly, the thesis presumes the symbolic dimension to be a constitutive element of reality, not its cooked up misrepresentation. This presumption is reflected in a discussion of the concept of role , which should not be anachronistically understood as roles in which the king cast himself at different times and in different situations. Neither Gustav III nor other European sovereigns of this period played the roles as rulers or majesties. Rather, they were monarchs both in their own eyes and in the eyes of their contemporaries as well as in all relations and contexts. Key words: Eighteenth-Century, Gustav III, Cultural History, Monarchs, Royal Graces, the Vasa Court of Appeal, the Russo-Swedish War 1788–1790.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Transfer from aluminum to copper metallization and decreasing feature size of integrated circuit devices generated a need for new diffusion barrier process. Copper metallization comprised entirely new process flow with new materials such as low-k insulators and etch stoppers, which made the diffusion barrier integration demanding. Atomic Layer Deposition technique was seen as one of the most promising techniques to deposit copper diffusion barrier for future devices. Atomic Layer Deposition technique was utilized to deposit titanium nitride, tungsten nitride, and tungsten nitride carbide diffusion barriers. Titanium nitride was deposited with a conventional process, and also with new in situ reduction process where titanium metal was used as a reducing agent. Tungsten nitride was deposited with a well-known process from tungsten hexafluoride and ammonia, but tungsten nitride carbide as a new material required a new process chemistry. In addition to material properties, the process integration for the copper metallization was studied making compatibility experiments on different surface materials. Based on these studies, titanium nitride and tungsten nitride processes were found to be incompatible with copper metal. However, tungsten nitride carbide film was compatible with copper and exhibited the most promising properties to be integrated for the copper metallization scheme. The process scale-up on 300 mm wafer comprised extensive film uniformity studies, which improved understanding of non-uniformity sources of the ALD growth and the process-specific requirements for the ALD reactor design. Based on these studies, it was discovered that the TiN process from titanium tetrachloride and ammonia required the reactor design of perpendicular flow for successful scale-up. The copper metallization scheme also includes process steps of the copper oxide reduction prior to the barrier deposition and the copper seed deposition prior to the copper metal deposition. Easy and simple copper oxide reduction process was developed, where the substrate was exposed gaseous reducing agent under vacuum and at elevated temperature. Because the reduction was observed efficient enough to reduce thick copper oxide film, the process was considered also as an alternative method to make the copper seed film via copper oxide reduction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The multiplier ideals of an ideal in a regular local ring form a family of ideals parametrized by non-negative rational numbers. As the rational number increases the corresponding multiplier ideal remains unchanged until at some point it gets strictly smaller. A rational number where this kind of diminishing occurs is called a jumping number of the ideal. In this manuscript we shall give an explicit formula for the jumping numbers of a simple complete ideal in a two dimensional regular local ring. In particular, we obtain a formula for the jumping numbers of an analytically irreducible plane curve. We then show that the jumping numbers determine the equisingularity class of the curve.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tools known as maximal functions are frequently used in harmonic analysis when studying local behaviour of functions. Typically they measure the suprema of local averages of non-negative functions. It is essential that the size (more precisely, the L^p-norm) of the maximal function is comparable to the size of the original function. When dealing with families of operators between Banach spaces we are often forced to replace the uniform bound with the larger R-bound. Hence such a replacement is also needed in the maximal function for functions taking values in spaces of operators. More specifically, the suprema of norms of local averages (i.e. their uniform bound in the operator norm) has to be replaced by their R-bound. This procedure gives us the Rademacher maximal function, which was introduced by Hytönen, McIntosh and Portal in order to prove a certain vector-valued Carleson's embedding theorem. They noticed that the sizes of an operator-valued function and its Rademacher maximal function are comparable for many common range spaces, but not for all. Certain requirements on the type and cotype of the spaces involved are necessary for this comparability, henceforth referred to as the “RMF-property”. It was shown, that other objects and parameters appearing in the definition, such as the domain of functions and the exponent p of the norm, make no difference to this. After a short introduction to randomized norms and geometry in Banach spaces we study the Rademacher maximal function on Euclidean spaces. The requirements on the type and cotype are considered, providing examples of spaces without RMF. L^p-spaces are shown to have RMF not only for p greater or equal to 2 (when it is trivial) but also for 1 < p < 2. A dyadic version of Carleson's embedding theorem is proven for scalar- and operator-valued functions. As the analysis with dyadic cubes can be generalized to filtrations on sigma-finite measure spaces, we consider the Rademacher maximal function in this case as well. It turns out that the RMF-property is independent of the filtration and the underlying measure space and that it is enough to consider very simple ones known as Haar filtrations. Scalar- and operator-valued analogues of Carleson's embedding theorem are also provided. With the RMF-property proven independent of the underlying measure space, we can use probabilistic notions and formulate it for martingales. Following a similar result for UMD-spaces, a weak type inequality is shown to be (necessary and) sufficient for the RMF-property. The RMF-property is also studied using concave functions giving yet another proof of its independence from various parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An iterative algorithm baaed on probabilistic estimation is described for obtaining the minimum-norm solution of a very large, consistent, linear system of equations AX = g where A is an (m times n) matrix with non-negative elements, x and g are respectively (n times 1) and (m times 1) vectors with positive components.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The max-coloring problem is to compute a legal coloring of the vertices of a graph G = (V, E) with a non-negative weight function w on V such that Sigma(k)(i=1) max(v epsilon Ci) w(v(i)) is minimized, where C-1, ... , C-k are the various color classes. Max-coloring general graphs is as hard as the classical vertex coloring problem, a special case where vertices have unit weight. In fact, in some cases it can even be harder: for example, no polynomial time algorithm is known for max-coloring trees. In this paper we consider the problem of max-coloring paths and its generalization, max-coloring abroad class of trees and show it can be solved in time O(vertical bar V vertical bar+time for sorting the vertex weights). When vertex weights belong to R, we show a matching lower bound of Omega(vertical bar V vertical bar log vertical bar V vertical bar) in the algebraic computation tree model.