405 resultados para Local press
Resumo:
For decades, indeed centuries, the Scottish media have been a source of national pride. Alongside the education system, the Church of Scotland and the legal apparatus the media have been rightly viewed as a distinctive Scottish cultural institution, a key part of what makes Scotland a nation rather than a region. Scotland has long sustained, per capita, one of the richest and most diverse media systems in the world, encapsulating a heady mix of local newspapers such as the West Highland Free Press, national [i.e., Scotland-wide] newspapers and broadcast outlets such as BBC Scotland and the Scotsman, and UK-based media with Scottish editions such as the Sun and the Mail. These media have reflected and fuelled what is in turn a distinctive Scottish political identity separate from, though connected with that of the United Kingdom as a whole. There has, for example, been no major paper with a pro-Tory editorial line north of the border for longer than most of us can remember, reflecting (and perhaps contributing to) the Conservative Party’s poor showing in successive Scottish elections.
Resumo:
More recently, lifespan development psychology models of adaptive development have been applied to the workforce to investigate ageing worker and lifespan issues. The current study uses the Learning and Development Survey (LDS) to investigate employee selection and engagement of learning and development goals and opportunities and constraints for learning at work in relation to demographics and career goals. It was found that mature age was associated with perceptions of preferential treatment of younger workers with respect to learning and development. Age was also correlated with several career goals. Findings suggest that younger workers’ learning and development options are better catered for in the workplace. Mature aged workers may compensate for unequal learning opportunities at work by studying for an educational qualification or seeking alternate job opportunities. The desire for a higher level job within the organization or educational qualification was linked to engagement in learning and development goals at work. It is suggested that an understanding of employee perceptions in the workplace in relation to goals and activities may be important in designing strategies to retain workers.
Resumo:
This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of knowing’.
Resumo:
Bringing together some of Australia’s leading and emerging researchers studying different aspects of the not-for-profit (NFP) sector, Strategic Issues in the Not-for-Profit Sector draws on original Australian and comparative research to provide a spirited exploration of strategic issues facing NFP organisations. A diverse, vital and ever-growing sector in Australia, the NFP sector provides the organisational framework through which many of the most disadvantaged in the community receive access to services and support. However, pressures such as a changing composition, an erosion of financial sustainability, the need to professionalise, and demographic trends affecting patterns of volunteering have put pressure on the NFP sector to innovate and grow in new directions. Strategic Issues in the Not-for-Profit Sector considers the local and global drivers of change, as well as the industry, policy and community imperatives impacting upon NFP sustainability, providing a unique insight into not only the strategic issues, but also strategic responses emerging within the sector.
Resumo:
Robust, affine covariant, feature extractors provide a means to extract correspondences between images captured by widely separated cameras. Advances in wide baseline correspondence extraction require looking beyond the robust feature extraction and matching approach. This study examines new techniques of extracting correspondences that take advantage of information contained in affine feature matches. Methods of improving the accuracy of a set of putative matches, eliminating incorrect matches and extracting large numbers of additional correspondences are explored. It is assumed that knowledge of the camera geometry is not available and not immediately recoverable. The new techniques are evaluated by means of an epipolar geometry estimation task. It is shown that these methods enable the computation of camera geometry in many cases where existing feature extractors cannot produce sufficient numbers of accurate correspondences.
Resumo:
Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among the embedded data points. The embedding is performed implicitly, by specifying the inner products between each pair of points in the embedding space. This information is contained in the so-called kernel matrix, a symmetric and positive semidefinite matrix that encodes the relative positions of all points. Specifying this matrix amounts to specifying the geometry of the embedding space and inducing a notion of similarity in the input space - classical model selection problems in machine learning. In this paper we show how the kernel matrix can be learned from data via semidefinite programming (SDP) techniques. When applied to a kernel matrix associated with both training and test data this gives a powerful transductive algorithm -using the labeled part of the data one can learn an embedding also for the unlabeled part. The similarity between test points is inferred from training points and their labels. Importantly, these learning problems are convex, so we obtain a method for learning both the model class and the function without local minima. Furthermore, this approach leads directly to a convex method for learning the 2-norm soft margin parameter in support vector machines, solving an important open problem.
Resumo:
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
Resumo:
Urban expansion continues to encroach on existing or newly implemented sewerage infrastructure. In this context, legislation and guidelines, both national and international, provide limited direction to the amenity allocation of appropriate buffering distances for land use planners and infrastructure providers. A review of published literature suggests the dominant influences include topography, wind speed and direction, temperature, humidity, existing land uses and vegetation profiles. A statistical criteria review of these factors against six years of sewerage odour complaint data was undertaken to ascertain their influence and a complaint severity hierarchy was established. These hierarchical results suggested the main criteria were: topographical location, elevation relative to the odour source and wind speed. Establishing a justifiable criterion for buffer zone allocations will assist in analytically determining a basis for buffer separations and will assist planners and infrastructure designers in assessing lower impact sewerage infrastructure locations.
Resumo:
Single particle analysis (SPA) coupled with high-resolution electron cryo-microscopy is emerging as a powerful technique for the structure determination of membrane protein complexes and soluble macromolecular assemblies. Current estimates suggest that ∼104–105 particle projections are required to attain a 3 Å resolution 3D reconstruction (symmetry dependent). Selecting this number of molecular projections differing in size, shape and symmetry is a rate-limiting step for the automation of 3D image reconstruction. Here, we present SwarmPS, a feature rich GUI based software package to manage large scale, semi-automated particle picking projects. The software provides cross-correlation and edge-detection algorithms. Algorithm-specific parameters are transparently and automatically determined through user interaction with the image, rather than by trial and error. Other features include multiple image handling (∼102), local and global particle selection options, interactive image freezing, automatic particle centering, and full manual override to correct false positives and negatives. SwarmPS is user friendly, flexible, extensible, fast, and capable of exporting boxed out projection images, or particle coordinates, compatible with downstream image processing suites.