181 resultados para Cohesion-based discourse analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jean Anyon’s (1981) “Social class and school knowledge” was a landmark work in North American educational research. It provided a richly detailed qualitative description of differential, social-class-based constructions of knowledge and epistemological stance. This essay situates Anyon’s work in two parallel traditions of critical educational research: the sociology of the curriculum and classroom interaction and discourse analysis. It argues for the renewed importance of both quantitative and qualitative research on social reproduction and equity in the current policy context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In what follows, I put forward an argument for an analytical method for social science that operates at the level of genre. I argue that generic convergence, generic hybridity, and generic instability provide us with a powerful perspectives on changes in political, cultural, and economic relationships, most specifically at the level of institutions. Such a perspective can help us identify the transitional elements, relationships, and trajectories that define the place of our current system in history, thereby grounding our understanding of possible futures.1 In historically contextualising our present with this method, my concern is to indicate possibilities for the future. Systemic contradictions indicate possibility spaces within which systemic change must and will emerge. We live in a system currently dominated by many fully-expressed contradictions, and so in the presence of many possible futures. The contradictions of the current age are expressed most overtly in the public genres of power politics. Contemporary public policy—indeed politics in general-is an excellent focus for any investigation of possible futures, precisely because of its future-oriented function. It is overtly hortatory; it is designed ‘to get people to do things’ (Muntigl in press: 147). There is no point in trying to get people to do things in the past. Consequently, policy discourse is inherently oriented towards creating some future state of affairs (Graham in press), along with concomitant ways of being, knowing, representing, and acting (Fairclough 2000).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we identify elements in Marx´s economic and political writings that are relevant to contemporary critical discourse analysis (CDA). We argue that Marx can be seen to be engaging in a form of discourse analysis. We identify the elements in Marx´s historical materialist method that support such a perspective, and exemplify these in a longitudinal comparison of Marx´s texts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A good object representation or object descriptor is one of the key issues in object based image analysis. To effectively fuse color and texture as a unified descriptor at object level, this paper presents a novel method for feature fusion. Color histogram and the uniform local binary patterns are extracted from arbitrary-shaped image-objects, and kernel principal component analysis (kernel PCA) is employed to find nonlinear relationships of the extracted color and texture features. The maximum likelihood approach is used to estimate the intrinsic dimensionality, which is then used as a criterion for automatic selection of optimal feature set from the fused feature. The proposed method is evaluated using SVM as the benchmark classifier and is applied to object-based vegetation species classification using high spatial resolution aerial imagery. Experimental results demonstrate that great improvement can be achieved by using proposed feature fusion method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Road accidents are of great concerns for road and transport departments around world, which cause tremendous loss and dangers for public. Reducing accident rates and crash severity are imperative goals that governments, road and transport authorities, and researchers are aimed to achieve. In Australia, road crash trauma costs the nation A$ 15 billion annually. Five people are killed, and 550 are injured every day. Each fatality costs the taxpayer A$1.7 million. Serious injury cases can cost the taxpayer many times the cost of a fatality. Crashes are in general uncontrolled events and are dependent on a number of interrelated factors such as driver behaviour, traffic conditions, travel speed, road geometry and condition, and vehicle characteristics (e.g. tyre type pressure and condition, and suspension type and condition). Skid resistance is considered one of the most important surface characteristics as it has a direct impact on traffic safety. Attempts have been made worldwide to study the relationship between skid resistance and road crashes. Most of these studies used the statistical regression and correlation methods in analysing the relationships between skid resistance and road crashes. The outcomes from these studies provided mix results and not conclusive. The objective of this paper is to present a probability-based method of an ongoing study in identifying the relationship between skid resistance and road crashes. Historical skid resistance and crash data of a road network located in the tropical east coast of Queensland were analysed using the probability-based method. Analysis methodology and results of the relationships between skid resistance, road characteristics and crashes are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel two-stage information filtering model which combines the merits of term-based and pattern- based approaches to effectively filter sheer volume of information. In particular, the first filtering stage is supported by a novel rough analysis model which efficiently removes a large number of irrelevant documents, thereby addressing the overload problem. The second filtering stage is empowered by a semantically rich pattern taxonomy mining model which effectively fetches incoming documents according to the specific information needs of a user, thereby addressing the mismatch problem. The experiments have been conducted to compare the proposed two-stage filtering (T-SM) model with other possible "term-based + pattern-based" or "term-based + term-based" IF models. The results based on the RCV1 corpus show that the T-SM model significantly outperforms other types of "two-stage" IF models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim. Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Background. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. Method. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. Findings. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Conclusion. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice. © 2007 Blackwell Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of knowing’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The bulk of the homicide research to date has focused on male offending, with little consideration given to women's offending and in particular, their constructions within the courtroom following a homicide-related charge. This thesis examines, in detail, nineteen homicide cases finalised in the Queensland Supreme Courts between 01/01/1997 and 31/12/2002, in order to document and discuss the various legal stories available to women who kill. Predominantly, two “stock stories” are available within the court. The first, presented by the defence, offers the accused woman a victimised position to occupy. Evidence of victimisation is made available through previous abuse, expert testimony from psychologists and psychiatrists, challenges to her mental health, or appeals to her emotional nature. The second stock story, presented by the prosecution, positions the accused woman as angry, full of revenge, calculating and self serving. Such a script is usually supported by witnesses, police evidence, and family members. This thesis examines these competing and contradictory scripts using thematic discourse analysis to examine the court transcripts in detail. It argues that the "truth" of the fatal incident is based on one of these two prevailing scripts. This research destabilises the dominant script of violent female offending in the feminist literature. Most research to date has focussed on explaining the circumstances in which women kill, concentrating attention on the victimisation of the violent offending woman and negating or de-prioritising any volition on her part. By analysing all transcripts of women whose trials were held within the specified period, this research is able to demonstrate the stories used to describe their complex offending, and draw attention to the anger and intent that can occur alongside the victimisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trees, shrubs and other vegetation are of continued importance to the environment and our daily life. They provide shade around our roads and houses, offer a habitat for birds and wildlife, and absorb air pollutants. However, vegetation touching power lines is a risk to public safety and the environment, and one of the main causes of power supply problems. Vegetation management, which includes tree trimming and vegetation control, is a significant cost component of the maintenance of electrical infrastructure. For example, Ergon Energy, the Australia’s largest geographic footprint energy distributor, currently spends over $80 million a year inspecting and managing vegetation that encroach on power line assets. Currently, most vegetation management programs for distribution systems are calendar-based ground patrol. However, calendar-based inspection by linesman is labour-intensive, time consuming and expensive. It also results in some zones being trimmed more frequently than needed and others not cut often enough. Moreover, it’s seldom practicable to measure all the plants around power line corridors by field methods. Remote sensing data captured from airborne sensors has great potential in assisting vegetation management in power line corridors. This thesis presented a comprehensive study on using spiking neural networks in a specific image analysis application: power line corridor monitoring. Theoretically, the thesis focuses on a biologically inspired spiking cortical model: pulse coupled neural network (PCNN). The original PCNN model was simplified in order to better analyze the pulse dynamics and control the performance. Some new and effective algorithms were developed based on the proposed spiking cortical model for object detection, image segmentation and invariant feature extraction. The developed algorithms were evaluated in a number of experiments using real image data collected from our flight trails. The experimental results demonstrated the effectiveness and advantages of spiking neural networks in image processing tasks. Operationally, the knowledge gained from this research project offers a good reference to our industry partner (i.e. Ergon Energy) and other energy utilities who wants to improve their vegetation management activities. The novel approaches described in this thesis showed the potential of using the cutting edge sensor technologies and intelligent computing techniques in improve power line corridor monitoring. The lessons learnt from this project are also expected to increase the confidence of energy companies to move from traditional vegetation management strategy to a more automated, accurate and cost-effective solution using aerial remote sensing techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current climate of accountability, political manoeuvring, changing curriculum, increasingly diverse student cohorts and community expectations, teachers, more than ever, need to develop the skills and abilities to be reflective and reflexive practitioners. This study examines national teacher professional standards from Australia and the UK to identify the extent to which reflexivity is embedded in key policy documents that are intended to guide the work of teachers in those countries. Using Margaret Archer's theories of reflexivity and morphogenesis, and methods of critical discourse analysis, we argue that these blueprints for teachers’ work exclude reflexivity as an essential and overarching discourse of teacher professionalism.