97 resultados para dichroic mirror


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The notion of pedagogy for anyone in the teaching profession is innocuous. The term itself, is steeped in history but the details of the practice can be elusive. What does it mean for an academic to be embracing pedagogy? The problem is not limited to academics; most teachers baulk at the introduction of a pedagogic agenda and resist attempts to have them reflect on their classroom teaching practice, where ever that classroom might be constituted. This paper explores the application of a pedagogic model (Education Queensland, 2001) which was developed in the context of primary and secondary teaching and was part of a schooling agenda to improve pedagogy. As a teacher educator I introduced the model to classroom teachers (Hill, 2002) using an Appreciative Inquiry (Cooperrider and Srivastva 1987) model and at the same time applied the model to my own pedagogy as an academic. Despite being instigated as a model for classroom teachers, I found through my own practitioner investigation that the model was useful for exploring my own pedagogy as a university academic (Hill, 2007, 2008). Cooperrider, D.L. and Srivastva, S. (1987) Appreciative inquiry in organisational life, in Passmore, W. and Woodman, R. (Eds) Research in Organisational Changes and Development (Vol 1) Greenwich, CT: JAI Press. Pp 129-69 Education Queensland (2001) School Reform Longitudinal Study (QSRLS), Brisbane, Queensland Government. Hill, G. (2002, December ) Reflecting on professional practice with a cracked mirror: Productive Pedagogy experiences. Australian Association for Research in Education Conference. Brisbane, Australia. Hill, G. (2007) Making the assessment criteria explicit through writing feedback: A pedagogical approach to developing academic writing. International Journal of Pedagogies and Learning 3(1), 59-66. Hill, G. (2008) Supervising Practice Based Research. Studies in Learning, Evaluation, Innovation and Development, 5(4), 78-87

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The legal power to declare war has traditionally been a part of a prerogative to be exercised solely on advice that passed from the King to the Governor-General no later than 1942. In 2003, the Governor- General was not involved in the decision by the Prime Minister and Cabinet to commit Australian troops to the invasion of Iraq. The authors explore the alternative legal means by which Australia can go to war - means the government in fact used in 2003 - and the constitutional basis of those means. While the prerogative power can be regulated and/or devolved by legislation, and just possibly by practice, there does not seem to be a sound legal basis to assert that the power has been devolved to any other person. It appears that in 2003 the Defence Minister used his legal powers under the Defence Act 1903 (Cth) (as amended in 1975) to give instructions to the service head(s). A powerful argument could be made that the relevant sections of the Defence Act were not intended to be used for the decision to go to war, and that such instructions are for peacetime or in bello decisions. If so, the power to make war remains within the prerogative to be exercised on advice. Interviews with the then Governor-General indicate that Prime Minister Howard had planned to take the matter to the Federal Executive Council 'for noting', but did not do so after the Governor-General sought the views of the then Attorney-General about relevant issues of international law. The exchange raises many issues, but those of interest concern the kinds of questions the Governor-General could and should ask about proposed international action and whether they in any way mirror the assurances that are uncontroversially required for domestic action. In 2003, the Governor-General's scrutiny was the only independent scrutiny available because the legality of the decision to go to war was not a matter that could be determined in the High Court, and the federal government had taken action in March 2002 that effectively prevented the matter coming before the International Court of Justice

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Driver simulators provide safe conditions to assess driver behaviour and provide controlled and repeatable environments for study. They are a promising research tool in terms of both providing safety and experimentally well controlled environments. There are wide ranges of driver simulators, from laptops to advanced technologies which are controlled by several computers in a real car mounted on platforms with six degrees of freedom of movement. The applicability of simulator-based research in a particular study needs to be considered before starting the study, to determine whether the use of a simulator is actually appropriate for the research. Given the wide range of driver simulators and their uses, it is important to know beforehand how closely the results from a driver simulator match results found in the real word. Comparison between drivers’ performance under real road conditions and in particular simulators is a fundamental part of validation. The important question is whether the results obtained in a simulator mirror real world results. In this paper, the results of the most recently conducted research into validity of simulators is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To compare subjective blur limits for cylinder and defocus. ---------- Method: Blur was induced with a deformable, adaptive-optics mirror when either the subjects’ own astigmatisms were corrected or when both astigmatisms and higher-order aberrations were corrected. Subjects were cyclopleged and had 5 mm artificial pupils. Black letter targets (0.1, 0.35 and 0.6 logMAR) were presented on white backgrounds. Results: For ten subjects, blur limits were approximately 50% greater for cylinder than for defocus (in diopters). While there were considerable effects of axis for individuals, overall this was not strong, with the 0° (or 180°) axis having about 20% greater limits than oblique axes. In a second experiment with text (equivalent in angle to N10 print at 40 cm distance), cylinder blur limits for 6 subjects were approximately 30% greater than those for defocus; this percentage was slightly smaller than for the three letters. Blur limits of the text were intermediate between those of 0.35 logMAR and 0.6 logMAR letters. Extensive blur limit measurements for one subject with single letters did not show expected interactions between target detail orientation and cylinder axis. ---------- Conclusion: Subjective blur limits for cylinder are 30%-50% greater than those for defocus, with the overall influence of cylinder axis being 20%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While critical success factors (CSFs) of enterprise system (ES) implementation are mature concepts and have received considerable attention for over a decade, researchers have very often focused on only a specific aspect of the implementation process or a specific CSF. Resultantly, there is (1) little research documented that encompasses all significant CSF considerations and (2) little empirical research into the important factors of successful ES implementation. This paper is part of a larger research effort that aims to contribute to understanding the phenomenon of ES CSFs, and reports on preliminary findings from a case study conducted at a Queensland University of Technology (QUT) in Australia. This paper reports on an empirically derived CSFs framework using a directed content analysis of 79 studies; from top IS outlets, employing the characteristics of the analytic theory, and from six different projects implemented at QUT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While Information services function’s (ISF) service quality is not a new concept and has received considerable attention for over two decades, cross-cultural research of ISF’s service quality is not very mature. The author argues that the relationship between cultural dimensions and the ISF’s service quality dimensions may provide useful insights for how organisations should deal with different cultural groups. This paper will show that ISF’s service quality dimensions vary from one culture to another. The study adopts Hofstede’s (1980, 1991) typology of cultures and the “zones of tolerance” (ZOT) service quality measure reported by Kettinger & Lee (2005) as the primary commencing theory-base. In this paper, the author hypothesised and tested the influences of culture on users’ service quality perceptions and found strong empirical support for the study’s hypotheses. The results of this study indicate that as a result of their cultural characteristics, users vary in both their overall service quality perceptions and their perceptions on each of the four dimensions of ZOT service quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date, much work has been done to examine the ways in which information literacy – a way of thinking about, existing alongside and working with information- functions in an academic setting. However, its role in the non-academic library professions has been largely ignored. Given that the public librarian is responsible for designing and delivering services and programmes aimed at supporting the information literacy needs of the community-at-large there is great value to be had from examining the ways in which public libraries understand and experience IL. The research described in this paper investigates, through the use of phenomenography; the ways in which public librarians understand and experience the concept of Information Literacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper I analyse UK artist Alison Jones’ sonic interventions Portrait of the Artist by Proxy (2008), Voyeurism by Proxy (2008) and Art, Lies and Audio Tapes (2009). In Portrait of the Artist by Proxy, Jones – who, due to deteriorating vision, has not seen her reflection in a mirror in years – asks and trusts participants to audio-describe her own image back to her. In Voyeurism by Proxy, Jones asks participants to audio-describe erotic drawings by Gustav Klimt. In Art, Lies and Audio Tapes, Jones asks participants to audio-describe other artworks, such as W.F. Yeames’ And When Did You Last see Your Father?. In these portraits by proxy, Jones opens her image, and other images, to interpretation. In doing so, Jones draws attention to the way sight is privileged as a mode of access to fixed, fundamental truths in Western culture – a mode assumed to be untainted by filters that skew perception of the object. “In a culture where vision is by far the dominant sense,” Jones says, “and as a visual artist with a visual impairment, I am reliant on audio-description …Inevitably, there are limitations imposed by language, time and the interpreter’s background knowledge of the subject viewed, as well as their personal bias of what is deemed important to impart in their description” . In these works, Jones strips these background knowledges, biases and assumptions bare. She reveals different perceptions, as well as tendencies or censor, edit or exaggerate descriptions. In this paper, I investigate how, by revealing unconscious biases, Jones’ works renders herself and her participants vulnerable to a change of perception. I also examine how Jones’ later editing of the audio-descriptions allows her to show the instabilities of sight, and, in Portrait of the Artist by Proxy, to reclaim authorship of her own image.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a resource constrained business world, strategic choices must be made on process improvement and service delivery. There are calls for more agile forms of enterprises and much effort is being directed at moving organizations from a complex landscape of disparate application systems to that of an integrated and flexible enterprise accessing complex systems landscapes through service oriented architecture (SOA). This paper describes the analysis of strategies to detect supporting business services. These services can then be delivered in a variety of ways: web-services, new application services or outsourced services. The focus of this paper is on strategy analysis to identify those strategies that are common to lines of business and thus can be supported through shared services. A case study of a state government is used to show the analytical method and the detection of shared strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human SSB1 (single-stranded binding protein 1 [hSSB1]) was recently identified as a part of the ataxia telangiectasia mutated (ATM) signaling pathway. To investigate hSSB1 function, we performed tandem affinity purifications of hSSB1 mutants mimicking the unphosphorylated and ATM-phosphorylated states. Both hSSB1 mutants copurified a subset of Integrator complex subunits and the uncharacterized protein LOC58493/c9orf80 (henceforth minute INTS3/hSSB-associated element [MISE]). The INTS3–MISE–hSSB1 complex plays a key role in ATM activation and RAD51 recruitment to DNA damage foci during the response to genotoxic stresses. These effects on the DNA damage response are caused by the control of hSSB1 transcription via INTS3, demonstrating a new network controlling hSSB1 function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Like a set of bookends, cellular, molecular, and genetic changes of the beginnings of life mirror those of one of the most common cause of death—metastatic cancer. Epithelial to mesenchymal transition (EMT) is an important change in cell phenotype which allows the escape of epithelial cells from the structural constraints imposed by tissue architecture, and was first recognized by Elizabeth Hay in the early to mid 1980's to be a central process in early embryonic morphogenesis. Reversals of these changes, termed mesenchymal to epithelial transitions (METs), also occur and are important in tissue construction in normal development. Over the last decade, evidence has mounted for EMT as the means through which solid tissue epithelial cancers invade and metastasize. However, demonstrating this potentially rapid and transient process in vivo has proven difficult and data connecting the relevance of this process to tumor progression is still somewhat limited and controversial. Evidence for an important role of MET in the development of clinically overt metastases is starting to accumulate, and model systems have been developed. This review details recent advances in the knowledge of EMT as it occurs in breast development and carcinoma and prostate cancer progression, and highlights the role that MET plays in cancer metastasis. Finally, perspectives from a clinical and translational viewpoint are discussed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examined whether physical, social, cultural and economical environmental factors are associated with obesogenic dietary behaviours and overweight/obesity among adults. Literature searches of databases (i.e. PubMed, CSA Illumina, Web of Science, PsychInfo) identified studies examining environmental factors and the consumption of energy, fat, fibre, fruit, vegetables, sugar-sweetened drinks, meal patterns and weight status. Twenty-eight studies were in-scope, the majority (n= 16) were conducted in the USA. Weight status was consistently associated with the food environment; greater accessibility to supermarkets or less access to takeaway outlets were associated with a lower BMI or prevalence of overweight/obesity. However, obesogenic dietary behaviours did not mirror these associations; mixed associations were found between the environment and obesogenic dietary behaviours. Living in a socioeconomically-deprived area was the only environmental factor consistently associated with a number of obesogenic dietary behaviours. Associations between the environment and weight status are more consistent than that seen between the environment and dietary behaviours. The environment may play an important role in the development of overweight/obesity, however the dietary mechanisms that contribute to this remain unclear and the physical activity environment may also play an important role in weight gain, overweight and obesity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Orlando (Sally Potter, 1992) is a significant filmic achievement: in only ninety minutes it offers a rich, layered, and challenging account of a life lived across four hundred years, across two sexes and genders, and across multiple countries and cultures. Already established as a feminist artist, Potter aligns herself with a genealogy of feminist art by adapting Virginia Woolf’s Orlando: A Biography (1928) to tell the story of Orlando: a British subject who must negotiate their “identity” while living a strangely long time and, also somewhat strangely, changing biological sex from male to female. Both novel and film interrogate norms of gender and culture. They each take up issues of sex, gender, and sexuality as socially-constructed phenomena rather than as “essential truths”, and Orlando’s attempts to tell his/her story and make sense of his/her life mirror readers’ attempts to understand and interpret Orlando’s journey within inherited artistic traditions.