871 resultados para Technology Readiness Level


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, practitioners and researchers alike have turned their attention to knowledge management (KM) in order to increase organisational performance (OP). As a result, many different approaches and strategies have been investigated and suggested for how knowledge should be managed to make organisations more effective and efficient. However, most research has been undertaken in the for-profit sector, with only a few studies focusing on the benefits nonprofit organisations might gain by managing knowledge. This study broadly investigates the impact of knowledge management on the organisational performance of nonprofit organisations. Organisational performance can be evaluated through either financial or non-financial measurements. In order to evaluate knowledge management and organisational performance, non-financial measurements are argued to be more suitable given that knowledge is an intangible asset which often cannot be expressed through financial indicators. Non-financial measurement concepts of performance such as the balanced scorecard or the concept of Intellectual Capital (IC) are well accepted and used within the for-profit and nonprofit sectors to evaluate organisational performance. This study utilised the concept of IC as the method to evaluate KM and OP in the context of nonprofit organisations due to the close link between KM and IC: Indeed, KM is concerned with managing the KM processes of creating, storing, sharing and applying knowledge and the organisational KM infrastructure such as organisational culture or organisational structure to support these processes. On the other hand, IC measures the knowledge stocks in different ontological levels: at the individual level (human capital), at the group level (relational capital) and at the organisational level (structural capital). In other words, IC measures the value of the knowledge which has been managed through KM. As KM encompasses the different KM processes and the KM infrastructure facilitating these processes, previous research has investigated the relationship between KM infrastructure and KM processes. Organisational culture, organisational structure and the level of IT support have been identified as the main factors of the KM infrastructure influencing the KM processes of creating, storing, sharing and applying knowledge. Other research has focused on the link between KM and OP or organisational effectiveness. Based on existing literature, a theoretical model was developed to enable the investigation of the relation between KM (encompassing KM infrastructure and KM processes) and IC. The model assumes an association between KM infrastructure and KM processes, as well as an association between KM processes and the various levels of IC (human capital, structural capital and relational capital). As a result, five research questions (RQ) with respect to the various factors of the KM infrastructure as well as with respect to the relationship between KM infrastructure and IC were raised and included into the research model: RQ 1 Do nonprofit organisations which have a Hierarchy culture have a stronger IT support than nonprofit organisations which have an Adhocracy culture? RQ 2 Do nonprofit organisations which have a centralised organisational structure have a stronger IT support than nonprofit organisations which have decentralised organisational structure? RQ 3 Do nonprofit organisations which have a stronger IT support have a higher value of Human Capital than nonprofit organisations which have a less strong IT support? RQ 4 Do nonprofit organisations which have a stronger IT support have a higher value of Structural Capital than nonprofit organisations which have a less strong IT support? RQ 5 Do nonprofit organisations which have a stronger IT support have a higher value of Relational Capital than nonprofit organisations which have a less strong IT support? In order to investigate the research questions, measurements for IC were developed which were linked to the main KM processes. The final KM/IC model contained four items for evaluating human capital, five items for evaluating structural capital and four items for evaluating relational capital. The research questions were investigated through empirical research using a case study approach with the focus on two nonprofit organisations providing trade promotions services through local offices worldwide. Data for the investigation of the assumptions were collected via qualitative as well as quantitative research methods. The qualitative study included interviews with representatives of the two participating organisations as well as in-depth document research. The purpose of the qualitative study was to investigate the factors of the KM infrastructure (organisational culture, organisational structure, IT support) of the organisations and how these factors were related to each other. On the other hand, the quantitative study was carried out through an online-survey amongst staff of the various local offices. The purpose of the quantitative study was to investigate which impact the level of IT support, as the main instrument of the KM infrastructure, had on IC. Overall several key themes were found as a result of the study: • Knowledge Management and Intellectual Capital were complementary with each other, which should be expressed through measurements of IC based on KM processes. • The various factors of the KM infrastructure (organisational culture, organisational structure and level of IT support) are interdependent. • IT was a primary instrument through which the different KM processes (creating, storing, sharing and applying knowledge) were performed. • A high level of IT support was evident when participants reported higher level of IC (human capital, structural capital and relational capital). The study supported previous research in the field of KM and replicated the findings from other case studies in this area. The study also contributed to theory by placing the KM research within the nonprofit context and analysing the linkage between KM and IC. From the managerial perspective, the findings gave clear indications that would allow interested parties, such as nonprofit managers or consultants to understand more about the implications of KM on OP and to use this knowledge for implementing efficient and effective KM strategies within their organisations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neurodegenerative disorders are heterogenous in nature and include a range of ataxias with oculomotor apraxia, which are characterised by a wide variety of neurological and ophthalmological features. This family includes recessive and dominant disorders. A subfamily of autosomal recessive cerebellar ataxias are characterised by defects in the cellular response to DNA damage. These include the well characterised disorders Ataxia-Telangiectasia (A-T) and Ataxia-Telangiectasia Like Disorder (A-TLD) as well as the recently identified diseases Spinocerebellar ataxia with axonal neuropathy Type 1 (SCAN1), Ataxia with Oculomotor Apraxia Type 2 (AOA2), as well as the subject of this thesis, Ataxia with Oculomotor Apraxia Type 1 (AOA1). AOA1 is caused by mutations in the APTX gene, which is located at chromosomal locus 9p13. This gene codes for the 342 amino acid protein Aprataxin. Mutations in APTX cause destabilization of Aprataxin, thus AOA1 is a result of Aprataxin deficiency. Aprataxin has three functional domains, an N-terminal Forkhead Associated (FHA) phosphoprotein interaction domain, a central Histidine Triad (HIT) nucleotide hydrolase domain and a C-terminal C2H2 zinc finger. Aprataxins FHA domain has homology to FHA domain of the DNA repair protein 5’ polynucleotide kinase 3’ phosphatase (PNKP). PNKP interacts with a range of DNA repair proteins via its FHA domain and plays a critical role in processing damaged DNA termini. The presence of this domain with a nucleotide hydrolase domain and a DNA binding motif implicated that Aprataxin may be involved in DNA repair and that AOA1 may be caused by a DNA repair deficit. This was substantiated by the interaction of Aprataxin with proteins involved in the repair of both single and double strand DNA breaks (XRay Cross-Complementing 1, XRCC4 and Poly-ADP Ribose Polymerase-1) and the hypersensitivity of AOA1 patient cell lines to single and double strand break inducing agents. At the commencement of this study little was known about the in vitro and in vivo properties of Aprataxin. Initially this study focused on generation of recombinant Aprataxin proteins to facilitate examination of the in vitro properties of Aprataxin. Using recombinant Aprataxin proteins I found that Aprataxin binds to double stranded DNA. Consistent with a role for Aprataxin as a DNA repair enzyme, this binding is not sequence specific. I also report that the HIT domain of Aprataxin hydrolyses adenosine derivatives and interestingly found that this activity is competitively inhibited by DNA. This provided initial evidence that DNA binds to the HIT domain of Aprataxin. The interaction of DNA with the nucleotide hydrolase domain of Aprataxin provided initial evidence that Aprataxin may be a DNA-processing factor. Following these studies, Aprataxin was found to hydrolyse 5’adenylated DNA, which can be generated by unscheduled ligation at DNA breaks with non-standard termini. I found that cell extracts from AOA1 patients do not have DNA-adenylate hydrolase activity indicating that Aprataxin is the only DNA-adenylate hydrolase in mammalian cells. I further characterised this activity by examining the contribution of the zinc finger and FHA domains to DNA-adenylate hydrolysis by the HIT domain. I found that deletion of the zinc finger ablated the activity of the HIT domain against adenylated DNA, indicating that the zinc finger may be required for the formation of a stable enzyme-substrate complex. Deletion of the FHA domain stimulated DNA-adenylate hydrolysis, which indicated that the activity of the HIT domain may be regulated by the FHA domain. Given that the FHA domain is involved in protein-protein interactions I propose that the activity of Aprataxins HIT domain may be regulated by proteins which interact with its FHA domain. We examined this possibility by measuring the DNA-adenylate hydrolase activity of extracts from cells deficient for the Aprataxin-interacting DNA repair proteins XRCC1 and PARP-1. XRCC1 deficiency did not affect Aprataxin activity but I found that Aprataxin is destabilized in the absence of PARP-1, resulting in a deficiency of DNA-adenylate hydrolase activity in PARP-1 knockout cells. This implies a critical role for PARP-1 in the stabilization of Aprataxin. Conversely I found that PARP-1 is destabilized in the absence of Aprataxin. PARP-1 is a central player in a number of DNA repair mechanisms and this implies that not only do AOA1 cells lack Aprataxin, they may also have defects in PARP-1 dependant cellular functions. Based on this I identified a defect in a PARP-1 dependant DNA repair mechanism in AOA1 cells. Additionally, I identified elevated levels of oxidized DNA in AOA1 cells, which is indicative of a defect in Base Excision Repair (BER). I attribute this to the reduced level of the BER protein Apurinic Endonuclease 1 (APE1) I identified in Aprataxin deficient cells. This study has identified and characterised multiple DNA repair defects in AOA1 cells, indicating that Aprataxin deficiency has far-reaching cellular consequences. Consistent with the literature, I show that Aprataxin is a nuclear protein with nucleoplasmic and nucleolar distribution. Previous studies have shown that Aprataxin interacts with the nucleolar rRNA processing factor nucleolin and that AOA1 cells appear to have a mild defect in rRNA synthesis. Given the nucleolar localization of Aprataxin I examined the protein-protein interactions of Aprataxin and found that Aprataxin interacts with a number of rRNA transcription and processing factors. Based on this and the nucleolar localization of Aprataxin I proposed that Aprataxin may have an alternative role in the nucleolus. I therefore examined the transcriptional activity of Aprataxin deficient cells using nucleotide analogue incorporation. I found that AOA1 cells do not display a defect in basal levels of RNA synthesis, however they display defective transcriptional responses to DNA damage. In summary, this thesis demonstrates that Aprataxin is a DNA repair enzyme responsible for the repair of adenylated DNA termini and that it is required for stabilization of at least two other DNA repair proteins. Thus not only do AOA1 cells have no Aprataxin protein or activity, they have additional deficiencies in PolyADP Ribose Polymerase-1 and Apurinic Endonuclease 1 dependant DNA repair mechanisms. I additionally demonstrate DNA-damage inducible transcriptional defects in AOA1 cells, indicating that Aprataxin deficiency confers a broad range of cellular defects and highlighting the complexity of the cellular response to DNA damage and the multiple defects which result from Aprataxin deficiency. My detailed characterization of the cellular consequences of Aprataxin deficiency provides an important contribution to our understanding of interlinking DNA repair processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transmissible diseases are re-emerging as a global problem, with Sexually Transmitted Diseases (STDs) becoming endemic. Chlamydia trachomatis is the leading cause of bacterially-acquired STD worldwide, with the Australian cost of infection estimated at $90 - $160 million annually. Studies using animal models of genital tract Chlamydia infection suggested that the hormonal status of the genital tract epithelium at the time of exposure may influence the outcome of infection. Oral contraceptive use also increased the risk of contracting chlamydial infections compared to women not using contraception. Generally it was suggested that the outcome of chlamydial infection is determined in part by the hormonal status of the epithelium at the time of exposure. Using the human endolmetrial cell line ECC-1 this study investigated the effects of C. trachomatis serovar D infection, in conjunction with the female sex hormones, 17β-estradiol and progesterone, on chlamydial gene expression. While previous studies have examined the host response, this is the first study to examine C.trachomatis gene expression under different hormonal conditions. We have highlighted a basic model of C. trachomatis gene regulation in the presence of steroid hormones by identifying 60 genes that were regulated by addition of estradiol and/or progesterone. In addition, the third chapter of this thesis discussed and compared the significance of the current findings in the context of data from other research groups to improve our understanding of the molecular basis of chlamydial persistence under hormonal different conditions. In addition, this study analysed the effects of these female sex hormones and C. trachomatis Serovar D infection, on host susceptibility and bacterial growth. Our results clearly demonstrated that addition of steroid hormones not only had a great impact on the level of infectivity of epithelial cells with C.trachomatis serovar D, but also the morphology of chlamydial inclusions was affected by hormone supplementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The shift from 20th century mass communications media towards convergent media and Web 2.0 has raised the possibility of a renaissance of the public sphere, based around citizen journalism and participatory media culture. This paper will evaluate such claims both conceptually and empirically. At a conceptual level, it is noted that the question of whether media democratization is occurring depends in part upon how democracy is understood, with some critical differences in understandings of democracy, the public sphere and media citizenship. The empirical work in this paper draws upon various case studies of new developments in Australian media, including online- only newspapers, developments in public service media, and the rise of commercially based online alternative media. It is argued that participatory media culture is being expanded if understood in terms of media pluralism, but that implications for the public sphere depend in part upon how media democratization is defined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The over represented number of novice drivers involved in crashes is alarming. Driver training is one of the interventions aimed at mitigating the number of crashes that involve young drivers. Experienced drivers have better hazard perception ability compared to inexperienced drivers. Eye gaze patterns have been found to be an indicator of the driver's competency level. The aim of this paper is to develop an in-vehicle system which correlates information about the driver's gaze and vehicle dynamics, which is then used to assist driver trainers in assessing driving competency. This system allows visualization of the complete driving manoeuvre data on interactive maps. It uses an eye tracker and perspective projection algorithms to compute the depth of gaze and plots it on Google maps. This interactive map also features the trajectory of the vehicle and turn indicator usage. This system allows efficient and user friendly analysis of the driving task. It can be used by driver trainers and trainees to understand objectively the risks encountered during driving manoeuvres. This paper presents a prototype that plots the driver's eye gaze depth and direction on an interactive map along with the vehicle dynamics information. This prototype will be used in future to study the difference in gaze patterns in novice and experienced drivers prior to a certain manoeuvre.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Brisbane Media Map is both an online resource and a tertiary-level authentic learning project. The Brisbane Media Map is an online database which provides a detailed overview of about 600 media industry organisations in Brisbane, Australia. In addition to providing contact details and synopses for each organisation’s profile, the Brisbane Media Map also includes supplementary information on current issues, trends, and individuals in the media and communication industry sectors. This resource is produced and updated annually by final-year undergraduate Media and Communication students. This article introduces the Brisbane Media Map, its functionality and systems design approach, as well as its alignment with key learning infrastructures. It examines authentic learning as the pedagogical framework underpinning the ongoing development work of the resource and highlights some synergies of this framework with participatory design principles. The Brisbane Media Map is a useful example of an authentic learning approach that successfully engages students of non-traditional and non-design areas of study in human-computer interaction, usability, and participatory design activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION In their target article, Yuri Hanin and Muza Hanina outlined a novel multidisciplinary approach to performance optimisation for sport psychologists called the Identification-Control-Correction (ICC) programme. According to the authors, this empirically-verified, psycho-pedagogical strategy is designed to improve the quality of coaching and consistency of performance in highly skilled athletes and involves a number of steps including: (i) identifying and increasing self-awareness of ‘optimal’ and ‘non-optimal’ movement patterns for individual athletes; (ii) learning to deliberately control the process of task execution; and iii), correcting habitual and random errors and managing radical changes of movement patterns. Although no specific examples were provided, the ICC programme has apparently been successful in enhancing the performance of Olympic-level athletes. In this commentary, we address what we consider to be some important issues arising from the target article. We specifically focus attention on the contentious topic of optimization in neurobiological movement systems, the role of constraints in shaping emergent movement patterns and the functional role of movement variability in producing stable performance outcomes. In our view, the target article and, indeed, the proposed ICC programme, would benefit from a dynamical systems theoretical backdrop rather than the cognitive scientific approach that appears to be advocated. Although Hanin and Hanina made reference to, and attempted to integrate, constructs typically associated with dynamical systems theoretical accounts of motor control and learning (e.g., Bernstein’s problem, movement variability, etc.), these ideas required more detailed elaboration, which we provide in this commentary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The composition of many professional services firms in the Urban Development area has moved away from a discipline specific ‘silo’ structure to a more multidisciplinary environment. The benefits of multidisciplinarity have been seen in industry by providing synergies across many of the related disciplines. Similarly, the Queensland University of Technology, Bachelor of Urban Development degree has sought to broaden the knowledge base of students and achieve a greater level of synergy between related urban development disciplines through the introduction of generic and multidisciplinary units. This study aims to evaluate the effectiveness of delivering core property units in a multidisciplinary context. A comparative analysis has been undertaken between core property units and more generic units offered in a multidisciplinary context from introductory, intermediate and advanced years within the property program. This analysis was based on data collected from course performance surveys, student performance results, a student focus group and was informed by a reflective process from the student perspective and lecturer/ tutor feedback. The study showed that there are many benefits associated with multidisciplinary unit offerings across the QUT Urban Development program particularly in the more generic units. However, these units require a greater degree of management. It is more difficult to organise, teach and coordinate multidisciplinary student cohorts due to a difference in prior knowledge and experience between each of the discipline groups. In addition, the interaction between lecturers/ tutors and the students frequently becomes more limited. A perception exists within the student body that this more limited face to face contact with academic staff is not valuable which may be exacerbated by the quality of complimentary online teaching materials. For many academics, non-attendance at lectures was coupled with an increase in email communication. From the limited data collected during the study there appears to be no clear correlation between large multidisciplinary student classes and student academic performance or satisfaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An emergent form of political economy, facilitated by information and communication technologies (ICTs), is widely propagated as the apotheosis of unmitigated social, economic, and technological progress. Meanwhile, throughout the world, social degradation and economic inequality are increasing logarithmically. Valued categories of thought are, axiomatically, the basic commodities of the “knowledge economy”. Language is its means of exchange. This paper proposes a sociolinguistic method with which to critically engage the hyperbole of the “Information Age”. The method is grounded in a systemic social theory that synthesises aspects of autopoiesis and Marxist political economy. A trade policy statement is analysed to exemplify the sociolinguistically created aberrations that are today most often construed as social and political determinants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of agriculture in many countries has tended to reduce as their economies move from a resource base to a manufacturing industry base. Although the level of agricultural production in first world countries has increased over the past two decades, this increase has generally been at a less significant rate compared to other sectors of the economies. Despite this increase in secondary and high technology industries, developed countries have continued to encourage and support their agricultural industries. This support has been through both tariffs and price support. Although the average farm production property may require this support to maintain long-term production, the better farms can actually achieve production levels and commodity prices that result in these units being competitive on a free market basis. This paper will analyse the total return performance of UK farmland over the period 1981-2004. This analysis will compare the total return from rural properties in the UK and compare this performance to commercial property returns (total, office, retail, industrial), equities and gilts over this 24-year period. The analysis will be based on the IPD UK let land index and the IPD property index. The portfolio diversification and risk-reduction benefits of UK farmland will be highlighted. The analysis shows that rural property has negative correlations with equities and gilts, as well as insignificant positive correlations with retail, industrial and office property. Rural property also provides portfolio diversification benefits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of agriculture in many countries has tended to reduce as their economies move from a resource base to a manufacturing industry base. Although the level of agricultural production in first world countries has increased over the past two decades, this increase has generally been at a less significant rate compared to other sectors of the economies. Despite this increase in secondary and high technology industries, developed countries have continued to encourage and support their agricultural industries. This support has been through both tariffs and price support. Following pressure from developing economies, particularly through the World Trade Organisation (WTO), GATT Uruguay round and the Cairns Group developed countries are now in various stages of winding back or de-coupling agricultural support within their economies. A major concern of farmers in protected agricultural markets is the impact of a free market trade in agricultural commodities on farm incomes, profitability and land values. This paper will analyse both the capital and income performance of the NSW rural land market over the period 1990-1999. This analysis will be based on several rural land use classifications and will compare the total return from rural properties based on the farm income generated by both the average farmer and those farmers considered to be in the top 20% of the various land use areas. The analysis will provide a comprehensive overview of rural production in a free trade economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of agriculture in many countries has tended to reduce as their economies move from a resource base to a manufacturing industry base. Although the level of agricultural production in first world countries has increased over the past two decades, this increase has generally been at a less significant rate compared to other sectors of the economies. Despite this increase in secondary and high technology industries, developed countries have continued to encourage and support their agricultural industries. This support has been through both tariffs and price support. Following pressure from developing economies, particularly through the World Trade Organisation (WTO), GATT Uruguay round and the Cairns Group Developed countries are now in various stages of winding back or de-coupling agricultural support within their economies. A major concern of farmers in protected agricultural markets is the impact of a free market trade in agricultural commodities on farm incomes and land values. This paper will analyse the capital and income performance of the NSW rural land market over the period 1990-1999. This analysis will be based on land use and will compare the total return from rural properties based on world agricultural commodity prices.