891 resultados para phase compensation technology
Resumo:
This study investigates variation in IT professionals' experience of ethics with a view to enhancing their formation and support. This is explored through an examination of the experience of IT, IT professional ethics and IT professional ethics education. The study's principal contribution is the empirical study and description of IT professionals' experience of ethics. The empirical phase is preceded by a review of conceptions of IT and followed by an application of the findings to IT education. The study's empirical findings are based on 30 semi-structured interviews with IT professionals who represent a wide demographic, experience and IT sub-discipline range. Their experience of ethics is depicted as five citizenships: Citizenship of my world, Citizenship of the corporate world, Citizenship of a shared world, Citizenship of the client's world and Citizenship of the wider world. These signify an expanding awareness, which progressively accords rights to others and defines responsibility in terms of others. The empirical findings inform a Model of Ethical IT. This maps an IT professional space increasingly oriented towards others. Such a model provides a conceptual tool, available to prompt discussion and reflection, and which may be employed in pursuing formation aimed at experiential change. Its usefulness for the education of IT professionals with respect to ethics is explored. The research approach employed in this study is phenomenography. This method seeks to elicit and represent variation of experience. It understands experience as a relationship between a subject (IT professionals) and an object (ethics), and describes this relationship in terms of its foci and boundaries. The study's findings culminate in three observations, that change is indicated in the formation and support of IT professionals in: 1. IT professionals' experience of their discipline, moving towards a focus on information users; 2. IT professionals' experience of professional ethics, moving towards the adoption of other-centred attitudes; and 3. IT professionals' experience of professional development, moving towards an emphasis on a change in lived experience. Based on these results, employers, educators and professional bodies may want to evaluate how they approach professional formation and support, if they aim to promote a comprehensive awareness of ethics in IT professionals.
Resumo:
The purpose of this study is to demonstrate the appropriateness of “Japanese Manufacturing Management” (JMM) strategies in the Asian, ASEAN and Australasian automotive sectors. Secondly, the study assessed JMM as a prompt, effective and efficient global manufacturing management practice for automotive manufacturing companies to learn; benchmark for best practice; acquire product and process innovation, and enhance their capabilities and capacities. In this study, the philosophies, systems and tools that have been adopted in various automotive manufacturing assembly plants and their tier 1 suppliers in the three Regions were examined. A number of top to middle managers in these companies were located in Thailand, Indonesia, Malaysia, Singapore, Philippines, Viet Nam, and Australia and were interviewed by using a qualitative methodology. The results confirmed that the six pillars of JMM (culture change, quality at shop floor, consensus, incremental continual improvement, benchmarking, and backward-forward integration) are key enablers to success in adopting JMM in both automotive and other manufacturing sectors in the three Regions. The analysis and on-site interviews identified a number of recommendations that were validated by the automotive manufacturing company’s managers as the most functional JMM strategies.
Resumo:
A key concern organisations face is how to incorporate Internet tools into their marketing communications mix. Where and how should companies invest their human, technological and financial resources? This paper explores a subset of this problem, online complaining and electronic customer service. It applies diffusion of innovation as a theoretical framework to investigate organisational implementation of email technology and explain the outcome of annual customer service surveys in 2001, 2002 and 2003. The results add to the small body of research on electronic service recovery by extending diffusion of innovations to email service recovery and underscoring the importance of adoption phases, particularly for SMEs. Larger companies provide more channels for submitting complaints, which represents an early phase of adoption. There was little difference in how large and small companies respond to online complaints, a later phase of adoption.
Resumo:
Denial-of-service attacks (DoS) and distributed denial-of-service attacks (DDoS) attempt to temporarily disrupt users or computer resources to cause service un- availability to legitimate users in the internetworking system. The most common type of DoS attack occurs when adversaries °ood a large amount of bogus data to interfere or disrupt the service on the server. The attack can be either a single-source attack, which originates at only one host, or a multi-source attack, in which multiple hosts coordinate to °ood a large number of packets to the server. Cryptographic mechanisms in authentication schemes are an example ap- proach to help the server to validate malicious tra±c. Since authentication in key establishment protocols requires the veri¯er to spend some resources before successfully detecting the bogus messages, adversaries might be able to exploit this °aw to mount an attack to overwhelm the server resources. The attacker is able to perform this kind of attack because many key establishment protocols incorporate strong authentication at the beginning phase before they can iden- tify the attacks. This is an example of DoS threats in most key establishment protocols because they have been implemented to support con¯dentiality and data integrity, but do not carefully consider other security objectives, such as availability. The main objective of this research is to design denial-of-service resistant mechanisms in key establishment protocols. In particular, we focus on the design of cryptographic protocols related to key establishment protocols that implement client puzzles to protect the server against resource exhaustion attacks. Another objective is to extend formal analysis techniques to include DoS- resistance. Basically, the formal analysis approach is used not only to analyse and verify the security of a cryptographic scheme carefully but also to help in the design stage of new protocols with a high level of security guarantee. In this research, we focus on an analysis technique of Meadows' cost-based framework, and we implement DoS-resistant model using Coloured Petri Nets. Meadows' cost-based framework is directly proposed to assess denial-of-service vulnerabil- ities in the cryptographic protocols using mathematical proof, while Coloured Petri Nets is used to model and verify the communication protocols using inter- active simulations. In addition, Coloured Petri Nets are able to help the protocol designer to clarify and reduce some inconsistency of the protocol speci¯cation. Therefore, the second objective of this research is to explore vulnerabilities in existing DoS-resistant protocols, as well as extend a formal analysis approach to our new framework for improving DoS-resistance and evaluating the performance of the new proposed mechanism. In summary, the speci¯c outcomes of this research include following results; 1. A taxonomy of denial-of-service resistant strategies and techniques used in key establishment protocols; 2. A critical analysis of existing DoS-resistant key exchange and key estab- lishment protocols; 3. An implementation of Meadows's cost-based framework using Coloured Petri Nets for modelling and evaluating DoS-resistant protocols; and 4. A development of new e±cient and practical DoS-resistant mechanisms to improve the resistance to denial-of-service attacks in key establishment protocols.
Resumo:
This program of research examines the experience of chronic pain in a community sample. While, it is clear that like patient samples, chronic pain in non-patient samples is also associated with psychological distress and physical disability, the experience of pain across the total spectrum of pain conditions (including acute and episodic pain conditions) and during the early course of chronic pain is less clear. Information about these aspects of the pain experience is important because effective early intervention for chronic pain relies on identification of people who are likely to progress to chronicity post-injury. A conceptual model of the transition from acute to chronic pain was proposed by Gatchel (1991a). In brief, Gatchel’s model describes three stages that individuals who have a serious pain experience move through, each with worsening psychological dysfunction and physical disability. The aims of this program of research were to describe the experience of pain in a community sample in order to obtain pain-specific data on the problem of pain in Queensland, and to explore the usefulness of Gatchel’s Model in a non-clinical sample. Additionally, five risk factors and six protective factors were proposed as possible extensions to Gatchel’s Model. To address these aims, a prospective longitudinal mixed-method research design was used. Quantitative data was collected in Phase 1 via a comprehensive postal questionnaire. Phase 2 consisted of a follow-up questionnaire 3 months post-baseline. Phase 3 consisted of semi-structured interviews with a subset of the original sample 12 months post follow-up, which used qualitative data to provide a further in-depth examination of the experience and process of chronic pain from respondents’ point of view. The results indicate chronic pain is associated with high levels of anxiety and depressive symptoms. However, the levels of disability reported by this Queensland sample were generally lower than those reported by clinical samples and consistent with disability data reported in a New South Wales population-based study. With regard to the second aim of this program of research, while some elements of the pain experience of this sample were consistent with that described by Gatchel’s Model, overall the model was not a good fit with the experience of this non-clinical sample. The findings indicate that passive coping strategies (minimising activity), catastrophising, self efficacy, optimism, social support, active strategies (use of distraction) and the belief that emotions affect pain may be important to consider in understanding the processes that underlie the transition to and continuation of chronic pain.
Resumo:
This paper presents research findings about the use of remote desktop applications to teach music sequencing software. It highlights the successes, shortcomings and interactive issues encountered during a pilot project with a theoretical focus on a specific interactive bottleneck. The paper proposes a new delivery and partnership model to widen this bottleneck, which currently hinders interactions between the technical support, education and professional development communities in music technology.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
In this article I outline and demonstrate a synthesis of the methods developed by Lemke (1998) and Martin (2000) for analyzing evaluations in English. I demonstrate the synthesis using examples from a 1.3-million-word technology policy corpus drawn from institutions at the local, state, national, and supranational levels. Lemke's (1998) critical model is organized around the broad 'evaluative dimensions' that are deployed to evaluate propositions and proposals in English. Martin's (2000) model is organized with a more overtly systemic-functional orientation around the concept of 'encoded feeling'. In applying both these models at different times, whilst recognizing their individual usefulness and complementarity, I found specific limitations that led me to work towards a synthesis of the two approaches. I also argue for the need to consider genre, media, and institutional aspects more explicitly when claiming intertextual and heteroglossic relations as the basis for inferred evaluations. A basic assertion made in this article is that the perceived Desirability of a process, person, circumstance, or thing is identical to its 'value'. But the Desirability of anything is a socially and thus historically conditioned attribution that requires significant amounts of institutional inculcation of other 'types' of value-appropriateness, importance, beauty, power, and so on. I therefore propose a method informed by critical discourse analysis (CDA) that sees evaluation as happening on at least four interdependent levels of abstraction.
Resumo:
In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.
Resumo:
Organisations are increasingly investing in complex technological innovations such as enterprise information systems with the aim of improving the operations of the business, and in this way gaining competitive advantage. However, the implementation of technological innovations tends to have an excessive focus on either technology innovation effectiveness (also known as system effectiveness), or the resulting operational effectiveness; focusing on either one of them is detrimental to the long-term enterprise benefits through failure to achieve the real value of technological innovations. The lack of research on the dimensions and performance objectives that organisations must be focusing on is the main reason for this misalignment. This research uses a combination of qualitative and quantitative, three-stage methodological approach. Initial findings suggest that factors such as quality of information from technology innovation effectiveness, and quality and speed from operational effectiveness are important and significantly well correlated factors that promote the alignment between technology innovation effectiveness and operational effectiveness.
Resumo:
This chapter reports on Australian and Swedish experiences in the iterative design, development, and ongoing use of interactive educational systems we call ‘Media Maps.’ Like maps in general, Media Maps are usefully understood as complex cultural technologies; that is, they are not only physical objects, tools and artefacts, but also information creation and distribution technologies, the use and development of which are embedded in systems of knowledge and social meaning. Drawing upon Australian and Swedish experiences with one Media Map technology, this paper illustrates this three-layered approach to the development of media mapping. It shows how media mapping is being used to create authentic learning experiences for students preparing for work in the rapidly evolving media and communication industries. We also contextualise media mapping as a response to various challenges for curriculum and learning design in Media and Communication Studies that arise from shifts in tertiary education policy in a global knowledge economy.
Resumo:
The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.
Resumo:
Various reasons have been proffered for female under-representation in tertiary information technology (IT) courses and the IT industry with most relating to cultural moirés. The 2006 Geek Goddess calendar was designed to alter IT’s “geeky image” and the term is used here to represent young women enrolled in pre-service IT teaching courses. Their special mix of IT and teaching draws on conflicting stereotypes and represents a micro-climate which is typically lost in studies of IT occupations because of the aggregation of all IT roles. This paper will report on a small-scale investigation of female students (N=25) at a university in Queensland (Australia) studying to become teachers of secondary IT subjects. They are entering the IT industry, gendered as a “male” occupation, through the safe space of teaching a discipline allied to feminine qualities of nurturing. They are “geek goddesses” who – perhaps to balance the masculine and feminine of these occupations - have decided to go to school rather than into corporations or government.
Resumo:
This article reports on the impact on student personal creativity of a longitudinal study that had as its major goal the creation of a unique intervention program for elementary students. The intervention was based on the National Profile and Statement (Curriculum Corporation, 1994a, 1994b) for the curriculum area of technology. The intervention program comprised thematically based units of work that integrated all eight Australian Key Learning Areas (KLA). A pretest/posttest control group design investigation (Campbell & Stanley, 1963) was undertaken with 580 students from 7 schools and 24 class groups that were randomly divided into 3 treatment groups. One group (10 classes) formed the control group. Another 7 classes received the year-long intervention program, while the remaining 7 classes received the intervention, but with the added seamless integration of their available classroom computer technologies. The effect of the intervention on the personal creativity characteristics of the students involved in the study was assessed using the Creativity Checklist, an instrument that was developed during the study. The results suggest that the purposeful integration of computer technology with the intervention program positively affects the personal creativity characteristics of students.
Resumo:
The National survey was the third phase in an ongoing initiative to identify critical success factors in ICT mediated supply chains. This study has been designed to harness the tacit and explicit knowledge to be found on the subject from the widest range of appropriate sources. At its core is the assumption that, provided with the fullest list of candidate success factors, a representative sample of experienced industry-based practitioners will (with the aid of statistical analysis) reveal a set of critical success factors. A postal survey has been judged to be the most appropriate mechanism for achieving this outcome.