259 resultados para internet-based application components


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although internet chat is a significant aspect of many internet users’ lives, the manner in which participants in quasi-synchronous chat situations orient to issues of social and moral order remains to be studied in depth. The research presented here is therefore at the forefront of a continually developing area of study. This work contributes new insights into how members construct and make accountable the social and moral orders of an adult-oriented Internet Relay Chat (IRC) channel by addressing three questions: (1) What conversational resources do participants use in addressing matters of social and moral order? (2) How are these conversational resources deployed within IRC interaction? and (3) What interactional work is locally accomplished through use of these resources? A survey of the literature reveals considerable research in the field of computer-mediated communication, exploring both asynchronous and quasi-synchronous discussion forums. The research discussed represents a range of communication interests including group and collaborative interaction, the linguistic construction of social identity, and the linguistic features of online interaction. It is suggested that the present research differs from previous studies in three ways: (1) it focuses on the interaction itself, rather than the ways in which the medium affects the interaction; (2) it offers turn-by-turn analysis of interaction in situ; and (3) it discusses membership categories only insofar as they are shown to be relevant by participants through their talk. Through consideration of the literature, the present study is firmly situated within the broader computer-mediated communication field. Ethnomethodology, conversation analysis and membership categorization analysis were adopted as appropriate methodological approaches to explore the research focus on interaction in situ, and in particular to investigate the ways in which participants negotiate and co-construct social and moral orders in the course of their interaction. IRC logs collected from one chat room were analysed using a two-pass method, based on a modification of the approaches proposed by Pomerantz and Fehr (1997) and ten Have (1999). From this detailed examination of the data corpus three interaction topics are identified by means of which participants clearly orient to issues of social and moral order: challenges to rule violations, ‘trolling’ for cybersex, and experiences regarding the 9/11 attacks. Instances of these interactional topics are subjected to fine-grained analysis, to demonstrate the ways in which participants draw upon various interactional resources in their negotiation and construction of channel social and moral orders. While these analytical topics stand alone in individual focus, together they illustrate different instances in which participants’ talk serves to negotiate social and moral orders or collaboratively construct new orders. Building on the work of Vallis (2001), Chapter 5 illustrates three ways that rule violation is initiated as a channel discussion topic: (1) through a visible violation in open channel, (2) through an official warning or sanction by a channel operator regarding the violation, and (3) through a complaint or announcement of a rule violation by a non-channel operator participant. Once the topic has been initiated, it is shown to become available as a topic for others, including the perceived violator. The fine-grained analysis of challenges to rule violations ultimately demonstrates that channel participants orient to the rules as a resource in developing categorizations of both the rule violation and violator. These categorizations are contextual in that they are locally based and understood within specific contexts and practices. Thus, it is shown that compliance with rules and an orientation to rule violations as inappropriate within the social and moral orders of the channel serves two purposes: (1) to orient the speaker as a group member, and (2) to reinforce the social and moral orders of the group. Chapter 6 explores a particular type of rule violation, solicitations for ‘cybersex’ known in IRC parlance as ‘trolling’. In responding to trolling violations participants are demonstrated to use affiliative and aggressive humour, in particular irony, sarcasm and insults. These conversational resources perform solidarity building within the group, positioning non-Troll respondents as compliant group members. This solidarity work is shown to have three outcomes: (1) consensus building, (2) collaborative construction of group membership, and (3) the continued construction and negotiation of existing social and moral orders. Chapter 7, the final data analysis chapter, offers insight into how participants, in discussing the events of 9/11 on the actual day, collaboratively constructed new social and moral orders, while orienting to issues of appropriate and reasonable emotional responses. This analysis demonstrates how participants go about ‘doing being ordinary’ (Sacks, 1992b) in formulating their ‘first thoughts’ (Jefferson, 2004). Through sharing their initial impressions of the event, participants perform support work within the interaction, in essence working to normalize both the event and their initial misinterpretation of it. Normalising as a support work mechanism is also shown in relation to participants constructing the ‘quiet’ following the event as unusual. Normalising is accomplished by reference to the indexical ‘it’ and location formulations, which participants use both to negotiate who can claim to experience the ‘unnatural quiet’ and to identify the extent of the quiet. Through their talk participants upgrade the quiet from something legitimately experienced by one person in a particular place to something that could be experienced ‘anywhere’, moving the phenomenon from local to global provenance. With its methodological design and detailed analysis and findings, this research contributes to existing knowledge in four ways. First, it shows how rules are used by participants as a resource in negotiating and constructing social and moral orders. Second, it demonstrates that irony, sarcasm and insults are three devices of humour which can be used to perform solidarity work and reinforce existing social and moral orders. Third, it demonstrates how new social and moral orders are collaboratively constructed in relation to extraordinary events, which serve to frame the event and evoke reasonable responses for participants. And last, the detailed analysis and findings further support the use of conversation analysis and membership categorization as valuable methods for approaching quasi-synchronous computer-mediated communication.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the key issues facing public asset owners is the decision of refurbishing aged built assets. This decision requires an assessment of the “remaining service life” of the key components in a building. The remaining service life is significantly dependent upon the existing condition of the asset and future degradation patterns considering durability and functional obsolescence. Recently developed methods on Residual Service Life modelling, require sophisticated data that are not readily available. Most of the data available are in the form of reports prior to undertaking major repairs or in the form of sessional audit reports. Valuable information from these available sources can serve as bench marks for estimating the reference service life. The authors have acquired similar informations from a public asset building in Melbourne. Using these informations, the residual service life of a case study building façade has been estimated in this paper based on state-of-the-art approaches. These estimations have been evaluated against expert opinion. Though the results are encouraging it is clear that the state-of-the-art methodologies can only provide meaningful estimates provided the level and quality of data are available. This investigation resulted in the development of a new framework for maintenance that integrates the condition assessment procedures and factors influencing residual service life

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study sought to improve understanding of the persuasive process of emotion-based appeals not only in relation to negative, fear-based appeals but also for appeals based upon positive emotions. In particular, the study investigated whether response efficacy, as a cognitive construct, mediated outcome measures of message effectiveness in terms of both acceptance and rejection of negative and positive emotion-based messages. Licensed drivers (N = 406) participated via the completion of an on-line survey. Within the survey, participants received either a negative (fear-based) appeal or one of the two possible positive appeals (pride or humor-based). Overall, the study's findings confirmed the importance of emotional and cognitive components of persuasive health messages and identified response efficacy as a key cognitive construct influencing the effectiveness of not only fear-based messages but also positive emotion-based messages. Interestingly, however, the results suggested that response efficacy's influence on message effectiveness may differ for positive and negative emotion-based appeals such that significant indirect (and mediational) effects were found with both acceptance and rejection of the positive appeals yet only with rejection of the fear-based appeal. As such, the study's findings provide an important extension to extant literature and may inform future advertising message design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The enhanced accessibility, affordability and capability of the Internet has created enormous possibilities in terms of designing, developing and implementing innovative teaching methods in the classroom. As existing pedagogies are revamped and new ones are added, there is a need to assess the effectiveness of these approaches from the students’ perspective. For more than three decades, proven qualitative and quantitative research methods associated with learning environments research have yielded productive results for educators. This article presents the findings of a study in which Getsmart, a teacher-designed website, was blended into science and physics lessons at an Australian high school. Students’ perceptions of this environment were investigated, together with differences in the perceptions of students in junior and senior years of schooling. The article also explores the impact of teachers in such an environment. The investigation undertaken in this study also gave an indication of how effective Getsmart was as a teaching model in such environments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Web 1.0 referred to the early, read-only internet; Web 2.0 refers to the ‘read-write web’ in which users actively contribute to as well as consume online content; Web 3.0 is now being used to refer to the convergence of mobile and Web 2.0 technologies and applications. One of the most important developments in mobile 3.0 is geography: with many mobile phones now equipped with GPS, mobiles promise to “bring the internet down to earth” through geographically-aware, or locative media. The internet was earlier heralded as “the death of geography” with predictions that with anyone able to access information from anywhere, geography would no longer matter. But mobiles are disproving this. GPS allows the location of the user to be pinpointed, and the mobile internet allows the user to access locally-relevant information, or to upload content which is geotagged to the specific location. It also allows locally-specific content to be sent to the user when the user enters a specific space. Location-based services are one of the fastest-growing segments of the mobile internet market: the 2008 AIMIA report indicates that user access of local maps increased by 347% over the previous 12 months, and restaurant guides/reviews increased by 174%. The central tenet of cultural geography is that places are culturally-constructed, comprised of the physical space itself, culturally-inflected perceptions of that space, and people’s experiences of the space (LeFebvre 1991). This paper takes a cultural geographical approach to locative media, anatomising the various spaces which have emerged through locative media, or “the geoweb” (Lake 2004). The geoweb is such a new concept that to date, critical discourse has treated it as a somewhat homogenous spatial formation. In order to counter this, and in order to demonstrate the dynamic complexity of the emerging spaces of the geoweb, the paper provides a topography of different types of locative media space: including the personal/aesthetic in which individual users geotag specific physical sites with their own content and meanings; the commercial, like the billboards which speak to individuals as they pass in Minority Report; and the social, in which one’s location is defined by the proximity of friends rather than by geography.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In Web service based systems, new value-added Web services can be constructed by integrating existing Web services. A Web service may have many implementations, which are functionally identical, but have different Quality of Service (QoS) attributes, such as response time, price, reputation, reliability, availability and so on. Thus, a significant research problem in Web service composition is how to select an implementation for each of the component Web services so that the overall QoS of the composite Web service is optimal. This is so called QoS-aware Web service composition problem. In some composite Web services there are some dependencies and conflicts between the Web service implementations. However, existing approaches cannot handle the constraints. This paper tackles the QoS-aware Web service composition problem with inter service dependencies and conflicts using a penalty-based genetic algorithm (GA). Experimental results demonstrate the effectiveness and the scalability of the penalty-based GA.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Young drivers aged 17-24 are consistently overrepresented in motor vehicle crashes. Research has shown that a young driver’s crash risk increases when carrying similarly aged passengers, with fatal crash risk increasing two to three fold with two or more passengers. Recent growth in access to and use of the internet has led to a corresponding increase in the number of web based behaviour change interventions. An increasing body of literature describes the evaluation of web based programs targeting risk behaviours and health issues. Evaluations have shown promise for such strategies with evidence for positive changes in knowledge, attitudes and behaviour. The growing popularity of web based programs is due in part to their wide accessibility, ability for personalised tailoring of intervention messages, and self-direction and pacing of online content. Young people are also highly receptive to the internet and the interactive elements of online programs are particularly attractive. The current study was designed to assess the feasibility for a web based intervention to increase the use of personal and peer protective strategies among young adult passengers. An extensive review was conducted on the development and evaluation of web based programs. Year 12 students were also surveyed about their use of the internet in general and for health and road safety information. All students reported internet access at home or at school, and 74% had searched for road safety information. Additional findings have shown promise for the development of a web based passenger safety program for young adults. Design and methodological issues will be discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The School Based Youth Health Nurse Program was established in 1999 by the Queensland Government to fund school nurse positions in Queensland state high schools. Schools were required to apply for a School Based Youth Health Nurse during a five-phase recruitment process, managed by the health districts, and rolled out over four years. The only mandatory selection criterion for the position of School Based Youth Health Nurse was registration as a General Nurse and most School Based Youth Health Nurses are allocated to two state high schools. Currently, there are approximately 115 Full Time Equivalent School Based Youth Health Nurse positions across all Queensland state high schools. The literature review revealed an abundance of information about school nursing. Most of the literature came from the United Kingdom and the United States, who have a different model of school nursing to school based youth health nursing. However, there is literature to suggest school nursing is gradually moving from a disease-focused approach to a social view of health. The noticeable number of articles about, for example, drug and alcohol, mental health, and contemporary sexual health issues, is evidence of this change. Additionally, there is a significant the volume of literature about partnerships and collaboration, much of which is about health education, team teaching and how school nurses and schools do health business together. The surfacing of this literature is a good indication that school nursing is aligning with the broader national health priority areas. More particularly, the literature exposed a small but relevant and current body of research, predominantly from Queensland, about school based youth health nursing. However, there remain significant gaps in the knowledge about school based youth health nursing. In particular, there is a deficit about how School Based Youth Heath Nurses understand the experience of school based youth health nursing. This research aimed to reveal the meaning of the experience of school based youth health nursing. The research question was How do School Based Youth Health Nurses’ understand the experience of school based youth health nursing? This enquiry was instigated because the researcher, who had a positive experience of school based youth health nursing, considered it important to validate other School Based Youth Health Nurses’ experiences. Consequently, a comprehensive use of qualitative research was considered the most appropriate manner to explore this research question. Within this qualitative paradigm, the research framework consists of the epistemology of social constructionism, the theoretical perspective of interpretivism and the approach of phenomenography. After ethical approval was gained, purposeful and snowball sampling was used to recruit a sample of 16 participants. In-depth interviews, which were voluntary, confidential and anonymous, were mostly conducted in public venues and lasted from 40-75 minutes. The researcher also kept a researchers journal as another form of data collection. Data analysis was guided by Dahlgren and Fallsbergs’ (1991, p. 152) seven phases of data analysis which includes familiarization, condensation, comparison, grouping, articulating, labelling and contrasting. The most important finding in this research is the outcome space, which represents the entirety of the experience of school based youth health nursing. The outcome space consists of two components: inside the school environment and outside the school environment. Metaphorically and considered as whole-in-themselves, these two components are not discreet but intertwined with each other. The outcome space consists of eight categories. Each category of description is comprised of several sub-categories of description but as a whole, is a conception of school based youth health nursing. The eight conceptions of school based youth health nursing are: 1. The conception of school based youth health nursing as out there all by yourself. 2. The conception of school based youth health nursing as no real backup. 3. The conception of school based youth health nursing as confronted by many barriers. 4. The conception of school based youth health nursing as hectic and full-on. 5. The conception of school based youth health nursing as working together. 6. The conception of school based youth health nursing as belonging to school. 7. The conception of school based youth health nursing as treated the same as others. 8. The conception of school based youth health nursing as the reason it’s all worthwhile. These eight conceptions of school based youth health nursing are logically related and form a staged hierarchical relationship because they are not equally dependent on each other. The conceptions of school based youth health nursing are grouped according to negative, negative and positive and positive conceptions of school based youth health nursing. The conceptions of school based youth health nursing build on each other, from the bottom upwards, to reach the authorized, or the most desired, conception of school based youth health nursing. This research adds to the knowledge about school nursing in general but especially about school based youth health nursing specifically. Furthermore, this research has operational and strategic implications, highlighted in the negative conceptions of school based youth health nursing, for the School Based Youth Health Nurse Program. The researcher suggests the School Based Youth Health Nurse Program, as a priority, address the operational issues The researcher recommends a range of actions to tackle issues and problems associated with accommodation and information, consultations and referral pathways, confidentiality, health promotion and education, professional development, line management and School Based Youth Health Nurse Program support and school management and community. Strategically, the researcher proposes a variety of actions to address strategic issues, such as the School Based Youth Health Nurse Program vision, model and policy and practice framework, recruitment and retention rates and evaluation. Additionally, the researcher believes the findings of this research have the capacity to spawn a myriad of future research projects. The researcher has identified the most important areas for future research as confidentiality, information, qualifications and health outcomes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It has been suggested that the Internet is the most significant driver of international trade in recent years to the extent that the term =internetalisation‘ has been coined (Bell, Deans, Ibbotson & Sinkovics, 2001; Buttriss & Wilkinson, 2003). This term is used to describe the Internet‘s affect on the internationalisation process of the firm. Consequently, researchers have argued that the internationalisation process of the firm has altered due to the Internet, hence is in need of further investigation. However, as there is limited research and understanding, ambiguity remains in how the Internet has influenced international market growth. Thus, the purpose of this study was to explore how the Internet influences firms‘ internationalisation process, specifically, international market growth. To this end, Internet marketing and international market growth theories are used to illuminate this ambiguity in the body of knowledge. Thus, the research problem =How and why does the Internet influence international market growth of the firm’ is justified for investigation. To explore the research question a two-stage approach is used. Firstly, twelve case studies were used to evaluate key concepts, generate hypotheses and to develop a model of Internetalisation for testing. The participants held key positions within their firm, so that rich data could be drawn from international market growth decision makers. Secondly, a quantitative confirmation process analysed the identified themes or constructs, using two hundred and twenty four valid responses. Constructs were evaluated through an exploratory factor analysis, confirmatory factor analysis and structural equation modelling process. Structural equation modelling was used to test the model of =internetalisation‘ to examine the interrelationships between the internationalisation process components: information availability, information usage, interaction communication, international mindset, business relationship usage, psychic distance, the Internet intensity of the firm and international market growth. This study found that the Internet intensity of the firm mediates information availability, information usage, international mindset, and business relationships when firms grow in international markets. Therefore, these results provide empirical evidence that the Internet has a positive influence on international information, knowledge, entrepreneurship and networks and these in turn influence international market growth. The theoretical contributions are three fold. Firstly, the study identifies a holistic model of the impact the Internet has had on the outward internationalisation of the firm. This contribution extends the body of knowledge pertaining to Internet international marketing by mapping and confirming interrelationships between the Internet, internationalisation and growth concepts. Secondly, the study highlights the broad scope and accelerated rate of international market growth of firms. Evidence that the Internet influences the traditional and virtual networks for the pursuit of international market growth extends the current understanding. Thirdly, this study confirms that international information, knowledge, entrepreneurship and network concepts are valid in a single model. Thus, these three contributions identify constructs, measure constructs in a multi-item capacity, map interrelationships and confirm single holistic model of ‗internetalisation‘. The main practical contribution is that the findings identified information, knowledge and entrepreneurial opportunities for firms wishing to maximise international market growth. To capitalise on these opportunities suggestions are offered to assist firms to develop greater Internet intensity and internationalisation capabilities. From a policy perspective, educational institutions and government bodies need to promote more applied programs for Internet international marketing. The study provides future researchers with a platform of identified constructs and interrelationships related to internetalisation, with which to investigate. However, a single study has limitations of generalisability; thus, future research should replicate this study. Such replication or cross validation will assist in the verification of scales used in this research and enhance the validity of causal predications. Furthermore, this study was undertaken in the Australian outward-bound context. Research in other nations, as well as research into inbound internationalisation would be fruitful.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cultural objects are increasingly generated and stored in digital form, yet effective methods for their indexing and retrieval still remain an important area of research. The main problem arises from the disconnection between the content-based indexing approach used by computer scientists and the description-based approach used by information scientists. There is also a lack of representational schemes that allow the alignment of the semantics and context with keywords and low-level features that can be automatically extracted from the content of these cultural objects. This paper presents an integrated approach to address these problems, taking advantage of both computer science and information science approaches. We firstly discuss the requirements from a number of perspectives: users, content providers, content managers and technical systems. We then present an overview of our system architecture and describe various techniques which underlie the major components of the system. These include: automatic object category detection; user-driven tagging; metadata transform and augmentation, and an expression language for digital cultural objects. In addition, we discuss our experience on testing and evaluating some existing collections, analyse the difficulties encountered and propose ways to address these problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: Computer vision has been widely used in the inspection of electronic components. This paper proposes a computer vision system for the automatic detection, localisation, and segmentation of solder joints on Printed Circuit Boards (PCBs) under different illumination conditions. Design/methodology/approach: An illumination normalization approach is applied to an image, which can effectively and efficiently eliminate the effect of uneven illumination while keeping the properties of the processed image the same as in the corresponding image under normal lighting conditions. Consequently special lighting and instrumental setup can be reduced in order to detect solder joints. These normalised images are insensitive to illumination variations and are used for the subsequent solder joint detection stages. In the segmentation approach, the PCB image is transformed from an RGB color space to a YIQ color space for the effective detection of solder joints from the background. Findings: The segmentation results show that the proposed approach improves the performance significantly for images under varying illumination conditions. Research limitations/implications: This paper proposes a front-end system for the automatic detection, localisation, and segmentation of solder joint defects. Further research is required to complete the full system including the classification of solder joint defects. Practical implications: The methodology presented in this paper can be an effective method to reduce cost and improve quality in production of PCBs in the manufacturing industry. Originality/value: This research proposes the automatic location, identification and segmentation of solder joints under different illumination conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a high voltage pulsed power system based on low voltage switch-capacitor units connected to a current source for several applications such as plasma systems. A buck-boost converter topology is used to utilize the current source and a series of low voltage switch-capacitor units is connected to the current source in order to provide high voltage with high voltage stress (dv/dt) as demanded by loads. This pulsed power converter is flexible in terms of energy control, in that the stored energy in the current source can be adjusted by changing the current magnitude to significantly improve the efficiency of various systems with different requirements. Output voltage magnitude and stress (dv/dt) can be controlled by a proper selection of components and control algorithm to turn on and off switching devices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.