898 resultados para Trojan 2.0


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PLATO 2.0 has recently been selected for ESA’s M3 launch opportunity (2022/24). Providing accurate key planet parameters (radius, mass, density and age) in statistical numbers, it addresses fundamental questions such as: How do planetary systems form and evolve? Are there other systems with planets like ours, including potentially habitable planets? The PLATO 2.0 instrument consists of 34 small aperture telescopes (32 with 25 s readout cadence and 2 with 2.5 s candence) providing a wide field-of-view (2232 deg 2) and a large photometric magnitude range (4–16 mag). It focusses on bright (4–11 mag) stars in wide fields to detect and characterize planets down to Earth-size by photometric transits, whose masses can then be determined by ground-based radial-velocity follow-up measurements. Asteroseismology will be performed for these bright stars to obtain highly accurate stellar parameters, including masses and ages. The combination of bright targets and asteroseismology results in high accuracy for the bulk planet parameters: 2 %, 4–10 % and 10 % for planet radii, masses and ages, respectively. The planned baseline observing strategy includes two long pointings (2–3 years) to detect and bulk characterize planets reaching into the habitable zone (HZ) of solar-like stars and an additional step-and-stare phase to cover in total about 50 % of the sky. PLATO 2.0 will observe up to 1,000,000 stars and detect and characterize hundreds of small planets, and thousands of planets in the Neptune to gas giant regime out to the HZ. It will therefore provide the first large-scale catalogue of bulk characterized planets with accurate radii, masses, mean densities and ages. This catalogue will include terrestrial planets at intermediate orbital distances, where surface temperatures are moderate. Coverage of this parameter range with statistical numbers of bulk characterized planets is unique to PLATO 2.0. The PLATO 2.0 catalogue allows us to e.g.: - complete our knowledge of planet diversity for low-mass objects, - correlate the planet mean density-orbital distance distribution with predictions from planet formation theories,- constrain the influence of planet migration and scattering on the architecture of multiple systems, and - specify how planet and system parameters change with host star characteristics, such as type, metallicity and age. The catalogue will allow us to study planets and planetary systems at different evolutionary phases. It will further provide a census for small, low-mass planets. This will serve to identify objects which retained their primordial hydrogen atmosphere and in general the typical characteristics of planets in such low-mass, low-density range. Planets detected by PLATO 2.0 will orbit bright stars and many of them will be targets for future atmosphere spectroscopy exploring their atmosphere. Furthermore, the mission has the potential to detect exomoons, planetary rings, binary and Trojan planets. The planetary science possible with PLATO 2.0 is complemented by its impact on stellar and galactic science via asteroseismology as well as light curves of all kinds of variable stars, together with observations of stellar clusters of different ages. This will allow us to improve stellar models and study stellar activity. A large number of well-known ages from red giant stars will probe the structure and evolution of our Galaxy. Asteroseismic ages of bright stars for different phases of stellar evolution allow calibrating stellar age-rotation relationships. Together with the results of ESA’s Gaia mission, the results of PLATO 2.0 will provide a huge legacy to planetary, stellar and galactic science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online technological advances are pioneering the wider distribution of geospatial information for general mapping purposes. The use of popular web-based applications, such as Google Maps, is ensuring that mapping based applications are becoming commonplace amongst Internet users which has facilitated the rapid growth of geo-mashups. These user generated creations enable Internet users to aggregate and publish information over specific geographical points. This article identifies privacy invasive geo-mashups that involve the unauthorized use of personal information, the inadvertent disclosure of personal information and invasion of privacy issues. Building on Zittrain’s Privacy 2.0, the author contends that first generation information privacy laws, founded on the notions of fair information practices or information privacy principles, may have a limited impact regarding the resolution of privacy problems arising from privacy invasive geo-mashups. Principally because geo-mashups have different patterns of personal information provision, collection, storage and use that reflect fundamental changes in the Web 2.0 environment. The author concludes by recommending embedded technical and social solutions to minimize the risks arising from privacy invasive geo-mashups that could lead to the establishment of guidelines for the general protection of privacy in geo-mashups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is the primary output of Project 4: Copyright and Intellectual Property, the aim of which was to produce a report considering how greater access to and use of government information could be achieved within the scope of the current copyright law. In our submission for Project 4, we undertook to address: •the policy rationales underlying copyright and how they apply in the context of materials owned, held and used by government; • the recommendations of the Copyright Law Review Committee (CLRC) in its 2005 report on Crown copyright; • the legislative and regulatory barriers to information sharing in key domains, including where legal impediments such as copyright have been relied upon (whether rightly or wrongly) to justify a refusal to provide access to government data; • copyright licensing models appropriate to government materials and examples of licensing initiatives in Australia and other relevant jurisdictions; and • issues specific to the galleries, libraries, archives and museums (“GLAM”) sector, including management of copyright in legacy materials and “orphan” works. In addressing these areas, we analysed the submissions received in response to the Government 2.0 Taskforce Issues Paper, consulted with members of the Task Force as well as several key stakeholders and considered the comments posted on the Task Force’s blog. This Project Report sets out our findings on the above issues. It puts forward recommendations for consideration by the Government 2.0 Task Force on steps that can be taken to ensure that copyright and intellectual property promote access to and use of government information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Report, prepared for Smart Service Queensland (“SSQ”), addresses legal issues, areas of risk and other factors associated with activities conducted on three popular online platforms—YouTube, MySpace and Second Life (which are referred to throughout this Report as the “Platforms”). The Platforms exemplify online participatory spaces and behaviours, including blogging and networking, multimedia sharing, and immersive virtual environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The traditional model for information dissemination in disaster response is unidirectional from official channels to the public. However recent crises in the US, such as Hurricane Katrina and the Californian Bushfires show that civilians are now turning to Web 2.0 technologies as a means of sharing disaster related information. These technologies present enormous potential benefits to disaster response authorities that cannot be overlooked. In Australia, the Victorian Bushfires Royal Commission has recently recommended that Australian disaster response authorities utilize information technologies to improve the dissemination of disaster related, bushfire information. However, whilst the use of these technologies has many positive attributes, potential legal liabilities for disaster response authorities arise. This paper identifies some potential legal liabilities arising from the use of Web 2.0 technologies in disaster response situations thereby enhancing crisis related information sharing by highlighting legal concerns that need to be addressed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following the position of Beer and Burrows (2007) this paper poses a re-conceptualization of Web 2.0 interaction in order to understand the properties of action possibilities in and of Web 2.0. The paper discusses the positioning of Web 2.0 social interaction in light of current descriptions, which point toward the capacities of technology in the production of social affordances within that domain (Bruns 2007; Jenkins 2006; O’Reilly 2005). While this diminishes the agency and reflexivity for users of Web 2.0 it also inadvertently positions tools as the central driver for the interactive potential available (Everitt and Mills 2009; van Dicjk 2009). In doing so it neglects the possibility that participants may be more involved in the production of Web 2.0 than the technology that underwrites it. It is this aspect of Web 2.0 that is questioned in the study with particular interest on how an analytical option may be made available to broaden the scope of investigations into Web 2.0 to include a study of the capacity for an interactive potential in light of how action possibilities are presented to users through communication with others (Bonderup Dohn 2009).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An essential challenge for organizations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. Using three case examples, this paper explores how Enterprise 2.0 technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering organizations. The paper is intended to be a timely introduction to the benefits and issues associated with the use of Enterprise 2.0 technologies with the aim of achieving the positive outcomes associated with knowledge management

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper anatomises emerging developments in online community engagement in a major global industry: real estate. Economists argue that we are entering a ‘social network economy’ in which ‘complex social networks’ govern consumer choice and product value. In the light of this, organisations are shifting from thinking and behaving in the conventional ‘value chain’ model--in which exchanges between firms and customers are one-way only, from the firm to the consumer--to the ‘value ecology’ model, in which consumers and their networks become co-creators of the value of the product. This paper studies the way in which the global real estate industry is responding to this environment. This paper identifies three key areas in which online real estate ‘value ecology’ work is occurring: real estate social networks, games, and locative media / augmented reality applications. Uptake of real estate applications is, of course, user-driven: the paper not only highlights emerging innovations; it also identifies which of these innovations are actually being taken up by users, and the content contributed as a result. The paper thus provides a case study of one major industry’s shift into a web 2.0 communication model, focusing on emerging trends and issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The intersection of current arguments about the role of creative industries in economic development, online user-generated content, and the uptake of broadband in economically disadvantaged communities provides the content for this article. From 2006 to 2008 the authors carried out a research project in Ipswich, Queensland involving local creative practitioners and community groups in their development of edgeX, a Web-based platform for content uploads and social networking. The project aimed to explore issues of local identity and community building through online networking, as well as the possibilities for creating pathways from amateur to professional practice in the creative industries through the auspices of the Website. Set against the backdrop of a rapidly changing technological environment that has problematic implications for research projects aiming to build new online platforms, we present several case studies from the project to illustrate the challenges to participation experienced by people with limited access to, and literacy with, the Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Northern Hemisphere slumbers, dreaming that – one day – it is going to split up its empire, before the seas boil and the towers collapse. During this same dark night, Australia is wide awake, chirpy as a Canadian, strapping as a Bondi blonde, having an election...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. This article will discuss a research project that fills this gap. Funded by the Australian Learning and Teaching Council, the project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. This article will provide a detailed discussion on each of these themes. The study’s findings also suggest that “librarian 2.0” is a state of mind, and that the Australian LIS profession is undergoing a significant shift in “attitude.”

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An often neglected but well recognised aspect of successful engineering asset management is the achievement of co-operation and collaboration between various occupational, functional and hierarchical levels present within complex technical environments. Engineering and technical contexts have been well documented for the presence of highly cohesive groups based around around functional or role orientations. However while highly cohesive groups are potentially advantageous they are also often correlated with the emergence of knowledge and information silos based around those same functional or occupational clusters. Improved collaboration and co-operation between groups has been demonstrated to result in a number of positive outcomes at an individual, group and organisational level. Example outcomes include an increased capacity for problem solving, improved responsiveness and adaptation to organisational crises, higher morale and an increased ability to leverage workforce capability. However, an essential challenge for organisations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. This paper reviews the ability of Web 2.0 technologies and mobile computing devices to facilitate and encourage knowledge sharing between “silo’d” groups. Commonly available tools such as Facebook, Twitter, Blogs, Wiki’s and others will be reviewed in relation to their applicability, functionality and ease-of-use by engineering and technical personnel. The paper also documents three case examples of engineering organisations that have successfully employed Web 2.0 to achieve superior knowledge management. With a number of clear recommendations he paper is an essential starting point for any organization looking at the use of new generation technologies for achieving the significant outcomes associated with knowledge transfer.