814 resultados para Peer-to-Peers Networks


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Informed by a model of family role-redistribution derived from the Family Ecology Framework (Pedersen & Revenson, 2005), this study examined differences in two proposed psychological components of role-redistribution (youth caregiving experiences and responsibilities) between youth of a parent with illness and their peers from ‘healthy’ families controlling for the effects of whether a parent is ill or some other family member, illness type, and demographics. Based on self-report questionnaire data, four groups of Australian children were derived from a community sample of 2474youth (‘healthy’ family, n=1768; parental illness, n=336; other family member illness, n=254; both parental and other family member illness, n=116). The presence of any family member with a serious illness is associated with an intensification of youth caregiving experiences relative to peers from healthy families. This risk is elevated if the ill family member is a parent, if more illnesses are present, and by certain youth and family demographics, and especially by higher caregiving responsibilities. The presence of a family member, particularly a parent, with a serious medical condition has pervasive increased effects on youth caregiving compared to healthy families, and these effects are not fully accounted for by illness type, demographics or caregiving responsibilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2012 the Australian Commonwealth government was scheduled to release the first dedicated policy for culture and the arts since the Keating government's Creative Nation (1994). Investing in a Creative Australia was to appear after a lengthy period of consultation between the Commonwealth government and all interested cultural sectors and organisations. When it eventuates, the policy will be of particular interest to those information professionals working in the GLAM (galleries, libraries, archives and museums) environment. GLAM is a cross-institutional field which seeks to find points of commonality among various cultural-heritage institutions, while still recognising their points of difference. Digitisation, collaboration and convergence are key themes and characteristics of the GLAM sector and its associated theoretical discipline. The GLAM movement has seen many institutions seeking to work together to create networks of practice that are beneficial to the cultural-heritage industry and sector. With a new Australian cultural policy imminent, it is timely to reflect on the issues and challenges that GLAM principles present to national cultural-heritage institutions by discussing their current practices. In doing so, it is possible to suggest productive ways forward for these institutions which could then be supported at a policy level by the Commonwealth government. Specifically, this paper examines four institutions: the National Gallery of Australia, the National Library of Australia, the National Archives of Australia and the National Museum of Australia. The paper reflects on their responses to the Commonwealth's 2011 Cultural Policy Discussion Paper. It argues that by encouraging and supporting collecting institutions to participate more fully in GLAM practices the Commonwealth government's cultural policy would enable far greater public access to, and participation in, Australia's cultural heritage. Furthermore, by considering these four institutions, the paper presents a discussion of the challenges and the opportunities that GLAM theoretical and disciplinary principles present to the cultural-heritage sector. Implications for Best Practice * GLAM is a developing field of theory and practice that encompasses many issues and challenges for practitioners in this area. * GLAM principles and practices are increasingly influencing the cultural-heritage sector. * Cultural policy is a key element in shaping the future of Australia's cultural-heritage sector and needs to incorporate GLAM principles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background This study addresses limitations of prior research that have used group comparison designs to test the effects of parental illness on youth. Purpose This study examined differences in adjustment between children of a parent with illness and peers from ‘healthy’ families controlling for the effects of whether a parent or non-parent family member is ill, illness type, demographics and caregiving. Methods Based on questionnaire data, groups were derived from a community sample of 2,474 youth (‘healthy’ family, n = 1768; parental illness, n = 336; other family member illness, n = 254; both parental and other family illness, n = 116). Results The presence of any family member with an illness is associated with greater risk of mental health difficulties for youth relative to peers from healthy families. This risk is elevated if the ill family member is a parent and has mental illness or substance misuse. Conclusions Serious health problems within a household adversely impact youth adjustment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article considers the challenges posed to intellectual property law by the emerging field of bioinformatics. It examines the intellectual property strategies of established biotechnology companies, such as Celera Genomics, and information technology firms entering into the marketplace, such as IBM. First this paper argues that copyright law is not irrelevant to biotechnology, as some commentators would suggest. It claims that the use of copyright law and contract law is fundamental to the protection of biomedical and genomic databases. Second this article questions whether biotechnology companies are exclusively interested in patenting genes and genetics sequences. Recent evidence suggests that biotechnology companies and IT firms are patenting bioinformatics software and Internet business methods, as well as underlying instrumentation such as microarrays and genechips. Finally, this paper evaluates what impact the privatisation of bioinformatics will have on public research and scientific communication. It raises important questions about integration, interoperability, and the risks of monopoly. It finally considers whether open source software such as the Ensembl Project and peer to peer technology like DSAS will be able to counter this trend of privatisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter considers the legal ramifications of Wikipedia, and other online media, such as the Encyclopedia of Life. Nathaniel Tkacz (2007) has observed: 'Wikipedia is an ideal entry-point from which to approach the shifting character of knowledge in contemporary society.' He observes: 'Scholarship on Wikipedia from computer science, history, philosophy, pedagogy and media studies has moved beyond speculation regarding its considerable potential, to the task of interpreting - and potentially intervening in - the significance of Wikipedia's impact' (Tkacz 2007). After an introduction, Part II considers the evolution and development of Wikipedia, and the legal troubles that have attended it. It also considers the establishment of rival online encyclopedia - such as Citizendium set up by Larry Sanger, the co-founder of Wikipedia; and Knol, the mysterious new project of Google. Part III explores the use of mass, collaborative authorship in the field of science. In particular, it looks at the development of the Encyclopedia of Life, which seeks to document the world's biodiversity. This chapter expresses concern that Wiki-based software had to develop in a largely hostile and inimical legal environment. It contends that copyright law and related fields of intellectual property need to be reformed in order better to accommodate users of copyright material (Rimmer 2007). This chapter makes a number of recommendations. First, there is a need to acknowledge and recognize forms of mass, collaborative production and consumption - not just individual authorship. Second, the view of a copyright 'work' and other subject matter as a complete and closed piece of cultural production also should be reconceptualised. Third, the defense of fair use should be expanded to accommodate a wide range of amateur, peer-to-peer production activities - not only in the United States, but in other jurisdictions as well. Fourth, the safe harbor protections accorded to Internet intermediaries, such as Wikipedia, should be strengthened. Fifth, there should be a defense in respect of the use of 'orphan works' - especially in cases of large-scale digitization. Sixth, the innovations of open source licensing should be expressly incorporated and entrenched within the formal framework of copyright laws. Finally, courts should craft judicial remedies to take into account concerns about political censorship and freedom of speech.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a single server queue with the interarrival times and the service times forming a regenerative sequence. This traffic class includes the standard models: lid, periodic, Markov modulated (e.g., BMAP model of Lucantoni [18]) and their superpositions. This class also includes the recently proposed traffic models in high speed networks, exhibiting long range dependence. Under minimal conditions we obtain the rates of convergence to stationary distributions, finiteness of stationary moments, various functional limit theorems and the continuity of stationary distributions and moments. We use the continuity results to obtain approximations for stationary distributions and moments of an MMPP/GI/1 queue where the modulating chain has a countable state space. We extend all our results to feedforward networks where the external arrivals to each queue can be regenerative. In the end we show that the output process of a leaky bucket is regenerative if the input process is and hence our results extend to a queue with arrivals controlled by a leaky bucket.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study examined how personality and social psychological factors affect third and fourth graders' computer-mediated communication. Personality was analysed in terms of the following strategies: optimism, pessimism and defensive pessimism. Students worked either individually or in dyads which were paired homogeneously or heterogeneously according to the strategies. Moreover, the present study compared horizontal and vertical interaction. The study also examined the role that popularity plays, and students were divided into groups based on their popularity level. The results show that an optimistic strategy is useful. Optimism was found to be related to the active production and processing of ideas. Although previous research has identified drawbacks to pessimism in achievement settings, this study shows that the pessimistic strategy is not as debilitating a strategy as is usually assumed. Pessimistic students were able to process their ideas. However, defensive pessimists were somewhat cautious in introducing or changing ideas. Heterogeneous dyads were not beneficial configurations with respect to producing, introducing, or changing ideas. Moreover, many differences were found to exist between the horizontal and vertical interaction; specifically, the students expressed more opinions and feelings when teachers took no part in the discussions. Strong emotions were observed especially in the horizontal interaction. Further, group working skills were found to be more important for boys than for girls, while rejected students were not at a disadvantage compared to popular ones. Schools can encourage emotional and social learning. The present study shows that students can use computers to express their feelings. In addition, students who are unpopular in non-computer contexts or students who use pessimism can benefit from computers. Participation in computer discussions can give unpopular children a chance to develop confidence when relating to peers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mesh topologies are important for large-scale peer-to-peer systems that use low-power transceivers. The Quality of Service (QoS) in such systems is known to decrease as the scale increases. We present a scalable approach for dissemination that exploits all the shortest paths between a pair of nodes and improves the QoS. Despite th presence of multiple shortest paths in a system, we show that these paths cannot be exploited by spreading the messages over the paths in a simple round-robin manner; nodes along one of these paths will always handle more messages than the nodes along the other paths. We characterize the set of shortest paths between a pair of nodes in regular mesh topologies and derive rules, using this characterization, to effectively spread the messages over all the available paths. These rules ensure that all the nodes that are at the same distance from the source handle roughly the same number of messages. By modeling the multihop propagation in the mesh topology as a multistage queuing network, we present simulation results from a variety of scenarios that include link failures and propagation irregularities to reflect real-world characteristics. Our method achieves improved QoS in all these scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND OR CONTEXT Thermodynamics is a core concept for mechanical engineers yet notoriously difficult. Evidence suggests students struggle to understand and apply the core fundamental concepts of thermodynamics with analysis indicating a problem with student learning/engagement. A contributing factor is that thermodynamics is a ‘science involving concepts based on experiments’ (Mayhew 1990) with subject matter that cannot be completely defined a priori. To succeed, students must engage in a deep-holistic approach while taking ownership of their learning. The difficulty in achieving this often manifests itself in students ‘not getting’ the principles and declaring thermodynamics ‘hard’. PURPOSE OR GOAL Traditionally, students practice and “learn” the application of thermodynamics in their tutorials, however these do not consider prior conceptions (Holman & Pilling 2004). As ‘hands on’ learning is the desired outcome of tutorials it is pertinent to study methods of improving their efficacy. Within the Australian context, the format of thermodynamics tutorials has remained relatively unchanged over the decades, relying anecdotally on a primarily didactic pedagogical approach. Such approaches are not conducive to deep learning (Ramsden 2003) with students often disengaged from the learning process. Evidence suggests (Haglund & Jeppsson 2012), however, that a deeper level and ownership of learning can be achieved using a more constructivist approach for example through self generated analogies. This pilot study aimed to collect data to support the hypothesis that the ‘difficulty’ of thermodynamics is associated with the pedagogical approach of tutorials rather than actual difficulty in subject content or deficiency in students. APPROACH Successful application of thermodynamic principles requires solid knowledge of the core concepts. Typically, tutorial sessions guide students in this application. However, a lack of deep and comprehensive understanding can lead to student confusion in the applications resulting in the learning of the ‘process’ of application without understanding ‘why’. The aim of this study was to gain empirical data on student learning of both concepts and application, within thermodynamic tutorials. The approach taken for data collection and analysis was: - 1 Four concurrent tutorial streams were timetabled to examine student engagement/learning in traditional ‘didactic’ (3 weeks) and non-traditional (3 weeks). In each week, two of the selected four sessions were traditional and two non-traditional. This provided a control group for each week. - 2 The non-traditional tutorials involved activities designed to promote student-centered deep learning. Specific pedagogies employed were: self-generated analogies, constructivist, peer-to-peer learning, inquiry based learning, ownership of learning and active learning. - 3 After a three-week period, teaching styles of the selected groups was switched, to allow each group to experience both approaches with the same tutor. This also acted to mimimise any influence of tutor personality / style on the data. - 4 At the conclusion of the trial participants completed a ‘5 minute essay’ on how they liked the sessions, a small questionnaire, modelled on the modified (Christo & Hoang, 2013)SPQ designed by Biggs (1987) and a small formative quiz to gauge the level of learning achieved. DISCUSSION Preliminary results indicate that overall students respond positively to in class demonstrations (inquiry based learning), and active learning activities. Within the active learning exercises, the current data suggests students preferred individual rather than group or peer-to-peer activities. Preliminary results from the open-ended questions such as “What did you like most/least about this tutorial” and “do you have other comments on how this tutorial could better facilitate your learning”, however, indicated polarising views on the nontraditional tutorial. Some student’s responded that they really like the format and emphasis on understanding the concepts, while others were very vocal that that ‘hated’ the style and just wanted the solutions to be presented by the tutor. RECOMMENDATIONS/IMPLICATIONS/CONCLUSION Preliminary results indicated a mixed, but overall positive response by students with more collaborative tutorials employing tasks promoting inquiry based, peer-to-peer, active, and ownership of learning activities. Preliminary results from student feedback supports evidence that students learn differently, and running tutorials focusing on only one pedagogical approached (typically didactic) may not be beneficial to all students. Further, preliminary data suggests that the learning / teaching style of both students and tutor are important to promoting deep learning in students. Data collection is still ongoing and scheduled for completion at the end of First Semester (Australian academic calendar). The final paper will examine in more detail the results and analysis of this project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Delay and disruption tolerant networks (DTNs) are computer networks where round trip delays and error rates are high and disconnections frequent. Examples of these extreme networks are space communications, sensor networks, connecting rural villages to the Internet and even interconnecting commodity portable wireless devices and mobile phones. Basic elements of delay tolerant networks are a store-and-forward message transfer resembling traditional mail delivery, an opportunistic and intermittent routing, and an extensible cross-region resource naming service. Individual nodes of the network take an active part in routing the traffic and provide in-network data storage for application data that flows through the network. Application architecture for delay tolerant networks differs also from those used in traditional networks. It has become feasible to design applications that are network-aware and opportunistic, taking an advantage of different network connection speeds and capabilities. This might change some of the basic paradigms of network application design. DTN protocols will also support in designing applications which depend on processes to be persistent over reboots and power failures. DTN protocols could also be applicable to traditional networks in cases where high tolerance to delays or errors would be desired. It is apparent that challenged networks also challenge the traditional strictly layered model of network application design. This thesis provides an extensive introduction to delay tolerant networking concepts and applications. Most attention is given to challenging problems of routing and application architecture. Finally, future prospects of DTN applications and implementations are envisioned through recent research results and an interview with an active researcher of DTN networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, XML has been accepted as the format of messages for several applications. Prominent examples include SOAP for Web services, XMPP for instant messaging, and RSS and Atom for content syndication. This XML usage is understandable, as the format itself is a well-accepted standard for structured data, and it has excellent support for many popular programming languages, so inventing an application-specific format no longer seems worth the effort. Simultaneously with this XML's rise to prominence there has been an upsurge in the number and capabilities of various mobile devices. These devices are connected through various wireless technologies to larger networks, and a goal of current research is to integrate them seamlessly into these networks. These two developments seem to be at odds with each other. XML as a fully text-based format takes up more processing power and network bandwidth than binary formats would, whereas the battery-powered nature of mobile devices dictates that energy, both in processing and transmitting, be utilized efficiently. This thesis presents the work we have performed to reconcile these two worlds. We present a message transfer service that we have developed to address what we have identified as the three key issues: XML processing at the application level, a more efficient XML serialization format, and the protocol used to transfer messages. Our presentation includes both a high-level architectural view of the whole message transfer service, as well as detailed descriptions of the three new components. These components consist of an API, and an associated data model, for XML processing designed for messaging applications, a binary serialization format for the data model of the API, and a message transfer protocol providing two-way messaging capability with support for client mobility. We also present relevant performance measurements for the service and its components. As a result of this work, we do not consider XML to be inherently incompatible with mobile devices. As the fixed networking world moves toward XML for interoperable data representation, so should the wireless world also do to provide a better-integrated networking infrastructure. However, the problems that XML adoption has touch all of the higher layers of application programming, so instead of concentrating simply on the serialization format we conclude that improvements need to be made in an integrated fashion in all of these layers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

- Background and Purpose Given the turbulent and highly contested environment in which professional coaches work, a prime concern to coach developers is how coaches learn their craft. Understanding the learning and development of senior coaches (SCs) and assistant coaches (ACs) in the Australian Football League (AFL – the peak organisation for Australian Rules Football) is important to better develop the next generation of performance coaches. Hence the focus of this research was to examine the learning of SC and AC in the AFL. Fundamental to this research was an understanding that the AFL and each club within the league be regarded as learning organisations and workplaces with their own learning cultures where learning takes place. The purpose of this paper was to examine the learning culture for AFL coaches. - Method Five SCs, 6 ACs, and 5 administrators (4 of whom were former coaches) at 11 of the 16 AFL clubs were recruited for the research project. First, demographic data were collected for each participant (e.g. age, playing and coaching experience, development and coach development activities). Second, all participants were involved in one semi-structured interview of between 45 and 90 minutes duration. An interpretative (hierarchical content) analysis of the interview data was conducted to identify key emergent themes. - Results Learning was central to AFL coaches becoming a SC. Nevertheless, coaches reported a sense of isolation and a lack of support in developing their craft within their particular learning culture. These coaches developed a unique dynamic social network (DSN) that involved episodic contact with a number of respected confidantes often from diverse fields (used here in the Bourdieuian sense) in developing their coaching craft. Although there were some opportunities in their workplace, much of their learning was unmediated by others, underscoring the importance of their agentic engagement in limited workplace affordances. - Conclusion The variety of people accessed for the purposes of learning (often beyond the immediate workplace) and the long time taken to establish networks of supporters meant that a new way of describing the social networks of AFL coaches was needed; DSN. However, despite the acknowledged utility of learning from others, all coaches reported some sense of isolation in their learning. The sense of isolation brought about by professional volatility in high-performance Australian Football offers an alternative view on Hodkinson, Biesta and James' attempt in overcoming dualisms in learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Den här boken handlar om framtidens television och vissa marknadsförings- och upphovsrättsliga spörsmål som aktualiseras i samband med nya former av televisionssändningar. TV-tekniken har utvecklats mycket under de senaste åren, och det är framförallt sändningstekniken som har genomgått de största förändringarna. Tyngdpunkten i detta arbete har lagts på förmedling av TV-sändningar över Internet med hjälp av P2P-tekniken (peer-to-peer) och vilka marknadsförings- och upphovsrättsliga spörsmål detta aktualiserar. Den nya förmedlingstekniken möjliggör en rad nya marknadsföringsmetoder, och i boken behandlas hur dessa metoder förhåller sig till nuvarande marknadsföringsrättsliga reglering. Förmedling av upphovsrättsskyddat material över Internet har inneburit ett flertal reformer av gällande upphovsrättslagstiftning i syfte att stärka rättsinnehavarnas ensamrätt. En viktig fråga som boken behandlar är hur gällande upphovsrättslagstiftning skall tolkas i förhållande till nya distributionsformer över Internet. I boken tas även upp frågor i anknytning till problemet med att en stor del av det material som förmedlas över Internet sker utan rättsinnehavarens samtycke, vilket innebär en ekonomisk förlust för rättsinnehavaren. I boken framläggs alternativa lösningar för hur förmedlingen över Internet kan tilldela rättsinnehavaren ekonomisk ersättning, trots att förmedlingen är avgiftsfri.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erasure coding techniques are used to increase the reliability of distributed storage systems while minimizing storage overhead. Also of interest is minimization of the bandwidth required to repair the system following a node failure. In a recent paper, Wu et al. characterize the tradeoff between the repair bandwidth and the amount of data stored per node. They also prove the existence of regenerating codes that achieve this tradeoff. In this paper, we introduce Exact Regenerating Codes, which are regenerating codes possessing the additional property of being able to duplicate the data stored at a failed node. Such codes require low processing and communication overheads, making the system practical and easy to maintain. Explicit construction of exact regenerating codes is provided for the minimum bandwidth point on the storage-repair bandwidth tradeoff, relevant to distributed-mail-server applications. A sub-space based approach is provided and shown to yield necessary and sufficient conditions on a linear code to possess the exact regeneration property as well as prove the uniqueness of our construction. Also included in the paper, is an explicit construction of regenerating codes for the minimum storage point for parameters relevant to storage in peer-to-peer systems. This construction supports a variable number of nodes and can handle multiple, simultaneous node failures. All constructions given in the paper are of low complexity, requiring low field size in particular.