997 resultados para 982:946.0
Resumo:
The pulsed decline and eventual extinction of 51 species of elongate, cylindrical deep-sea benthic foraminifera (Stilostomellidae, Pleurostomellidae, and some Nodosariidae) occurred at intermediate water depths (1145-2168 m, Sites 980 and 982) in the northern North Atlantic during the mid-Pleistocene transition (MPT, 1.2-0.6 Ma). In the early Pleistocene, prior to their disappearance, these species comprised up to 20% of the total abundance of the benthic foraminiferal assemblage at 2168 m, but up to only 2% at 1145 m. The MPT extinction of 51 species represents ?20% of the total benthic foraminiferal diversity at bathyal depths in the North Atlantic (excluding the myriad of small unilocular forms). The extinction rate during the MPT was approximately 10 species per 0.1 myr, being one or two orders of magnitude greater than normal background turnover rates of deep-sea benthic foraminifera. Comparison of the precise timings of declines and disappearances (= highest occurrences) of each species shows that they were often diachronous between the two depths. The last of these species to disappear in the North Atlantic was Pleurostomella alternans at ~0.679 and ~0.694 Ma in Sites 980 and 982, respectively, which is in good agreement with the previously documented global "Stilostomella extinction" datum within the period 0.7-0.58 Ma. Comparison with similar studies in intermediate depth waters in the Southwest Pacific Gateway indicates that ~61% of the extinct species were common to both regions, and that although the pattern of pulsed decline was similar, the precise order and timing of the extinction of individual species were mostly different on opposite sides of the world. Previous studies have indicated that this extinct group of elongate, cylindrical foraminifera lived infaunally and had their greatest abundances in poorly ventilated, lower oxygen environments. This is supported by our study where there is a strong positive correlation (r = ~+ 0.8) between the flux of the extinction group and low-oxygen/high organic input species (such as Uvigerina, Bulimina and Bolivina) during the MPT, suggesting a close relationship with lower oxygen levels and high food supply to the sea floor. The absolute abundance, flux, and number of the extinction group of species show a progressive withdrawal pattern with major decreases occurring in cold periods with high d13C values. This might be related to increasing chemical ventilation of glacial intermediate water.
Resumo:
The chemical composition of surface associated metabolites of two Fucus species (Fucus vesiculosus and Fucus serratus) was analysed by means of gas chromatography-mass spectrometry (GC-MS) to describe temporal patterns in chemical surface composition. Method: The two perennial brown macroalgae F. vesiculosus and F. serratus were sampled monthly at Bülk, outer Kiel Fjord, Germany (54°27'21 N / 10°11'57 E) over an entire year (August 2012 - July 2013). Per month and species six non-fertile Fucus individuals were collected from mixed stands at a depth of 0.5 m under mid water level. For surface extraction approx. 50 g of the upper 5-10 cm apical thalli tips were cut off per species. The surface extraction of Fucus was performed according to the protocol of de Nys and co-workers (1998) with minor modifications (see Rickert et al. 2015). GC/EI-MS measurements were performed with a Waters GCT premier (Waters, Manchester, UK) coupled to an Agilent 6890N GC equipped with a DB-5 ms 30 m column (0.25 mm internal diameter, 0.25 mM film thickness, Agilent, USA). The inlet temperature was maintained at 250°C and samples were injected in split 10 mode. He carrier gas flow was adjusted to 1 ml min-1. Alkanes were used for referencing of retention times. For further details (GC-MS sample preparation and analysis) see the related publication (Rickert et al. submitted to PLOS ONE).
Resumo:
During the late Pliocene (~3 to 2.5 Ma), oceanic records of opal and C37 alkenone accumulation from around the world show a secular shift towards lower values in the high latitudes and higher values in the low and mid latitudes. These shifts are broadly coincident with the intensification of northern hemisphere glaciation and are suggestive of changes in export productivity, with potential implications for Pliocene atmospheric carbon dioxide concentrations. The interpretation of a global latitudinal shift in productivity, however, requires testing because of the potential uncertainties associated with site to site comparisons of records that can be influenced by highly nonlinear processes associated with production, export, and preservation. Here, we assess the inferred Pliocene latitudinal productivity shift interpretation by presenting new records of C37 alkenone accumulation from Ocean Drilling Program (ODP) Site 982 in the North Atlantic and biotic assemblages (calcareous nannoplankton) from this site and ODP Site 846 in the eastern tropical Pacific. Our results corroborate the interpretation of C37 alkenone accumulation as a proxy for gross export productivity at these sites, indicating that large-scale productivity decreases at high latitudes and increases at tropical sites are recorded robustly. We conclude that the intensification of northern hemisphere glaciation during the late Pliocene was associated with a profound reorganisation of ocean biogeochemistry.
Resumo:
Online technological advances are pioneering the wider distribution of geospatial information for general mapping purposes. The use of popular web-based applications, such as Google Maps, is ensuring that mapping based applications are becoming commonplace amongst Internet users which has facilitated the rapid growth of geo-mashups. These user generated creations enable Internet users to aggregate and publish information over specific geographical points. This article identifies privacy invasive geo-mashups that involve the unauthorized use of personal information, the inadvertent disclosure of personal information and invasion of privacy issues. Building on Zittrain’s Privacy 2.0, the author contends that first generation information privacy laws, founded on the notions of fair information practices or information privacy principles, may have a limited impact regarding the resolution of privacy problems arising from privacy invasive geo-mashups. Principally because geo-mashups have different patterns of personal information provision, collection, storage and use that reflect fundamental changes in the Web 2.0 environment. The author concludes by recommending embedded technical and social solutions to minimize the risks arising from privacy invasive geo-mashups that could lead to the establishment of guidelines for the general protection of privacy in geo-mashups.
Resumo:
Introduction: 3.0 Tesla MRI offers the potential to quantify the volume fraction and structural texture of cancellous bone, along with quantification of marrow composition, in a single non-invasive examination. This study describes our preliminary investigations to identify parameters which describe cancellous bone structure including the relationships between texture and volume fraction.
Resumo:
This report is the primary output of Project 4: Copyright and Intellectual Property, the aim of which was to produce a report considering how greater access to and use of government information could be achieved within the scope of the current copyright law. In our submission for Project 4, we undertook to address: •the policy rationales underlying copyright and how they apply in the context of materials owned, held and used by government; • the recommendations of the Copyright Law Review Committee (CLRC) in its 2005 report on Crown copyright; • the legislative and regulatory barriers to information sharing in key domains, including where legal impediments such as copyright have been relied upon (whether rightly or wrongly) to justify a refusal to provide access to government data; • copyright licensing models appropriate to government materials and examples of licensing initiatives in Australia and other relevant jurisdictions; and • issues specific to the galleries, libraries, archives and museums (“GLAM”) sector, including management of copyright in legacy materials and “orphan” works. In addressing these areas, we analysed the submissions received in response to the Government 2.0 Taskforce Issues Paper, consulted with members of the Task Force as well as several key stakeholders and considered the comments posted on the Task Force’s blog. This Project Report sets out our findings on the above issues. It puts forward recommendations for consideration by the Government 2.0 Task Force on steps that can be taken to ensure that copyright and intellectual property promote access to and use of government information.
Resumo:
The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.
Resumo:
This Report, prepared for Smart Service Queensland (“SSQ”), addresses legal issues, areas of risk and other factors associated with activities conducted on three popular online platforms—YouTube, MySpace and Second Life (which are referred to throughout this Report as the “Platforms”). The Platforms exemplify online participatory spaces and behaviours, including blogging and networking, multimedia sharing, and immersive virtual environments.
Resumo:
The traditional model for information dissemination in disaster response is unidirectional from official channels to the public. However recent crises in the US, such as Hurricane Katrina and the Californian Bushfires show that civilians are now turning to Web 2.0 technologies as a means of sharing disaster related information. These technologies present enormous potential benefits to disaster response authorities that cannot be overlooked. In Australia, the Victorian Bushfires Royal Commission has recently recommended that Australian disaster response authorities utilize information technologies to improve the dissemination of disaster related, bushfire information. However, whilst the use of these technologies has many positive attributes, potential legal liabilities for disaster response authorities arise. This paper identifies some potential legal liabilities arising from the use of Web 2.0 technologies in disaster response situations thereby enhancing crisis related information sharing by highlighting legal concerns that need to be addressed.
Resumo:
Following the position of Beer and Burrows (2007) this paper poses a re-conceptualization of Web 2.0 interaction in order to understand the properties of action possibilities in and of Web 2.0. The paper discusses the positioning of Web 2.0 social interaction in light of current descriptions, which point toward the capacities of technology in the production of social affordances within that domain (Bruns 2007; Jenkins 2006; O’Reilly 2005). While this diminishes the agency and reflexivity for users of Web 2.0 it also inadvertently positions tools as the central driver for the interactive potential available (Everitt and Mills 2009; van Dicjk 2009). In doing so it neglects the possibility that participants may be more involved in the production of Web 2.0 than the technology that underwrites it. It is this aspect of Web 2.0 that is questioned in the study with particular interest on how an analytical option may be made available to broaden the scope of investigations into Web 2.0 to include a study of the capacity for an interactive potential in light of how action possibilities are presented to users through communication with others (Bonderup Dohn 2009).
Resumo:
An essential challenge for organizations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. Using three case examples, this paper explores how Enterprise 2.0 technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering organizations. The paper is intended to be a timely introduction to the benefits and issues associated with the use of Enterprise 2.0 technologies with the aim of achieving the positive outcomes associated with knowledge management