979 resultados para Core issues


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is a rebuttal to the reviews of Moum and Land. Publication has been facilitated by Journal policy (see Instructions to authors ‘Procedure when the authors and reviewers disagree’ on the Journal website) which details the procedure to be followed in the case of disputed manuscripts. Our reply is in two parts. The first concerns issues of theory and the second issues of methodology. Each point of critique raised by the reviewers has been answered and shown either to be false or irrelevant. We contend that our original conclusion still stands. Subjective wellbeing is dominated by Core Affect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Research Quality Framework uses Thomson-ISI citation benchmarks as its main set of objective measures of research quality. The Thomson-ISI measures rely on identifying a core set of journals in which the major publications for a discipline are to be found. The core for a discipline is determined by applying a nontransparent process that is partly based on Bradford’s Law (1934). Yet Bradford was not seeking measures about quality of publications or journals. How valid then is it to base measures of publication quality on Bradford’s Law? We explore this by returning to Bradford’s Law and subsequent related research asking ‘what is Bradford’s Law really about?’ We go further, and ask ‘does Bradford’s Law apply in Information Systems?’ We use data from John Lamp’s internationally respected Index of Information Systems Journals to explore the latter question. We have found that Information Systems may have a core of journals only a subset of which is also in the list of Thomson-ISI journals. There remain many unanswered questions about the RQF metrics based on Thomson-ISI and their applicability to information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent literature in higher education argues university assessment has been too narrow and hasn’t adequately reflected the quality, breadth and depth of students’ learning. Research shows students often prioritise and learn what they need to know for formal, graded assessment and disregard other academic content seen as less relevant to those requirements. The predominance of essays and examinations has therefore tended to constrain learning. The case for a more comprehensive approach has been clearly articulated. So what happens when staff take up the unique challenge of designing fair and uniform assessment for a large, core, multi-modal, multi-campus unit offered nationally and internationally?
When developing an undergraduate Bachelor of Commerce unit at Deakin University, staff considered the most appropriate ways to assess a range of conceptual understandings and communication skills. This resulted in the mapping and adoption of a comprehensive approach incorporating teacher, peer, and self-assessment aspects, individual and group work, oral and written presentations, and the use of portfolios and journals. Particular practices were adopted to control workloads, ensure fairness in marking, and overcome some problems generally associated with group work. When implementing the approach, practical issues arose that demanded adjustments. This paper details the approach taken, outlines research activities, and discusses the practical implications of issues that arose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent literature in higher education argues university assessment has been too narrow and hasn’t adequately reflected the quality, breadth and depth of students’ learning. Research shows students often prioritise and learn what they need to know for formal, graded assessment and disregard other academic content seen as less relevant to those requirements. The predominance of essays and examinations has therefore tended to constrain learning. The case for a more comprehensive approach has been clearly articulated. So what happens when staff take up the unique challenge of designing fair and uniform assessment for a large, core, multi-modal, multi-campus unit offered nationally and internationally?

When developing an undergraduate Bachelor of Commerce unit at Deakin University, staff considered the most appropriate ways to assess a range of conceptual understandings and communication skills. This resulted in the mapping and adoption of a comprehensive approach incorporating teacher, peer, and self-assessment aspects, individual and group work, oral and written presentations, and the use of portfolios and journals. Particular practices were adopted to control workloads, ensure fairness in marking, and overcome some problems generally associated with group work. When implementing the approach, practical issues arose that demanded adjustments. This paper details the approach taken, outlines research activities, and discusses the practical implications of issues that arose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study found that a majority of Australian accounting firms either currently oursource accounting services or are considering outsourcing, and identified the significance of outsourcing contracts. It established that most firms outsource their core services. The study provided important information regarding ethical issues and the relinquishment of professional status.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Issues addressed: Health promotion principles for practice are closely aligned with that of environmental sustainability. Health promotion practitioners are well positioned to take action on climate change. However, there has been scant discussion about practice synergies and subsequently the type and nature of professional competencies that underpin such action.

Methods: This commentary uses the Australian Health Promotion Association (AHPA) national core competencies for Health Promotion Practitioners as a basis to examine the synergies between climate change and health promotion action.

Results: We demonstrate that AHPA core competencies, such as program planning, evaluation and partnership building, are highly compatible for implementing climate change mitigation and adaptation strategies. We use food security examples to illustrate this case.

Conclusions: There appears to be considerable synergy between climate change and health promotion action. This should be a key focus of future health promotion competency development in Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change poses serious threats to human health and well-being. It exacerbates existing health inequities, impacts on the social determinants of health and disproportionately affects vulnerable populations. In the Australian region these include remote Aboriginal communities, Pacific Island countries and people with low incomes. Given health promotion’s remit to protect and promote health, it should be well placed to respond to emerging climate-related health challenges. Yet, to date, there has been little evidence to demonstrate this. This paper draws on the findings of a qualitative study conducted in Victoria, Australia to highlight that; while there is clearly a role for health promotion in climate change mitigation and adaptation at the national and international levels, there is also a need for the engagement of health promoters at the community level. This raises several key issues for health promotion practice. To be better prepared to respond to climate change, health promotion practitioners first need to re-engage with the central tenets of the Ottawa Charter, namely the interconnectedness of humans and the natural environment and, secondly, the need to adopt ideas and frameworks from the sustainability field. The findings also open up a discussion for paradigmatic shifts in health promotion thinking and acting in the context of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spinal cord injury (SCI) results not only in paralysis; but it is also associated with a range of autonomic dysregulation that can interfere with cardiovascular, bladder, bowel, temperature, and sexual function. The entity of the autonomic dysfunction is related to the level and severity of injury to descending autonomic (sympathetic) pathways. For many years there was limited awareness of these issues and the attention given to them by the scientific and medical community was scarce. Yet, even if a new system to document the impact of SCI on autonomic function has recently been proposed, the current standard of assessment of SCI (American Spinal Injury Association (ASIA) examination) evaluates motor and sensory pathways, but not severity of injury to autonomic pathways. Beside the severe impact on quality of life, autonomic dysfunction in persons with SCI is associated with increased risk of cardiovascular disease and mortality. Therefore, obtaining information regarding autonomic function in persons with SCI is pivotal and clinical examinations and laboratory evaluations to detect the presence of autonomic dysfunction and quantitate its severity are mandatory. Furthermore, previous studies demonstrated that there is an intimate relationship between the autonomic nervous system and sleep from anatomical, physiological, and neurochemical points of view. Although, even if previous epidemiological studies demonstrated that sleep problems are common in spinal cord injury (SCI), so far only limited polysomnographic (PSG) data are available. Finally, until now, circadian and state dependent autonomic regulation of blood pressure (BP), heart rate (HR) and body core temperature (BcT) were never assessed in SCI patients. Aim of the current study was to establish the association between the autonomic control of the cardiovascular function and thermoregulation, sleep parameters and increased cardiovascular risk in SCI patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the several issues faced in the past, the evolutionary trend of silicon has kept its constant pace. Today an ever increasing number of cores is integrated onto the same die. Unfortunately, the extraordinary performance achievable by the many-core paradigm is limited by several factors. Memory bandwidth limitation, combined with inefficient synchronization mechanisms, can severely overcome the potential computation capabilities. Moreover, the huge HW/SW design space requires accurate and flexible tools to perform architectural explorations and validation of design choices. In this thesis we focus on the aforementioned aspects: a flexible and accurate Virtual Platform has been developed, targeting a reference many-core architecture. Such tool has been used to perform architectural explorations, focusing on instruction caching architecture and hybrid HW/SW synchronization mechanism. Beside architectural implications, another issue of embedded systems is considered: energy efficiency. Near Threshold Computing is a key research area in the Ultra-Low-Power domain, as it promises a tenfold improvement in energy efficiency compared to super-threshold operation and it mitigates thermal bottlenecks. The physical implications of modern deep sub-micron technology are severely limiting performance and reliability of modern designs. Reliability becomes a major obstacle when operating in NTC, especially memory operation becomes unreliable and can compromise system correctness. In the present work a novel hybrid memory architecture is devised to overcome reliability issues and at the same time improve energy efficiency by means of aggressive voltage scaling when allowed by workload requirements. Variability is another great drawback of near-threshold operation. The greatly increased sensitivity to threshold voltage variations in today a major concern for electronic devices. We introduce a variation-tolerant extension of the baseline many-core architecture. By means of micro-architectural knobs and a lightweight runtime control unit, the baseline architecture becomes dynamically tolerant to variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polylactide (PLA) is a biodegradable polymer that has been used in particle form for drug release, due to its biocompatibility, tailorable degradation kinetics, and desirable mechanical properties. Active pharmaceutical ingredients (APIs) may be either dissolved or encapsulated within these biomaterials to create micro- or nanoparticles. Delivery of an AIP within fine particles may overcome solubility or stability issues that can result in early elimination or degradation of the AIP in a hostile biological environment. Furthermore, it is a promising method for controlling the rate of drug delivery and dosage. The goal of this project is to develop a simple and cost-effective device that allows us to produce monodisperse micro- and nanocapsules with controllable size and adjustable sheath thickness on demand. To achieve this goal, we have studied the dual-capillary electrospray and pulsed electrospray. Dual-capillary electrospray has received considerable attention in recent years due to its ability to create core-shell structures in a single-step. However, it also increases the difficulty of controlling the inner and outer particle morphology, since two simultaneous flows are required. Conventional electrospraying has been mainly conducted using direct-current (DC) voltage with little control over anything but the electrical potential. In contrast, control over the input voltage waveform (i.e. pulsing) in electrospraying offers greater control over the process variables. Poly(L-lactic acid) (PLLA) microspheres and microcapsules were successfully fabricated via pulsed-DC electrospray and dual-capillary electrospray, respectively. Core shell combinations produced include: Water/PLLA, PLLA/polyethylene glycol (PEG), and oleic Acid/PLLA. In this study, we designed a novel high-voltage pulse forming network and a set of new designs for coaxial electrospray nozzles. We also investigated the effect of the pulsed voltage characteristics (e.g. pulse frequency, pulse amplitude and pulse width) on the particle’s size and uniformity. We found that pulse frequency, pulse amplitude, pulse width, and the combinations of these factors had a statistically significant effect on the particle’s size. In addition, factors such as polymer concentration, solvent type, feed flow rate, collection method, temperature, and humidity can significantly affect the size and shape of the particles formed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present in this paper several contributions on the collision detection optimization centered on hardware performance. We focus on the broad phase which is the first step of the collision detection process and propose three new ways of parallelization of the well-known Sweep and Prune algorithm. We first developed a multi-core model takes into account the number of available cores. Multi-core architecture enables us to distribute geometric computations with use of multi-threading. Critical writing section and threads idling have been minimized by introducing new data structures for each thread. Programming with directives, like OpenMP, appears to be a good compromise for code portability. We then proposed a new GPU-based algorithm also based on the "Sweep and Prune" that has been adapted to multi-GPU architectures. Our technique is based on a spatial subdivision method used to distribute computations among GPUs. Results show that significant speed-up can be obtained by passing from 1 to 4 GPUs in a large-scale environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract of Bazin et al. (2013): An accurate and coherent chronological framework is essential for the interpretation of climatic and environmental records obtained from deep polar ice cores. Until now, one common ice core age scale had been developed based on an inverse dating method (Datice), combining glaciological modelling with absolute and stratigraphic markers between 4 ice cores covering the last 50 ka (thousands of years before present) (Lemieux-Dudon et al., 2010). In this paper, together with the companion paper of Veres et al. (2013), we present an extension of this work back to 800 ka for the NGRIP, TALDICE, EDML, Vostok and EDC ice cores using an improved version of the Datice tool. The AICC2012 (Antarctic Ice Core Chronology 2012) chronology includes numerous new gas and ice stratigraphic links as well as improved evaluation of background and associated variance scenarios. This paper concentrates on the long timescales between 120-800 ka. In this framework, new measurements of d18Oatm over Marine Isotope Stage (MIS) 11-12 on EDC and a complete d18Oatm record of the TALDICE ice cores permit us to derive additional orbital gas age constraints. The coherency of the different orbitally deduced ages (from d18Oatm, dO2/N2 and air content) has been verified before implementation in AICC2012. The new chronology is now independent of other archives and shows only small differences, most of the time within the original uncertainty range calculated by Datice, when compared with the previous ice core reference age scale EDC3, the Dome F chronology, or using a comparison between speleothems and methane. For instance, the largest deviation between AICC2012 and EDC3 (5.4 ka) is obtained around MIS 12. Despite significant modifications of the chronological constraints around MIS 5, now independent of speleothem records in AICC2012, the date of Termination II is very close to the EDC3 one. Abstract of Veres et al. (2013): The deep polar ice cores provide reference records commonly employed in global correlation of past climate events. However, temporal divergences reaching up to several thousand years (ka) exist between ice cores over the last climatic cycle. In this context, we are hereby introducing the Antarctic Ice Core Chronology 2012 (AICC2012), a new and coherent timescale developed for four Antarctic ice cores, namely Vostok, EPICA Dome C (EDC), EPICA Dronning Maud Land (EDML) and Talos Dome (TALDICE), alongside the Greenlandic NGRIP record. The AICC2012 timescale has been constructed using the Bayesian tool Datice (Lemieux-Dudon et al., 2010) that combines glaciological inputs and data constraints, including a wide range of relative and absolute gas and ice stratigraphic markers. We focus here on the last 120 ka, whereas the companion paper by Bazin et al. (2013) focuses on the interval 120-800 ka. Compared to previous timescales, AICC2012 presents an improved timing for the last glacial inception, respecting the glaciological constraints of all analyzed records. Moreover, with the addition of numerous new stratigraphic markers and improved calculation of the lock-in depth (LID) based on d15N data employed as the Datice background scenario, the AICC2012 presents a slightly improved timing for the bipolar sequence of events over Marine Isotope Stage 3 associated with the seesaw mechanism, with maximum differences of about 600 yr with respect to the previous Datice-derived chronology of Lemieux-Dudon et al. (2010), hereafter denoted LD2010. Our improved scenario confirms the regional differences for the millennial scale variability over the last glacial period: while the EDC isotopic record (events of triangular shape) displays peaks roughly at the same time as the NGRIP abrupt isotopic increases, the EDML isotopic record (events characterized by broader peaks or even extended periods of high isotope values) reached the isotopic maximum several centuries before. It is expected that the future contribution of both other long ice core records and other types of chronological constraints to the Datice tool will lead to further refinements in the ice core chronologies beyond the AICC2012 chronology. For the time being however, we recommend that AICC2012 be used as the preferred chronology for the Vostok, EDC, EDML and TALDICE ice core records, both over the last glacial cycle (this study), and beyond (following Bazin et al., 2013). The ages for NGRIP in AICC2012 are virtually identical to those of GICC05 for the last 60.2 ka, whereas the ages beyond are independent of those in GICC05modelext (as in the construction of AICC2012, the GICC05modelext was included only via the background scenarios and not as age markers). As such, where issues of phasing between Antarctic records included in AICC2012 and NGRIP are involved, the NGRIP ages in AICC2012 should therefore be taken to avoid introducing false offsets. However for issues involving only Greenland ice cores, there is not yet a strong basis to recommend superseding GICC05modelext as the recommended age scale for Greenland ice cores.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite intensive research on the different domains of the marine phosphorus (P) cycle during the last decades, frequently discussed open questions still exist especially on controlling factors for the benthic behaviour of P and its general distribution in sediment-pore water systems. Steady state or the internal balance of all relevant physical and (bio)geochemical processes are amongst the key issues. In this study we present and discuss an extended data set from surface sediments recovered from three locations on the NW African continental slope. Pore water data and results from sequential sediment extractions give clear evidence to the well-known close relationship between the benthic cycles of P and iron. Accordingly, most of the dissolved phosphate must have been released by microbially catalyzed reductive dissolution of iron (oxhydr)oxides. However, rates of release and association of P and iron, respectively, are not directly represented in profiles of element specific sediment compositions. Results from steady-state based transport-reaction modelling suggest that particle mixing due to active bioturbation, or rather a physical net downward transport of P associated to iron (oxyhydr)oxides, is an essential process for the balance of the inspected benthic cycles. This study emphasizes the importance of balancing analytical data for a comprehensive understanding of all processes involved in biogeochemical cycles.