96 resultados para workload leave
Resumo:
PtSi/Si Schottky junctions, fabricated using a conventional technique of Pt deposition with a subsequent thermal anneal, are examined using X-ray diffraction, atomic force microscopy and a novel prism/gap/sample optical coupling system. With the aid of X-ray diffraction and atomic farce microscopy it is shown that a post-anneal etch in aqua regia is essential for the removal of an unreacted, rough surface layer of Pt, to leave a much smoother PtSi film. The prism/gap/sample or Otto coupling rig is mounted in a small UHV chamber and has facilities for remote variation of the gap (by virtue of a piezoactuator system) and variation of the temperature in the range of similar to 300 K - 85 K. The system is used to excite surface plasmon polaritons on the outer surface of the PtSi and thus produce sensitive optical characterisation as a function of temperature. This is performed in order to yield an understanding of the temperature dependence of phonon and interface scattering of carriers in the PtSi.
Resumo:
In many countries formal or informal palliative care networks (PCNs) have evolved to better integrate community-based services for individuals with a life-limiting illness. We conducted a cross-sectional survey using a customized tool to determine the perceptions of the processes of palliative care delivery reflective of horizontal integration from the perspective of nurses, physicians and allied health professionals working in a PCN, as well as to assess the utility of this tool. The process elements examined were part of a conceptual framework for evaluating integration of a system of care and centred on interprofessional collaboration. We used the Index of Interdisciplinary Collaboration (IIC) as a basis of measurement. The 86 respondents (85% response rate) placed high value on working collaboratively and most reported being part of an interprofessional team. The survey tool showed utility in identifying strengths and gaps in integration across the network and in detecting variability in some factors according to respondent agency affiliation and profession. Specifically, support for interprofessional communication and evaluative activities were viewed as insufficient. Impediments to these aspects of horizontal integration may be reflective of workload constraints, differences in agency operations or an absence of key structural features.
Resumo:
The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process.
Resumo:
Child protection social work is acknowledged as a very stressful occupation, with high turnover and poor retention of staff being a major concern. This paper highlights themes that emerged from findings of sixty-five articles that were included as part of a systematic literature review. The review focused on the evaluation of research findings, which considered individual and organisational factors associated with resilience or burnout in child protection social work staff. The results identified a range of individual and organisational themes for staff in child protection social work. Nine themes were identified in total. These are categorised under ‘Individual’ and ‘Organisational’ themes. Themes categorised as individual included personal history of maltreatment, training and preparation for child welfare, coping, secondary traumatic stress, compassion fatigue and compassion satisfaction. Those classified as organisational included workload, social support and supervision, organisational culture and climate, organisational and professional commitment, and job satisfaction or dissatisfaction. The range of factors is discussed with recommendations and areas for future research are highlighted.
Resumo:
Polybrominated diphenyl ethers (PBDEs) are a group of flame retardants that have been in use since the 1970s. They are included in the list of hazardous substances known as persistent organic pollutants (POPs) because they are extremely hazardous to the environment and human health. PBDEs have been extensively used in industry and manufacturing in Taiwan, thus its citizens are at high risk of exposure to these chemicals.
An assessment of the environmental fate of these compounds in the Zhuoshui river and Changhua County regions of western Taiwan, and also including the adjacent area of the Taiwan Strait, was conducted for three high risk congeners, BDE-47, -99 and -209, to obtain information regarding the partitioning, advection, transfer and long range transport potential of the PBDEs in order to identify the level of risk posed by the pollutants in this region.
The results indicate that large amounts of PBDEs presently reside in all model compartments – air, soil, water, and sediment – with particularly high levels found in air and especially in sediment. The high levels found in sediment, particularly for BDE-209, are significant, since there is the threat of these pollutants entering the food chain, either directly through benthic feeding, or through resuspension and subsequent feeding in the pelagic region of the water column which is a distinct possibility in the strong currents found within the Taiwan Strait. Another important result is that a substantial portion of emissions leave the model domain directly through advection, particularly for BDE-47 (58%) and BDE-209 (75%), thus posing a risk to adjacent communities.
Model results were generally in reasonable agreement with available measured concentrations. In air, model concentrations are in reasonably good agreement with available measured values. For both BDE-47 and -99, model concentrations are a factor of 2-3 higher and BDE-209 within the range of measured values. In soil, model results are somewhat less than measured values. In sediment, model results are at the high end of measured values.
Resumo:
This chapter discusses opportunities and limitations of height inequality, especially the role of social status and income distribution in determining height inequality. The more unequal the income distribution in a society, the more unequal the corresponding height distribution. At one time, the height gap between rich and poor teenagers in industrializing England was as high as 22 cm (8.7 inches); today, height inequality tends to be much lower (on the order of a few centimeters) because the gap between rich and poor in developed countries tends to be smaller. Results presented here suggest that height inequality is driven by differences in purchasing power, education, physical workload, and epidemiological environment. In a modern setting, social safety and redistribution of income is also relevant. An introduction into the literature helps illustrate opportunities this methodology has to offer to understand better the dynamics of the way populations experience economic development.
Resumo:
The melting of high-latitude permafrost peatlands is a major concern due to a potential positive feedback on global climate change. We examine the ecology of testate amoebae in permafrost peatlands, based on sites in Sweden (~ 200 km north of the Arctic Circle). Multivariate statistical analysis confirms that water-table depth and moisture content are the dominant controls on the distribution of testate amoebae, corroborating the results from studies in mid-latitude peatlands. We present a new testate amoeba-based water table transfer function and thoroughly test it for the effects of spatial autocorrelation, clustered sampling design and uneven sampling gradients. We find that the transfer function has good predictive power; the best-performing model is based on tolerance-downweighted weighted averaging with inverse deshrinking (performance statistics with leave-one-out cross validation: R2 = 0.87, RMSEP = 5.25 cm). The new transfer function was applied to a short core from Stordalen mire, and reveals a major shift in peatland ecohydrology coincident with the onset of the Little Ice Age (c. AD 1400). We also applied the model to an independent contemporary dataset from Stordalen and find that it outperforms predictions based on other published transfer functions. The new transfer function will enable palaeohydrological reconstruction from permafrost peatlands in Northern Europe, thereby permitting greatly improved understanding of the long-term ecohydrological dynamics of these important carbon stores as well as their responses to recent climate change.
Resumo:
Species-area relationships (SAR) are fundamental in the understanding of biodiversity patterns and of critical importance for predicting species extinction risk worldwide. Despite the enormous attention given to SAR in the form of many individual analyses, little attempt has been made to synthesize these studies. We conducted a quantitative meta-analysis of 794 SAR, comprising a wide span of organisms, habitats and locations. We identified factors reflecting both pattern-based and dynamic approaches to SAR and tested whether these factors leave significant imprints on the slope and strength of SAR. Our analysis revealed that SAR are significantly affected by variables characterizing the sampling scheme, the spatial scale, and the types of organisms or habitats involved. We found that steeper SAR are generated at lower latitudes and by larger organisms. SAR varied significantly between nested and independent sampling schemes and between major ecosystem types, but not generally between the terrestrial and the aquatic realm. Both the fit and the slope of the SAR were scale-dependent. We conclude that factors dynamically regulating species richness at different spatial scales strongly affect the shape of SAR. We highlight important consequences of this systematic variation in SAR for ecological theory, conservation management and extinction risk predictions.
Resumo:
Bdellovibrio bacteriovorus is a Delta-proteobacterium that oscillates between free-living growth and predation on Gram-negative bacteria including important pathogens of man, animals and plants. After entering the prey periplasm, killing the prey and replicating inside the prey bdelloplast, several motile B. bacteriovorus progeny cells emerge. The B. bacteriovorus HD100 genome encodes numerous proteins predicted to be involved in signalling via the secondary messenger cyclic di-GMP (c-di-GMP), which is known to affect bacterial lifestyle choices. We investigated the role of c-di-GMP signalling in B. bacteriovorus, focussing on the five GGDEF domain proteins that are predicted to function as diguanylyl cyclases initiating c-di-GMP signalling cascades. Inactivation of individual GGDEF domain genes resulted in remarkably distinct phenotypes. Deletion of dgcB (Bd0742) resulted in a predation impaired, obligately axenic mutant, while deletion of dgcC (Bd1434) resulted in the opposite, obligately predatory mutant. Deletion of dgcA (Bd0367) abolished gliding motility, producing bacteria capable of predatory invasion but unable to leave the exhausted prey. Complementation was achieved with wild type dgc genes, but not with GGAAF versions. Deletion of cdgA (Bd3125) substantially slowed predation; this was restored by wild type complementation. Deletion of dgcD (Bd3766) had no observable phenotype. In vitro assays showed that DgcA, DgcB, and DgcC were diguanylyl cyclases. CdgA lacks enzymatic activity but functions as a c-di-GMP receptor apparently in the DgcB pathway. Activity of DgcD was not detected. Deletion of DgcA strongly decreased the extractable c-di-GMP content of axenic Bdellovibrio cells. We show that c-di-GMP signalling pathways are essential for both the free-living and predatory lifestyles of B. bacteriovorus and that obligately predatory dgcC- can be made lacking a propensity to survive without predation of bacterial pathogens and thus possibly useful in anti-pathogen applications. In contrast to many studies in other bacteria, Bdellovibrio shows specificity and lack of overlap in c-di-GMP signalling pathways.
Resumo:
We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.
Resumo:
Introduction
The use of video capture of lectures in Higher Education is not a recent occurrence with web based learning technologies including digital recording of live lectures becoming increasing commonly offered by universities throughout the world (Holliman and Scanlon, 2004). However in the past decade the increase in technical infrastructural provision including the availability of high speed broadband has increased the potential and use of videoed lecture capture. This had led to a variety of lecture capture formats including pod casting, live streaming or delayed broadcasting of whole or part of lectures.
Additionally in the past five years there has been a significant increase in the popularity of online learning, specifically via Massive Open Online Courses (MOOCs) (Vardi, 2014). One of the key aspects of MOOCs is the simulated recording of lecture like activities. There has been and continues to be much debate on the consequences of the popularity of MOOCs, especially in relation to its potential uses within established University programmes.
There have been a number of studies dedicated to the effects of videoing lectures.
The clustered areas of research in video lecture capture have the following main themes:
• Staff perceptions including attendance, performance of students and staff workload
• Reinforcement versus replacement of lectures
• Improved flexibility of learning
• Facilitating engaging and effective learning experiences
• Student usage, perception and satisfaction
• Facilitating students learning at their own pace
Most of the body of the research has concentrated on student and faculty perceptions, including academic achievement, student attendance and engagement (Johnston et al, 2012).
Generally the research has been positive in review of the benefits of lecture capture for both students and faculty. This perception coupled with technical infrastructure improvements and student demand may well mean that the use of video lecture capture will continue to increase in frequency in the next number of years in tertiary education. However there is a relatively limited amount of research in the effects of lecture capture specifically in the area of computer programming with Watkins 2007 being one of few studies . Video delivery of programming solutions is particularly useful for enabling a lecturer to illustrate the complex decision making processes and iterative nature of the actual code development process (Watkins et al 2007). As such research in this area would appear to be particularly appropriate to help inform debate and future decisions made by policy makers.
Research questions and objectives
The purpose of the research was to investigate how a series of lecture captures (in which the audio of lectures and video of on-screen projected content were recorded) impacted on the delivery and learning of a programme of study in an MSc Software Development course in Queen’s University, Belfast, Northern Ireland. The MSc is conversion programme, intended to take graduates from non-computing primary degrees and upskill them in this area. The research specifically targeted the Java programming module within the course. It also analyses and reports on the empirical data from attendances and various video viewing statistics. In addition, qualitative data was collected from staff and student feedback to help contextualise the quantitative results.
Methodology, Methods and Research Instruments Used
The study was conducted with a cohort of 85 post graduate students taking a compulsory module in Java programming in the first semester of a one year MSc in Software Development. A pre-course survey of students found that 58% preferred to have available videos of “key moments” of lectures rather than whole lectures. A large scale study carried out by Guo concluded that “shorter videos are much more engaging” (Guo 2013). Of concern was the potential for low audience retention for videos of whole lectures.
The lecturers recorded snippets of the lecture directly before or after the actual physical delivery of the lecture, in a quiet environment and then upload the video directly to a closed YouTube channel. These snippets generally concentrated on significant parts of the theory followed by theory related coding demonstration activities and were faithful in replication of the face to face lecture. Generally each lecture was supported by two to three videos of durations ranging from 20 – 30 minutes.
Attendance
The MSc programme has several attendance based modules of which Java Programming was one element. In order to assess the consequence on attendance for the Programming module a control was established. The control used was a Database module which is taken by the same students and runs in the same semester.
Access engagement
The videos were hosted on a closed YouTube channel made available only to the students in the class. The channel had enabled analytics which reported on the following areas for all and for each individual video; views (hits), audience retention, viewing devices / operating systems used and minutes watched.
Student attitudes
Three surveys were taken in regard to investigating student attitudes towards the videoing of lectures. The first was before the start of the programming module, then at the mid-point and subsequently after the programme was complete.
The questions in the first survey were targeted at eliciting student attitudes towards lecture capture before they had experienced it in the programme. The midpoint survey gathered data in relation to how the students were individually using the system up to that point. This included feedback on how many videos an individual had watched, viewing duration, primary reasons for watching and the result on attendance, in addition to probing for comments or suggestions. The final survey on course completion contained questions similar to the midpoint survey but in summative view of the whole video programme.
Conclusions and Outcomes
The study confirmed findings of other such investigations illustrating that there is little or no effect on attendance at lectures. The use of the videos appears to help promote continual learning but they are particularly accessed by students at assessment periods. Students respond positively to the ability to access lectures digitally, as a means of reinforcing learning experiences rather than replacing them. Feedback from students was overwhelmingly positive indicating that the videos benefited their learning. Also there are significant benefits to part recording of lectures rather than recording whole lectures. The behaviour viewing trends analytics suggest that despite the increase in the popularity of online learning via MOOCs and the promotion of video learning on mobile devices in fact in this study the vast majority of students accessed the online videos at home on laptops or desktops However, in part, this is likely due to the nature of the taught subject, that being programming.
The research involved prerecording the lecture in smaller timed units and then uploading for distribution to counteract existing quality issues with recording entire live lectures. However the advancement and consequential improvement in quality of in situ lecture capture equipment may well help negate the need to record elsewhere. The research has also highlighted an area of potentially very significant use for performance analysis and improvement that could have major implications for the quality of teaching. A study of the analytics of the viewings of the videos could well provide a quick response formative feedback mechanism for the lecturer. If a videoed lecture either recorded live or later is a true reflection of the face to face lecture an analysis of the viewing patterns for the video may well reveal trends that correspond with the live delivery.
Resumo:
We present a rigorous methodology and new metrics for fair comparison of server and microserver platforms. Deploying our methodology and metrics, we compare a microserver with ARM cores against two servers with ×86 cores running the same real-time financial analytics workload. We define workload-specific but platform-independent performance metrics for platform comparison, targeting both datacenter operators and end users. Our methodology establishes that a server based on the Xeon Phi co-processor delivers the highest performance and energy efficiency. However, by scaling out energy-efficient microservers, we achieve competitive or better energy efficiency than a power-equivalent server with two Sandy Bridge sockets, despite the microserver's slower cores. Using a new iso-QoS metric, we find that the ARM microserver scales enough to meet market throughput demand, that is, a 100% QoS in terms of timely option pricing, with as little as 55% of the energy consumed by the Sandy Bridge server.
Resumo:
Energy efficiency is an essential requirement for all contemporary computing systems. We thus need tools to measure the energy consumption of computing systems and to understand how workloads affect it. Significant recent research effort has targeted direct power measurements on production computing systems using on-board sensors or external instruments. These direct methods have in turn guided studies of software techniques to reduce energy consumption via workload allocation and scaling. Unfortunately, direct energy measurements are hampered by the low power sampling frequency of power sensors. The coarse granularity of power sensing limits our understanding of how power is allocated in systems and our ability to optimize energy efficiency via workload allocation.
We present ALEA, a tool to measure power and energy consumption at the granularity of basic blocks, using a probabilistic approach. ALEA provides fine-grained energy profiling via sta- tistical sampling, which overcomes the limitations of power sens- ing instruments. Compared to state-of-the-art energy measurement tools, ALEA provides finer granularity without sacrificing accuracy. ALEA achieves low overhead energy measurements with mean error rates between 1.4% and 3.5% in 14 sequential and paral- lel benchmarks tested on both Intel and ARM platforms. The sampling method caps execution time overhead at approximately 1%. ALEA is thus suitable for online energy monitoring and optimization. Finally, ALEA is a user-space tool with a portable, machine-independent sampling method. We demonstrate two use cases of ALEA, where we reduce the energy consumption of a k-means computational kernel by 37% and an ocean modelling code by 33%, compared to high-performance execution baselines, by varying the power optimization strategy between basic blocks.
Resumo:
This paper examines a place-making project in post-conflict Belfast, analyzing efforts to transform an area which has often been used as a byword for militant Irish nationalism and social deprivation into an inclusive, vibrant tourist destination and cultural hub themed around the Irish language (called the "Gaeltacht Quarter‟). The antagonistic and territorial assumptions about place that characterize divided cities now co-exist with global trends towards the commodification of difference as recreation or spectacle, and longstanding struggles over the representation of contested identities are intertwined with the struggle to compete for international tourism and investment. The proliferation of officially themed quarters in many cities across the world reflects the enthusiasm with which planning authorities have embraced the vision of difference as a benign resource for the creation of tourist revenue. Yet, analysis of „quartering‟ processes reveals that such commodification does not neutralise or evade the political potency of naming, representing and delimiting cultural difference. Indeed, this paper argues that such projects offer a valuable insight into the inseparable roles of physical and representational space as both loci and catalysts of contestation in urban conflicts. Bringing together a wide range of public and private interest groups, projects redefining parts of Belfast as distinctive quarters have been explicitly linked with efforts to deterritorialize the city. The creation of bounded, themed spaces as an attempt to leave behind the ethno-sectarian geographical segregation that parts of Belfast still experience has its particular ironies, but is in many ways typical of contemporary trends in urban planning. The Gaeltacht Quarter exemplifies both the importance and the challenge of representation within cities where culturally distinguishing features have acted as markers of violent division, and where negotiations about how to successfully encompass difference necessarily address multiple local and international audiences simultaneously.
Resumo:
This chapter focuses on the relationship between improvisation and indeterminacy. We discuss the two practices by referring to play theory and game studies and situate it in recent network music performance. We will develop a parallel with game theory in which indeterminacy is seen as a way of articulating situations where structural decisions are left to the discernment of the performers and discuss improvisation as a method of play. The improvisation-indeterminacy relationship is discussed in the context of network music performance, which employs digital networks in the exchange of data between performers and hence relies on topological structures with varying degrees of openness and flexibility. Artists such as Max Neuhaus and The League of Automatic Music Composers initiated the development of a multitude of practices and technologies exploring the network as an environment for music making. Even though the technologies behind “the network” have shifted dramatically since Neuhaus’ use of radio in the 1960’s, a preoccupation with distribution and sharing of artistic agency has remained at the centre of networked practices. Gollo Föllmer, after undertaking an extensive review of network music initiatives, produced a typology that comprises categories as diverse as remix lists, sound toys, real/virtual space installations and network performances. For Föllmer, “the term ‘Net music’ comprises all formal and stylistic kinds of music upon which the specifics of electronic networks leave considerable traces, whereby the electronic networks strongly influence the process of musical production, the musical aesthetic, or the way music is received” (2005: 185).