700 resultados para Multiple scale
Resumo:
This paper explores the concept that individual dancers leave traces in a choreographer’s body of work and similarly, that dancers carry forward residue of embodied choreographies into other working processes. This presentation will be grounded in a study of the multiple iterations of a programme of solo works commissioned in 2008 from choreographers John Jasperse, Jodi Melnick, Liz Roche and Rosemary Butcher and danced by the author. This includes an exploration of the development by John Jasperse of themes from his solo into the pieces PURE (2008) and Truth, Revised Histories, Wishful Thinking and Flat Out Lies (2009); an adaptation of the solo Business of the Bloom by Jodi Melnick in 2008 and a further adaptation of Business of the Bloom by this author in 2012. It will map some of the developments that occurred through a number of further performances over five years of the solo Shared Material on Dying by Liz Roche and the working process of the (uncompleted) solo Episodes of Flight by Rosemary Butcher. The purpose is to reflect back on authorship in dance, an art form in which lineages of influence can often be clearly observed. Normally, once a choreographic work is created and performed, it is archived through video recording, notation and/or reviews. The dancer is no longer called upon to represent the dance piece within the archive and thus her/his lived presence and experiential perspective disappears. The author will draw on the different traces still inhabiting her body as pathways towards understanding how choreographic movement circulates beyond this moment of performance. This will include the interrogation of ownership of choreographic movement, as once it becomes integrated in the body of the dancer, who owns the dance? Furthermore, certain dancers, through their individual physical characteristics and moving identities, can deeply influence the formation of choreographic signatures, a proposition that challenges the sole authorship role of the choreographer in dance production. This paper will be delivered in a presentation format that will bleed into movement demonstrations alongside video footage of the works and auto-ethnographic accounts of dancing experience. A further source of knowledge will be drawn from extracts of interviews with other dancers including Sara Rudner, Rebecca Hilton and Catherine Bennett.
Resumo:
This paper presents an overview of the strengths and limitations of existing and emerging geophysical tools for landform studies. The objectives are to discuss recent technical developments and to provide a review of relevant recent literature, with a focus on propagating field methods with terrestrial applications. For various methods in this category, including ground-penetrating radar (GPR), electrical resistivity (ER), seismics, and electromagnetic (EM) induction, the technical backgrounds are introduced, followed by section on novel developments relevant to landform characterization. For several decades, GPR has been popular for characterization of the shallow subsurface and in particular sedimentary systems. Novel developments in GPR include the use of multi-offset systems to improve signal-to-noise ratios and data collection efficiency, amongst others, and the increased use of 3D data. Multi-electrode ER systems have become popular in recent years as they allow for relatively fast and detailed mapping. Novel developments include time-lapse monitoring of dynamic processes as well as the use of capacitively-coupled systems for fast, non-invasive surveys. EM induction methods are especially popular for fast mapping of spatial variation, but can also be used to obtain information on the vertical variation in subsurface electrical conductivity. In recent years several examples of the use of plane wave EM for characterization of landforms have been published. Seismic methods for landform characterization include seismic reflection and refraction techniques and the use of surface waves. A recent development is the use of passive sensing approaches. The use of multiple geophysical methods, which can benefit from the sensitivity to different subsurface parameters, is becoming more common. Strategies for coupled and joint inversion of complementary datasets will, once more widely available, benefit the geophysical study of landforms.Three cases studies are presented on the use of electrical and GPR methods for characterization of landforms in the range of meters to 100. s of meters in dimension. In a study of polygonal patterned ground in the Saginaw Lowlands, Michigan, USA, electrical resistivity tomography was used to characterize differences in subsurface texture and water content associated with polygon-swale topography. Also, a sand-filled thermokarst feature was identified using electrical resistivity data. The second example is on the use of constant spread traversing (CST) for characterization of large-scale glaciotectonic deformation in the Ludington Ridge, Michigan. Multiple CST surveys parallel to an ~. 60. m high cliff, where broad (~. 100. m) synclines and narrow clay-rich anticlines are visible, illustrated that at least one of the narrow structures extended inland. A third case study discusses internal structures of an eolian dune on a coastal spit in New Zealand. Both 35 and 200. MHz GPR data, which clearly identified a paleosol and internal sedimentary structures of the dune, were used to improve understanding of the development of the dune, which may shed light on paleo-wind directions.
Resumo:
Intimate partner homicides are fatal violent attacks perpetrated by intimate partners, and are often the extreme and unplanned consequence of abusive relationships. Although recognised as an important risk factor for death and disability among women, previous country-level assessments and the recent Global Burden of Disease Study 2010 (GBD 2010)4 have not considered the extent of intimate partner violence among male victims...
Resumo:
BACKGROUND Measurement of the global burden of disease with disability-adjusted life-years (DALYs) requires disability weights that quantify health losses for all non-fatal consequences of disease and injury. There has been extensive debate about a range of conceptual and methodological issues concerning the definition and measurement of these weights. Our primary objective was a comprehensive re-estimation of disability weights for the Global Burden of Disease Study 2010 through a large-scale empirical investigation in which judgments about health losses associated with many causes of disease and injury were elicited from the general public in diverse communities through a new, standardised approach. METHODS We surveyed respondents in two ways: household surveys of adults aged 18 years or older (face-to-face interviews in Bangladesh, Indonesia, Peru, and Tanzania; telephone interviews in the USA) between Oct 28, 2009, and June 23, 2010; and an open-access web-based survey between July 26, 2010, and May 16, 2011. The surveys used paired comparison questions, in which respondents considered two hypothetical individuals with different, randomly selected health states and indicated which person they regarded as healthier. The web survey added questions about population health equivalence, which compared the overall health benefits of different life-saving or disease-prevention programmes. We analysed paired comparison responses with probit regression analysis on all 220 unique states in the study. We used results from the population health equivalence responses to anchor the results from the paired comparisons on the disability weight scale from 0 (implying no loss of health) to 1 (implying a health loss equivalent to death). Additionally, we compared new disability weights with those used in WHO's most recent update of the Global Burden of Disease Study for 2004. FINDINGS 13,902 individuals participated in household surveys and 16,328 in the web survey. Analysis of paired comparison responses indicated a high degree of consistency across surveys: correlations between individual survey results and results from analysis of the pooled dataset were 0·9 or higher in all surveys except in Bangladesh (r=0·75). Most of the 220 disability weights were located on the mild end of the severity scale, with 58 (26%) having weights below 0·05. Five (11%) states had weights below 0·01, such as mild anaemia, mild hearing or vision loss, and secondary infertility. The health states with the highest disability weights were acute schizophrenia (0·76) and severe multiple sclerosis (0·71). We identified a broad pattern of agreement between the old and new weights (r=0·70), particularly in the moderate-to-severe range. However, in the mild range below 0·2, many states had significantly lower weights in our study than previously. INTERPRETATION This study represents the most extensive empirical effort as yet to measure disability weights. By contrast with the popular hypothesis that disability assessments vary widely across samples with different cultural environments, we have reported strong evidence of highly consistent results.
Resumo:
Conservation planning and management programs typically assume relatively homogeneous ecological landscapes. Such “ecoregions” serve multiple purposes: they support assessments of competing environmental values, reveal priorities for allocating scarce resources, and guide effective on-ground actions such as the acquisition of a protected area and habitat restoration. Ecoregions have evolved from a history of organism–environment interactions, and are delineated at the scale or level of detail required to support planning. Depending on the delineation method, scale, or purpose, they have been described as provinces, zones, systems, land units, classes, facets, domains, subregions, and ecological, biological, biogeographical, or environmental regions. In each case, they are essential to the development of conservation strategies and are embedded in government policies at multiple scales.
Resumo:
Background Pharmacist prescribing has been introduced in several countries and is a possible future role for pharmacy in Australia. Objective To assess whether patient satisfaction with the pharmacist as a prescriber, and patient experiences in two settings of collaborative doctor-pharmacist prescribing may be barriers to implementation of pharmacist prescribing. Design Surveys containing closed questions, and Likert scale responses, were completed in both settings to investigate patient satisfaction after each consultation. A further survey investigating attitudes towards pharmacist prescribing, after multiple consultations, was completed in the sexual health clinic. Setting and Participants A surgical pre-admission clinic (PAC) in a tertiary hospital and an outpatient sexual health clinic at a university hospital. Two hundred patients scheduled for elective surgery, and 17 patients diagnosed with HIV infection, respectively, recruited to the pharmacist prescribing arm of two collaborative doctor-pharmacist prescribing studies. Results Consultation satisfaction response rates in PAC and the sexual health clinic were 182/200 (91%) and 29/34 (85%), respectively. In the sexual health clinic, the attitudes towards pharmacist prescribing survey response rate were 14/17 (82%). Consultation satisfaction was high in both studies, most patients (98% and 97%, respectively) agreed they were satisfied with the consultation. In the sexual health clinic, all patients (14/14) agreed that they trusted the pharmacist’s ability to prescribe, care was as good as usual care, and they would recommend seeing a pharmacist prescriber to friends. Discussion and Conclusion Most of the patients had a high satisfaction with pharmacist prescriber consultations, and a positive outlook on the collaborative model of care in the sexual health clinic.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
A hippocampal-CA3 memory model was constructed with PGENESIS, a recently developed version of GENESIS that allows for distributed processing of a neural network simulation. A number of neural models of the human memory system have identified the CA3 region of the hippocampus as storing the declarative memory trace. However, computational models designed to assess the viability of the putative mechanisms of storage and retrieval have generally been too abstract to allow comparison with empirical data. Recent experimental evidence has shown that selective knock-out of NMDA receptors in the CA1 of mice leads to reduced stability of firing specificity in place cells. Here a similar reduction of stability of input specificity is demonstrated in a biologically plausible neural network model of the CA3 region, under conditions of Hebbian synaptic plasticity versus an absence of plasticity. The CA3 region is also commonly associated with seizure activity. Further simulations of the same model tested the response to continuously repeating versus randomized nonrepeating input patterns. Each paradigm delivered input of equal intensity and duration. Non-repeating input patterns elicited a greater pyramidal cell spike count. This suggests that repetitive versus non-repeating neocortical inpus has a quantitatively different effect on the hippocampus. This may be relevant to the production of independent epileptogenic zones and the process of encoding new memories.
Resumo:
Numerous efforts have been dedicated to the synthesis of large-volume methacrylate monoliths for large-scale biomolecules purification but most were obstructed by the enormous release of exotherms during preparation, thereby introducing structural heterogeneity in the monolith pore system. A significant radial temperature gradient develops along the monolith thickness, reaching a terminal temperature that supersedes the maximum temperature required for structurally homogenous monoliths preparation. The enormous heat build-up is perceived to encompass the heat associated with initiator decomposition and the heat released from free radical-monomer and monomer-monomer interactions. The heat resulting from the initiator decomposition was expelled along with some gaseous fumes before commencing polymerization in a gradual addition fashion. Characteristics of 80 mL monolith prepared using this technique was compared with that of a similar monolith synthesized in a bulk polymerization mode. An extra similarity in the radial temperature profiles was observed for the monolith synthesized via the heat expulsion technique. A maximum radial temperature gradient of only 4.3°C was recorded at the center and 2.1°C at the monolith peripheral for the combined heat expulsion and gradual addition technique. The comparable radial temperature distributions obtained birthed identical pore size distributions at different radial points along the monolith thickness.
Resumo:
The recognition of the potential efficacy of plasmid DNA (pDNA) molecules as vectors in the treatment and prevention of emerging diseases has birthed the confidence to combat global pandemics. This is due to the close-to-zero safety concern associated with pDNA vectors compared to viral vectors in cell transfection and targeting. Considerable attention has been paid to the potential of pDNA vectors but comparatively less thought has been given to the practical challenges in producing large quantities to meet current rising demands. A pilot-scale fermentation scheme was developed by employing a stoichiometrically-designed growth medium whose exceptional plasmid yield performance was attested in a shake flask environment for pUC19 and pEGFP-N1 transformed into E. coliDH5α and E. coliJM109, respectively. Batch fermentation of E. coliDH5α-pUC19 employing the stoichiometric medium displayed a maximum plasmid volumetric and specific yield of 62.6 mg/L and 17.1 mg/g (mg plasmid/g dry cell weight), respectively. Fed-batch fermentation of E. coliDH5α-pUC19 on a glycerol substrate demonstrated one of the highest ever reported pilot-scale plasmid specific yield of 48.98 mg/g and a volumetric yield of 0.53 g/L. The attainment of high plasmid specific yields constitutes a decrease in plasmid manufacturing cost and enhances the effectiveness of downstream processes by reducing the proportion of intracellular impurities. The effect of step-rise temperature induction was also considered to maximize ColE1-origin plasmid replication.
Resumo:
Welcome to the Evaluation of course matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systemic manner. The primary purpose of the Evaluation of course matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being ready for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation. I hope you find this tool useful in your assessment review.
Resumo:
This paper overviews the development of a vision-based AUV along with a set of complementary operational strategies to allow reliable autonomous data collection in relatively shallow water and coral reef environments. The development of the AUV, called Starbug, encountered many challenges in terms of vehicle design, navigation and control. Some of these challenges are discussed with focus on operational strategies for estimating and reducing the total navigation error when using lower-resolution sensing modalities. Results are presented from recent field trials which illustrate the ability of the vehicle and associated operational strategies to enable rapid collection of visual data sets suitable for marine research applications.