953 resultados para Broad mite
Resumo:
Lawyers have traditionally viewed law as a closed system, and doctrinal research has been the research methodology used most widely in the profession. This reflects traditional concepts of legal reasoning. There is a wealth of reliable and valid social science data available to lawyers and judges. Judges in fact often refer to general facts about the world, society, institutions and human behaviour (‘empirical facts’). Legal education needs to prepare our students for this broader legal context. This paper examines how ‘empirical facts’ are used in Australian and other common law courts. Specifically, the paper argues that there is a need for enhanced training in non-doctrinal research methodologies across the law school curriculum. This should encompass a broad introduction to social science methods, with more attention being paid to a cross-section of methodologies such as content analysis, comparative law and surveys that are best applied to law.
Resumo:
We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.
Resumo:
The book within which this chapter appears is published as a research reference book (not a coursework textbook) on Management Information Systems (MIS) for seniors or graduate students in Chinese universities. It is hoped that this chapter, along with the others, will be helpful to MIS scholars and PhD/Masters research students in China who seek understanding of several central Information Systems (IS) research topics and related issues. The subject of this chapter - ‘Evaluating Information Systems’ - is broad, and cannot be addressed in its entirety in any depth within a single book chapter. The chapter proceeds from the truism that organizations have limited resources and those resources need to be invested in a way that provides greatest benefit to the organization. IT expenditure represents a substantial portion of any organization’s investment budget and IT related innovations have broad organizational impacts. Evaluation of the impact of this major investment is essential to justify this expenditure both pre- and post-investment. Evaluation is also important to prioritize possible improvements. The chapter (and most of the literature reviewed herein) admittedly assumes a blackbox view of IS/IT1, emphasizing measures of its consequences (e.g. for organizational performance or the economy) or perceptions of its quality from a user perspective. This reflects the MIS emphasis – a ‘management’ emphasis rather than a software engineering emphasis2, where a software engineering emphasis might be on the technical characteristics and technical performance. Though a black-box approach limits diagnostic specificity of findings from a technical perspective, it offers many benefits. In addition to superior management information, these benefits may include economy of measurement and comparability of findings (e.g. see Part 4 on Benchmarking IS). The chapter does not purport to be a comprehensive treatment of the relevant literature. It does, however, reflect many of the more influential works, and a representative range of important writings in the area. The author has been somewhat opportunistic in Part 2, employing a single journal – The Journal of Strategic Information Systems – to derive a classification of literature in the broader domain. Nonetheless, the arguments for this approach are believed to be sound, and the value from this exercise real. The chapter drills down from the general to the specific. It commences with a highlevel overview of the general topic area. This is achieved in 2 parts: - Part 1 addressing existing research in the more comprehensive IS research outlets (e.g. MISQ, JAIS, ISR, JMIS, ICIS), and Part 2 addressing existing research in a key specialist outlet (i.e. Journal of Strategic Information Systems). Subsequently, in Part 3, the chapter narrows to focus on the sub-topic ‘Information Systems Success Measurement’; then drilling deeper to become even more focused in Part 4 on ‘Benchmarking Information Systems’. In other words, the chapter drills down from Parts 1&2 Value of IS, to Part 3 Measuring Information Systems Success, to Part 4 Benchmarking IS. While the commencing Parts (1&2) are by definition broadly relevant to the chapter topic, the subsequent, more focused Parts (3 and 4) admittedly reflect the author’s more specific interests. Thus, the three chapter foci – value of IS, measuring IS success, and benchmarking IS - are not mutually exclusive, but, rather, each subsequent focus is in most respects a sub-set of the former. Parts 1&2, ‘the Value of IS’, take a broad view, with much emphasis on ‘the business Value of IS’, or the relationship between information technology and organizational performance. Part 3, ‘Information System Success Measurement’, focuses more specifically on measures and constructs employed in empirical research into the drivers of IS success (ISS). (DeLone and McLean 1992) inventoried and rationalized disparate prior measures of ISS into 6 constructs – System Quality, Information Quality, Individual Impact, Organizational Impact, Satisfaction and Use (later suggesting a 7th construct – Service Quality (DeLone and McLean 2003)). These 6 constructs have been used extensively, individually or in some combination, as the dependent variable in research seeking to better understand the important antecedents or drivers of IS Success. Part 3 reviews this body of work. Part 4, ‘Benchmarking Information Systems’, drills deeper again, focusing more specifically on a measure of the IS that can be used as a ‘benchmark’3. This section consolidates and extends the work of the author and his colleagues4 to derive a robust, validated IS-Impact measurement model for benchmarking contemporary Information Systems (IS). Though IS-Impact, like ISS, has potential value in empirical, causal research, its design and validation has emphasized its role and value as a comparator; a measure that is simple, robust and generalizable and which yields results that are as far as possible comparable across time, across stakeholders, and across differing systems and systems contexts.
Resumo:
3D Motion capture is a medium that plots motion, typically human motion, converting it into a form that can be represented digitally. It is a fast evolving field and recent inertial technology may provide new artistic possibilities for its use in live performance. Although not often used in this context, motion capture has a combination of attributes that can provide unique forms of collaboration with performance arts. The inertial motion capture suit used for this study has orientation sensors placed at strategic points on the body to map body motion. Its portability, real-time performance, ease of use, and its immunity from line-of-sight problems inherent in optical systems suggest it would work well as a live performance technology. Many animation techniques can be used in real-time. This research examines a broad cross-section of these techniques using four practice-led cases to assess the suitability of inertial motion capture to live performance. Although each case explores different visual possibilities, all make use of the performativity of the medium, using either an improvisational format or interactivity among stage, audience and screen that would be difficult to emulate any other way. A real-time environment is not capable of reproducing the depth and sophistication of animation people have come to expect through media. These environments take many hours to render. In time the combination of what can be produced in real-time and the tools available in a 3D environment will no doubt create their own tree of aesthetic directions in live performance. The case study looks at the potential of interactivity that this technology offers.
Resumo:
Motor vehicles are a major source of gaseous and particulate matter pollution in urban areas, particularly of ultrafine sized particles (diameters < 0.1 µm). Exposure to particulate matter has been found to be associated with serious health effects, including respiratory and cardiovascular disease, and mortality. Particle emissions generated by motor vehicles span a very broad size range (from around 0.003-10 µm) and are measured as different subsets of particle mass concentrations or particle number count. However, there exist scientific challenges in analysing and interpreting the large data sets on motor vehicle emission factors, and no understanding is available of the application of different particle metrics as a basis for air quality regulation. To date a comprehensive inventory covering the broad size range of particles emitted by motor vehicles, and which includes particle number, does not exist anywhere in the world. This thesis covers research related to four important and interrelated aspects pertaining to particulate matter generated by motor vehicle fleets. These include the derivation of suitable particle emission factors for use in transport modelling and health impact assessments; quantification of motor vehicle particle emission inventories; investigation of the particle characteristic modality within particle size distributions as a potential for developing air quality regulation; and review and synthesis of current knowledge on ultrafine particles as it relates to motor vehicles; and the application of these aspects to the quantification, control and management of motor vehicle particle emissions. In order to quantify emissions in terms of a comprehensive inventory, which covers the full size range of particles emitted by motor vehicle fleets, it was necessary to derive a suitable set of particle emission factors for different vehicle and road type combinations for particle number, particle volume, PM1, PM2.5 and PM1 (mass concentration of particles with aerodynamic diameters < 1 µm, < 2.5 µm and < 10 µm respectively). The very large data set of emission factors analysed in this study were sourced from measurement studies conducted in developed countries, and hence the derived set of emission factors are suitable for preparing inventories in other urban regions of the developed world. These emission factors are particularly useful for regions with a lack of measurement data to derive emission factors, or where experimental data are available but are of insufficient scope. The comprehensive particle emissions inventory presented in this thesis is the first published inventory of tailpipe particle emissions prepared for a motor vehicle fleet, and included the quantification of particle emissions covering the full size range of particles emitted by vehicles, based on measurement data. The inventory quantified particle emissions measured in terms of particle number and different particle mass size fractions. It was developed for the urban South-East Queensland fleet in Australia, and included testing the particle emission implications of future scenarios for different passenger and freight travel demand. The thesis also presents evidence of the usefulness of examining modality within particle size distributions as a basis for developing air quality regulations; and finds evidence to support the relevance of introducing a new PM1 mass ambient air quality standard for the majority of environments worldwide. The study found that a combination of PM1 and PM10 standards are likely to be a more discerning and suitable set of ambient air quality standards for controlling particles emitted from combustion and mechanically-generated sources, such as motor vehicles, than the current mass standards of PM2.5 and PM10. The study also reviewed and synthesized existing knowledge on ultrafine particles, with a specific focus on those originating from motor vehicles. It found that motor vehicles are significant contributors to both air pollution and ultrafine particles in urban areas, and that a standardized measurement procedure is not currently available for ultrafine particles. The review found discrepancies exist between outcomes of instrumentation used to measure ultrafine particles; that few data is available on ultrafine particle chemistry and composition, long term monitoring; characterization of their spatial and temporal distribution in urban areas; and that no inventories for particle number are available for motor vehicle fleets. This knowledge is critical for epidemiological studies and exposure-response assessment. Conclusions from this review included the recommendation that ultrafine particles in populated urban areas be considered a likely target for future air quality regulation based on particle number, due to their potential impacts on the environment. The research in this PhD thesis successfully integrated the elements needed to quantify and manage motor vehicle fleet emissions, and its novelty relates to the combining of expertise from two distinctly separate disciplines - from aerosol science and transport modelling. The new knowledge and concepts developed in this PhD research provide never before available data and methods which can be used to develop comprehensive, size-resolved inventories of motor vehicle particle emissions, and air quality regulations to control particle emissions to protect the health and well-being of current and future generations.
Resumo:
Unresolved painful emotional experiences such as bereavement, trauma and disturbances in core relationships, are common presenting problems for clients of psychodrama or psychotherapy more generally. Emotional pain is experienced as a shattering of the sense of self and disconnection from others and, when unresolved, produces avoidant responses which inhibit the healing process. There is agreement across therapeutic modalities that exposure to emotional experience can increase the efficacy of therapeutic interventions. Moreno proposes that the activation of spontaneity is the primary curative factor in psychodrama and that healing occurs when the protagonist (client) engages with his or her wider social system and develops greater flexibility in response to that system. An extensive case-report literature describes the application of the psychodrama method in healing unresolved painful emotional experiences, but there is limited empirical research to verify the efficacy of the method or to identify the processes that are linked to therapeutic change. The purpose of this current research was to construct a model of protagonist change processes that could extend psychodrama theory, inform practitioners’ therapeutic decisions and contribute to understanding the common factors in therapeutic change. Four studies investigated protagonist processes linked to in-session resolution of painful emotional experiences. Significant therapeutic events were analysed using recordings and transcripts of psychodrama enactments, protagonist and director recall interviews and a range of process and outcome measures. A preliminary study (3 cases) identified four themes that were associated with helpful therapeutic events: enactment, the working alliance with the director and with group members, emotional release or relief and social atom repair. The second study (7 cases) used Comprehensive Process Analysis (CPA) to construct a model of protagonists’ processes linked to in-session resolution. This model was then validated across four more cases in Study 3. Five meta-processes were identified: (i) a readiness to engage in the psychodrama process; (ii) re-experiencing and insight; (iii) activating resourcefulness; (iv) social atom repair with emotional release and (v) integration. Social atom repair with emotional release involved deeply experiencing a wished-for interpersonal experience accompanied by a free flowing release of previously restricted emotion and was most clearly linked to protagonists’ reports of reaching resolution and to post session improvements in interpersonal relationships and sense of self. Acceptance of self in the moment increased protagonists’ capacity to generate new responses within each meta-process and, in resolved cases, there was evidence of spontaneity developing over time. The fourth study tested Greenberg’s allowing and accepting painful emotional experience model as an alternative explanation of protagonist change. The findings of this study suggested that while the process of allowing emotional pain was present in resolved cases, Greenberg’s model was not sufficient to explain the processes that lead to in-session resolution. The protagonist’s readiness to engage and activation of resourcefulness appear to facilitate the transition from problem identification to emotional release. Furthermore, experiencing a reparative relationship was found to be central to the healing process. This research verifies that there can be in-session resolution of painful emotional experience during psychodrama and protagonists’ reports suggest that in-session resolution can heal the damage to the sense of self and the interpersonal disconnection that are associated with unresolved emotional pain. A model of protagonist change processes has been constructed that challenges the view of psychodrama as a primarily cathartic therapy, by locating the therapeutic experience of emotional release within the development of new role relationships. The five meta-processes which are described within the model suggest broad change principles which can assist practitioners to make sense of events as they unfold and guide their clinical decision making in the moment. Each meta-process was linked to specific post-session changes, so that the model can inform the development of therapeutic plans for individual clients and can aid communication for practitioners when a psychodrama intervention is used for a specific therapeutic purpose within a comprehensive program of therapy.
Resumo:
This article considers the distinctive ways in which the Special Broadcasting Service (SBS) has evolved over its history since 1980, and how it has managed competing claims to being a multicultural yet broad-appeal broadcaster, and a comprehensive yet low-cost media service. It draws attention to the challenges presented by a global rethinking of the nature of citizenship and its relationship to media, for which SBS is well placed as a leader, and the challenges of online media for traditional public service media models, where SBS has arguably been a laggard, particularly when compared with the Australian Broadcasting Corporation (ABC). It notes recent work that has been undertaken by the author with others into user-created content strategies at SBS and how its online news and current affairs services have been evolving in recent years.
Resumo:
The two outcome indices described in a companion paper (Sanson et al., Child Indicators Research, 2009) were developed using data from the Longitudinal Study of Australian Children (LSAC). These indices, one for infants and the other for 4 year to 5 year old children, were designed to fill the need for parsimonious measures of children’s developmental status to be used in analyses by a broad range of data users and to guide government policy and interventions to support young children’s optimal development. This paper presents evidence from Wave 1data from LSAC to support the validity of these indices and their three domain scores of Physical, Social/Emotional, and Learning. Relationships between the indices and child, maternal, family, and neighborhood factors which are known to relate concurrently to child outcomes were examined. Meaningful associations were found with the selected variables, thereby demonstrating the usefulness of the outcome indices as tools for understanding children’s development in their family and socio-cultural contexts. It is concluded that the outcome indices are valuable tools for increasing understanding of influences on children’s development, and for guiding policy and practice to optimize children’s life chances.
Resumo:
The Longitudinal Study of Australian Children (LSAC) is a major national study examining the lives of Australian children, using a cross-sequential cohort design and data from parents, children, and teachers for 5,107 infants (3–19 months) and 4,983 children (4–5 years). Its data are publicly accessible and are used by researchers from many disciplinary backgrounds. It contains multiple measures of children’s developmental outcomes as well as a broad range of information on the contexts of their lives. This paper reports on the development of summary outcome indices of child development using the LSAC data. The indices were developed to fill the need for indicators suitable for use by diverse data users in order to guide government policy and interventions which support young children’s optimal development. The concepts underpinning the indices and the methods of their development are presented. Two outcome indices (infant and child) were developed, each consisting of three domains—health and physical development, social and emotional functioning, and learning competency. A total of 16 measures are used to make up these three domains in the Outcome Index for the Child Cohort and six measures for the Infant Cohort. These measures are described and evidence supporting the structure of the domains and their underlying latent constructs is provided for both cohorts. The factorial structure of the Outcome Index is adequate for both cohorts, but was stronger for the child than infant cohort. It is concluded that the LSAC Outcome Index is a parsimonious measure representing the major components of development which is suitable for non-specialist data users. A companion paper (Sanson et al. 2010) presents evidence of the validity of the Index.
Resumo:
Paropsis atomaria is a recently emerged pest of eucalypt plantations in subtropical Australia. Its broad host range of at least 20 eucalypt species and wide geographical distribution provides it the potential to become a serious forestry pest both within Australia and, if accidentally introduced, overseas. Although populations of P. atomaria are genetically similar throughout its range, population dynamics differ between regions. Here, we determine temperature-dependent developmental requirements using beetles sourced from temperate and subtropical zones by calculating lower temperature thresholds, temperature-induced mortality, and day-degree requirements. We combine these data with field mortality estimates of immature life stages to produce a cohort-based model, ParopSys, using DYMEX™ that accurately predicts the timing, duration, and relative abundance of life stages in the field and number of generations in a spring–autumn (September–May) field season. Voltinism was identified as a seasonally plastic trait dependent upon environmental conditions, with two generations observed and predicted in the Australian Capital Territory, and up to four in Queensland. Lower temperature thresholds for development ranged between 4 and 9 °C, and overall development rates did not differ according to beetle origin. Total immature development time (egg–adult) was approximately 769.2 ± S.E. 127.8 DD above a lower temperature threshold of 6.4 ± S.E. 2.6 °C. ParopSys provides a basic tool enabling forest managers to use the number of generations and seasonal fluctuations in abundance of damaging life stages to estimate the pest risk of P. atomaria prior to plantation establishment, and predict the occurrence and duration of damaging life stages in the field. Additionally, by using local climatic data the pest potential of P. atomaria can be estimated to predict the risk of it establishing if accidentally introduced overseas. Improvements to ParopSys’ capability and complexity can be made as more biological data become available.
Resumo:
Rapid advances in information and communications technology (ICT) - particularly the development of online technologies -have transformed the nature of economic, social and cultural relations across the globe. In the context of higher education in post-industrial societies, technological change has had a significant impact on university operating environments. In a broad sense, technological advancement has contributed significantly to the increasing complexity of global economies and societies, which is reflected in the rise of lifelong learning discourses with which universities are engaging. More specifically, the ever-expanding array of ICT available within the university sector has generated new management and pedagogical imperatives for higher education in the information age.
Resumo:
The New Zealand creative sector was responsible for almost 121,000 jobs at the time of the 2006 Census (6.3% of total employment). These are divided between • 35,751 creative specialists – persons employed doing creative work in creative industries • 42,300 support workers - persons providing management and support services in creative industries • 42,792 embedded creative workers – persons engaged in creative work in other types of enterprise The most striking feature of this breakdown is the fact that the largest group of creative workers are employed outside the creative industries, i.e. in other types of businesses. Even within the creative industries, there are fewer people directly engaged in creative work than in providing management and support. Creative sector employees earned incomes of approximately $52,000 per annum at the time of the 2006 Census. This is relatively uniform across all three types of creative worker, and is significantly above the average for all employed persons (of approximately $40,700). Creative employment and incomes were growing strongly over both five year periods between the 1996, 2001 and 2006 Censuses. However, when we compare creative and general trends, we see two distinct phases in the development of the creative sector: • rapid structural growth over the five years to 2001 (especially led by developments in ICT), with creative employment and incomes increasing rapidly at a time when they were growing modestly across the whole economy; • subsequent consolidation, with growth driven by more by national economic expansion than structural change, and creative employment and incomes moving in parallel with strong economy-wide growth. Other important trends revealed by the data are that • the strongest growth during the decade was in embedded creative workers, especially over the first five years. The weakest growth was in creative specialists, with support workers in creative industries in the middle rank, • by far the strongest growth in creative industries’ employment was in Software & digital content, which trebled in size over the decade Comparing New Zealand with the United Kingdom and Australia, the two southern hemisphere nations have significantly lower proportions of total employment in the creative sector (both in creative industries and embedded employment). New Zealand’s and Australia’s creative shares in 2001 were similar (5.4% each), but in the following five years, our share has expanded (to 5.7%) whereas Australia’s fell slightly (to 5.2%) – in both cases, through changes in creative industries’ employment. The creative industries generated $10.5 billion in total gross output in the March 2006 year. Resulting from this was value added totalling $5.1b, representing 3.3% of New Zealand’s total GDP. Overall, value added in the creative industries represents 49% of industry gross output, which is higher than the average across the whole economy, 45%. This is a reflection of the relatively high labour intensity and high earnings of the creative industries. Industries which have an above-average ratio of value added to gross output are usually labour-intensive, especially when wages and salaries are above average. This is true for Software & Digital Content and Architecture, Design & Visual Arts, with ratios of 60.4% and 55.2% respectively. However there is significant variation in this ratio between different parts of the creative industries, with some parts (e.g. Software & Digital Content and Architecture, Design & Visual Arts) generating even higher value added relative to output, and others (e.g. TV & Radio, Publishing and Music & Performing Arts) less, because of high capital intensity and import content. When we take into account the impact of the creative industries’ demand for goods and services from its suppliers and consumption spending from incomes earned, we estimate that there is an addition to economic activity of: • $30.9 billion in gross output, $41.4b in total • $15.1b in value added, $20.3b in total • 158,100 people employed, 234,600 in total The total economic impact of the creative industries is approximately four times their direct output and value added, and three times their direct employment. Their effect on output and value added is roughly in line with the average over all industries, although the effect on employment is significantly lower. This is because of the relatively high labour intensity (and high earnings) of the creative industries, which generate below-average demand from suppliers, but normal levels of demand though expenditure from incomes. Drawing on these numbers and conclusions, we suggest some (slightly speculative) directions for future research. The goal is to better understand the contribution the creative sector makes to productivity growth; in particular, the distinctive contributions from creative firms and embedded creative workers. The ideas for future research can be organised into the several categories: • Understanding the categories of the creative sector– who is doing the business? In other words, examine via more fine grained research (at a firm level perhaps) just what is the creative contribution from the different aspects of the creative sector industries. It may be possible to categorise these in terms of more or less striking innovations. • Investigate the relationship between the characteristics and the performance of the various creative industries/ sectors; • Look more closely at innovation at an industry level e.g. using an index of relative growth of exports, and see if this can be related to intensity of use of creative inputs; • Undertake case studies of the creative sector; • Undertake case studies of the embedded contribution to growth in the firms and industries that employ them, by examining taking several high performing noncreative industries (in the same way as proposed for the creative sector). • Look at the aggregates – drawing on the broad picture of the extent of the numbers of creative workers embedded within the different industries, consider the extent to which these might explain aspects of the industries’ varied performance in terms of exports, growth and so on. • This might be able to extended to examine issues like the type of creative workers that are most effective when embedded, or test the hypothesis that each industry has its own particular requirements for embedded creative workers that overwhelms any generic contributions from say design, or IT.
Resumo:
The international focus on embracing daylighting for energy efficient lighting purposes and the corporate sector’s indulgence in the perception of workplace and work practice “transparency” has spurned an increase in highly glazed commercial buildings. This in turn has renewed issues of visual comfort and daylight-derived glare for occupants. In order to ascertain evidence, or predict risk, of these events; appraisals of these complex visual environments require detailed information on the luminances present in an occupant’s field of view. Conventional luminance meters are an expensive and time consuming method of achieving these results. To create a luminance map of an occupant’s visual field using such a meter requires too many individual measurements to be a practical measurement technique. The application of digital cameras as luminance measurement devices has solved this problem. With high dynamic range imaging, a single digital image can be created to provide luminances on a pixel-by-pixel level within the broad field of view afforded by a fish-eye lens: virtually replicating an occupant’s visual field and providing rapid yet detailed luminance information for the entire scene. With proper calibration, relatively inexpensive digital cameras can be successfully applied to the task of luminance measurements, placing them in the realm of tools that any lighting professional should own. This paper discusses how a digital camera can become a luminance measurement device and then presents an analysis of results obtained from post occupancy measurements from building assessments conducted by the Mobile Architecture Built Environment Laboratory (MABEL) project. This discussion leads to the important realisation that the placement of such tools in the hands of lighting professionals internationally will provide new opportunities for the lighting community in terms of research on critical issues in lighting such as daylight glare and visual quality and comfort.
Resumo:
Low back pain is an increasing problem in industrialised countries and although it is a major socio-economic problem in terms of medical costs and lost productivity, relatively little is known about the processes underlying the development of the condition. This is in part due to the complex interactions between bone, muscle, nerves and other soft tissues of the spine, and the fact that direct observation and/or measurement of the human spine is not possible using non-invasive techniques. Biomechanical models have been used extensively to estimate the forces and moments experienced by the spine. These models provide a means of estimating the internal parameters which can not be measured directly. However, application of most of the models currently available is restricted to tasks resembling those for which the model was designed due to the simplified representation of the anatomy. The aim of this research was to develop a biomechanical model to investigate the changes in forces and moments which are induced by muscle injury. In order to accurately simulate muscle injuries a detailed quasi-static three dimensional model representing the anatomy of the lumbar spine was developed. This model includes the nine major force generating muscles of the region (erector spinae, comprising the longissimus thoracis and iliocostalis lumborum; multifidus; quadratus lumborum; latissimus dorsi; transverse abdominis; internal oblique and external oblique), as well as the thoracolumbar fascia through which the transverse abdominis and parts of the internal oblique and latissimus dorsi muscles attach to the spine. The muscles included in the model have been represented using 170 muscle fascicles each having their own force generating characteristics and lines of action. Particular attention has been paid to ensuring the muscle lines of action are anatomically realistic, particularly for muscles which have broad attachments (e.g. internal and external obliques), muscles which attach to the spine via the thoracolumbar fascia (e.g. transverse abdominis), and muscles whose paths are altered by bony constraints such as the rib cage (e.g. iliocostalis lumborum pars thoracis and parts of the longissimus thoracis pars thoracis). In this endeavour, a separate sub-model which accounts for the shape of the torso by modelling it as a series of ellipses has been developed to model the lines of action of the oblique muscles. Likewise, a separate sub-model of the thoracolumbar fascia has also been developed which accounts for the middle and posterior layers of the fascia, and ensures that the line of action of the posterior layer is related to the size and shape of the erector spinae muscle. Published muscle activation data are used to enable the model to predict the maximum forces and moments that may be generated by the muscles. These predictions are validated against published experimental studies reporting maximum isometric moments for a variety of exertions. The model performs well for fiexion, extension and lateral bend exertions, but underpredicts the axial twist moments that may be developed. This discrepancy is most likely the result of differences between the experimental methodology and the modelled task. The application of the model is illustrated using examples of muscle injuries created by surgical procedures. The three examples used represent a posterior surgical approach to the spine, an anterior approach to the spine and uni-lateral total hip replacement surgery. Although the three examples simulate different muscle injuries, all demonstrate the production of significant asymmetrical moments and/or reduced joint compression following surgical intervention. This result has implications for patient rehabilitation and the potential for further injury to the spine. The development and application of the model has highlighted a number of areas where current knowledge is deficient. These include muscle activation levels for tasks in postures other than upright standing, changes in spinal kinematics following surgical procedures such as spinal fusion or fixation, and a general lack of understanding of how the body adjusts to muscle injuries with respect to muscle activation patterns and levels, rate of recovery from temporary injuries and compensatory actions by other muscles. Thus the comprehensive and innovative anatomical model which has been developed not only provides a tool to predict the forces and moments experienced by the intervertebral joints of the spine, but also highlights areas where further clinical research is required.
Resumo:
This thesis examines the culture of contemporary writers’ festivals in an international context. In the last five decades writers’ festivals have emerged in cities across the world, and during this time they have expanded their literary discussions and debates to include numerous other topics of broad interest to society. To examine the expanded popularity and function of writers’ festivals, this thesis establishes a new vantage point for theorising the content now typically generated by these events using concepts in urban festivals and public culture research. Importantly, the new vantage point addresses the limitations of current commentary on writers’ festivals which routinely claim they trivialize literature, and more generally, contribute to the decline of public culture. The thesis presents two case studies: one on the Brisbane Writers Festival in Australia and the other on the International Festival of Authors in Toronto, Canada. The first case study, which focuses on the 2007 Brisbane Writers Festival, illustrates the many overlapping and often conflicting discourses as well as opinions productively discussed and debated at writers’ festivals. Key topic discussed and debated at the Festival include local topics about the host city—its history, literature and politics, as well as broader literary, political and celebrity culture topics. The diversity of topics discussed at the 2007 Brisbane Writers Festival is typical of the majority of writers’ festivals similarly located outside the largest geographic centres of global literary production and circulation, and designated as ‘peripheral’ festivals in this research. The second case study on Toronto’s International Festival of Authors examines the ways in which the 2006 Festival almost exclusively focussed on literary and celebrity culture discourses, and promoted itself on these terms. The 2006 International Festival of Authors’ discussion and debate of a narrow range of topics is typical of the few writers’ festivals located in global centres of literary production and circulation, and unlike ‘peripheral’ festivals they are not experiencing the same growth in number or popularity. The aim of these ‘international’ Festivals is not to democratise their elite literary beginnings, but rather to promote ‘literature’ as a niche brand for quality writing that is valid on a global scale. This thesis will assert that while all writers’ festivals are influenced by the marketing desires of publishing companies, the aim of international writers’ festivals in marketing to a virtually and globally connected elite literary audience makes them more susceptible to experiencing declines in audience and author participation.