904 resultados para Turn Around Time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This body of photographic work has been created to firstly, explore a new approach to practice-led research that uses an “action genre” approach to reflective practice (Lemke) and secondly, to visually explore human interaction with the fundamental item in life - water. The first of these is based on the contention that to understand the meanings inherent in photographs we cannot look merely at the end result. It is essential to keep looking at the actions of practitioners, and the influences upon them, to determine how external influences affect the meaning potential of editorial photographs (Grayson, 2012). WATER therefore, provides an ideal platform to reflect upon the actions and influences involved in creating work within the photographic genre of photojournalism. It enables this practitioner to reflect on each stage of production to gain a better understanding of how external influences impact the narrative potential within images created. There are multi-faceted influences experienced by photographers who are creating images that, in turn, are part of constructing and presenting the narrative potential of editorial photographs. There is an important relationship between professional photographers and the technical, cultural, economic and institutional forces that impinge upon all stages of production and publication. What results is a greater understanding of technical, cultural, economic and institutional forces that impinge upon all stages of production and publication. Therefore, to understand the meanings inherent in photographs within WATER, I do not look merely at the end result. It provides a case study looking at my actions in the filed, and the influences upon me, to determine how external influences affect the meaning potential of these photographs (Grayson, 2012). As a result, this project adds to the body of scholarship around the definition of Photojournalism, how it has adapted to the current media environment and provides scope for further research into emerging new genres within editorial photography, such as citizen photojournalism. Concurrently, the photographs themselves were created to visually explore how there remains a humanistic desire to interact with the natural form of water even while living a modern cosmopolitan life around it. Taking a photojournalistic approach to exploring this phenomenon, the images were created by “capturing moments as they happened” with no posing or setting up of images. This serendipitous approach to the photographic medium provides the practitioner with at least an attempt to direct the subjectivity contained explicitly in photographs. What results is a series of images that extend the visual dialogue around the role of water within modern humanistic lifestyles and how it remains an integral part of our society’s behaviors. It captures important moments that document this relationship at this time of modern development. The resulting works were exhibited and published as part of the Head On Photo Festival, Australia's largest photo festival and the world's second largest festival in Sydney 20-24 May 2013. The WATER series of images were curated by three Magnum members; Ian Berry, Eli Reed and Chris Steele-Perkins. Magnum is a highly regarded international photographic co-operative with editorial offices in New York, London, Paris and Tokyo. There was a projection of the works as part of the official festival programme, presented to both members of the public and Sydney’s photography professionals. In addition, a sample of images from the WATER series was chosen for inclusion in the Magnum-published hardcover book. References Grayson, Louise. 2012. “Editorial photographs and patterns of practice.” Journalism Practice. Accessed: http://www.tandfonline.com/doi/abs/10.1080/17512786.2012.726836#.UbZN-L--1RQ Lemke, Jay. 1995. Textual Politics: Discourse and Social Dynamics. London: Taylor & Francis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Transmission of Plasmodium vivax malaria is dependent on vector availability, biting rates and parasite development. In turn, each of these is influenced by climatic conditions. Correlations have previously been detected between seasonal rainfall, temperature and malaria incidence patterns in various settings. An understanding of seasonal patterns of malaria, and their weather drivers, can provide vital information for control and elimination activities. This research aimed to describe temporal patterns in malaria, rainfall and temperature, and to examine the relationships between these variables within four counties of Yunnan Province, China. Methods Plasmodium vivax malaria surveillance data (1991–2006), and average monthly temperature and rainfall were acquired. Seasonal trend decomposition was used to examine secular trends and seasonal patterns in malaria. Distributed lag non-linear models were used to estimate the weather drivers of malaria seasonality, including the lag periods between weather conditions and malaria incidence. Results There was a declining trend in malaria incidence in all four counties. Increasing temperature resulted in increased malaria risk in all four areas and increasing rainfall resulted in increased malaria risk in one area and decreased malaria risk in one area. The lag times for these associations varied between areas. Conclusions The differences detected between the four counties highlight the need for local understanding of seasonal patterns of malaria and its climatic drivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Thomas Mann’s tetralogy of the 1930s and 1940s, Joseph and His Brothers, the narrator declares history is not only “that which has happened and that which goes on happening in time,” but it is also “the stratified record upon which we set our feet, the ground beneath us.” By opening up history to its spatial, geographical, and geological dimensions Mann both predicts and encapsulates the twentieth-century’s “spatial turn,” a critical shift that divested geography of its largely passive role as history’s “stage” and brought to the fore intersections between the humanities and the earth sciences. In this paper, I draw out the relationships between history, narrative, geography, and geology revealed by this spatial turn and the questions these pose for thinking about the disciplinary relationship between geography and the humanities. As Mann’s statement exemplifies, the spatial turn itself has often been captured most strikingly in fiction, and I would argue nowhere more so than in Graham Swift’s Waterland (1983) and Anne Michaels’s Fugitive Pieces (1996), both of which present space, place, and landscape as having a palpable influence on history and memory. The geographical/geological line that runs through both Waterland and Fugitive Pieces continues through Tim Robinson’s non-fictional, two-volume “topographical” history Stones of Aran. Robinson’s Stones of Aran—which is not history, not geography, and not literature, and yet is all three—constructs an imaginative geography that renders inseparable geography, geology, history, memory, and the act of writing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"In this chapter the authors present a critique of Participatory Evaluation as worked in development projects, in this case, in Nepal. The article works between established claims that Participatory Evaluation builds capacity at programmatic and organisational levels, and the specific experiences of these claims in the authors’ current work. They highlight the need to address key difficulties such as high turn-over of staff and resulting loss of capacity to engage in Participatory Evaluation, and the difficulty of communication between academic as compared with local practical wisdoms. A key issue is the challenge of addressing the inevitable issues of power inequities that such approaches encounter. While Participatory Evaluation has been around for some time, it has only enjoyed more widespread recognition of its value in comparatively recent times, with its uptake in international development environments. To this extent, the practice is still in its early stages of development, and Jo, June and Michael’s work contributes to strengthening and more comprehensively understanding it. With regard to the meta-theme of this publication, this chapter is an example of how context not only influences the methodology to be used and the praxis of how it is to be used, but contributes to early explication of the core nature of an emerging methodology."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The occurrence of extreme water levels along low-lying, highly populated and/or developed coastlines can lead to considerable loss of life and billions of dollars of damage to coastal infrastructure. Therefore it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood management, engineering and future land-use planning. This ensures the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. This paper estimates for the first time present day extreme water level exceedence probabilities around the whole coastline of Australia. A high-resolution depth averaged hydrodynamic model has been configured for the Australian continental shelf region and has been forced with tidal levels from a global tidal model and meteorological fields from a global reanalysis to generate a 61-year hindcast of water levels. Output from this model has been successfully validated against measurements from 30 tide gauge sites. At each numeric coastal grid point, extreme value distributions have been fitted to the derived time series of annual maxima and the several largest water levels each year to estimate exceedence probabilities. This provides a reliable estimate of water level probabilities around southern Australia; a region mainly impacted by extra-tropical cyclones. However, as the meteorological forcing used only weakly includes the effects of tropical cyclones, extreme water level probabilities are underestimated around the western, northern and north-eastern Australian coastline. In a companion paper we build on the work presented here and more accurately include tropical cyclone-induced surges in the estimation of extreme water level. The multi-decadal hindcast generated here has been used primarily to estimate extreme water level exceedance probabilities but could be used more widely in the future for a variety of other research and practical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long Time, No See? is a crowd-sourced project that asks people to reflect upon what kind of long term future they would each like to promote. It is an evolving experiment in the social practice of ‘everyday futuring’. To participate download the Long Time, No See? IPhone APP that gently guides you during a short walk, encouraging you to experience new places, sensations and thoughts in your locality. At nine stages along that journey you donate ‘field notes’ as images, texts, sounds and ‘themes’, offering a unique opportunity to reveal possible pathways towards more sustaining futures. The APP records the shape of your walk on the ground and draws an island on the ‘map’ shown here, populated by your nine sets of responses. The themes you have chosen then connect your island into an evolving ‘world’ map of connections and possibilities, which you can then explore at your leisure. In these ways, Long Time, No See? doesn’t ask you for lofty visions or ask you to lay out a program of action, but instead asks you to consider what is around you today, steering your eyes, ears and embodied experiences towards new futures that demonstrate your ‘care’ for what comes after you. Please use the contribute tab below to learn how to add your voice! PARTICIPATE To contribute 1: Download the APP {bit.do/ltns}, iPhone/iPad is supported right now. 2: Register a ‘walker name’. 3: Take a leisurely walk (30 -60mins) and contribute image, text, sound and themes when asked. 4: Wait while we verify and upload your walk (allow about 24 hours) 5: View your contributions via your ‘walker name’ and discover how it relates to others, here at the Cube and at www.long-time-no-see.org. NB You can undertake each walk over more than one day if that suits. You may even drive, cycle or move by other modes. DOWNLOAD THE APP: bit.do/ltns (insert QI Code) FIND OUT MORE www.long-time-no-see.org

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous initiatives have been employed around the world in order to address rising greenhouse gas (GHG) emissions originating from the transport sector. These measures include: travel demand management (congestion‐charging), increased fuel taxes, alternative fuel subsidies and low‐emission vehicle (LEV) rebates. Incentivizing the purchase of LEVs has been one of the more prevalent approaches in attempting to tackle this global issue. LEVs, whilst having the advantage of lower emissions and, in some cases, more efficient fuel consumption, also bring the downsides of increased purchase cost, reduced convenience of vehicle fuelling, and operational uncertainty. To stimulate demand in the face of these challenges, various incentive‐based policies, such as toll exemptions, have been used by national and local governments to encourage the purchase of these types of vehicles. In order to address rising GHG emissions in Stockholm, and in line with the Swedish Government’s ambition to operate a fossil free fleet by 2030, a number of policies were implemented targeting the transport sector. Foremost amongst these was the combination of a congestion charge – initiated to discourage emissions‐intensive travel – and an exemption from this charge for some LEVs, established to encourage a transition towards a ‘green’ vehicle fleet. Although both policies shared the aim of reducing GHG emissions, the exemption for LEVs carried the risk of diminishing the effectiveness of the congestion charging scheme. As the number of vehicle owners choosing to transition to an eligible LEV increased, the congestion‐reduction effectiveness of the charging scheme weakened. In fact, policy makers quickly recognized this potential issue and consequently phased out the LEV exemption less than 18 months after its introduction (1). Several studies have investigated the demand for LEVs through stated‐preference (SP) surveys across multiple countries, including: Denmark (2), Germany (3, 4), UK (5), Canada (6), USA (7, 8) and Australia (9). Although each of these studies differed in approach, all involved SP surveys where differing characteristics between various types of vehicles, including LEVs, were presented to respondents and these respondents in turn made hypothetical decisions about which vehicle they would be most likely to purchase. Although these studies revealed a number of interesting findings in regards to the potential demand for LEVs, they relied on SP data. In contrast, this paper employs an approach where LEV choice is modelled by taking a retrospective view and by using revealed preference (RP) data. By examining the revealed preferences of vehicle owners in Stockholm, this study overcomes one of the principal limitations of SP data, namely that stated preferences may not in fact reflect individuals’ actual choices, such as when cost, time, and inconvenience factors are real rather than hypothetical. This paper’s RP approach involves modelling the characteristics of individuals who purchased new LEVs, whilst estimating the effect of the congestion charging exemption upon choice probabilities and subsequent aggregate demand. The paper contributes to the current literature by examining the effectiveness of a toll exemption under revealed preference conditions, and by assessing the total effect of the policy based on key indicators for policy makers, including: vehicle owner home location, commuting patterns, number of children, age, gender and income. Extended Abstract Submission for Kuhmo Nectar Conference 2014 2 The two main research questions motivating this study were:  Which individuals chose to purchase a new LEV in Stockholm in 2008?; and,  How did the congestion charging exemption affect the aggregate demand for new LEVs in Stockholm in 2008? In order to answer these research questions the analysis was split into two stages. Firstly, a multinomial logit (MNL) model was used to identify which demographic characteristics were most significantly related to the purchase of an LEV over a conventional vehicle. The three most significant variables were found to be: intra‐cordon residency (positive); commuting across the cordon (positive); and distance of residence from the cordon (negative). In order to estimate the effect of the exemption policy on vehicle purchase choice, the model included variables to control for geographic differences in preferences, based on the location of the vehicle owners’ homes and workplaces in relation to the congestion‐charging cordon boundary. These variables included one indicator representing commutes across the cordon and another indicator representing intra‐cordon residency. The effect of the exemption policy on the probability of purchasing LEVs was estimated in the second stage of the analysis by focusing on the groups of vehicle owners that were most likely to have been affected by the policy i.e. those commuting across the cordon boundary (in both directions). Given the inclusion of the indicator variable representing commutes across the cordon, it is assumed that the estimated coefficient of this variable captures the effect of the exemption policy on the utility of choosing to purchase an exempt LEV for these two groups of vehicle owners. The intra‐cordon residency indicator variable also controls for differences between the two groups, based upon direction of travel across the cordon boundary. A counter‐hypothesis to this assumption is that the coefficient of the variable representing commuting across the cordon boundary instead only captures geo‐demographic differences that lead to variations in LEV ownership across the different groups of vehicle owners in relation to the cordon boundary. In order to address this counter‐hypothesis, an additional analysis was performed on data from a city with a similar geodemographic pattern to Stockholm, Gothenburg ‐ Sweden’s second largest city. The results of this analysis provided evidence to support the argument that the coefficient of the variable representing commutes across the cordon was capturing the effect of the exemption policy. Based upon this framework, the predicted vehicle type shares were calculated using the estimated coefficients of the MNL model and compared with predicted vehicle type shares from a simulated scenario where the exemption policy was inactive. This simulated scenario was constructed by setting the coefficient for the variable representing commutes across the cordon boundary to zero for all observations to remove the utility benefit of the exemption policy. Overall, the procedure of this second stage of the analysis led to results showing that the exemption had a substantial effect upon the probability of purchasing and aggregate demand for exempt LEVs in Stockholm during 2008. By making use of unique evidence of revealed preferences of LEV owners, this study identifies the common characteristics of new LEV owners and estimates the effect of Stockholm's congestion charging exemption upon the demand for new LEVs during 2008. It was found that the variables that had the greatest effect upon the choice of purchasing an exempt LEV included intra‐cordon residency (positive), distance of home from the cordon (negative), and commuting across the cordon (positive). It was also determined that owners under the age of 30 years preferred non‐exempt LEVs (low CO2 LEVs), whilst those over the age of 30 years preferred electric vehicles. In terms of electric vehicles, it was apparent that those individuals living within the city had the highest propensity towards purchasing this vehicle type. A negative relationship between choosing an electric vehicle and the distance of an individuals’ residency from the cordon was also evident. Overall, the congestion charging exemption was found to have increased the share of exempt LEVs in Stockholm by 1.9%, with, as expected, a much stronger effect on those commuting across the boundary, with those living inside the cordon having a 13.1% increase, and those owners living outside the cordon having a 5.0% increase. This increase in demand corresponded to an additional 538 (+/‐ 93; 95% C.I.) new exempt LEVs purchased in Stockholm during 2008 (out of a total of 5 427; 9.9%). Policy makers can take note that an incentive‐based policy can increase the demand for LEVs and appears to be an appropriate approach to adopt when attempting to reduce transport emissions through encouraging a transition towards a ‘green’ vehicle fleet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FROM KCWS 2011 CHAIRS AND SUMMIT PROCEEDING EDITORS In recent years, with the impact of global knowledge economy, a more comprehensive development approach has gained significant popularity. This new development approach, so called ‘knowledgebased development’, is different from its traditional predecessor. With a much more balanced focus on all of the four key development domains – economic, enviro-urban, institutional, and sociocultural – this contemporary approach, aims to bring economic prosperity, environmental sustainability and local institutional competence with a just socio-spatial order to our cities and regions. The ultimate goal of knowledge-based development is to produce a city purposefully designed to encourage the continuous production, circulation and commercialisation of social and scientific knowledge – this will in turn establish a ‘knowledge city’. A city following the ‘knowledge city’ concept embarks on a strategic mission to firmly encourage and nurture locally focussed innovation, science and creativity within the context of an expanding knowledge economy and society. In this regard a ‘knowledge city’ can be seen as an integrated city, which physically and institutionally combines the functions of a science and technology park with civic and residential functions and urban amenities. It also offers one of the effective paradigms for the sustainable cities of our time. This fourth edition of KCWS – The 4th Knowledge Cities World Summit 2011 – makes an important reminder that the 'knowledge city' concept is a key notion in the 21st Century development. Considering this notion, the Summit sheds light on the multi-faceted dimensions and various scales of building a ‘knowledge city’ via 'knowledge-based development' paradigm by particularly focusing on the overall Summit theme of ‘Knowledge Cities for Future Generations’. At this summit, the theoretical and practical maturing of knowledge-based development paradigms are advanced through the interplay between the world’s leading academics’ theories and the practical models and strategies of practitioners’ and policy makers’ drawn from around the world. This summit proceeding is compiled in order to disseminate the knowledge generated and shared in KCWS 2011 with the wider research, governance, and practice communities the knowledge cocreated in this summit. All papers of this proceeding have gone through a double-blind peer review process and been reviewed by our summit editorial review and advisory board members. We, organisers of the summit, cordially thank the members of the Summit Proceeding Editorial Review and Advisory Board for their diligent work in the review of the papers. We hope the papers in this proceeding will inspire and make a significant contribution to the research, governance, and practice circles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the body, time and space are fundamental to human experience, comparatively little attention has been given to the connections between them. Here scholars from a wide range of disciplines explore important themes of embodied life in time and space across cultures, activities and bodymind states. Motivated by a common desire to deepen and extend our comprehension of these phenomena and the connections and conversations between them, this book emerged from intense inter-disciplinary dialogue during the 1st Global Conferences on Time, Space and the Body and Body Horror. A plenitude of theoretical approaches and media are deployed to investigate assumptions and pose problems, to creatively deconstruct and reconstruct the terms through which experience is rendered meaningful, pleasurable, and functional. These investigations, pursued through various research methods in fields of the arts, social and psychological sciences and humanities, invite readers into a genuinely pluralistic conversation around the most basic and profound aspects of being.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The experiences and constructs of time, space and bodies saturate human discourse—naturally enough, since they are fundamental to existence—yet there has long been a tendency for the terms to be approached somewhat independently, belying the depth of their interconnections. It was a desire to address that apparent shortcoming that inspired this book, and the interdisciplinary meetings from which it was born, the 1st Global Conferences on ‘Time, Space and the Body’ and ‘Body Horror’ held in Sydney in February 2013. Following the lively, often provocative, exchange of ideas throughout those meetings, the writing here crosses conventional boundaries inhabiting everyday life and liminal experiences, across cultures, life circumstances, and bodily states. Through numerous theoretical frameworks and with reference to a variety of media, the authors problematize or deconstruct commonplace assumptions to reveal challenging new perspectives on the diverse cultures and communities which make our world. If there is an overarching theme of this collection it is diversity itself. The writers here come from numerous academic fields, but a good number of them also draw on first-hand cultural production in the arts: photography, sculpture and fine art instillation, for example. Of course, however laudable it might be, there is a potential problem in such diversity: does it produce fruitful dialogue moving toward creative, workable syntheses or simply a cacophony of competing, incomprehensible, barely comprehending voices? To a large degree this depends upon the intellectual, existential ambitions as well as the old-fashioned goodnatured tolerance of both writers and readers. But we hope three unifying characteristics are discernable in the following chapters viewed as a whole: firstly, a genuine concern for the world humans inhabit and the communities they form as bodies in space and time; secondly, an emphasis upon the experience of the human subject, exemplified perhaps by the number of chapters drawing on phenomenology; thirdly, an adventurous, explorative impulse associated with an underlying sense that being, since it is inseparable from the body’s temporality, is always becoming, and here the presence of poststructuralist influences is unmistakable, often explicit. Our challenge as editors has been to present the enormous variety of subjects and views in a way that would render the book coherent and at the same time encourage readers to make explorations themselves into realms they might usually consider beyond their field of interest. To that end we have divided the book into six sections around loosely defined themes, each offering different angles on how time and/or space unfold in and around bodies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To measure alcohol-related harms to the health of young people presenting to emergency departments (EDs) of Gold Coast public hospitals before and after the increase in the federal government "alcopops" tax in 2008. Design, setting and participants: Interrupted time series analysis over 5 years (28 April 2005 to 27 April 2010) of 15-29-year-olds presenting to EDs with alcohol-related harms compared with presentations of selected control groups. Main outcome measures: Proportion of 15-29-year-olds presenting to EDs with alcohol-related harms compared with (i) 30-49-year-olds with alcohol-related harms, (ii)15-29-year-olds with asthma or appendicitis, and (iii) 15-29-yearolds with any non-alcohol and non-injury related ED presentation. Results: Over a third of 15-29-year-olds presented to ED with alcohol-related conditions, as opposed to around a quarter for all other age groups. There was no significant decrease in alcohol-related ED presentations of 15-29-year-olds compared with any of the control groups after the increase in the tax. We found similar results for males and females, narrow and broad definitions of alcoholrelated harms, under-19s, and visitors to and residents of the Gold Coast. Conclusions: The increase in the tax on al copops was not associated with any reduction in alcohol-related harms in this population in a unique tourist and holiday region. A more comprehensive approach to reducing alcohol harms in young people is needed.