117 resultados para mega-event
Resumo:
The recent floods in south-east Queensland have focused policy, academic and community attention on the challenges associated with severe weather events (SWE), specifically pre-disaster preparation, disaster-response and post-disaster community resilience. Financially, the cost of SWE was $9 billion in the 2011 Australian Federal Budget (Swan 2011); psychologically and emotionally, the impact on individual mental health and community wellbeing is also significant but more difficult to quantify. However, recent estimates suggest that as many as one in five will subsequently experience major emotional distress (Bonanno et al. 2010). With climate change predicted to increase the frequency and intensity of a wide range of SWE in Australia (Garnaut 2011; The Climate Institute 2011), there is an urgent and critical need to ensure that the unique psychological and social needs of more vulnerable community members - such as older residents - are better understood and integrated into disaster preparedness and response policy, planning and protocols. Navigating the complex dynamics of SWE can be particularly challenging for older adults and their disaster experience is frequently magnified by a wide array of cumulative and interactive stressors, which intertwine to make them uniquely vulnerable to significant short and long-term adverse effects. This current article provides a brief introduction to the current literature in this area and highlights a gap in the research relating to communication tools during and after severe weather events.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
Until recently, sustainable development was perceived as essentially an environmental issue, relating to the integration of environmental concerns into economic decision-making. As a result, environmental considerations have been the primary focus of sustainability decision making during the economic development process for major projects, and the assessment and preservation of social and cultural systems has been arguably too limited. The practice of social impact and sustainability assessment is an established and accepted part of project planning, however, these practices are not aimed at delivering sustainability outcomes for social systems, rather they are designed to minimise ‘unsustainability’ and contribute to project approval. Currently, there exists no widely recognised standard approach for assessing social sustainability and accounting for positive externalities of existing social systems in project decision making. As a result, very different approaches are applied around the world, and even by the same organisations from one project to another. This situation is an impediment not only to generating a shared understanding of the social implications as related to major projects, but more importantly, to identifying common approaches to help improve social sustainability outcomes of proposed activities. This paper discusses the social dimension of sustainability decision making of mega-projects, and argues that to improve accountability and transparency of project outcomes it is important to understand the characteristics that make some communities more vulnerable than others to mega-project development. This paper highlights issues with current operational level approaches to social sustainability assessment at the project level, and asserts that the starting point for project planning and sustainability decision making of mega-projects needs to include the preservation, maintenance, and enhancement of existing social and cultural systems. It draws attention to the need for a scoping mechanism to systematically assess community vulnerability (or sensitivity) to major infrastructure development during the feasibility and planning stages of a project.
Resumo:
Dose-finding designs estimate the dose level of a drug based on observed adverse events. Relatedness of the adverse event to the drug has been generally ignored in all proposed design methodologies. These designs assume that the adverse events observed during a trial are definitely related to the drug, which can lead to flawed dose-level estimation. We incorporate adverse event relatedness into the so-called continual reassessment method. Adverse events that have ‘doubtful’ or ‘possible’ relationships to the drug are modelled using a two-parameter logistic model with an additive probability mass. Adverse events ‘probably’ or ‘definitely’ related to the drug are modelled using a cumulative logistic model. To search for the maximum tolerated dose, we use the maximum estimated toxicity probability of these two adverse event relatedness categories. We conduct a simulation study that illustrates the characteristics of the design under various scenarios. This article demonstrates that adverse event relatedness is important for improved dose estimation. It opens up further research pathways into continual reassessment design methodologies.
Resumo:
With the growing size and variety of social media files on the web, it’s becoming critical to efficiently organize them into clusters for further processing. This paper presents a novel scalable constrained document clustering method that harnesses the power of search engines capable of dealing with large text data. Instead of calculating distance between the documents and all of the clusters’ centroids, a neighborhood of best cluster candidates is chosen using a document ranking scheme. To make the method faster and less memory dependable, the in-memory and in-database processing are combined in a semi-incremental manner. This method has been extensively tested in the social event detection application. Empirical analysis shows that the proposed method is efficient both in computation and memory usage while producing notable accuracy.
Resumo:
In 2006, the International Law Commission began a study into the role of states and international organizations in protecting persons in the event of a disaster. Special Rapporteur Mr. Eduardo Valencia-Ospina was appointed to head the study, and in 2011 the findings of the study will be presented to the United Nations General Assembly. Of interest to this paper has been the inclusion of “epidemics” under the natural disaster category in all of the reports detailing the Commission’s program of work on the protection of persons. This paper seeks to examine the legal and political ramifications involved in including “epidemic” into the concept of protection by exploring where sovereign responsibility for epidemic control begins and ends, particularly in light of the revisions to the International Health Regulations by the World Health Assembly in 2005. The paper will first analyze the findings already presented by the Special Rapporteur, examining the existing “responsibilities” of both states and international organizations. Then, the paper will consider to what extent the concept of protection entails the duty to assist individuals when an affected state proves unwilling or unable to assist their own population in the event of a disease outbreak. In an attempt to answer this question, the third part of the paper will examine the recent cholera outbreak in Zimbabwe.
Resumo:
From June 7th to 15th the Thesis Eleven Centre for Cultural Sociology at La Trobe University directed by Peter Beilharz put together a programme of public lectures, cultural events and master classes under the theme ‘Word, Image, Action: Popular Print and Visual Cultures’. This article reports on the highlights of the festival, including a forum titled ‘Does WikiLeaks Matter?, a half-day event ‘On Bauman’, and a public lecture by Ron Jacobs on ‘Media Narratives of Economic Crisis’.
Resumo:
PURPOSE To compare diffusion-weighted functional magnetic resonance imaging (DfMRI), a novel alternative to the blood oxygenation level-dependent (BOLD) contrast, in a functional MRI experiment. MATERIALS AND METHODS Nine participants viewed contrast reversing (7.5 Hz) black-and-white checkerboard stimuli using block and event-related paradigms. DfMRI (b = 1800 mm/s2 ) and BOLD sequences were acquired. Four parameters describing the observed signal were assessed: percent signal change, spatial extent of the activation, the Euclidean distance between peak voxel locations, and the time-to-peak of the best fitting impulse response for different paradigms and sequences. RESULTS The BOLD conditions showed a higher percent signal change relative to DfMRI; however, event-related DfMRI showed the strongest group activation (t = 21.23, P < 0.0005). Activation was more diffuse and spatially closer to the BOLD response for DfMRI when the block design was used. DfMRIevent showed the shortest TTP (4.4 +/- 0.88 sec). CONCLUSION The hemodynamic contribution to DfMRI may increase with the use of block designs.
Resumo:
In this paper, the complete mitochondrial genome of Acraea issoria (Lepidoptera: Nymphalidae: Heliconiinae: Acraeini) is reported; a circular molecule of 15,245 bp in size. For A. issoria, genes are arranged in the same order and orientation as the complete sequenced mitochondrial genomes of the other lepidopteran species, except for the presence of an extra copy of tRNAIle(AUR)b in the control region. All protein-coding genes of A. issoria mitogenome start with a typical ATN codon and terminate in the common stop codon TAA, except that COI gene uses TTG as its initial codon and terminates in a single T residue. All tRNA genes possess the typical clover leaf secondary structure except for tRNASer(AGN), which has a simple loop with the absence of the DHU stem. The sequence, organization and other features including nucleotide composition and codon usage of this mitochondrial genome were also reported and compared with those of other sequenced lepidopterans mitochondrial genomes. There are some short microsatellite-like repeat regions (e.g., (TA)9, polyA and polyT) scattered in the control region, however, the conspicuous macro-repeats units commonly found in other insect species are absent.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
Salons became popular in Europe in 17th Century as sites of philosophic and literary conversation. A group of female academics interested in Deleuzian theories experimented with the salon to challenge presentation and dissemination norms that hierarchize and centralize the human. For Deleuze and Guattari (1987), assemblages are shifting and decentering, so how might assemblages of chairs, tables, bodies, lights, space, help to trouble thinking about the methodological conventions around academic disseminations? The authors discuss the salon as a critical-cultural site: Cumming presents Deleuze and play-dough, an exploration of how the playful dissemination format of the salon prompted a re-reading of a methodological vignette from earlier research. Knight, an arts-based researcher, uses video art as a creative methodology to examine conceptualizations of rhizomes and assemblages at the salon as a dissemination site. The authors conclude that the salon, as a critical, cultural site disrupts hierarchized ways of approaching and presenting research.
Resumo:
What does the future look like for music festivals in Australia? This article examines the decline of the large festivals that have grown to dominate the scene in Australia in the last twenty years, and the rise of small, specialized festivals that offer a boutique experience.
Resumo:
This paper presents an event-based failure model to predict the number of failures that occur in water distribution assets. Often, such models have been based on analysis of historical failure data combined with pipe characteristics and environmental conditions. In this paper weather data have been added to the model to take into account the commonly observed seasonal variation of the failure rate. The theoretical basis of existing logistic regression models is briefly described in this paper, along with the refinements made to the model for inclusion of seasonal variation of weather. The performance of these refinements is tested using data from two Australian water authorities.
Resumo:
Through the application of process mining, valuable evidence-based insights can be obtained about business processes in organisations. As a result the field has seen an increased uptake in recent years as evidenced by success stories and increased tool support. However, despite this impact, current performance analysis capabilities remain somewhat limited in the context of information-poor event logs. For example, natural daily and weekly patterns are not considered. In this paper a new framework for analysing event logs is defined which is based on the concept of event gap. The framework allows for a systematic approach to sophisticated performance-related analysis of event logs containing varying degrees of information. The paper formalises a range of event gap types and then presents an implementation as well as an evaluation of the proposed approach.