858 resultados para Seismic event
Resumo:
Sharing photos through mobile devices has a great potential for creating shared experiences of social events between co-located as well as remote participants. In order to design novel event sharing tools, we need to develop in-depth understanding of current practices surrounding these so called ‘event photos’- photos about and taken during different social events such as weddings picnics, and music concert visits among others. We studied people’s practices related to event photos through in-depth interviews, guided home visits and naturalistic observations. Our results show four major themes describing practices surrounding event photos: 1) representing events, 2) significant moments, 3) situated activities through photos, and 4) collectivism and roles of participants.
Resumo:
The recent floods in south-east Queensland have focused policy, academic and community attention on the challenges associated with severe weather events (SWE), specifically pre-disaster preparation, disaster-response and post-disaster community resilience. Financially, the cost of SWE was $9 billion in the 2011 Australian Federal Budget (Swan 2011); psychologically and emotionally, the impact on individual mental health and community wellbeing is also significant but more difficult to quantify. However, recent estimates suggest that as many as one in five will subsequently experience major emotional distress (Bonanno et al. 2010). With climate change predicted to increase the frequency and intensity of a wide range of SWE in Australia (Garnaut 2011; The Climate Institute 2011), there is an urgent and critical need to ensure that the unique psychological and social needs of more vulnerable community members - such as older residents - are better understood and integrated into disaster preparedness and response policy, planning and protocols. Navigating the complex dynamics of SWE can be particularly challenging for older adults and their disaster experience is frequently magnified by a wide array of cumulative and interactive stressors, which intertwine to make them uniquely vulnerable to significant short and long-term adverse effects. This current article provides a brief introduction to the current literature in this area and highlights a gap in the research relating to communication tools during and after severe weather events.
Resumo:
Event report on the Open Access and Research 2013 conference which focused on recent developments and the strategic advantages they bring to the research sector.
Resumo:
Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.
Resumo:
Dose-finding designs estimate the dose level of a drug based on observed adverse events. Relatedness of the adverse event to the drug has been generally ignored in all proposed design methodologies. These designs assume that the adverse events observed during a trial are definitely related to the drug, which can lead to flawed dose-level estimation. We incorporate adverse event relatedness into the so-called continual reassessment method. Adverse events that have ‘doubtful’ or ‘possible’ relationships to the drug are modelled using a two-parameter logistic model with an additive probability mass. Adverse events ‘probably’ or ‘definitely’ related to the drug are modelled using a cumulative logistic model. To search for the maximum tolerated dose, we use the maximum estimated toxicity probability of these two adverse event relatedness categories. We conduct a simulation study that illustrates the characteristics of the design under various scenarios. This article demonstrates that adverse event relatedness is important for improved dose estimation. It opens up further research pathways into continual reassessment design methodologies.
Resumo:
With the growing size and variety of social media files on the web, it’s becoming critical to efficiently organize them into clusters for further processing. This paper presents a novel scalable constrained document clustering method that harnesses the power of search engines capable of dealing with large text data. Instead of calculating distance between the documents and all of the clusters’ centroids, a neighborhood of best cluster candidates is chosen using a document ranking scheme. To make the method faster and less memory dependable, the in-memory and in-database processing are combined in a semi-incremental manner. This method has been extensively tested in the social event detection application. Empirical analysis shows that the proposed method is efficient both in computation and memory usage while producing notable accuracy.
Resumo:
In 2006, the International Law Commission began a study into the role of states and international organizations in protecting persons in the event of a disaster. Special Rapporteur Mr. Eduardo Valencia-Ospina was appointed to head the study, and in 2011 the findings of the study will be presented to the United Nations General Assembly. Of interest to this paper has been the inclusion of “epidemics” under the natural disaster category in all of the reports detailing the Commission’s program of work on the protection of persons. This paper seeks to examine the legal and political ramifications involved in including “epidemic” into the concept of protection by exploring where sovereign responsibility for epidemic control begins and ends, particularly in light of the revisions to the International Health Regulations by the World Health Assembly in 2005. The paper will first analyze the findings already presented by the Special Rapporteur, examining the existing “responsibilities” of both states and international organizations. Then, the paper will consider to what extent the concept of protection entails the duty to assist individuals when an affected state proves unwilling or unable to assist their own population in the event of a disease outbreak. In an attempt to answer this question, the third part of the paper will examine the recent cholera outbreak in Zimbabwe.
Resumo:
From June 7th to 15th the Thesis Eleven Centre for Cultural Sociology at La Trobe University directed by Peter Beilharz put together a programme of public lectures, cultural events and master classes under the theme ‘Word, Image, Action: Popular Print and Visual Cultures’. This article reports on the highlights of the festival, including a forum titled ‘Does WikiLeaks Matter?, a half-day event ‘On Bauman’, and a public lecture by Ron Jacobs on ‘Media Narratives of Economic Crisis’.
Resumo:
PURPOSE To compare diffusion-weighted functional magnetic resonance imaging (DfMRI), a novel alternative to the blood oxygenation level-dependent (BOLD) contrast, in a functional MRI experiment. MATERIALS AND METHODS Nine participants viewed contrast reversing (7.5 Hz) black-and-white checkerboard stimuli using block and event-related paradigms. DfMRI (b = 1800 mm/s2 ) and BOLD sequences were acquired. Four parameters describing the observed signal were assessed: percent signal change, spatial extent of the activation, the Euclidean distance between peak voxel locations, and the time-to-peak of the best fitting impulse response for different paradigms and sequences. RESULTS The BOLD conditions showed a higher percent signal change relative to DfMRI; however, event-related DfMRI showed the strongest group activation (t = 21.23, P < 0.0005). Activation was more diffuse and spatially closer to the BOLD response for DfMRI when the block design was used. DfMRIevent showed the shortest TTP (4.4 +/- 0.88 sec). CONCLUSION The hemodynamic contribution to DfMRI may increase with the use of block designs.
Resumo:
In this paper, the complete mitochondrial genome of Acraea issoria (Lepidoptera: Nymphalidae: Heliconiinae: Acraeini) is reported; a circular molecule of 15,245 bp in size. For A. issoria, genes are arranged in the same order and orientation as the complete sequenced mitochondrial genomes of the other lepidopteran species, except for the presence of an extra copy of tRNAIle(AUR)b in the control region. All protein-coding genes of A. issoria mitogenome start with a typical ATN codon and terminate in the common stop codon TAA, except that COI gene uses TTG as its initial codon and terminates in a single T residue. All tRNA genes possess the typical clover leaf secondary structure except for tRNASer(AGN), which has a simple loop with the absence of the DHU stem. The sequence, organization and other features including nucleotide composition and codon usage of this mitochondrial genome were also reported and compared with those of other sequenced lepidopterans mitochondrial genomes. There are some short microsatellite-like repeat regions (e.g., (TA)9, polyA and polyT) scattered in the control region, however, the conspicuous macro-repeats units commonly found in other insect species are absent.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
Salons became popular in Europe in 17th Century as sites of philosophic and literary conversation. A group of female academics interested in Deleuzian theories experimented with the salon to challenge presentation and dissemination norms that hierarchize and centralize the human. For Deleuze and Guattari (1987), assemblages are shifting and decentering, so how might assemblages of chairs, tables, bodies, lights, space, help to trouble thinking about the methodological conventions around academic disseminations? The authors discuss the salon as a critical-cultural site: Cumming presents Deleuze and play-dough, an exploration of how the playful dissemination format of the salon prompted a re-reading of a methodological vignette from earlier research. Knight, an arts-based researcher, uses video art as a creative methodology to examine conceptualizations of rhizomes and assemblages at the salon as a dissemination site. The authors conclude that the salon, as a critical, cultural site disrupts hierarchized ways of approaching and presenting research.
Resumo:
This paper presents an event-based failure model to predict the number of failures that occur in water distribution assets. Often, such models have been based on analysis of historical failure data combined with pipe characteristics and environmental conditions. In this paper weather data have been added to the model to take into account the commonly observed seasonal variation of the failure rate. The theoretical basis of existing logistic regression models is briefly described in this paper, along with the refinements made to the model for inclusion of seasonal variation of weather. The performance of these refinements is tested using data from two Australian water authorities.
Resumo:
This research provides validated Finite Element techniques to analyse pile foundations under seismic loads. The results show that the capability of the technique to capture the important pile response which includes kinematic and inertial interaction effects, effects of soil stiffness and depth on pile deflection patterns and permanent deformations.