854 resultados para Paleoseismic events
Resumo:
Much progress has been made in estimating recurrence intervals of great and giant subduction earthquakes using terrestrial, lacustrine, and marine paleoseismic archives. Recent detailed records suggest these earthquakes may have variable recurrence periods and magnitudes forming supercycles. Understanding seismic supercycles requires long paleoseismic archives that record timing and magnitude of such events. Turbidite paleoseismic archives may potentially extend past earthquake records to the Pleistocene and can thus complement commonly shorter-term terrestrial archives. However, in order to unambiguously establish recurring seismicity as a trigger mechanism for turbidity currents, synchronous deposition of turbidites in widely spaced, isolated depocenters has to be ascertained. Furthermore, characteristics that predispose a seismically active continental margin to turbidite paleoseismology and the correct sample site selection have to be taken into account. Here we analyze 8 marine sediment cores along 950 km of the Chile margin to test for the feasibility of compiling detailed and continuous paleoseismic records based on turbidites. Our results suggest that the deposition of areally widespread, synchronous turbidites triggered by seismicity is largely controlled by sediment supply and, hence, the climatic and geomorphic conditions of the adjacent subaerial setting. The feasibility of compiling a turbidite paleoseismic record depends on the delicate balance between sufficient sediment supply providing material to fail frequently during seismic shaking and sufficiently low sedimentation rates to allow for coeval accumulation of planktonic foraminifera for high-resolution radiocarbon dating. We conclude that offshore northern central Chile (29-32.5°S) Holocene turbidite paleoseismology is not feasible, because sediment supply from the semi-arid mainland is low and almost no Holocene turbidity-current deposits are found in the cores. In contrast, in the humid region between 36 and 38°S frequent Holocene turbidite deposition may generally correspond to paleoseismic events. However, high terrigenous sedimentation rates prevent high-resolution radiocarbon dating. The climatic transition region between 32.5 and 36°S appears to be best suited for turbidite paleoseismology.
Resumo:
The nature of this research is to investigate paleoseismic deformation of glacial soft sediments from three sampling sites throughout the Scottish Highlands; Arrat's Mills, Meikleour and Glen Roy. The paleoseismic evidence investigated in this research will provide a basis for applying criteria to soft sediment deformation structures, and the trigger mechanisms that create these structures. Micromorphology is the tool used in this to investigate paleoseismic deformation structures in thin section. Thin section analysis, (micromorphology) of glacial sediments from the three sampling sites is used to determine microscale evidence of past earthquakes that can be correlated to modem-day events and possibly lead to a better understanding of the impact of earthquakes throughout a range of sediment types. The significance of the three sampling locations is their proximity to two major active fault zones that cross Scotland. The fault zones are the Highland Boundary Fault and the Great Glen Fault, these two major faults that parallel each other and divide the country in half Sims (1975) used a set of seven criteria that identified soft sediment deformation structures created by a magnitude six earthquake in Cahfomia. Using criteria set forth by Sims (1975), the paleoseismic evidence can be correlated to the magnitude of the deformation structures found in the glacial sediments. This research determined that the microstructures at Arrat's Mill, Meikleour and Glen Roy are consistent with a seismically induced origin. It has also been demonstrated that, even without the presence of macrostructures, the use of micromorphology techniques in detecting such activity within sediments is of immense value.
Resumo:
We studied sediment cores from Lake Vens (2,327 m asl), in the Tinée Valley of the SW Alps, to test the paleoseismic archive potential of the lake sediments in this particularly earthquake-sensitive area. The historical earthquake catalogue shows that moderate to strong earthquakes, with intensities of IX–X, have impacted the Southern Alps during the last millennium. Sedimentological (X-ray images, grain size distribution) and geochemical (major elements and organic matter) analyses show that Lake Vens sediments consist of a terrigenous, silty material (minerals and organic matter) sourced from the watershed and diatom frustules. A combination of X-ray images, grain-size distribution, major elements and magnetic properties shows the presence of six homogenite-type deposits interbedded in the sedimentary background. These sedimentological features are ascribed to sediment reworking and grain sorting caused by earthquake-generated seiches. The presence of microfaults that cross-cut the sediment supports the hypothesis of seismic deposits in this system. A preliminary sediment chronology is provided by 210Pb measurement and AMS 14C ages. According to the chronology, the most recent homogenite events are attributable to damaging historic earthquakes in AD 1887 (Ligure) and 1564 (Roquebillière). Hence, the Lake Vens sediment recorded large-magnitude earthquakes in the region and permits a preliminary estimate of recurrence time for such events of ~400 years.
Resumo:
The occurrence of and conditions favourable to nucleation were investigated at an industrial and commercial coastal location in Brisbane, Australia during five different campaigns covering a total period of 13 months. To identify potential nucleation events, the difference in number concentration in the size range 14-30 nm (N14-30) between consecutive observations was calculated using first-order differencing. The data showed that nucleation events were a rare occurrence, and that in the absence of nucleation the particle number was dominated by particles in the range 30-300 nm. In many instances, total particle concentration declined during nucleation. There was no clear pattern in change in NO and NO2 concentrations during the events. SO2 concentration, in the majority of cases, declined during nucleation but there were exceptions. Most events took place in summer, followed by winter and then spring, and no events were observed for the autumn campaigns. The events were associated with sea breeze and long-range transport. Roadside emissions, in contrast, did not contribute to nucleation, probably due to the predominance of particles in the range 50-100 nm associated with these emissions.
Resumo:
This research project set out to explore Unitary Authority (UA) involvement in festivals and special events across Wales. It considers the level and nature of UA involvement and investigates activity by event purpose; reasons for, and characteristics of, UA engagement; and, crucially, the extent and nature of event evaluation. The study’s aim was to begin the development of a baseline of information for further research into the growing use of festivals and special events as a strategy for local economic development in Wales. A quantitative survey approach facilitated a comprehensive snapshot of UA responses whilst also incorporating discursive elements. A telephone survey was designed and undertaken with representatives of all 22 UA departments responsible for festivals and events in Wales. The research reveals a significant level of festival and special event activity across Wales, supported primarily for its perceived socio-cultural value. However, evaluation would appear to be focused on improving processes and measuring economic outputs rather than assessing whether socio-cultural objectives are being achieved. Whilst overwhelmingly positive about efforts to improve approaches to evaluation, respondents held clear views about the complications most likely to hamper any such efforts. These responses focused upon the need for flexibility, cost effectiveness and comparability across festival and special event typologies.
Resumo:
The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.
Resumo:
Monitoring unused or dark IP addresses offers opportunities to extract useful information about both on-going and new attack patterns. In recent years, different techniques have been used to analyze such traffic including sequential analysis where a change in traffic behavior, for example change in mean, is used as an indication of malicious activity. Change points themselves say little about detected change; further data processing is necessary for the extraction of useful information and to identify the exact cause of the detected change which is limited due to the size and nature of observed traffic. In this paper, we address the problem of analyzing a large volume of such traffic by correlating change points identified in different traffic parameters. The significance of the proposed technique is two-fold. Firstly, automatic extraction of information related to change points by correlating change points detected across multiple traffic parameters. Secondly, validation of the detected change point by the simultaneous presence of another change point in a different parameter. Using a real network trace collected from unused IP addresses, we demonstrate that the proposed technique enables us to not only validate the change point but also extract useful information about the causes of change points.