926 resultados para Event-based timing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

During Ocean Drilling Program (ODP) Leg 180, 11 sites were drilled in the vicinity of the Moresby Seamount to study processes associated with the transition from continental rifting to seafloor spreading in the Woodlark Basin. This paper presents thermochronologic (40Ar/39Ar, 238U/206Pb, and fission track) results from igneous rocks recovered during ODP Leg 180 that help constrain the latest Cretaceous to present-day tectonic development of the Woodlark Basin. Igneous rocks recovered (primarily from Sites 1109, 1114, 1117, and 1118) consist of predominantly diabase and metadiabase, with minor basalt and gabbro. Zircon ion microprobe analyses gave a 238U/206Pb age of 66.4 ± 1.5 Ma, interpreted to date crystallization of the diabase. 40Ar/39Ar plagioclase apparent ages vary considerably according to the degree to which the diabase was altered subsequent to crystallization. The least altered sample (from Site 1109) yielded a plagioclase isochron age of 58.9 ± 5.8 Ma, interpreted to represent cooling following intrusion. The most altered sample (from Site 1117) yielded an isochron age of 31.0 ± 0.9 Ma, interpreted to represent a maximum age for the timing of subsequent hydrothermal alteration. The diabase has not been thermally affected by Miocene-Pliocene rift-related events, supporting our inference that these rocks have remained at shallow and cool levels in the crust (i.e., upper plate) since they were partially reset as a result of middle Oligocene hydrothermal alteration. These results suggest that crustal extension in the vicinity of the Moresby Seamount, immediately west of the active seafloor spreading tip, is being accommodated by normal faulting within latest Cretaceous to early Paleocene oceanic crust. Felsic clasts provide additional evidence for middle Miocene and Pliocene magmatic events in the region. Two rhyolitic clasts (from Sites 1110 and 1111) gave zircon 238U/206Pb ages of 15.7 ± 0.4 Ma and provide evidence for Miocene volcanism in the region. 40Ar/39Ar total fusion ages on single grains of K-feldspar from these clasts yielded younger apparent ages of 12.5 ± 0.2 and 14.4 ± 0.6 Ma due to variable sericitization of K-feldspar phenocrysts. 238U/206Pb zircon, 40Ar/39Ar K-feldspar and biotite total fusion, and apatite fission track analysis of a microgranite clast (from Site 1108) provide evidence for the existence of a rapidly cooled 3.0 to 1.8 Ma granitic protolith. The clast may have been transported longitudinally from the west (e.g., from the D'Entrecasteaux Islands). Alternatively, it may have been derived from a more proximal, but presently unknown, source in the vicinity of the Moresby Seamount.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent revisions of the geological time scale by Kent and Gradstein (in press) suggest that, on the average, Cretaceous magnetic anomalies are approximately 10 m.y. older than in Larson and Hilde's (1975) previous time scale. These revised basement ages change estimates for the duration of alteration in the ocean crust, based on the difference between secondary-mineral isochron ages and magnetic isochron-crustal ages, from 3 to approximately 13 m.y. In addition to the revised time scale, Burke et al.'s (1982) new data on the temporal variation of 87Sr/86Sr in seawater allow a better understanding of the timing of alteration and more realistic determinations of water/rock ratios during seawater-basalt interaction. Carbonates from all DSDP sites which reached Layer 2 of Atlantic crust (Sites 105, 332, 417, and 418) are deposited within 10-15 m.y. of crustal formation from solutions with 87Sr/86Sr ratios identical to unaltered or contemporaneous seawater. Comparisons of the revised seawater curve with the 87Sr/86Sr of basement carbonates is consistent with a duration of approximately 10-15 m.y. for alteration in the ocean crust. Our preliminary Sr and 87Sr/86Sr data for carbonates from Hole 504B, on 5.9-m.y.-old crust south of the Costa Rica Rift, suggest that hydrous solutions from which carbonates precipitated contained substantial amounts of basaltic Sr. For this reason, carbonate 87Sr/86Sr cannot be used to estimate the duration of alteration at this site. A basalt-dominated alteration environment at Hole 504B is consistent with heat-flow evidence which indicates rapid sediment burial of crust at the Costa Rica Rift, sealing it from access by seawater and resulting in unusually low water/rock ratios during alteration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change and continuous urbanization contribute to an increased urban vulnerability towards flooding. Only relying on traditional flood control measures is recognized as inadequate, since the damage can be catastrophic if flood controls fail. The idea of a flood-resilient city – one which can withstand or adapt to a flood event without being harmed in its functionality – seems promising. But what does resilience actually mean when it is applied to urban environments exposed to flood risk, and how can resilience be achieved? This paper presents a heuristic framework for assessing the flood resilience of cities, for scientists and policy-makers alike. It enriches the current literature on flood resilience by clarifying the meaning of its three key characteristics – robustness, adaptability and transformability – and identifying important components to implement resilience strategies. The resilience discussion moves a step forward, from predominantly defining resilience to generating insight into “doing” resilience in practice. The framework is illustrated with two case studies from Hamburg, showing that resilience, and particularly the underlying notions of adaptability and transformability, first and foremost require further capacity-building among public as well as private stakeholders. The case studies suggest that flood resilience is currently not enough motivation to move from traditional to more resilient flood protection schemes in practice; rather, it needs to be integrated into a bigger urban agenda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The branched vs. isoprenoid tetraether (BIT) index is based on the relative abundance of branched tetraether lipids (brGDGTs) and the isoprenoidal GDGT crenarchaeol. In Lake Challa sediments the BIT index has been applied as a proxy for local monsoon precipitation on the assumption that the primary source of brGDGTs is soil washed in from the lake's catchment. Since then, microbial production within the water column has been identified as the primary source of brGDGTs in Lake Challa sediments, meaning that either an alternative mechanism links BIT index variation with rainfall or that the proxy's application must be reconsidered. We investigated GDGT concentrations and BIT index variation in Lake Challa sediments at a decadal resolution over the past 2200 years, in combination with GDGT time-series data from 45 monthly sediment-trap samples and a chronosequence of profundal surface sediments.

Our 2200-year geochemical record reveals high-frequency variability in GDGT concentrations, and therefore in the BIT index, superimposed on distinct lower-frequency fluctuations at multi-decadal to century timescales. These changes in BIT index are correlated with changes in the concentration of crenarchaeol but not with those of the brGDGTs. A clue for understanding the indirect link between rainfall and crenarchaeol concentration (and thus thaumarchaeotal abundance) was provided by the observation that surface sediments collected in January 2010 show a distinct shift in GDGT composition relative to sediments collected in August 2007. This shift is associated with increased bulk flux of settling mineral particles with high Ti / Al ratios during March–April 2008, reflecting an event of unusually high detrital input to Lake Challa concurrent with intense precipitation at the onset of the principal rain season that year. Although brGDGT distributions in the settling material are initially unaffected, this soil-erosion event is succeeded by a massive dry-season diatom bloom in July–September 2008 and a concurrent increase in the flux of GDGT-0. Complete absence of crenarchaeol in settling particles during the austral summer following this bloom indicates that no Thaumarchaeota bloom developed at that time. We suggest that increased nutrient availability, derived from the eroded soil washed into the lake, caused the massive bloom of diatoms and that the higher concentrations of ammonium (formed from breakdown of this algal matter) resulted in a replacement of nitrifying Thaumarchaeota, which in typical years prosper during the austral summer, by nitrifying bacteria. The decomposing dead diatoms passing through the suboxic zone of the water column probably also formed a substrate for GDGT-0-producing archaea. Hence, through a cascade of events, intensive rainfall affects thaumarchaeotal abundance, resulting in high BIT index values.

Decade-scale BIT index fluctuations in Lake Challa sediments exactly match the timing of three known episodes of prolonged regional drought within the past 250 years. Additionally, the principal trends of inferred rainfall variability over the past two millennia are consistent with the hydroclimatic history of equatorial East Africa, as has been documented from other (but less well dated) regional lake records. We therefore propose that variation in GDGT production originating from the episodic recurrence of strong soil-erosion events, when integrated over (multi-)decadal and longer timescales, generates a stable positive relationship between the sedimentary BIT index and monsoon rainfall at Lake Challa. Application of this paleoprecipitation proxy at other sites requires ascertaining the local processes which affect the productivity of crenarchaeol by Thaumarchaeota and brGDGTs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern manufacturing systems should satisfy emerging needs related to sustainable development. The design of sustainable manufacturing systems can be valuably supported by simulation, traditionally employed mainly for time and cost reduction. In this paper, a multi-purpose digital simulation approach is proposed to deal with sustainable manufacturing systems design through Discrete Event Simulation (DES) and 3D digital human modelling. DES models integrated with data on power consumption of the manufacturing equipment are utilized to simulate different scenarios with the aim to improve productivity as well as energy efficiency, avoiding resource and energy waste. 3D simulation based on digital human modelling is employed to assess human factors issues related to ergonomics and safety of manufacturing systems. The approach is implemented for the sustainability enhancement of a real manufacturing cell of the aerospace industry, automated by robotic deburring. Alternative scenarios are proposed and simulated, obtaining a significant improvement in terms of energy efficiency (−87%) for the new deburring cell, and a reduction of energy consumption around −69% for the coordinate measuring machine, with high potential annual energy cost savings and increased energy efficiency. Moreover, the simulation-based ergonomic assessment of human operator postures allows 25% improvement of the workcell ergonomic index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Educational attainment has been shown to be positively associated with mental health and a potential buffer to stressful events. One stressful life event likely to affect everyone in their lifetime is bereavement. This paper assesses the effect of educational attainment on mental health post bereavement.
Methods: By utilising large administrative datasets, linking Census returns to death records and prescribed medication data, we analysed the bereavement exposure of 208,332 individuals aged 25-74 years. Two-level multi-level logistic regression models were constructed to determine the likelihood of antidepressant medication use (a proxy of mental ill-health) post bereavement given level of educational attainment.
Results: Individuals who are bereaved have greater antidepressant use than those who are not bereaved, with over a quarter (26.5%) of those bereaved by suicide in receipt of antidepressant medication compared to just 12.4% of those not bereaved. Within individuals bereaved by a sudden death those with a University Degree or higher qualifications are 73% less likely to be in receipt of antidepressant medication compared to those with no qualifications, after full adjustment for demographic, socio-economic and area factors (OR=0.27, 95% CI 0.09,0.75). Higher educational attainment and no qualifications have an equivalent effect for those bereaved by suicide.
Conclusions: Education may protect against poor mental health, as measured by the use of antidepressant medication, post bereavement, except in those bereaved by suicide. This is likely due to the improved cognitive, personal and psychological skills gained from time spent in education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The resilience of a social-ecological system is measured by its ability to retain core functionality when subjected to perturbation. Resilience is contextually dependent on the state of system components, the complex interactions among these components, and the timing, location, and magnitude of perturbations. The stability landscape concept provides a useful framework for considering resilience within the specified context of a particular social-ecological system but has proven difficult to operationalize. This difficulty stems largely from the complex, multidimensional nature of the systems of interest and uncertainty in system response. Agent-based models are an effective methodology for understanding how cross-scale processes within and across social and ecological domains contribute to overall system resilience. We present the results of a stylized model of agricultural land use in a small watershed that is typical of the Midwestern United States. The spatially explicit model couples land use, biophysical models, and economic drivers with an agent-based model to explore the effects of perturbations and policy adaptations on system outcomes. By applying the coupled modeling approach within the resilience and stability landscape frameworks, we (1) estimate the sensitivity of the system to context-specific perturbations, (2) determine potential outcomes of those perturbations, (3) identify possible alternative states within state space, (4) evaluate the resilience of system states, and (5) characterize changes in system-scale resilience brought on by changes in individual land use decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heterogeneity has to be taken into account when integrating a set of existing information sources into a distributed information system that are nowadays often based on Service- Oriented Architectures (SOA). This is also particularly applicable to distributed services such as event monitoring, which are useful in the context of Event Driven Architectures (EDA) and Complex Event Processing (CEP). Web services deal with this heterogeneity at a technical level, also providing little support for event processing. Our central thesis is that such a fully generic solution cannot provide complete support for event monitoring; instead, source specific semantics such as certain event types or support for certain event monitoring techniques have to be taken into account. Our core result is the design of a configurable event monitoring (Web) service that allows us to trade genericity for the exploitation of source specific characteristics. It thus delivers results for the areas of SOA, Web services, CEP and EDA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of spatial attention and part-whole configuration on recognition of repeated objects were investigated with behavioral and event-related potential (ERP) measures. Short-term repetition effects were measured for probe objects as a function of whether a preceding prime object was shown as an intact image or coarsely scrambled (split into two halves) and whether or not it had been attended during the prime display. In line with previous behavioral experiments, priming effects were observed from both intact and split primes for attended objects, but only from intact (repeated same-view) objects when they were unattended. These behavioral results were reflected in ERP waveforms at occipital-temporal locations as more negative-going deflections for repeated items in the time window between 220 and 300 ms after probe onset (N250r). Attended intact images showed generally more enhanced repetition effects than split ones. Unattended images showed repetition effects only when presented in an intact configuration, and this finding was limited to the right-hemisphere electrodes. Repetition effects in earlier (before 200 ms) time-windows were limited to attended conditions at occipito-temporal sites (N1), a component linked to the encoding of object structure, while repetition effects at central locations during the same time window (P150) were found only from objects repeated in the same intact configuration—both previously attended and unattended probe objects. The data indicate that view-generalization is mediated by a combination of analytic (part-based) representations and automatic view-dependent representations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.