899 resultados para Aftermath of cerebrovascular event


Relevância:

100.00% 100.00%

Publicador:

Resumo:

News of the attacks on New York and Washington on September 11th 2001 spread fast, mainly through dramatic images of the events broadcast via a global television media, particularly 24-hour news channels such as BBC News 24 and CNN. Following the initial report many news channels moved to dedicated live coverage of the story. This move, to what Liebes (1998) describes as a 'disaster marathon', entails shifting from the routine, regular news agenda to one where the event and its aftermath become the main story and reference for all other news. In this paper, we draw upon recordings from the BBC News 24 channel on September 11th 2001 during the immediate aftermath of the attacks on the World Trade Centre and Pentagon to argue that the coverage of this event, and other similar types of events, may be characterised as news permeated with strategic and emergent silences. Identifying silence as both concrete and metaphorical, we suggest that there are a number of types of silence found in the coverage and that these not only act to cover for lack of new news, or give emphasis or gravitas, but also that the vacuum created by a lack of news creates an emotional space in which collective shock, grieving or wonder are managed through news presented as phatic communion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major task of traditional temporal event sequence mining is to predict the occurrences of a special type of event (called target event) in a long temporal sequence. Our previous work has defined a new type of pattern, called event-oriented pattern, which can potentially predict the target event within a certain period of time. However, in the event-oriented pattern discovery, because the size of interval for prediction is pre-defined, the mining results could be inaccurate and carry misleading information. In this paper, we introduce a new concept, called temporal feature, to rectify this shortcoming. Generally, for any event-oriented pattern discovered under the pre-given size of interval, the temporal feature is the minimal size of interval that makes the pattern interesting. Thus, by further investigating the temporal features of discovered event-oriented patterns, we can refine the knowledge for the target event prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose - To provide an example of the use of system dynamics within the context of a discrete-event simulation study. Design/methodology/approach - A discrete-event simulation study of a production-planning facility in a gas cylinder-manufacturing plant is presented. The case study evidence incorporates questionnaire responses from sales managers involved in the order-scheduling process. Findings - As the project progressed it became clear that, although the discrete-event simulation would meet the objectives of the study in a technical sense, the organizational problem of "delivery performance" would not be solved by the discrete-event simulation study alone. The case shows how the qualitative outcomes of the discrete-event simulation study led to an analysis using the system dynamics technique. The system dynamics technique was able to model the decision-makers in the sales and production process and provide a deeper understanding of the performance of the system. Research limitations/implications - The case study describes a traditional discrete-event simulation study which incorporated an unplanned investigation using system dynamics. Further, case studies using a planned approach to showing consideration of organizational issues in discrete-event simulation studies are required. Then the role of both qualitative data in a discrete-event simulation study and the use of supplementary tools which incorporate organizational aspects may help generate a methodology for discrete-event simulation that incorporates human aspects and so improve its relevance for decision making. Practical implications - It is argued that system dynamics can provide a useful addition to the toolkit of the discrete-event simulation practitioner in helping them incorporate a human aspect in their analysis. Originality/value - Helps decision makers gain a broader perspective on the tools available to them by showing the use of system dynamics to supplement the use of discrete-event simulation. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent literature has argued that whereas remembering the past and imagining the future make use of shared cognitive substrates, simulating future events places heavier demands on executive resources. These propositions were explored in 3 experiments comparing the impact of imagery and concurrent task demands on speed and accuracy of past event retrieval and future event simulation. Results provide support for the suggestion that both past and future episodes can be constructed through 2 mechanisms: a noneffortful "direct" pathway and a controlled, effortful "generative" pathway. However, limited evidence emerged for the suggestion that simulating of future, compared with retrieving past, episodes places heavier demands on executive resources; only under certain conditions did it emerge as a more error prone and lengthier process. The findings are discussed in terms of how retrieval and simulation make use of the same cognitive substrates in subtly different ways. © 2011 American Psychological Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Before dawn on August 24, 1992, Hurricane Andrew smashed into south Florida, particularly southern Dade County, and soon become the costliest natural disaster in U.S. history. Andrew's impacts quickly overwhelmed local and state emergency response capabilities and eventually required major federal assistance, including regular military units. While the social and economic impacts of Hurricane Andrew are relatively well researched, much less attention has been given to its possible political effects. ^ Focusing on incumbent officeholders at three levels (municipal, state legislative, and statewide) who stood for reelection after Hurricane Andrew, this study seeks to determine whether they experienced any political effects from Andrew. That is, this study explores the possible interaction between the famous “incumbency advantage” and an “extreme event,” in this case a natural disaster. The specific foci were (1) campaigns and campaigning (a research process that included 43 personal interviews), and (2) election results before and after the event. ^ Given well-documented response problems, the working hypothesis was that incumbents experienced largely negative political fallout from the disaster. The null hypothesis was that incumbents saw no net political effects, but the reverse hypothesis was also considered: incumbents benefited politically from the event. ^ In the end, this study found that although the election process was physically disrupted, especially in south Dade County, the disaster largely reinforced the incumbency advantage. More specifically, the aftermath of Hurricane Andrew allowed most incumbent officeholders to (1) enhance constituency service, (2) associate themselves with the flow of external assistance, (3) achieve major personal visibility and media coverage, and yet (4) appear non-political or at least above normal politics. Overall, this combination allowed incumbents to very effectively “campaign without campaigning,” a point borne out by post-Andrew election results. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Patients with autoimmune disease have increased incidence of stroke. Hemorrhagic stroke (HS) is associated with loss of cerebrovascular function, leading to micro-vessel burst, and hemorrhage. We believe chronic inflammation is involved in loss of cerebrovascular function and HS. We established a hypertensive-arthritis model in spontaneously hypertensive rats (SHR) fed either standard rodent diet (0.59% NaCl) (RD) or high salt diet (4% NaCl) (HSD) and compared them to non-inflamed SHR. Methods: Complete Freund’s adjuvant (CFA) was injected into the left paw to induce mono-arthritis. Blood pressure and inflammation was monitored. At endpoint, animals were sacrificed and evaluated for HS while middle cerebral artery (MCA) was isolated for functional studies. Results: HS was observed in 90% of CFA-treated groups. The MCA of arthritic RD-SHR exhibited decreased ability to undergo pressure dependent constriction (PDC). All HSD-SHR showed a decreased response to PDC. However, arthritic HSD-SHR also demonstrated a diminished response to vasoactive peptides. Conclusion: HS occurring with CFA injection corresponds with loss of MCA function. Chronic HSD appears to further exacerbate vascular dysfunction in the MCA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article will discuss notions and concepts of remembering in the aftermath of the Charlie Hebdo attacks. Much has been written about the immediate response to the attacks, both commending the collective spirit of unity that defined the ‘marche républicaine’ of 11 January 2015, and criticising the alleged hypocrisy and cynicism of, most notably, the political figures that took to the streets that day, hand in hand. I will consider a selection of the memory practices that have emerged since then, notably on the anniversary of the event. This demonstration of memory provides key insights into the form and manner of remembering within a particular cultural group, but also reflects how the present moment is integral to our understanding of memory. The purpose of this article is to consider how official and non-official remembering of Charlie Hebdo can intertwine as well as pull in separate directions. A focus on the politics, the language, the aesthetics and the geography of commemorative activities in this article will enable an appreciation of the multidirectional character of remembering Charlie Hebdo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current thesis examines memory bias for state anxiety prior to academic achievement situations like writing an exam and giving a speech. The thesis relies on the reconstruction principle, which assumes that memories for past emotions are reconstructed rather than stored permanently and accurately. This makes them prone to memory bias, which is af-fected by several influencing factors. A major aim is to include four important influencing factors simultaneously. Early research on mood and emotional autobiographical memory found evidence for the existence of a propositional associative network (Bower, 1981; Col-lins & Loftus, 1975), leading to mood congruent recall. But empirical findings gave also strong evidence for the existence of mood incongruent recall for one’s own emotions, which was for example linked to mood regulation via mood repair (e.g. Clark & Isen, 1982), which seems to be associated to the personality traits extraversion and neuroticism (Lischetzke & Eid, 2006; Ng & Diener, 2009). Moreover, neuroticism and trait anxiety are related to rumination, which is seen as negative post-event-processing (e.g. Wells & Clark, 1997). Overall, the elapsed time since the emotional event happened should have an impact on recall of emotions. Following the affect infusion model by Robinson and Clore (2002a), the influence of personality on memory bias should increase over time. Therefore, three longitudinal studies were realized, using naturally occurring as well as laboratory settings. The used paradigm was equivalent in all studies. Subjects were asked about their actual state anxiety prior to an academic achievement situation. Directly after the situation, cur-rent mood and recall of former anxiety were assessed. The same procedure was repeated a few weeks later. Personality traits and post-event-processing were also assessed. The results suggest a need to have a differentiated view on predicting memory bias. Study 1 (N = 131) as well as study 3 (N = 53) found evidence for mood incongruent memory in the sense of mood repair and downward regulation as a function of personality. Rumination was found to cause stable overestimation of pre-event anxiety in study 2 (N = 141) as well as in study 3. Although the relevance of the influencing factors changed over time, an increasing relevance of personality could not consistently be observed. The tremendously different effects of the laboratory study 2 indicated that such settings are not appropriate to study current issues. Theoretical and psychotherapeutically relevant conclusions are drawn and several limitations are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sequences of timestamped events are currently being generated across nearly every domain of data analytics, from e-commerce web logging to electronic health records used by doctors and medical researchers. Every day, this data type is reviewed by humans who apply statistical tests, hoping to learn everything they can about how these processes work, why they break, and how they can be improved upon. To further uncover how these processes work the way they do, researchers often compare two groups, or cohorts, of event sequences to find the differences and similarities between outcomes and processes. With temporal event sequence data, this task is complex because of the variety of ways single events and sequences of events can differ between the two cohorts of records: the structure of the event sequences (e.g., event order, co-occurring events, or frequencies of events), the attributes about the events and records (e.g., gender of a patient), or metrics about the timestamps themselves (e.g., duration of an event). Running statistical tests to cover all these cases and determining which results are significant becomes cumbersome. Current visual analytics tools for comparing groups of event sequences emphasize a purely statistical or purely visual approach for comparison. Visual analytics tools leverage humans' ability to easily see patterns and anomalies that they were not expecting, but is limited by uncertainty in findings. Statistical tools emphasize finding significant differences in the data, but often requires researchers have a concrete question and doesn't facilitate more general exploration of the data. Combining visual analytics tools with statistical methods leverages the benefits of both approaches for quicker and easier insight discovery. Integrating statistics into a visualization tool presents many challenges on the frontend (e.g., displaying the results of many different metrics concisely) and in the backend (e.g., scalability challenges with running various metrics on multi-dimensional data at once). I begin by exploring the problem of comparing cohorts of event sequences and understanding the questions that analysts commonly ask in this task. From there, I demonstrate that combining automated statistics with an interactive user interface amplifies the benefits of both types of tools, thereby enabling analysts to conduct quicker and easier data exploration, hypothesis generation, and insight discovery. The direct contributions of this dissertation are: (1) a taxonomy of metrics for comparing cohorts of temporal event sequences, (2) a statistical framework for exploratory data analysis with a method I refer to as high-volume hypothesis testing (HVHT), (3) a family of visualizations and guidelines for interaction techniques that are useful for understanding and parsing the results, and (4) a user study, five long-term case studies, and five short-term case studies which demonstrate the utility and impact of these methods in various domains: four in the medical domain, one in web log analysis, two in education, and one each in social networks, sports analytics, and security. My dissertation contributes an understanding of how cohorts of temporal event sequences are commonly compared and the difficulties associated with applying and parsing the results of these metrics. It also contributes a set of visualizations, algorithms, and design guidelines for balancing automated statistics with user-driven analysis to guide users to significant, distinguishing features between cohorts. This work opens avenues for future research in comparing two or more groups of temporal event sequences, opening traditional machine learning and data mining techniques to user interaction, and extending the principles found in this dissertation to data types beyond temporal event sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Legislative Oversight Committee of the South Carolina House of Representatives, referred allegations pertaining to the Department of Juvenile Justice (DJJ) which were generated during its ongoing oversight study of DJJ. Specifically, the safety issues focused on lack of control; lack of trust; and lack of adequate staffing. This review’s scope and objectives were: Investigate specific complainant allegations of DJJ employees underreporting, misreporting, or destroying ERs; Review the efficiency and effectiveness of DJJ’s event reporting process and follow-up on anomalies or potential patterns of systemic underreporting, misreporting, or missing ERs; and Assess juvenile and employee safety conditions through interviewing a cross-section of relevant employees, record review, and possibly an employee survey.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show effects of the event-by-event fluctuation of the initial conditions (IC) in hydrodynamic description of high-energy nuclear collisions on some observables. Such IC produce not only fluctuations in observables but, due to their bumpy structure, several non-trivial effects appear. They enhance production of isotropically distributed high-p(T) particles, making upsilon(2) smaller there. Also, they reduce upsilon(2) in the forward and backward regions where the global matter density is smaller, so where such effects become more efficacious. They may also produce the so-called ridge effect in the two large-p(T) particle correlation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parity (P)-odd domains, corresponding to nontrivial topological solutions of the QCD vacuum, might be created during relativistic heavy-ion collisions. These domains are predicted to lead to charge separation of quarks along the orbital momentum of the system created in noncentral collisions. To study this effect, we investigate a three-particle mixed-harmonics azimuthal correlator which is a P-even observable, but directly sensitive to the charge-separation effect. We report measurements of this observable using the STAR detector in Au + Au and Cu + Cu collisions at root s(NN) = 200 and 62 GeV. The results are presented as a function of collision centrality, particle separation in rapidity, and particle transverse momentum. A signal consistent with several of the theoretical expectations is detected in all four data sets. We compare our results to the predictions of existing event generators and discuss in detail possible contributions from other effects that are not related to P violation.