926 resultados para Event-based timing
Resumo:
Background: increasing numbers of patients are surviving critical illness, but survival may be associated with a constellation of physical and psychological sequelae that can cause on going disability and reduced health-related quality of life. Limited evidence currently exists to guide the optimum structure, timing, and content of rehabilitation programmes. There is a need to both develop and evaluate interventions to support and expedite recovery during the post-ICU discharge period. This paper describes the construct development for a complex rehabilitation intervention intended to promote physical recovery following critical illness. The intervention is currently being evaluated in a randomised trial (ISRCTN09412438; funder Chief Scientists Office, Scotland). Methods: the intervention was developed using the Medical Research Council (MRC) framework for developing complex healthcare interventions. We ensured representation from a wide variety of stakeholders including content experts from multiple specialties, methodologists, and patient representation. The intervention construct was initially based on literature review, local observational and audit work, qualitative studies with ICU survivors, and brainstorming activities. Iterative refinement was aided by the publication of a National Institute for Health and Care Excellence guideline (No. 83), publicly available patient stories (Healthtalkonline), a stakeholder event in collaboration with the James Lind Alliance, and local piloting. Modelling and further work involved a feasibility trial and development of a novel generic rehabilitation assistant (GRA) role. Several rounds of external peer review during successive funding applications also contributed to development. Results: the final construct for the complex intervention involved a dedicated GRA trained to pre-defined competencies across multiple rehabilitation domains (physiotherapy, dietetics, occupational therapy, and speech/language therapy), with specific training in post-critical illness issues. The intervention was from ICU discharge to 3 months post-discharge, including inpatient and post-hospital discharge elements. Clear strategies to provide information to patients/families were included. A detailed taxonomy was developed to define and describe the processes undertaken, and capture them during the trial. The detailed process measure description, together with a range of patient, health service, and economic outcomes were successfully mapped on to the modified CONSORT recommendations for reporting non-pharmacologic trial interventions. Conclusions: the MRC complex intervention framework was an effective guide to developing a novel post-ICU rehabilitation intervention. Combining a clearly defined new healthcare role with a detailed taxonomy of process and activity enabled the intervention to be clearly described for the purpose of trial delivery and reporting. These data will be useful when interpreting the results of the randomised trial, will increase internal and external trial validity, and help others implement the intervention if the intervention proves clinically and cost effective.
Resumo:
Importance: critical illness results in disability and reduced health-related quality of life (HRQOL), but the optimum timing and components of rehabilitation are uncertain. Objective: to evaluate the effect of increasing physical and nutritional rehabilitation plus information delivered during the post–intensive care unit (ICU) acute hospital stay by dedicated rehabilitation assistants on subsequent mobility, HRQOL, and prevalent disabilities. Design, Setting, and Participants: a parallel group, randomized clinical trial with blinded outcome assessment at 2 hospitals in Edinburgh, Scotland, of 240 patients discharged from the ICU between December 1, 2010, and January 31, 2013, who required at least 48 hours of mechanical ventilation. Analysis for the primary outcome and other 3-month outcomes was performed between June and August 2013; for the 6- and 12-month outcomes and the health economic evaluation, between March and April 2014. Interventions: during the post-ICU hospital stay, both groups received physiotherapy and dietetic, occupational, and speech/language therapy, but patients in the intervention group received rehabilitation that typically increased the frequency of mobility and exercise therapies 2- to 3-fold, increased dietetic assessment and treatment, used individualized goal setting, and provided greater illness-specific information. Intervention group therapy was coordinated and delivered by a dedicated rehabilitation practitioner. Main Outcomes and Measures: the Rivermead Mobility Index (RMI) (range 0-15) at 3 months; higher scores indicate greater mobility. Secondary outcomes included HRQOL, psychological outcomes, self-reported symptoms, patient experience, and cost-effectiveness during a 12-month follow-up (completed in February 2014). Results: median RMI at randomization was 3 (interquartile range [IQR], 1-6) and at 3 months was 13 (IQR, 10-14) for the intervention and usual care groups (mean difference, −0.2 [95% CI, −1.3 to 0.9; P = .71]). The HRQOL scores were unchanged by the intervention (mean difference in the Physical Component Summary score, −0.1 [95% CI, −3.3 to 3.1; P = .96]; and in the Mental Component Summary score, 0.2 [95% CI, −3.4 to 3.8; P = .91]). No differences were found for self-reported symptoms of fatigue, pain, appetite, joint stiffness, or breathlessness. Levels of anxiety, depression, and posttraumatic stress were similar, as were hand grip strength and the timed Up & Go test. No differences were found at the 6- or 12-month follow-up for any outcome measures. However, patients in the intervention group reported greater satisfaction with physiotherapy, nutritional support, coordination of care, and information provision. Conclusions and Relevance: post-ICU hospital-based rehabilitation, including increased physical and nutritional therapy plus information provision, did not improve physical recovery or HRQOL, but improved patient satisfaction with many aspects of recovery.
Resumo:
Abstract : Many individuals that had a stroke have motor impairments such as timing deficits that hinder their ability to complete daily activities like getting dressed. Robotic rehabilitation is an increasingly popular therapeutic avenue in order to improve motor recovery among this population. Yet, most studies have focused on improving the spatial aspect of movement (e.g. reaching), and not the temporal one (e.g. timing). Hence, the main aim of this study was to compare two types of robotic rehabilitation on the immediate improvement of timing accuracy: haptic guidance (HG), which consists of guiding the person to make the correct movement, and thus decreasing his or her movement errors, and error amplification (EA), which consists of increasing the person’s movement errors. The secondary objective consisted of exploring whether the side of the stroke lesion had an effect on timing accuracy following HG and EA training. Thirty-four persons that had a stroke (average age 67 ± 7 years) participated in a single training session of a timing-based task (simulated pinball-like task), where they had to activate a robot at the correct moment to successfully hit targets that were presented a random on a computer screen. Participants were randomly divided into two groups, receiving either HG or EA. During the same session, a baseline phase and a retention phase were given before and after each training, and these phases were compared in order to evaluate and compare the immediate impact of HG and EA on movement timing accuracy. The results showed that HG helped improve the immediate timing accuracy (p=0.03), but not EA (p=0.45). After comparing both trainings, HG was revealed to be superior to EA at improving timing (p=0.04). Furthermore, a significant correlation was found between the side of stroke lesion and the change in timing accuracy following EA (r[subscript pb]=0.7, p=0.001), but not HG (r[subscript pb]=0.18, p=0.24). In other words, a deterioration in timing accuracy was found for participants with a lesion in the left hemisphere that had trained with EA. On the other hand, for the participants having a right-sided stroke lesion, an improvement in timing accuracy was noted following EA. In sum, it seems that HG helps improve the immediate timing accuracy for individuals that had a stroke. Still, the side of the stroke lesion seems to play a part in the participants’ response to training. This remains to be further explored, in addition to the impact of providing more training sessions in order to assess any long-term benefits of HG or EA.
Resumo:
As an emerging innovation paradigm gaining momentum in recent years, the open innovation paradigm is calling for greater theoretical depth and more empirical research. This dissertation proposes that open innovation in the context of open source software sponsorship may be viewed as knowledge strategies of the firm. Hence, this dissertation examines the performance determinants of open innovation through the lens of knowledge-based perspectives. Using event study and regression methodologies, this dissertation found that these open source software sponsorship events can indeed boost the stock market performance of US public firms. In addition, both the knowledge capabilities of the firms and the knowledge profiles of the open source projects they sponsor matter for performance. In terms of firm knowledge capabilities, internet service firms perform better than other firms owing to their advantageous complementary capabilities. Also, strong knowledge exploitation capabilities of the firm are positively associated with performance. In terms of the knowledge profile of sponsored projects, platform projects perform better than component projects. Also, community-originated projects outperform firm-originated projects. Finally, based on these findings, this dissertation discussed the important theoretical implications for the strategic tradeoff between knowledge protection and sharing.
Resumo:
The next generation of vehicles will be equipped with automated Accident Warning Systems (AWSs) capable of warning neighbouring vehicles about hazards that might lead to accidents. The key enabling technology for these systems is the Vehicular Ad-hoc Networks (VANET) but the dynamics of such networks make the crucial timely delivery of warning messages challenging. While most previously attempted implementations have used broadcast-based data dissemination schemes, these do not cope well as data traffic load or network density increases. This problem of sending warning messages in a timely manner is addressed by employing a network coding technique in this thesis. The proposed NETwork COded DissEmination (NETCODE) is a VANET-based AWS responsible for generating and sending warnings to the vehicles on the road. NETCODE offers an XOR-based data dissemination scheme that sends multiple warning in a single transmission and therefore, reduces the total number of transmissions required to send the same number of warnings that broadcast schemes send. Hence, it reduces contention and collisions in the network improving the delivery time of the warnings. The first part of this research (Chapters 3 and 4) asserts that in order to build a warning system, it is needful to ascertain the system requirements, information to be exchanged, and protocols best suited for communication between vehicles. Therefore, a study of these factors along with a review of existing proposals identifying their strength and weakness is carried out. Then an analysis of existing broadcast-based warning is conducted which concludes that although this is the most straightforward scheme, loading can result an effective collapse, resulting in unacceptably long transmission delays. The second part of this research (Chapter 5) proposes the NETCODE design, including the main contribution of this thesis, a pair of encoding and decoding algorithms that makes the use of an XOR-based technique to reduce transmission overheads and thus allows warnings to get delivered in time. The final part of this research (Chapters 6--8) evaluates the performance of the proposed scheme as to how it reduces the number of transmissions in the network in response to growing data traffic load and network density and investigates its capacity to detect potential accidents. The evaluations use a custom-built simulator to model real-world scenarios such as city areas, junctions, roundabouts, motorways and so on. The study shows that the reduction in the number of transmissions helps reduce competition in the network significantly and this allows vehicles to deliver warning messages more rapidly to their neighbours. It also examines the relative performance of NETCODE when handling both sudden event-driven and longer-term periodic messages in diverse scenarios under stress caused by increasing numbers of vehicles and transmissions per vehicle. This work confirms the thesis' primary contention that XOR-based network coding provides a potential solution on which a more efficient AWS data dissemination scheme can be built.
Resumo:
Fault tolerance allows a system to remain operational to some degree when some of its components fail. One of the most common fault tolerance mechanisms consists on logging the system state periodically, and recovering the system to a consistent state in the event of a failure. This paper describes a general fault tolerance logging-based mechanism, which can be layered over deterministic systems. Our proposal describes how a logging mechanism can recover the underlying system to a consistent state, even if an action or set of actions were interrupted mid-way, due to a server crash. We also propose different methods of storing the logging information, and describe how to deploy a fault tolerant master-slave cluster for information replication. We adapt our model to a previously proposed framework, which provided common relational features, like transactions with atomic, consistent, isolated and durable properties, to NoSQL database management systems.
Resumo:
Abundance and composition of marine benthic communities have been relatively well studied in the SE Brazilian coast, but little is known on patterns controlling the distribution of their planktonic larval stages. A survey of larval abundance in the continental margin, using a Multi-Plankton Sampler, was conducted in a cross-shelf transect off Cabo Frio (23 degrees S and 42 degrees W) during a costal upwelling event. Hydrographic conditions were monitored through discrete CDT casts. Chlorophyll-a in the top 100 m of the water column was determined and changes in surface chlorophyll-a was estimated using SeaWiFS images. Based on the larval abundances and the meso-scale hydrodynamics scenario, our results suggest two different processes affecting larval distributions. High larval densities were found nearshore due to the upwelling event associated with high chlorophyll a and strong along shore current. on the continental slope, high larval abundance was associated with a clockwise rotating meander, which may have entrapped larvae from a region located further north (Cabo de Sao Tome, 22 degrees S and 41 degrees W). In mid-shelf areas, our data suggests that vertical migration may likely occur as a response to avoid offshore transport by upwelling plumes and/or cyclonic meanders. The hydrodynamic scenario observed in the study area has two distinct yet extremely important consequences: larval retention on food-rich upwelling areas and the broadening of the tropical domain to southernmost subtropical areas. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The Homogeneous Charge Compression Ignition (HCCI) engine is a promising combustion concept for reducing NOx and particulate matter (PM) emissions and providing a high thermal efficiency in internal combustion engines. This concept though has limitations in the areas of combustion control and achieving stable combustion at high loads. For HCCI to be a viable option for on-road vehicles, further understanding of its combustion phenomenon and its control are essential. Thus, this thesis has a focus on both the experimental setup of an HCCI engine at Michigan Technological University (MTU) and also developing a physical numerical simulation model called the Sequential Model for Residual Affected HCCI (SMRH) to investigate performance of HCCI engines. The primary focus is on understanding the effects of intake and exhaust valve timings on HCCI combustion. For the experimental studies, this thesis provided the contributions for development of HCCI setup at MTU. In particular, this thesis made contributions in the areas of measurement of valve profiles, measurement of piston to valve contact clearance for procuring new pistons for further studies of high geometric compression ratio HCCI engines. It also consists of developing and testing a supercharging station and the setup of an electrical air heater to extend the HCCI operating region. The HCCI engine setup is based on a GM 2.0 L LHU Gen 1 engine which is a direct injected engine with variable valve timing (VVT) capabilities. For the simulation studies, a computationally efficient modeling platform has been developed and validated against experimental data from a single cylinder HCCI engine. In-cylinder pressure trace, combustion phasing (CA10, CA50, BD) and performance metrics IMEP, thermal efficiency, and CO emission are found to be in good agreement with experimental data for different operating conditions. Effects of phasing intake and exhaust valves are analyzed using SMRH. In addition, a novel index called Fuel Efficiency and Emissions (FEE) index is defined and is used to determine the optimal valve timings for engine operation through the use of FEE contour maps.
Resumo:
The loss of prestressing force over time influences the long-term deflection of the prestressed concrete element. Prestress losses are inherently complex due to the interaction of concrete creep, concrete shrinkage, and steel relaxation. Implementing advanced materials such as ultra-high performance concrete (UHPC) further complicates the estimation of prestress losses because of the changes in material models dependent on curing regime. Past research shows compressive creep is "locked in" when UHPC cylinders are subjected to thermal treatment before being loaded in compression. However, the current precasting manufacturing process would typically load the element (through prestressing strand release from the prestressing bed) before the element would be taken to the curing facility. Members of many ages are stored until curing could be applied to all of them at once. This research was conducted to determine the impact of variable curing times for UHPC on the prestress losses, and hence deflections. Three UHPC beams, a rectangular section, a modified bulb tee section, and a pi-girder, were assessed for losses and deflections using an incremental time step approach and material models specific to UHPC based on compressive creep and shrinkage testing. Results show that although it is important for prestressed UHPC beams to be thermally treated, to "lock in" material properties, the timing of thermal treatment leads to negligible differences in long-term deflections. Results also show that for UHPC elements that are thermally treated, changes in deflection are caused only by external loads because prestress losses are "locked-in" following thermal treatment.
Resumo:
Event extraction from texts aims to detect structured information such as what has happened, to whom, where and when. Event extraction and visualization are typically considered as two different tasks. In this paper, we propose a novel approach based on probabilistic modelling to jointly extract and visualize events from tweets where both tasks benefit from each other. We model each event as a joint distribution over named entities, a date, a location and event-related keywords. Moreover, both tweets and event instances are associated with coordinates in the visualization space. The manifold assumption that the intrinsic geometry of tweets is a low-rank, non-linear manifold within the high-dimensional space is incorporated into the learning framework using a regularization. Experimental results show that the proposed approach can effectively deal with both event extraction and visualization and performs remarkably better than both the state-of-the-art event extraction method and a pipeline approach for event extraction and visualization.
Resumo:
Abstract Problem Formal Volunteers in volunteer based organizations drop out at a fast pace due to many reasons like lack of interest what they are doing, conflict among volunteers, lack of motivation, job dissatisfaction due to prolonged volunteering etc. which is causing to improper functioning of these organizations and reaches a point where these volunteer based organizations find it difficult to function properly. The author in this study tries to address this particular issue of this drop out of formal volunteers. Purpose The purpose of this study is to explore the factors which helps in the retention of formal volunteers in a volunteer based organization for a longer period. Method The research in this paper is done in a qualitative way with primary data collected in the form of participant observation and open interview in two voluntary organizations. The collected data is analyzed in content analysis. The secondary data is collected in the form of necessary documents provided by the participating organizations. Results Many factors were found to influence retention of volunteers namely Job satisfaction, Motivation, Public Service Motivation, Organizational Commitment, Mission Attachment, Work load, Relationship with Coworkers, Justice of Organization, Flexible Timing, Training & Orientation. Conclusions Recommendations to improve retention is mentioned and a future model is also proposed. The result obtained from this research can be generalized to other form of small scale volunteer organizations where the major employees are formal volunteers.
Resumo:
Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.
Resumo:
We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.
Resumo:
Glyphosate-based herbicides (GBHs) are the most globally used herbicides raising the risk of environmental exposition. Carcinogenic effects are only one component of the multiple adverse health effects of Glyphosate and GBHs that have been reported. Questions related to hazards and corresponding risks identified in relation to endocrine disrupting effects are rising. The present study investigated the possible reproductive/developmental toxicity of GBHs administered to male and female Sprague-Dawley rats under various calendar of treatment. Assessments included maternal and reproductive outcome of F0 and F1 dams exposed to GBHs throughout pregnancy and lactation and developmental landmarks and sexual characteristics of offspring. The study was designed in two stages. In the first stage Glyphosate, or its commercial formulation Roundup Bioflow, was administered to rats at the dose of 1.75 mg/kg bw/day (Glyphosate US Acceptable Daily Intake) from the prenatal period until adulthood. In the second stage, multiple toxicological parameters were simultaneously assessed, including multigeneration reproductive/developmental toxicity of Glyphosate and two GBHs (Roundup Bioflow and Ranger Pro). Man-equivalent doses, beginning from 0.5 mg/kg bw/day (ADI Europe) up to 50 mg/kg bw/day (NOAEL Glyphosate), were administered to male and female rats, covering specific windows of biological susceptibility. The results of stage 1 and preliminary data from stage 2 experiments characterize GBHs as probable endocrine disruptors as suggested by: 1) androgen-like effects of Roundup Bioflow, including a significant increase of anogenital distances in both males and females, delay of first estrous and increased testosterone in females; 2) slight puberty onset anticipation in the high dose of Ranger Pro group, observed in the F1 generation treated from in utero life until adulthood; 3) a delayed balano-preputial separation achievement in the high dose of Ranger Pro-treated males exposed only during the peri-pubertal period, indicating a direct and specific effect of GBHs depending on the timing of exposure.
Resumo:
The aim of this thesis was to quantify experimentally in the field the effects of different timing regimes of hypoxia on the structure of benthic communities in a transitional habitat. The experiment was performed from 8 July to 29 July 2019 in a shallow subtidal area in Pialassa Baiona (Italy), a lagoon characterized by mixing regimes dominated by the tide. The benthic community was isolated using cylinders 15,5Cm x 20Cm size. Hypoxic conditions were imposed by covering the treated cylinders with a black plastic bag while control cylinders were left uncovered. We created 4 different timing regimes of hypoxia by manipulating both the duration of hypoxia (4 or 8 days) as well as the ratio between the duration of subsequent periods of hypoxia and the duration of a normoxic period between subsequent hypoxic events (D4R3/2, D8R3/2). At the end of each experimental trial, the benthic communities within each pot were retrieved, sieved in the field and subsequent analyzed in the laboratory where organisms were identified and counted. Results showed that benthic organism were generally negatively affected by hypoxic stress events. As expected, longer hypoxic events caused a stronger decrease of benthic community abundance. When the hypoxic events were interrupted by the normoxic event there were two different results. If the hypoxic period was too long, the normoxic period didn’t cause a positive recovery effect, and further decline of the benthic community was observed. Conversely normoxia had positive effects if the period of hypoxia was short enough not to compromise the benthic community. This resulted in a statistically significant interaction between the tested factors Duration and Ratio. Amphipods were the most sensitive organisms to hypoxia. We conclude that the effects of hypoxia can be greatly relieved by short normoxic periods if they happen frequently enough.