986 resultados para Validated Interval Software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract PURPOSE: Compensatory responses may attenuate the effectiveness of exercise training in weight management. The aim of this study was to compare the effect of moderate- and high-intensity interval training on eating behavior compensation. METHODS: Using a crossover design, 10 overweight and obese men participated in 4-week moderate (MIIT) and high (HIIT) intensity interval training. MIIT consisted of 5-min cycling stages at ± 20% of mechanical work at 45%VO(2)peak, and HIIT consisted of alternate 30-s work at 90%VO(2)peak and 30-s rests, for 30 to 45 min. Assessments included a constant-load exercise test at 45%VO(2)peak for 45 min followed by 60-min recovery. Appetite sensations were measured during the exercise test using a Visual Analog Scale. Food preferences (liking and wanting) were assessed using a computer-based paradigm, and this paradigm uses 20 photographic food stimuli varying along two dimensions, fat (high or low) and taste (sweet or nonsweet). An ad libitum test meal was provided after the constant-load exercise test. RESULTS: Exercise-induced hunger and desire to eat decreased after HIIT, and the difference between MIIT and HIIT in desire to eat approached significance (p = .07). Exercise-induced liking for high-fat nonsweet food tended to increase after MIIT and decreased after HIIT (p = .09). Fat intake decreased by 16% after HIIT, and increased by 38% after MIIT, with the difference between MIIT and HIIT approaching significance (p = .07). CONCLUSIONS: This study provides evidence that energy intake compensation differs between MIIT and HIIT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Transthoracic echocardiography (TTE) during extra corporeal membrane oxygenation (ECMO) is important but can be technically challenging. Contrast-specific TTE can improve imaging in suboptimal studies. These contrast microspheres are hydrodynamically labile structures. This study assessed the feasibility of contrast echocardiography (CE) during venovenous (VV) ECMO in a validated ovine model. Method: Twenty-four sheep were commenced on VV ECMO. Parasternal long-axis (Plax) and short-axis (Psax) views were obtained pre- and postcontrast while on VV ECMO. Endocardial definition scores (EDS) per segment were graded: 1 = good, 2 = suboptimal 3 = not seen. Endocardial border definition score index (EBDSI) was calculated for each view. Endocardial length (EL) in the Plax view for the left ventricle (LV) and right ventricle (RV) was measured. Results: Summation EDS data for the LV and RV for unenhanced TTE (UE) versus CE TTE imaging: EDS 1 = 289 versus 346, EDS 2 = 38 versus 10, EDS 3 = 33 versus 4, respectively. Wilcoxon matched-pairs rank-sign tests showed a significant ranking difference (improvement) pre- and postcontrast for the LV (P < 0.0001), RV (P < 0.0001) and combined ventricular data (P < 0.0001). EBDSI for CE TTE was significantly lower than UE TTE for the LV (1.05 ± 0.17 vs. 1.22 ± 0.38, P = 0.0004) and RV (1.06 ± 0.22 vs. 1.42 ± 0.47, P = 0.0.0006) respectively. Visualized EL was significantly longer in CE versus UE for both the LV (58.6 ± 11.0 mm vs. 47.4 ± 11.7 mm, P < 0.0001) and the RV (52.3 ± 8.6 mm vs. 36.0 ± 13.1 mm, P < 0.0001), respectively. Conclusions: Despite exposure to destructive hydrodynamic forces, CE is a feasible technique in an ovine ECMO model. CE results in significantly improved EDS and increased EL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metabolic imaging using positron emission tomography (PET) has found increasing clinical use for the management of infiltrating tumours such as glioma. However, the heterogeneous biological nature of tumours and intrinsic treatment resistance in some regions means that knowledge of multiple biological factors is needed for effective treatment planning. For example, the use of 18F-FDOPA to identify infiltrative tumour and 18F-FMISO for localizing hypoxic regions. Performing multiple PET acquisitions is impractical in many clinical settings, but previous studies suggest multiplexed PET imaging could be viable. The fidelity of the two signals is affected by the injection interval, scan timing and injected dose. The contribution of this work is to propose a framework to explicitly trade-off signal fidelity with logistical constraints when designing the imaging protocol. The particular case of estimating 18F-FMISO from a single frame prior to injection of 18F-FDOPA is considered. Theoretical experiments using simulations for typical biological scenarios in humans demonstrate that results comparable to a pair of single-tracer acquisitions can be obtained provided protocol timings are carefully selected. These results were validated using a pre-clinical data set that was synthetically multiplexed. The results indicate that the dual acquisition of 18F-FMISO and 18F-FDOPA could be feasible in the clinical setting. The proposed framework could also be used to design protocols for other tracers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The aim of this study was to develop a model capable of predicting variability in the mental workload experienced by frontline operators under routine and nonroutine conditions. Background: Excess workload is a risk that needs to be managed in safety-critical industries. Predictive models are needed to manage this risk effectively yet are difficult to develop. Much of the difficulty stems from the fact that workload prediction is a multilevel problem. Method: A multilevel workload model was developed in Study 1 with data collected from an en route air traffic management center. Dynamic density metrics were used to predict variability in workload within and between work units while controlling for variability among raters. The model was cross-validated in Studies 2 and 3 with the use of a high-fidelity simulator. Results: Reported workload generally remained within the bounds of the 90% prediction interval in Studies 2 and 3. Workload crossed the upper bound of the prediction interval only under nonroutine conditions. Qualitative analyses suggest that nonroutine events caused workload to cross the upper bound of the prediction interval because the controllers could not manage their workload strategically. Conclusion: The model performed well under both routine and nonroutine conditions and over different patterns of workload variation. Application: Workload prediction models can be used to support both strategic and tactical workload management. Strategic uses include the analysis of historical and projected workflows and the assessment of staffing needs. Tactical uses include the dynamic reallocation of resources to meet changes in demand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared fat oxidation rate from a graded exercise test (GXT) with a moderate-intensity interval training session (MIIT) in obese men. Twelve sedentary obese males (age 29 ± 4.1 years; BMI 29.1 ± 2.4 kg·m-2; fat mass 31.7 ± 4.4 %body mass) completed two exercise sessions: GXT to determine maximal fat oxidation (MFO) and maximal aerobic power (VO2max), and an interval cycling session during which respiratory gases were measured. The 30-min MIIT involved 5-min repetitions of workloads 20% below and 20% above the MFO intensity. VO2max was 31.8 ± 5.5 ml·kg-1·min-1 and all participants achieved ≥ 3 of the designated VO2max test criteria. The MFO identified during the GXT was not significantly different compared with the average fat oxidation rate in the MIIT session. During the MIIT session, fat oxidation rate increased with time; the highest rate (0.18 ± 0.11 g·min- 1) in minute 25 was significantly higher than the rate at minute 5 and 15 (p ≤ 0.01 and 0.05 respectively). In this cohort with low aerobic fitness, fat oxidation during the MIIT session was comparable with the MFO determined during a GXT. Future research may consider if the varying workload in moderate-intensity interval training helps adherence to exercise without compromising fat oxidation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to progress beyond currently available medical devices and implants, the concept of tissue engineering has moved into the centre of biomedical research worldwide. The aim of this approach is not to replace damaged tissue with an implant or device but rather to prompt the patient's own tissue to enact a regenerative response by using a tissue-engineered construct to assemble new functional and healthy tissue. More recently, it has been suggested that the combination of Synthetic Biology and translational tissue-engineering techniques could enhance the field of personalized medicine, not only from a regenerative medicine perspective, but also to provide frontier technologies for building and transforming the research landscape in the field of in vitro and in vivo disease models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. To confirm the association of a functional single-nucleotide polymorphism (SNP), C1858T (rs2476601), in the PTPN22 gene of British Caucasian rheumatoid arthritis (RA) patients and to evaluate its influence on the RA phenotype. Methods. A total of 686 RA patients and 566 healthy volunteers, all of British Caucasian origin, were genotyped for C1858T polymorphism by PCR-restriction fragment length polymorphism assay. Data were analysed using SPSS software and the χ 2 test as applicable. Results. The PTPN22 1858T risk allele was more prevalent in the RA patients (13.9%) compared with the healthy controls (10.3%) (P = 0.008, odds ratio 1.4, 95% confidence interval 1.09-1.79). The association of the T allele was restricted to those with rheumatoid factor (RF)-positive disease (n = 524, 76.4%) (P = 0.004, odds ratio 1.5, 95% confidence interval 1.1-1.9). We found no association between PTPN22 and the presence of the HLA-DRB1 shared epitope or clinical characteristics. Conclusions. We confirmed the previously reported association of PTPN22 with RF-positive RA, which was independent from the HLA-DRB1 genotype.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research explored how small and medium enterprises can achieve success with software as a service (SaaS) applications from cloud. Based upon an empirical investigation of six growth oriented and early technology adopting small and medium enterprises, this study proposes a SaaS for small and medium enterprise success model with two approaches: one for basic and one for advanced benefits. The basic model explains the effective use of SaaS for achieving informational and transactional benefits. The advanced model explains the enhanced use of software as a service for achieving strategic and transformational benefits. Both models explicate the information systems capabilities and organizational complementarities needed for achieving success with SaaS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chemical composition of rainwater changes from sea to inland under the influence of several major factors - topographic location of area, its distance from sea, annual rainfall. A model is developed here to quantify the variation in precipitation chemistry under the influence of inland distance and rainfall amount. Various sites in India categorized as 'urban', 'suburban' and 'rural' have been considered for model development. pH, HCO3, NO3 and Mg do not change much from coast to inland while, SO4 and Ca change is subjected to local emissions. Cl and Na originate solely from sea salinity and are the chemistry parameters in the model. Non-linear multiple regressions performed for the various categories revealed that both rainfall amount and precipitation chemistry obeyed a power law reduction with distance from sea. Cl and Na decrease rapidly for the first 100 km distance from sea, then decrease marginally for the next 100 km, and later stabilize. Regression parameters estimated for different cases were found to be consistent (R-2 similar to 0.8). Variation in one of the parameters accounted for urbanization. Model was validated using data points from the southern peninsular region of the country. Estimates are found to be within 99.9% confidence interval. Finally, this relationship between the three parameters - rainfall amount, coastline distance, and concentration (in terms of Cl and Na) was validated with experiments conducted in a small experimental watershed in the south-west India. Chemistry estimated using the model was in good correlation with observed values with a relative error of similar to 5%. Monthly variation in the chemistry is predicted from a downscaling model and then compared with the observed data. Hence, the model developed for rain chemistry is useful in estimating the concentrations at different spatio-temporal scales and is especially applicable for south-west region of India. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a novel approach to building large-scale agent-based models of networked physical systems using a compositional approach to provide extensibility and flexibility in building the models and simulations. A software framework (MODAM - MODular Agent-based Model) was implemented for this purpose, and validated through simulations. These simulations allow assessment of the impact of technological change on the electricity distribution network looking at the trajectories of electricity consumption at key locations over many years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bug fixing is a highly cooperative work activity where developers, testers, product managers and other stake-holders collaborate using a bug tracking system. In the context of Global Software Development (GSD), where software development is distributed across different geographical locations, we focus on understanding the role of bug trackers in supporting software bug fixing activities. We carried out a small-scale ethnographic fieldwork in a software product team distributed between Finland and India at a multinational engineering company. Using semi-structured interviews and in-situ observations of 16 bug cases, we show that the bug tracker 1) supported information needs of different stake holder, 2) established common-ground, and 3) reinforced issues related to ownership, performance and power. Consequently, we provide implications for design around these findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poor pharmacokinetics is one of the reasons for the withdrawal of drug candidates from clinical trials. There is an urgent need for investigating in vitro ADME (absorption, distribution, metabolism and excretion) properties and recognising unsuitable drug candidates as early as possible in the drug development process. Current throughput of in vitro ADME profiling is insufficient because effective new synthesis techniques, such as drug design in silico and combinatorial synthesis, have vastly increased the number of drug candidates. Assay technologies for larger sets of compounds than are currently feasible are critically needed. The first part of this work focused on the evaluation of cocktail strategy in studies of drug permeability and metabolic stability. N-in-one liquid chromatography-tandem mass spectrometry (LC/MS/MS) methods were developed and validated for the multiple component analysis of samples in cocktail experiments. Together, cocktail dosing and LC/MS/MS were found to form an effective tool for increasing throughput. First, cocktail dosing, i.e. the use of a mixture of many test compounds, was applied in permeability experiments with Caco-2 cell culture, which is a widely used in vitro model for small intestinal absorption. A cocktail of 7-10 reference compounds was successfully evaluated for standardization and routine testing of the performance of Caco-2 cell cultures. Secondly, cocktail strategy was used in metabolic stability studies of drugs with UGT isoenzymes, which are one of the most important phase II drug metabolizing enzymes. The study confirmed that the determination of intrinsic clearance (Clint) as a cocktail of seven substrates is possible. The LC/MS/MS methods that were developed were fast and reliable for the quantitative analysis of a heterogenous set of drugs from Caco-2 permeability experiments and the set of glucuronides from in vitro stability experiments. The performance of a new ionization technique, atmospheric pressure photoionization (APPI), was evaluated through comparison with electrospray ionization (ESI), where both techniques were used for the analysis of Caco-2 samples. Like ESI, also APPI proved to be a reliable technique for the analysis of Caco-2 samples and even more flexible than ESI because of the wider dynamic linear range. The second part of the experimental study focused on metabolite profiling. Different mass spectrometric instruments and commercially available software tools were investigated for profiling metabolites in urine and hepatocyte samples. All the instruments tested (triple quadrupole, quadrupole time-of-flight, ion trap) exhibited some good and some bad features in searching for and identifying of expected and non-expected metabolites. Although, current profiling software is helpful, it is still insufficient. Thus a time-consuming largely manual approach is still required for metabolite profiling from complex biological matrices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.