954 resultados para lean implementation time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Robot surgery is a further step towards new potential developments in minimally invasive surgery. Surgeons must keep abreast of these new technologies and learn their limits and possibilities. Robot-assisted laparoscopic cholecystectomy has not yet been performed in our institution. The purpose of this report is to present the pathway of implementation of robotic laparoscopic cholecystectomy in a university hospital. METHODS: The Zeus(R) robot system was used. Experimental training was performed on animals. The results of our experimental training allowed us to perform our first two clinical cases. RESULTS: Robot arm set-up and trocar placement required 53 and 35 minutes. Operative time were 59 and 45 minutes respectively. The overall operative time was 112 and 80 minutes, respectively. There were no intraoperative complications. Patients were discharged from the hospital after an overnight stay. CONCLUSION: Robotic laparoscopic cholecystectomy is safe and patient recovery similar to those of standard laparoscopy. At present, there are no advantages of robotic over conventional surgery. Nevertheless, robots have the potential to revolutionise the way surgery is performed. Robot surgery is not reserved for a happy few. This technology deserves more attention because it has the potential to change the way surgery is performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Lipids stored in adipose tissue can originate from dietary lipids or from de novo lipogenesis (DNL) from carbohydrates. Whether DNL is abnormal in adipose tissue of overweight individuals remains unknown. The present study was undertaken to assess the effect of carbohydrate overfeeding on glucose-induced whole body DNL and adipose tissue lipogenic gene expression in lean and overweight humans. DESIGN: Prospective, cross-over study. SUBJECTS AND METHODS: A total of 11 lean (five male, six female, mean BMI 21.0+/-0.5 kg/m(2)) and eight overweight (four males, four females, mean BMI 30.1+/-0.6 kg/m(2)) volunteers were studied on two occasions. On one occasion, they received an isoenergetic diet containing 50% carbohydrate for 4 days prior to testing; on the other, they received a hyperenergetic diet (175% energy requirements) containing 71% carbohydrates. After each period of 4 days of controlled diet, they were studied over 6 h after having received 3.25 g glucose/kg fat free mass. Whole body glucose oxidation and net DNL were monitored by means of indirect calorimetry. An adipose tissue biopsy was obtained at the end of this 6-h period and the levels of SREBP-1c, acetyl CoA carboxylase, and fatty acid synthase mRNA were measured by real-time PCR. RESULTS: After isocaloric feeding, whole body net DNL amounted to 35+/-9 mg/kg fat free mass/5 h in lean subjects and to 49+/-3 mg/kg fat free mass/5 h in overweight subjects over the 5 h following glucose ingestion. These figures increased (P<0.001) to 156+/-21 mg/kg fat free mass/5 h in lean and 64+/-11 mg/kg fat free mass/5 h (P<0.05 vs lean) in overweight subjects after carbohydrate overfeeding. Whole body DNL after overfeeding was lower (P<0.001) and glycogen synthesis was higher (P<0.001) in overweight than in normal subjects. Adipose tissue SREBP-1c mRNA increased by 25% in overweight and by 43% in lean subjects (P<0.05) after carbohydrate overfeeding, whereas fatty acid synthase mRNA increased by 66 and 84% (P<0.05). CONCLUSION: Whole body net DNL is not increased during carbohydrate overfeeding in overweight individuals. Stimulation of adipose lipogenic enzymes is also not higher in overweight subjects. Carbohydrate overfeeding does not stimulate whole body net DNL nor expression of lipogenic enzymes in adipose tissue to a larger extent in overweight than lean subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aquest projecte consisteix en fer l’anàlisi, disseny i implementació d'un sistema d'autenticació a través de contrasenyes d’un sol ús (One Time Password ‐OTP‐) per a dispositius mòbils. Per evitar l’ús de contrasenyes estàtiques farem una aplicació per a telèfons mòbils capaç de generar contrasenyes aleatòries gràcies a uns paràmetres previs, així com de poder tenir un registre dels serveis on poden ser utilitzades. Partirem d’un protocol repte/resposta on l’usuari interactuarà amb el seu telèfon mòbil i un ordinador personal amb una connexió a Internet. Podrà registrar‐se i, introduint certes dades al mòbil que li proporciona el servidor, podrà fer el procés d’autenticar‐se per poder accedir a zones restringides del servei.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quintessence of recent natural science studies is that the 2 degrees C target can only be achieved with massive emission reductions in the next few years. The central twist of this paper is the addition of this limited time to act into a non-perpetual real options framework analysing optimal climate policy under uncertainty. The window-of-opportunity modelling setup shows that the limited time to act may spark a trend reversal in the direction of low-carbon alternatives. However, the implementation of a climate policy is evaded by high uncertainty about possible climate pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interim Report on Progress of European Working Time Directive Pilot Projects The establishment of the National Implementation Group EWTD and the subsequent commencement of a number of EWTD pilot projects marks a significant stage in the implementation of the European Working Time Directive in Ireland. Click here to download PDF 780kb

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Enhanced recovery protocols may reduce postoperative complications and length of hospital stay. However, the implementation of these protocols requires time and financial investment. This study evaluated the cost-effectiveness of enhanced recovery implementation. METHODS: The first 50 consecutive patients treated during implementation of an enhanced recovery programme were compared with 50 consecutive patients treated in the year before its introduction. The enhanced recovery protocol principally implemented preoperative counselling, reduced preoperative fasting, preoperative carbohydrate loading, avoidance of premedication, optimized fluid balance, standardized postoperative analgesia, use of a no-drain policy, as well as early nutrition and mobilization. Length of stay, readmissions and complications within 30 days were compared. A cost-minimization analysis was performed. RESULTS: Hospital stay was significantly shorter in the enhanced recovery group: median 7 (interquartile range 5-12) versus 10 (7-18) days (P = 0·003); two patients were readmitted in each group. The rate of severe complications was lower in the enhanced recovery group (12 versus 20 per cent), but there was no difference in overall morbidity. The mean saving per patient in the enhanced recovery group was euro1651. CONCLUSION: Enhanced recovery is cost-effective, with savings evident even in the initial implementation period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed a comprehensive study to assess the fit for purpose of four chromatographic conditions for the determination of six groups of marine lipophilic toxins (okadaic acid and dinophysistoxins, pectenotoxins, azaspiracids, yessotoxins, gymnodimine and spirolides) by LC-MS/MS to select the most suitable conditions as stated by the European Union Reference Laboratory for Marine Biotoxins (EURLMB). For every case, the elution gradient has been optimized to achieve a total run-time cycle of 12 min. We performed a single-laboratory validation for the analysis of three relevant matrices for the seafood aquaculture industry (mussels, pacific oysters and clams), and for sea urchins for which no data about lipophilic toxins have been reported before. Moreover, we have compared the method performance under alkaline conditions using two quantification strategies: the external standard calibration (EXS) and the matrix-matched standard calibration (MMS). Alkaline conditions were the only scenario that allowed detection windows with polarity switching in a 3200 QTrap mass spectrometer, thus the analysis of all toxins can be accomplished in a single run, increasing sample throughput. The limits of quantification under alkaline conditions met the validation requirements established by the EURLMB for all toxins and matrices, while the remaining conditions failed in some cases. The accuracy of the method and the matrix effects where generally dependent on the mobile phases and the seafood species. The MMS had a moderate positive impact on method accuracy for crude extracts, but it showed poor trueness for seafood species other than mussels when analyzing hydrolyzed extracts. Alkaline conditions with EXS and recovery correction for OA were selected as the most proper conditions in the context of our laboratory. This comparative study can help other laboratories to choose the best conditions for the implementation of LC-MS/MS according to their own necessities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the finite field (FF) treatment of vibrational polarizabilities and hyperpolarizabilities, the field-free Eckart conditions must be enforced in order to prevent molecular reorientation during geometry optimization. These conditions are implemented for the first time. Our procedure facilities identification of field-induced internal coordinates that make the major contribution to the vibrational properties. Using only two of these coordinates, quantitative accuracy for nuclear relaxation polarizabilities and hyperpolarizabilities is achieved in π-conjugated systems. From these two coordinates a single most efficient natural conjugation coordinate (NCC) can be extracted. The limitations of this one coordinate approach are discussed. It is shown that the Eckart conditions can lead to an isotope effect that is comparable to the isotope effect on zero-point vibrational averaging, but with a different mass-dependence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous respiratory-exchange measurements were performed on ten moderately obese and ten lean young women for 1 h before, 3 h during, and 3 h after either parenteral (IV) or intragastric (IG) administration of a nutrient mixture infused at twice the postabsorptive, resting energy expenditure (REE). REE rose significantly from 0.98 +/- 0.02 to 1.13 +/- 0.03 kcal/min (IV) and from 0.99 +/- 0.02 to 1.13 +/- 0.02 kcal/min (IG) in the lean group; from 1.10 +/- 0.02 to 1.27 +/- 0.03 kcal/min (IV) and from 1.11 +/- 0.02 to 1.29 +/- 0.03 (IG) in the obese group. These increases resulted in similar nutrient-induced thermogenesis of 10.0 +/- 0.7% (IV) and 9.3 +/- 0.9% (IG) in the lean group; of 9.2 +/- 0.7% (IV) and 10.1 +/- 0.8% (IG) in the obese. Nutrient utilization was comparable in both groups and in both routes of administration, although the response time to IG feeding was delayed. These results showed no significant difference in both the thermogenic response and nutrient utilization between moderately obese and control groups using acute IV or IG feeding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In coronary magnetic resonance angiography, a magnetization-preparation scheme for T2 -weighting (T2 Prep) is widely used to enhance contrast between the coronary blood-pool and the myocardium. This prepulse is commonly applied without spatial selection to minimize flow sensitivity, but the nonselective implementation results in a reduced magnetization of the in-flowing blood and a related penalty in signal-to-noise ratio. It is hypothesized that a spatially selective T2 Prep would leave the magnetization of blood outside the T2 Prep volume unaffected and thereby lower the signal-to-noise ratio penalty. To test this hypothesis, a spatially selective T2 Prep was implemented where the user could freely adjust angulation and position of the T2 Prep slab to avoid covering the ventricular blood-pool and saturating the in-flowing spins. A time gap of 150 ms was further added between the T2 Prep and other prepulses to allow for in-flow of a larger volume of unsaturated spins. Consistent with numerical simulation, the spatially selective T2 Prep increased in vivo human coronary artery signal-to-noise ratio (42.3 ± 2.9 vs. 31.4 ± 2.2, n = 22, P < 0.0001) and contrast-to-noise-ratio (18.6 ± 1.5 vs. 13.9 ± 1.2, P = 0.009) as compared to those of the nonselective T2 Prep. Additionally, a segmental analysis demonstrated that the spatially selective T2 Prep was most beneficial in proximal and mid segments where the in-flowing blood volume was largest compared to the distal segments. Magn Reson Med, 2013. © 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform an experimental test of Maskin's canonical mechanism for Nashimplementation, using 3 subjects in non-repeated groups, as well as 3 outcomes, states of nature, and integer choices. We find that this mechanism succesfully implements the desired outcome a large majority of the time and an imbedded comprehension test indicates that subjects were generally able to comprehend their decision tasks. The performance can also be improved by imposing a fine on non designated dissidents. We offer some explanations for the imperfect implementation, including risk preferences, the possibilities that agents have for collusion, and the mixed strategy equilibria of the game.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1.1 Fundamentals Chest pain is a common complaint in primary care patients (1 to 3% of all consultations) (1) and its aetiology can be miscellaneous, from harmless to potentially life threatening conditions. In primary care practice, the most prevalent aetiologies are: chest wall syndrome (43%), coronary heart disease (12%) and anxiety (7%) (2). In up to 20% of cases, potentially serious conditions as cardiac, respiratory or neoplasic diseases underlie chest pain. In this context, a large number of laboratory tests are run (42%) and over 16% of patients are referred to a specialist or hospitalized (2).¦A cardiovascular origin to chest pain can threaten patient's life and investigations run to exclude a serious condition can be expensive and involve a large number of exams or referral to specialist -­‐ often without real clinical need. In emergency settings, up to 80% of chest pains in patients are due to cardiovascular events (3) and scoring methods have been developed to identify conditions such as coronary heart disease (HD) quickly and efficiently (4-­‐6). In primary care, a cardiovascular origin is present in only about 12% of patients with chest pain (2) and general practitioners (GPs) need to exclude as safely as possible a potential serious condition underlying chest pain. A simple clinical prediction rule (CPR) like those available in emergency settings may therefore help GPs and spare time and extra investigations in ruling out CHD in primary care patients. Such a tool may also help GPs reassure patients with more common origin to chest pain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Although there is no strong evidence of benefit, chest physiotherapy (CP) seems to be commonly used in simple pneumonia. CP requires equipment and frequently involves the assistance of a respiratory therapist, engendering a significant medical workload and cost. AIM: To measure and compare the efficacy of two modalities of chest physiotherapy (CP) guideline implementation on the appropriateness of CP prescription among patients hospitalised for community-acquired pneumonia (CAP). PATIENTS AND METHODS: We measured the CP prescription rate and duration in all consecutive CAP inpatients admitted in a division of general internal medicine at an urban teaching community hospital during three consecutive one-year time periods: (1) before any guideline implementation; (2) after a passive implementation by medical grand rounds and guideline diffusion through mailing; (3) after adding a one-page reminder in the CAP patient's medical chart highlighting our recommendations. Death and recurrent hospitalisation rates within one year after hospitalisation were recorded to assess whether CP prescription reduction, if any, impaired patient outcomes. RESULTS: During the three successive phases, 127, 157, and 147 patients with similar characteristics were included. Among all CAP inpatients, the CP prescription rate decreased from 68% (86/127) to 51% (80/157), and to 48% (71/147), respectively (P for trend <0.01 for trend). A significant reduction in CP duration was observed after the active guideline implementation (12.0, 11.0, 7.0days, respectively) and persisted after adjustment for length of stay. Reductions in CP prescription rate and duration were also observed among CAP patients with COPD CP prescription rate: 97% (30/31), 67% (24/36), 75% (35/47), respectively (P<0.01 for trend). The mean cost of CP per patient was reduced by 56%, from $709 to $481, and to $309, respectively. Neither the in-hospital deaths, the one-year overall recurrent hospitalisation nor the one-year CAP-specific recurrent hospitalisation significantly differed between the three phases. CONCLUSION: Both passive and active implementation of guidelines appear to improve the appropriateness of CP prescription among inpatients with CAP without impairing their outcomes. Restricting CP use to patients who benefit from this treatment might be an opportunity to decrease CAP medical cost and workload.