928 resultados para ANESTHETIC TECHNIQUES, General


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a hybrid hazard regression model with threshold stress which includes the proportional hazards and the accelerated failure time models as particular cases. To express the behavior of lifetimes the generalized-gamma distribution is assumed and an inverse power law model with a threshold stress is considered. For parameter estimation we develop a sampling-based posterior inference procedure based on Markov Chain Monte Carlo techniques. We assume proper but vague priors for the parameters of interest. A simulation study investigates the frequentist properties of the proposed estimators obtained under the assumption of vague priors. Further, some discussions on model selection criteria are given. The methodology is illustrated on simulated and real lifetime data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patients with cleft lip and palate usually present dental anomalies of number, shape, structure and position in the cleft area and the general dentist is frequently asked to restore or extract those teeth. Considering that several anatomic variations are expected in teeth adjacent to cleft areas and that knowledge of these variations by general dentists is required for optimal treatment, the objectives of this paper are: 1) to describe changes in the innervation pattern of anterior teeth and soft tissue caused by the presence of a cleft, 2) to describe a local anesthetic procedure in unilateral and bilateral clefts, and 3) to provide recommendations to improve anesthetic procedures in patients with cleft lip and palate. The cases of 2 patients are presented: one with complete unilateral cleft lip and palate, and the other with complete bilateral cleft lip and palate. The patients underwent local anesthesia in the cleft area in order to extract teeth with poor bone support. The modified anesthetic procedure, respecting the altered course of nerves in the cleft maxilla and soft tissue alterations at the cleft site, was accomplished successfully and the tooth extraction was performed with no pain to the patients. General dentists should be aware of the anatomic variations in nerve courses in the cleft area to offer high quality treatment to patients with cleft lip and palate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This prospective, randomized, experimental study with rats aimed to investigate the influence of general treatment strategies on the motor recovery of Wistar rats with moderate contusive spinal cord injury. METHODS: A total of 51 Wistar rats were randomized into five groups: control, maze, ramp, runway, and sham (laminectomy only). The rats underwent spinal cord injury at the T9-T10 levels using the NYU-Impactor. Each group was trained for 12 minutes twice a week for two weeks before and five weeks after the spinal cord injury, except for the control group. Functional motor recovery was assessed with the Basso, Beattie, and Bresnahan Scale on the first postoperative day and then once a week for five weeks. The animals were euthanized, and the spinal cords were collected for histological analysis. RESULTS: Ramp and maze groups showed an earlier and greater functional improvement effect than the control and runway groups. However, over time, unexpectedly, all of the groups showed similar effects as the control group, with spontaneous recovery. There were no histological differences in the injured area between the trained and control groups. CONCLUSION: Short-term benefits can be associated with a specific training regime; however, the same training was ineffective at maintaining superior long-term recovery. These results might support new considerations before hospital discharge of patients with spinal cord injuries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The standard therapy for patients with high-level spinal cord injury is long-term mechanical ventilation through a tracheostomy. However, in some cases, this approach results in death or disability. The aim of this study is to highlight the anesthetics and perioperative aspects of patients undergoing insertion of a diaphragmatic pacemaker. METHODS: Five patients with quadriplegia following high cervical traumatic spinal cord injury and ventilator-dependent chronic respiratory failure were implanted with a laparoscopic diaphragmatic pacemaker after preoperative assessments of their phrenic nerve function and diaphragm contractility through transcutaneous nerve stimulation. ClinicalTrials.gov:NCT01385384. RESULTS: The diaphragmatic pacemaker placement was successful in all of the patients. Two patients presented with capnothorax during the perioperative period, which resolved without consequences. After six months, three patients achieved continuous use of the diaphragm pacing system, and one patient could be removed from mechanical ventilation for more than 4 hours per day. CONCLUSIONS: The implantation of a diaphragmatic phrenic system is a new and safe technique with potential to improve the quality of life of patients who are dependent on mechanical ventilation because of spinal cord injuries. Appropriate indication and adequate perioperative care are fundamental to achieving better results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adequate polymerization plays an important role on the longevity of the composite resin restorations. Objectives: The aim of this study was to evaluate the effect of light-curing units, curing mode techniques and storage media on sorption, solubility and biaxial flexural strength (BFS) of a composite resin. Material and Methods: Two hundred and forty specimens were made of one composite resin (Esthet-X) in a stainless steel mold (2 mm x 8 mm 0), and divided into 24 groups (n=10) established according to the 4 study factors: light-curing units: quartz tungsten halogen (QTH) lamp and light-emitting diodes (LED); energy densities: 16 J/cm(2) and 20 J/cm(2); curing modes: conventional (CM) and pulse-delay (PD); and permeants: deionized water and 75% ethanol for 28 days. Sorption and solubility tests were performed according to ISO 4049:2000 specifications. All specimens were then tested for BFS according to ASTM F394-78 specification. Data were analyzed by three-way ANOVA followed by Tukey, Kruskal-Wallis and Mann-Whitney tests (alpha=0.05). Results: In general, no significant differences were found regarding sorption, solubility or BFS means for the light-curing units and curing modes (p>0.05). Only LED unit using 16 J/cm(2) and PD using 10 s produced higher sorption and solubility values than QTH. Otherwise, using CM (16 J/cm(2)), LED produced lower values of BFS than QTH (p<0.05). 75% ethanol permeant produced higher values of sorption and solubility and lower values of BFS than water (p<0.05). Conclusion: Ethanol storage media produced more damage on composite resin than water. In general the LED and QTH curing units using 16 and 20 J/cm(2) by CM and PD curing modes produced no influence on the sorption, solubility or BFS of the tested resin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gossip protocols have proved to be a viable solution to set-up and manage largescale P2P services or applications in a fully decentralised scenario. The gossip or epidemic communication scheme is heavily based on stochastic behaviors and it is the fundamental idea behind many large-scale P2P protocols. It provides many remarkable features, such as scalability, robustness to failures, emergent load balancing capabilities, fast spreading, and redundancy of information. In some sense, these services or protocols mimic natural system behaviors in order to achieve their goals. The key idea of this work is that the remarkable properties of gossip hold when all the participants follow the rules dictated by the actual protocols. If one or more malicious nodes join the network and start cheating according to some strategy, the result can be catastrophic. In order to study how serious the threat posed by malicious nodes can be and what can be done to prevent attackers from cheating, we focused on a general attack model aimed to defeat a key service in gossip overlay networks (the Peer Sampling Service [JGKvS04]). We also focused on the problem of protecting against forged information exchanged in gossip services. We propose a solution technique for each problem; both techniques are general enough to be applied to distinct service implementations. As gossip protocols, our solutions are based on stochastic behavior and are fully decentralized. In addition, each technique’s behaviour is abstracted by a general primitive function extending the basic gossip scheme; this approach allows the adoptions of our solutions with minimal changes in different scenarios. We provide an extensive experimental evaluation to support the effectiveness of our techniques. Basically, these techniques aim to be building blocks or P2P architecture guidelines in building more resilient and more secure P2P services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The popularity of herbal products, especially plant food supplements (PFS) and herbal medicine is on the rise in Europe and other parts of the world, with increased use in the general population as well as among specific subgroups encompassing children, women or those suffering from diseases such as cancer. The aim of this paper is to examine the PFS market structures in European Community (EC) Member States as well as to examine issues addressing methodologies and consumption data relating to PFS use in Europe. A revision of recent reports on market data, trends and main distribution channels, in addition an example of the consumption of PFS in Spain, is presented. An overview of the methods and administration techniques used...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented in this thesis is focused on the open-ended coaxial-probe frequency-domain reflectometry technique for complex permittivity measurement at microwave frequencies of dispersive dielectric multilayer materials. An effective dielectric model is introduced and validated to extend the applicability of this technique to multilayer materials in on-line system context. In addition, the thesis presents: 1) a numerical study regarding the imperfectness of the contact at the probe-material interface, 2) a review of the available models and techniques, 3) a new classification of the extraction schemes with guidelines on how they can be used to improve the overall performance of the probe according to the problem requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric aerosol particles directly impact air quality and participate in controlling the climate system. Organic Aerosol (OA) in general accounts for a large fraction (10–90%) of the global submicron (PM1) particulate mass. Chemometric methods for source identification are used in many disciplines, but methods relying on the analysis of NMR datasets are rarely used in atmospheric sciences. This thesis provides an original application of NMR-based chemometric methods to atmospheric OA source apportionment. The method was tested on chemical composition databases obtained from samples collected at different environments in Europe, hence exploring the impact of a great diversity of natural and anthropogenic sources. We focused on sources of water-soluble OA (WSOA), for which NMR analysis provides substantial advantages compared to alternative methods. Different factor analysis techniques are applied independently to NMR datasets from nine field campaigns of the project EUCAARI and allowed the identification of recurrent source contributions to WSOA in European background troposphere: 1) Marine SOA; 2) Aliphatic amines from ground sources (agricultural activities, etc.); 3) Biomass burning POA; 4) Biogenic SOA from terpene oxidation; 5) “Aged” SOAs, including humic-like substances (HULIS); 6) Other factors possibly including contributions from Primary Biological Aerosol Particles, and products of cooking activities. Biomass burning POA accounted for more than 50% of WSOC in winter months. Aged SOA associated with HULIS was predominant (> 75%) during the spring-summer, suggesting that secondary sources and transboundary transport become more important in spring and summer. Complex aerosol measurements carried out, involving several foreign research groups, provided the opportunity to compare source apportionment results obtained by NMR analysis with those provided by more widespread Aerodyne aerosol mass spectrometers (AMS) techniques that now provided categorization schemes of OA which are becoming a standard for atmospheric chemists. Results emerging from this thesis partly confirm AMS classification and partly challenge it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a multibody model of a motorbike engine cranktrain is presented in this work, with an emphasis on flexible component model reduction. A modelling methodology based upon the adoption of non-ideal joints at interface locations, and the inclusion of component flexibility, is developed: both are necessary tasks if one wants to capture dynamic effects which arise in lightweight, high-speed applications. With regard to the first topic, both a ball bearing model and a journal bearing model are implemented, in order to properly capture the dynamic effects of the main connections in the system: angular contact ball bearings are modelled according to a five-DOF nonlinear scheme in order to grasp the crankshaft main bearings behaviour, while an impedance-based hydrodynamic bearing model is implemented providing an enhanced operation prediction at the conrod big end locations. Concerning the second matter, flexible models of the crankshaft and the connecting rod are produced. The well-established Craig-Bampton reduction technique is adopted as a general framework to obtain reduced model representations which are suitable for the subsequent multibody analyses. A particular component mode selection procedure is implemented, based on the concept of Effective Interface Mass, allowing an assessment of the accuracy of the reduced models prior to the nonlinear simulation phase. In addition, a procedure to alleviate the effects of modal truncation, based on the Modal Truncation Augmentation approach, is developed. In order to assess the performances of the proposed modal reduction schemes, numerical tests are performed onto the crankshaft and the conrod models in both frequency and modal domains. A multibody model of the cranktrain is eventually assembled and simulated using a commercial software. Numerical results are presented, demonstrating the effectiveness of the implemented flexible model reduction techniques. The advantages over the conventional frequency-based truncation approach are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes an integrated holistic approach to the study of neuromuscular fatigue in order to encompass all the causes and all the consequences underlying the phenomenon. Starting from the metabolic processes occurring at the cellular level, the reader is guided toward the physiological changes at the motorneuron and motor unit level and from this to the more general biomechanical alterations. In Chapter 1 a list of the various definitions for fatigue spanning several contexts has been reported. In Chapter 2, the electrophysiological changes in terms of motor unit behavior and descending neural drive to the muscle have been studied extensively as well as the biomechanical adaptations induced. In Chapter 3 a study based on the observation of temporal features extracted from sEMG signals has been reported leading to the need of a more robust and reliable indicator during fatiguing tasks. Therefore, in Chapter 4, a novel bi-dimensional parameter is proposed. The study on sEMG-based indicators opened a scenario also on neurophysiological mechanisms underlying fatigue. For this purpose, in Chapter 5, a protocol designed for the analysis of motor unit-related parameters during prolonged fatiguing contractions is presented. In particular, two methodologies have been applied to multichannel sEMG recordings of isometric contractions of the Tibialis Anterior muscle: the state-of-the-art technique for sEMG decomposition and a coherence analysis on MU spike trains. The importance of a multi-scale approach has been finally highlighted in the context of the evaluation of cycling performance, where fatigue is one of the limiting factors. In particular, the last chapter of this thesis can be considered as a paradigm: physiological, metabolic, environmental, psychological and biomechanical factors influence the performance of a cyclist and only when all of these are kept together in a novel integrative way it is possible to derive a clear model and make correct assessments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Basic concepts and definitions relative to Lagrangian Particle Dispersion Models (LPDMs)for the description of turbulent dispersion are introduced. The study focusses on LPDMs that use as input, for the large scale motion, fields produced by Eulerian models, with the small scale motions described by Lagrangian Stochastic Models (LSMs). The data of two different dynamical model have been used: a Large Eddy Simulation (LES) and a General Circulation Model (GCM). After reviewing the small scale closure adopted by the Eulerian model, the development and implementation of appropriate LSMs is outlined. The basic requirement of every LPDM used in this work is its fullfillment of the Well Mixed Condition (WMC). For the dispersion description in the GCM domain, a stochastic model of Markov order 0, consistent with the eddy-viscosity closure of the dynamical model, is implemented. A LSM of Markov order 1, more suitable for shorter timescales, has been implemented for the description of the unresolved motion of the LES fields. Different assumptions on the small scale correlation time are made. Tests of the LSM on GCM fields suggest that the use of an interpolation algorithm able to maintain an analytical consistency between the diffusion coefficient and its derivative is mandatory if the model has to satisfy the WMC. Also a dynamical time step selection scheme based on the diffusion coefficient shape is introduced, and the criteria for the integration step selection are discussed. Absolute and relative dispersion experiments are made with various unresolved motion settings for the LSM on LES data, and the results are compared with laboratory data. The study shows that the unresolved turbulence parameterization has a negligible influence on the absolute dispersion, while it affects the contribution of the relative dispersion and meandering to absolute dispersion, as well as the Lagrangian correlation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although it is clear that regional analgesia in association with general anaesthesia substantially reduces postoperative pain, the benefits in terms of overall perioperative outcome are less evident. The aim of this nonsystematic review was to evaluate the effect on middle and long-term postoperative outcomes of adding regional perioperative analgesia to general anaesthesia. This study is based mostly on systematic reviews, large epidemiological studies and large or high-quality randomized controlled trials that were selected and evaluated by the author. The endpoints that are discussed are perioperative morbidity, cancer recurrence, chronic postoperative pain, postoperative rehabilitation and risk of neurologic damage. Epidural analgesia may have a favourable but very small effect on perioperative morbidity. The influence of other regional anaesthetic techniques on perioperative morbidity is unclear. Preliminary data suggest that regional analgesia might reduce the incidence of cancer recurrence. However, adequately powered randomized controlled trials are lacking. The sparse literature available suggests that regional analgesia may prevent the development of chronic postoperative pain. Rehabilitation in the immediate postoperative period is possibly improved, but the advantages in the long term remain unclear. Permanent neurological damage is extremely rare. In conclusion, while the risk of permanent neurologic damage remains extremely low, evidence suggests that regional analgesia may improve relevant outcomes in the long term. The effect size is mostly small or the number-needed-to-treat is high. However, considering the importance of the outcomes of interest, even minor improvement probably has substantial clinical relevance.